One of the most captivating books of 2010 was not a gory science-fiction thriller or a gripping end-of-the world page-turner, though its subject matter is equally engrossing and out of the ordinary. It is about somewhat crazy people doing crazy things as seen through the lenses of the man that has been treating them for decades. The Naked Lady Who Stood On Her Head is the first psych ward memoir, a tale of a curious doctor/scientist and his most extreme, bizarre, and sometimes touching cases from the nation’s most prestigious neurology centers and universities. Included in ScriptPhD.com’s review is a podcast interview with Dr. Small, as well as the opportunity to win a free autographed copy of his book. Our end-of-the year science library pick is under the “continue reading” cut.
Gary Small is a very unlikely candidate for the chaos that many of us confuse with a psych ward. Whether it was the frantic psych consults on ER or fond remembrance of Jack Nicholson and his cohorts in One Flew Over The Cuckoo’s Nest, most of us have a natural association of psychiatry with insanity or pandemonium. Meeting Dr. Small in real life is the antithesis of these scenarios. Warm, welcoming, serene and genuinely affable, his voice translates directly from the pages of his latest book. Told in chronological order—starting with a young, curious, inexperienced intern at Harvard’s Massachussetts General Hospital to his tenure as a world-renowned neuroscientist at UCLA—The Naked Lady Who Stood On Her Head feels like an enormous learning and growing experience for Dr. Small, his patients, and the reader.
The scene plays out like a standard medical drama or movie. In the beginning, the young, bright-eyed, bushy-tailed, trepidatious doctor is exploring while learning the ropes on duty. There is, in the self-titled chapter, literally a naked lady standing on her head in the middle of a Boston psych ward. Dr. Small is the only doctor that can cure her baffling ailment, but in doing so, only begins to peel away at what is really troubling her. There is a bevvy of inexplicable fainting schoolgirls afflicting the Boston suburbs. Only through a fresh pair of eager eyes is the root cause attained, a cause that to this day sets the standard for mass hysteria treatment nationwide. And there is a mute hip painter from Venice beach, immobile for weeks until Small, fighting the rigid senior attendings, gets to the unlikely diagnosis. As the book, and Dr. Small’s career, flourishes, we meet a WebMD mom, a young man literally blinded by his family’s pressure, a man whose fiancé’s obsession with Disney characters resurfaces a painful childhood secret, and Dr. Small’s touching story of having to watch as the mentor he introduced at the book’s beginning hires him as a therapist so that he can diagnose his teacher’s dementia. Ultimately, all of the characters of The Naked Lady Who Stood on Her Head, and Dr. Small’s dedication and respect, have a common thread. They are real, they are diverse, and they are us. Psych patients are not one-dimensional figments of a screenwriter’s imagination. They are the brother who has childhood trauma, the friend with a dysfunctional or abusive family, the husband or wife with a rare genetic predisposition, and all of us are but one degree away from the abnormal behavior that these conditions can ignite. In his book, Dr. Small has pulled back the curtain of a notoriously secretive and mysterious field. It’s a riveting reveal, and absolutely worth an appointment. The Naked Lady Who Stood On Her Head has been optioned by 20th Century Fox, and may be coming to your televisions soon!
Podcast Interview
In addition to his latest novel, Gary Small is the author of the best-selling global phenomenon The Memory Bible: An Innovative Strategy For Keeping Your Brain Young and a regular contributor to The Huffington Post (several excellent recent articles can be found here and here). His seminal research on Alzheimer’s disease, aging and brain training has appeared in recent articles in NPR and Newsweek. A seminal brain imaging study recently completed in his laboratory garnered worldwide media attention for suggesting that Google searching can stimulate the brain and literally keep aging brains agile. Dr. Small regularly updates his research and musings on his personal blog.
ScriptPhD.com Editor Jovana Grbić sat down for a one-on-one podcast with Dr. Small and discussed inspiration for the book, current and future aspects of psychiatry, and the role that media and entertainment have played and continue to play in shaping our perception of this important field. Click the “play” button to listen:
In addition to our review and podcast, ScriptPhD.com will be giving away a free signed copy of The Naked Lady Who Stood On Her Head via our Facebook fan page as a little holiday gift for our Facebook fans. Join us and drop a comment on the giveaway announcement for eligibility. Happy Holidays from ScriptPhD.com!
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]> Each of the brain’s 100 billion neurons has somewhere in the realm of 7,000 connections to other neurons, creating a tangled roadmap of about 700 trillion possible turns. But thinking of the brain as roads makes it sound very fixed—you know, pavement, and rebar, and steel girders and all. But the opposite is true: at work in our brains are never-sleeping teams of Fraggles and Doozers who rip apart the roads, build new ones, and are constantly at work retooling the brain’s intersections. This study of Fraggles and Doozers is the booming field of neuroplasticity: how the basic architecture of the brain changes over time. Scientist, neuro math geek, Science Channel personality and accomplished author Garth Sundem writes for ScriptPhD.com about the phenomenon of brain training and memory.
Certainly the brain is plastic—the gray matter you wake up with is not the stuff you take to sleep at night. But what changes the brain? How do the Fraggles know what to rip apart and how do the Doozers know what to build? Part of the answer lies in a simple idea: neurons that fire together, wire together. This is an integral part of the process we call learning. When you have a thought or perform a task, a car leaves point A in your brain and travels to point B. The first time you do something, the route from point A to B might be circuitous and the car might take wrong turns, but the more the car travels this same route, the more efficient the pathway becomes. Your brain learns to more efficiently pass this information through its neural net.
A simple example of this “firing together is wiring together” is seen in the infant hippocampus. The hippocampus packages memories for storage deeper in the brain: an experience goes in and a bundle comes out. I think of it like the pegboard at the Seattle Science Center: you drop a bouncy ball in the top and it ricochets down through the matrix of pegs until exiting a slot at the bottom. In the hippocampus, it’s a defined path: you drop an experience in slot number 5,678,284 and
it comes out exit number 1,274,986. How does the hippocampus possibly know which entrance leads to which exit? It wires itself by trial and error (oversimplification alert…but you get the point). Infants constantly fire test balls through the matrix and ones that reach a worthwhile endpoint reinforce worthwhile pathways. These neurons fire together, wire together, and eventually the hippocampus becomes efficient. It’s just that easy. (And because it’s so easy, researchers aren’t far away from creating an artificial hippocampus.)
Now let’s think about Sudoku. The first time you discover which missing numbers go in which empty boxes, you do so inefficiently. But over time, you get better at it. You learn tricks. You start to see patterns. You develop a workflow. And practice creates efficiency in your brain as neurons create the connections necessary for the quick processing of Sudoku. This is true of any puzzle: your plastic brain changes its basic architecture to allow you to complete subsequent puzzles more efficiently. Okay, that’s great and all, but studies are finding that the vast majority of brain-training attempts don’t generalize to overall intelligence. In other words, by doing Sudoku, you only get better at Sudoku. This might gain you street cred in certain circles, but it doesn’t necessarily make you smarter. Unfortunately, the same is true of puzzle regiments: you get better at the puzzles, but you don’t necessarily get smarter in a general way.
That said, one type of puzzle offers some hope: the crossword. In fact, researchers at Wake Forest University suggest that crossword puzzles strengthen the brain (even in later years) the same way that lifting weights can increase muscle strength. Still, it remains true that doing the crossword only reinforces the mechanism needed to do the crossword. But the crossword uses a very specific mechanism: it forces you to pull a range of facts from deep within your brain into your working memory. This is a nice thing to get better at. Think about it: there are few tasks that don’t require some sort of recall, be it of facts or experiences. And so training a nimble working memory through crosswords seems a more promising regiment than other single type of brain training exercise.
This is borne out by research. A Columbia University study published in 2008 found that training working memory increased overall fluid intelligence. So the answer to this article’s title question is yes, brain training is very real. (Only, there’s lot of schlock out there.) But hidden in this article lies the new key that many researchers hope will point the way to brain training of the future. Any ONE brain training regiment only makes you better at the one thing being trained. But NEW EXPERIENCES in general, promise a varied and continual rewiring of the brain for a fluid and ever-changing development of intelligence. In other words, if you stay in your comfort zone, the comfort zone decays around you. In order to build intelligence or even to keep what you have, you need to be building new rooms, outside your comfort zone. If you consume a new media source in the morning, experiment with a new route to work, eat a new food for lunch, talk to a new person, or…try a NEW puzzle, you’re forcing your brain to rewire itself to be able to deal with these new experiences—you’re growing new neurons and forcing your old ones to scramble to create new connections.
Here’s what that means for your brain-training regimen: doing a puzzle is little more than busywork; it’s the act of figuring out how to do it that makes you smarter. Sit down and read the directions. If you understand them immediately and know how you should go about solving a puzzle, put it down and look for something else…something new. It’s not just use it or lose it. It’s use it in a novel way or lose it. Try it. Your brain will thank you for it.
Garth Sundem works at the intersection of math, science, and humor with a growing list of bestselling books including the recently released Brain Candy: Science, Puzzles, Paradoxes, Logic and Illogic to Nourish Your Neurons, which he packed with tasty tidbits of fun, new experiences in hopes of making readers just a little bit smarter without boring them
into stupidity. He is a frequent on-screen contributor to The Science Channel and has written for magazines including Wired, Seed, Sand Esquire. You can visit him online or follow his Twitter feed.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>
Scientists are becoming more interested in trying to pinpoint precisely what’s going on inside our brains while we’re engaged in creative thinking. Which brain chemicals play a role? Which areas of the brain are firing? Is the magic of creativity linked to one specific brain structure? The answers are not entirely clear. But thanks to brain scan technology, some interesting discoveries are emerging. ScriptPhD.com was founded and focused on the creative applications of science and technology in entertainment, media and advertising, fields traditionally defined by “right brain” propensity. It stands to reason, then, that we would be fascinated by the very technology and science that as attempting to deduce and quantify what, exactly, makes for creativity. To help us in this endeavor, we are pleased to welcome computer scientist and writer Ravi Singh’s guest post to ScriptPhD.com. For his complete article, please click “continue reading.”
Before you can measure something, you must be able to clearly define what it is. It’s not easy to find consensus among scientists on the definition of creativity. But then, it’s not easy to find consensus among artists, either, about what’s creative and what’s not. Psychologists have traditionally defined creativity as “the ability to combine novelty and usefulness in a particular social context.” But newer models argue that these type of definitions, which rely on extremely-subjective criteria like ‘novelty’ and ‘usefulness,’ are too vague. John Kounios, a psychologist at Drexel University who studies the neural basis of insight, defines creativity as “the ability to restructure one’s understanding of a situation in a non-obvious way.” His research shows that creativity is not a singular concept. Rather, it’s a collection of different processes that emerge from different areas of the brain.
In attempting to measure creativity, scientists have had a tendency to correlate creativity with intelligence—or at least link creativity to intelligence—probably because we believe that we have a handle on intelligence. We believe can measure it with some degree of accuracy and reliability. But not creativity. No consensus measure for creativity exists. Creativity is too complex to be measured through tidy, discrete questions. There is no standardized test. There is yet to be a meaningful “Creativity Quotient.” In fact, creativity defies standardization. In the creative realm, one could argue, there’s no place for “standards.” After all, doesn’t the very notion of standardization contradict what creativity is all about?
To test creativity, researchers have historically attempted to test divergent thinking, an assessment construct originally developed in the 1950s by psychologist J. P. Guilford, who believed that standardized IQ tests favored convergent thinkers (who stay focused on solving a core problem), rather than divergent thinkers (who go ‘off on tangents’). Guilford believed that scores on IQ tests should not be taken as a unidimensional measure of intelligence. He observed that creative people often score lower on standard IQ tests because their approach to solving the problems generates a larger number of possible solutions, some of which are thoroughly original. The test’s designers would have never thought of those possibilities. Testing divergent thinking, he believed, allowed for greater appreciation of the diversity of human thinking and abilities. A test of divergent thinking might ask the subject to come up with new and useful functions for a familiar object, such as a brick or a pencil. Or the subject might be asked to draw the taste of chocolate. You can see how it would be very difficult, if not impossible to standardize a “correct” answer.
Eastern traditions have their own ideas about creativity and where it comes from. In Japan, where students and factory workers are stereotyped as being too methodical, researchers are studying schoolchildren for a possible correlation between playfulness and creativity. Nath philosopher Mahendranath wrote that man’s “memory became buried under the artificial superstructure of civilization and its artificial concepts,” his way of saying that that too much convergent thinking can inhibit creativity. Sanskrit authors described the spontaneous and divergent mental experience of sahaja meditation, where new insights occur after allowing the mind to rest and return to the natural, unconditioned state. But while modern scientific research on meditation is good at measuring physiological and behavioral changes, the “creative” part is much more elusive.
Some western scientists suggest that creativity is mostly ascribed to neurochemistry. High intelligence and skill proficiency have traditionally been associated with fast, efficient firing of neurons. But the research of Dr. Rex Jung, a research professor in the department of neurosurgery at the University of New Mexico, shows that this is not necessarily true. In researching the neurology of the creative process, Jung has found that subjects who tested high in “creativity” had thinner white matter and connecting axons in their brains, which has the effect of slowing nerve traffic. Jung believes that this slowdown in the left frontal cortex, a brain region where emotion and cognition are integrated, may allow us to be more creative, and to connect disparate ideas in novel ways. Jung has found that when it comes to intellectual pursuits, the brain is “an efficient superhighway” that gets you from Point A to Point B quickly. But creativity follows a slower, more meandering path that has lots of little detours, side roads and rabbit trails. Sometimes, it is along those rabbit trails that our most revolutionary ideas emerge.
You just have to be willing to venture off the main highway.
We’ve all had aha! moments—those sudden bursts of insight that solve a vexing problem, solder an important connection, or reinterpret a situation. We know what it is, but often, we’d be hard-pressed to explain where it came from or how it originated. Dr. Kounios, along with Northwestern University psychologist Mark Beeman, has extensively studied the the “Aha! moment.” They presented study participants with simple word puzzles that could be solved either through a quick, methodical analysis or an instant creative insight. Participants are given three words then are asked to come up with one word that could be combined with each of these three to form a familiar term; for example: crab, pine and sauce. (Answer: “apple.”) Or eye, gown and basket. (Answer: ball)
About half the participants arrived at solutions by methodically thinking through possibilities; for the other half, the answer popped into their minds suddenly. During the “Aha! moment,” neuroimaging showed a burst of high-frequency activity in the participants’ right temporal lobe, regardless of whether the answer popped into the subjects’ minds instantly or they solved the problem methodically. But there was a big difference in how each group mentally prepared for the test question. The methodical problem solvers prepared by paying close attention to the screen before the words appeared—their visual cortices were on high alert. By contrast, those who received a sudden Aha! flash of creative insight prepared by automatically shutting down activity in the visual cortex for an instant—the neurological equivalent of closing their eyes to block out distractions so that they could concentrate better. These creative thinkers, Kounios said, were “cutting out other sensory input and boosting the signal-to-noise ratio” to enable themselves retrieve the answer from the subconscious.
Creativity, in the end, is about letting the mind roam freely, giving it permission to ignore conventional solutions and explore uncharted waters. Accomplishing that requires an ability, and willingness, to inhibit habitual responses, take risks. Dr. Kenneth M. Heilman, a neurologist at the University of Florida believes that this capacity to let go may involve a dampening of norepinephrine, a neurotransmitter that triggers the fight-or-flight alarm. Since norepinephrine also plays a role in long-term memory retrieval, its reduction during creative thought may help the brain temporarily suppress what it already knows, which paves the way for new ideas and discovering novel connections. This neurochemical mechanism may explain why creative ideas and Aha! moments often occur when we are at our most peaceful, for example, relaxing or meditating.
The creative mind, by definition, is always open to new possibilities, and often fashions new ideas from seemingly irrelevant information. Psychologists at the University of Toronto and Harvard University believe they have discovered a biological basis for this behavior. They found that the brains of creative people may be more receptive to incoming stimuli from the environment that the brains of others would shut out through the the process of “latent inhibition,” our unconscious capacity to ignore stimuli that experience tells us are irrelevant to our needs. In other words, creative people are more likely to have low levels of latent inhibition. The average person becomes aware of such stimuli, classifies it and forgets about it. But the creative person maintains connections to that extra data that’s constantly streaming in from the environment and uses it.
Sometimes, just one tiny stand of information is all it takes to trigger a life-changing “Aha!” moment.
Ravi Singh is a California-based IT professional with a Masters in Computer Science (MCS) from the University of Illinois. He works on corporate information systems and is pursuing a career in writing.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]> As Comic-Con winds down on the shortened Day 4, we conclude our coverage with two panels that exemplify what Comic-Con is all about. As promised, we dissect the “Comics Design” panel of the world’s top logo designers deconstructing their work, coupled with images of their work. We also bring you an interesting panel of ethnographers, consisting of undergraduate and graduate student, studying the culture and the varying forces that shape Comic-Con. Seriously, they’re studying nerds! Finally, we are delighted to shine our ScriptPhD.com spotlight on new sci-fi author Charles Yu, who presented his new novel at his first (of what we are sure are many) Comic-Con appearance. We sat down and chatted with Charles, and are pleased to publish the interview. And of course, our Day 4 Costume of the Day. Comic-Con 2010 (through the eyes of ScriptPhD.com) ends under the “continue reading” cut!
Comics Design
We are not ashamed to admit that here at ScriptPhD.com, we are secret design nerds. We love it, particularly since good design so often elevates the content of films, television, and books, but is a relatively mysterious process. One of THE most fascinating panels that we attended at Comic-Con 2010 was on the design secrets behind some of your favorite comics and book covers. A panel of the world’s leading designers revealed their methodologies (and sometimes failures) in the design process behind their hit pieces, lifting the shroud of secrecy that designers often envelop themselves in. An unparalleled purview into the mind of the designer, and the visual appeal that so often subliminally contributes to the success of a graphic novel, comic, or even regular book. We do, as it turns out, judge books by their covers.
As promised, we revisit this illuminating panel, and thank Christopher Butcher, co-founder of The Toronto Comic Arts Festival and co-owner of The Beguiling, Canada’s finest comics bookstore. Chris was kind enough to provide us with high-quality images of the Comics Design panel’s work, for which we at ScriptPhD.com are grateful. Chris had each of the graphic artists discuss their work with an example of design that worked, and design that didn’t (if available or so inclined). The artist was asked to deconstruct the logo or design and talk about the thought process behind it.
Mark Ciarello – (art + design director at DC Comics)
Mark chose to design the cover of this book with an overall emphasis on the individual artist. Hence the white space on the book, and a focus on the logo above the “solo” artist.
Adam Grano – (designer at Fantagraphics)
Adam took the title of this book quite literally, and let loose with his design to truly emphasize the title. He called it “method design.” He wanted the cover to look like a drunken dream.
For the Humbug collection, Grano tried hard not to impress too much of himself (and his tastes) in the design of the cover. He wanted to inject simplicity in a project that would stand the test of time, because it was a collector’s series.
Grano considered this design project his “failure.” It contrasts greatly with the simplicity and elegance of Humbug. He mentioned that everyone on the page is scripted and gridded, something that designers try to avoid in comics.
Chip Kidd – (designer at Random House)
Chip Kidd had the honor of working on the first posthumous Peanuts release after Charles M. Schultz’s death, and took to the project quite seriously. In the cover, he wanted to deconstruct a Peanuts strip. All of the human element is taken out of the strip, with the characters on the cover up to their necks in suburban anxiety.
Kidd likes this cover because he considers it an updated spin on Superman. It’s not a classic Superman panel, so he designed a logo that deviated from the classic “Superman” logo to match.
Kidd chose this as his design “failure”, but not the design itself. The cover represents one of seven volumes, in which the logo pictured disintegrates by the seventh issue, to match the crisis in the title. Kidd’s only regret here is that he was too subtle. He wishes he’d chosen to start the logo disintegration progression sooner, as there’s very little difference between the first few volumes.
Fawn Lau – (designer at VIZ)
Fawn was commissioned to redesign this book cover for an American audience. Keeping this in mind, and wanting the Japanese animation to be more legible for the American audience, she didn’t want too heavy-handed of a logo. In an utterly genius stroke of creativity, Lau went to an art store, bought $70 worth of art supplies, and played around with them until she constructed the “Picasso” logo. Clever, clever girl!
Mark Siegel – (First Second Books)
Mark Siegel was hired to create the cover of the new biography Feynman, an eponymous title about one of the most famous physicists of all time. Feynman was an amazing man who lived an amazing life, including a Nobel Prize in physics in 1965. His biographer, Ottaviani Myrick, a nuclear physicist and speed skating champion, is an equally accomplished individual. The design of the cover was therefore chosen to reflect their dynamic personalities. The colors were chosen to represent the atomic bomb and Los Alamos, New Mexico, where Feynman assisted in the development of The Manhattan Project. Incidentally, the quote on the cover – “If that’s the world’s smartest man, God help us!” – is from Feynman’s own mother.
Keith Wood – (Oni Press)
Wood remarked that this was the first time he was able to do design on a large scale, which really worked for this project. He chose a very basic color scheme, again to emphasize a collection standing the test of time, and designed all the covers simultaneously, including color schemes and graphics. He felt this gave the project a sense of connectedness.
Wood chose a pantone silver as the base of this design with a stenciled typeface meant to look very modern. The back of the cover and the front of the cover were initially going to be reversed when the artists first brought him the renderings. However, Wood felt that since the book’s content is about the idea of a girl’s traveling across the United States, it would be more compelling and evocative to use feet/baggage as the front of the book. He was also the only graphic artist to show a progression of 10-12 renderings, playing with colors, panels and typeface, that led to the final design. He believes in a very traditional approach to design, which includes hand sketches and multiple renderings.
The Culture of Popular Things: Ethnographic Examinations of Comic-Con 2010
Each year, for the past four years, Comic-Con ends on an academic note. Matthew J. Smith, a professor at Wittenberg University in Ohio, takes along a cadre of students, graduate and undergraduate, to study Comic-Con; the nerds, the geeks, the entertainment component, the comics component, to ultimately understand the culture of what goes on in this fascinating microcosm of consumerism and fandom. By culture, the students embrace the accepted definition by famous anthropologist Raymond J. DeMallie: “what is understood by members of a group.” The students ultimately wanted to ask why people come to Comic-Con in general. They are united by the general forces of being fans; this is what is understood in their group. After milling around the various locales that constituted the Con, the students deduced that two ultimate forces were simultaneously at play. The fan culture drives and energizes the Con as a whole, while strong marketing forces were on display in the exhibit halls and panels.
Maxwell Wassmann, a political economy student at Wayne State University, pointed out that “secretly, what we’re talking about is the culture of buying things.” He compared Comic-Con as a giant shopping mall, a microcosm of our economic system in one place. “If you’ve spent at least 10 minutes at Comic-Con,” he pointed out, “you probably bought something or had something tried to be sold to you. Everything is about marketing.” As a whole, Comic-Con is subliminally designed to reinforce the idea that this piece of pop culture, which ultimately advertises an even greater subset of pop culture, is worth your money. Wassmann pointed out an advertising meme present throughout the weekend that we took notice of as well—garment-challenged ladies advertising the new Green Hornet movie. The movie itself is not terribly sexy, but by using garment-challenged ladies to espouse the very picture of the movie, when you leave Comic-Con and see a poster for Green Hornet, you will subconsciously link it to the sexy images you were exposed to in San Diego, greatly increasing your chances of wanting to see the film. By contrast, Wassmann also pointed out that there is a concomitant old-town economy happening; small comics. In the fringes of the exhibition center and the artists’ space, a totally different microcosm of consumerism and content exchange.
Kane Anderson, a PhD student at UC Santa Barbara getting his doctorate in “Superheroology” (seriously, why didn’t I think of that back in graduate school??), came to San Diego to observe how costumes relate to the superhero experience. To fully absorb himself in the experience, and to gain the trust of Con attendees that he’d be interviewing, Anderson came in full costume (see above picture). Overall, he deduced that the costume-goers, who we will openly admit to enjoying and photographing during our stay in San Diego, act as goodwill ambassadors for the characters and superheroes they represent. They also add to the fantasy and adventure of Comic-Con goers, creating the “experience.” The negative side to this is that it evokes a certain “looky-loo” effect, where people are actively seeking out, and singling out, costume-wearers, even though they only constitute 5% of all attendees.
Tanya Zuk, a media masters student from the University of Arizona, and Jacob Sigafoos, an undergraduate communications major at Wittenberg University, both took on the mighty Hollywood forces invading the Con, primarily the distribution of independent content, an enormous portion of the programming at Comic-Con (and a growing presence on the web). Zuk spoke about original video content, more distinctive of new media, is distributed primarily online. It allows for more exchange between creators and their audience than traditional content (such as film and cable television), and builds a community fanbase through organic interaction. Sigafoos expanded on this by talking about how to properly market such material to gain viral popularity—none at all! Lack of marketing, at least traditional forms, is the most successful way to promote a product. Producing a high-quality product, handing it off to friends, and promoting through social media is still the best way to grow a devoted following.
And speaking of Hollywood, their presence at Comic-Con is undeniable. Emily Saidel, a Master’s student at NYU, and Sam Kinney, a business/marketing student at Wittenberg University, both took on the behemoth forces of major studios hawking their products in what originally started out as a quite independent gathering. Saidel tackled Hollywood’s presence at Comic-Con, people’s acceptance/rejection thereof, and how comics are accepted by traditional academic disciplines as didactic tools in and of themselves. The common thread is a clash between the culture and the community. Being a member of a group is a relatively simple idea, but because Comic-Con is so large, it incorporates multiple communities, leading to tensions between those feeling on the outside (i.e. fringe comics or anime fans) versus those feeling on the inside (i.e. the more common mainstream fans). Comics fans would like to be part of that mainstream group and do show interest in those adaptations and changes (we’re all movie buffs, after all), noted Kinney, but feel that Comic-Con is bigger than what it should be.
But how much tension is there between the different subgroups and forces? The most salient example from last year’s Con was the invasion of the uber-mainstream Twilight fans, who not only created a ruckus on the streets of San Diego, but also usurped all the seats of the largest pavilion, Hall H, to wait for their panel, locking out other fans from seeing their panels. (No one was stabbed.) In reality, the supposed clash of cultures is blown out of proportion, with most fans not really feeling the tension. To boot, Seidel pointed out that tension isn’t necessarily a bad thing, either. She gave a metaphor of a rubber band, which only fulfills its purpose with tension. The different forces of Comic-Con work in different ways, if sometimes imperfectly. And that’s a good thing.
Incidentally,
if you are reading this and interested in participating in the week-long program in San Diego next year, visit the official website of the Comic-Con field study for more information. Some of the benefits include: attending the Comic-Con programs of your choice, learning the tools of ethnographic investigation, and presenting the findings as part of a presentation to the Comics Arts Conference. Dr. Matthew Smith, who leads the field study every year, is not just a veteran attendee of Comic-Con, but also the author of The Power of Comics.
COMIC-CON SPOTLIGHT ON: Charles Yu, author of How To Live Safely in a Science Fictional Universe.
Here at ScriptPhD.com, we love hobnobbing with the scientific and entertainment elite and talking to writers and filmmakers at the top of their craft as much as the next website. But what we love even more is seeking out new talent, the makers of the books, movies and ideas that you’ll be talking about tomorrow, and being proud to be the first to showcase their work. This year, in our preparation for Comic-Con 2010, we ran across such an individual in Charles Yu, whose first novel, How To Live Safely in a Science Fictional Universe premieres this fall, and who spoke about it at a panel over the weekend. We had an opportunity to have lunch with Charles in Los Angeles just prior Comic-Con, and spoke in-depth about his new book, along with the state of sci-fi in current literature. We’re pretty sure Charles Yu is a name science fiction fans are going to be hearing for some time to come. ScriptPhD.com is proud to shine our 2010 Comic-Con spotlight on Charles and his debut novel, which is available September 7, 2010.
How To Live Safely in a Science Fictional Universe is the story of a son searching for his father… through quantum-space time. The story takes place on Minor Universe 31, a vast story-space on the outskirts of fiction, where paradox fluctuates like the stock market, lonely sexbots beckon failed protagonists, and time travel is serious business. Every day, people get into time machines and try to do the one thing they should never do: try to change the past. That’s where the main character, Charles Yu, time travel technician, steps in. Accompanied by TAMMY (who we consider the new Hal), an operating system with low self-esteem, and a nonexistent but ontologically valid dog named Ed, Charles helps save people from themselves. When he’s not on the job, Charles visits his mother (stuck in a one-hour cycle, she makes dinner over and over and over) and searches for his father, who invented time travel and then vanished.
Questions for Charles Yu
ScriptPhD.com: Charles, the story has tremendous traditional sci-fi roots. Can you discuss where the inspiration for this came from?
Charles Yu: Well the sci-fi angle definitely comes from being a kid in the 80s, when there were blockbuster sci-fi things all over the place. I’ve always loved [that time], as a casual fan, but also wanted to write it. I didn’t even start doing that until after I’d graduated from law school. I did write, growing up, but I never wrote fiction—I didn’t think I’d be any good at it! I wrote poetry in college, minored in it, actually. Fiction and poetry are both incredibly hard, and poetry takes more discipline, but at least when I failed in my early writing, it was a 100 words of failure, instead of 5,000 words of it.
SPhD: What were some of your biggest inspirations growing up (television or books) that contributed to your later work?
CY: Definitely The Foundation Trilogy. I remember reading that in the 8th grade, and I remember spending every waking moment reading, because it was the greatest thing I’d ever read. First of all, I was in the 8th grade, so I hadn’t read that many things, but the idea that Asimov created this entire self-contained universe, it was the first time that I’d been exposed to that idea. And then to have this psychohistory on top, it was kind of trippy. Psychohistory is the idea that social sciences can be just as rigorously captured with equations as any physical science. I think that series of books is the main thing that got me into sci-fi.
SPhD: Any regrets about having named the main character after yourself?
CY: Yes. For a very specific reason. People in my life are going to think it’s biographical, which it’s very much not. And it’s very natural for people to do that. And in my first book of short stories, none of the main characters was named after anyone, and still I had family members that asked if that was about our family, or people that gave me great feedback but then said, “How could you do that to your family?” And it was fiction! I don’t think the book could have gotten written had I not left that placeholder in, because the one thing that drove any sort of emotional connection for the story for me was the idea of having less things to worry about. The other thing is that because the main character is named after you, as you’re writing the book, it acts as a fuel or vector to help drive the emotional completion.
SPhD: In the world of your novel, people live in a lachrymose, technologically-driven society. Any commentary therein whatsoever on the technological numbing of our own current culture?
CY: Yes. But I didn’t mean it as a condemnation, in a sense. I wouldn’t make an overt statement about technology and society, but I am more interested in the way that technology can sometimes not connect people, but enable people’s tendency to isolate themselves. Certainly, technology has amazing connective possibilities, but that would have been a much different story, obviously. The emotional plot-level core of this book is a box. And that sort of drove everything from there. The technology is almost an emotional technology that [Charles, the main character] has invented with his dad. It’s a larger reflection of his inability to move past certain limitations that he’s put on himself.
SPhD: What drives Charles, the main character of this book?
CY: What’s really driving Charles emotionally is looking for his dad. But more than that, is trying to move through time, to navigate the past without getting stuck in it.
SPhD: Both of his companions are non-human. Any significance to that?
CY: It probably speaks more to my limitations as a writer [laughs]. That was all part of the lonely guy type that Charles is being portrayed as. If he had a human with him, he’d be a much different person.
SPhD: The book abounds in scientific jargon and technological terminology, which is par for the course in science fiction, but was still very ambitious. Do you have high expectations of the audience that will read this book?
CY: Yeah. I was just reading an interview where the writer essentially said “You can never go wrong by expecting too much [of your audience].” You can definitely go wrong the other way, because that would come off as terrible, or assuming that you know more. But actually, my concerns were more in the other direction, because I knew I was playing fast and loose with concepts that I know I don’t have a great grasp of. I’m writing from the level of amateur who likes reading science books, and studied science in college—an entertainment layreader. My worry was whether I was BSing too much [of the science]. There are parts where it’s clearly fictional science, but there are other parts that I cite things that are real, and is anyone who reads this who actually knows something about science going to say “What the heck is this guy saying?”
SPhD: How To Live… is written in a very atavistic, retro 80s style of science fiction, and really reminded me of the best of Isaac Asimov. How do you feel about the current state of sci-fi literature as relates to your book?
CY: Two really big keys for me, and things I was thinking about while writing [this book], were one, there is kind of a kitchiness to sci-fi, and I think that’s kind of intentional. It has a kind of do-it-yourself aesthetic to it. In my book, you basically have a guy in the garage with his dad, and yes the dad is an engineer, but it’s in a garage without great equipment, so it’s not going to look sleek, you can imagine what it’s going to look like—it’s going to look like something you’d build with things you have lying around in the garage. On the other hand, it is supposed to be this fully realized time machine, and you’re not supposed to be able to imagine it. Even now, when I’m in the library in the science-fiction section, I’ll often look for anthologies that are from the 80s, or the greatest time travel stories from the 20th Century that cover a much greater range of time than what’s being published now. It’s almost like the advancement of real-world technology is edging closer to what used to be the realm of science fiction. The way that I would think about that is that it’s not exploting what the real possibility of science fiction is, which is to explore a current world or any other completely strange world, but not a world totally envisionable ten years from now. You end up speculating on what’s possible or what’s easily extrapollatable from here; that’s not necessarily going to make for super emotional stories.
Charles Yu is a writer and attorney living in Los Angeles, CA.
Last, but certainly not least, is our final Costume of the Day. We chose this young ninja not only because of the coolness of his costume, but because of his quick wit. As we were taking the snapshot he said, “I’m smiling, you just can’t see it.” And a check mate to you, young sir.
Incidentally, you can find much more photographic coverage of Comic-Con on our Facebook fan page. Become a fan, because this week, we will be announcing Comic-Con swag giveaways that only Facebook fans are eligible for.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>Dr. Mark Changizi, a cognitive science researcher, and professor at Rensselaer Polytechnic Institute, is one of the most exciting rising stars of science writing and the neurobiology of popular culture phenomena. His latest book, The Vision Revolution, expounds on the evolution and nuances of the human eye—a meticulously designed, highly precise technological marvel that allows us to have superhuman powers. You heard me right; superhuman! X-ray vision, color telepathy, spirit reading, and even seeing into the future. Dr. Changizi spoke about these ideas, and how they might be applied to everything from sports stars with great hand-eye coordination to modern reading and typeface design with us in ScriptPhD.com’s inaugural audio podcast. He also provides an exclusive teaser for his next book with a guest post on the surprising mindset that makes for creative people. Read Dr. Changizi’s guest post and listen to the podcast under the “continue reading” cut.
You are an idea-monger. Science, art, technology—it doesn’t matter which. What matters is that you’re all about the idea. You live for it. You’re the one who wakes your spouse at 3am to describe your new inspiration. You’re the person who suddenly veers the car to the shoulder to scribble some thoughts on the back of an unpaid parking ticket. You’re the one who, during your wedding speech, interrupts yourself to say, “Hey, I just thought of something neat.” You’re not merely interested in science, art or technology, you want to be part of the story of these broad communities. You don’t just want to read the book, you want to be in the book—not for the sake of celebrity,
but for the sake of getting your idea out there. You enjoy these creative disciplines in the way pigs enjoy mud: so up close and personal that you are dripping with it, having become part of the mud itself.
Enthusiasm for ideas is what makes an idea-monger, but enthusiasm is not enough for success. What is the secret behind people who are proficient idea-mongers? What is behind the people who have a knack for putting forward ideas that become part of the story of science, art and technology? Here’s the answer many will give: genius. There are a select few who are born with a gift for generating brilliant ideas beyond the ken of the rest of us. The idea-monger might well check to see that he or she has the “genius” gene, and if not, set off to go monger something else.
Luckily, there’s more to having a successful creative life than hoping for the right DNA. In fact, DNA has nothing to do with it. “Genius” is a fiction. It is a throw-back to antiquity, where scientists of the day had the bad habit of “explaining” some phenomenon by labeling it as having some special essence. The idea of “the genius” is imbued with a special, almost magical quality. Great ideas just pop into the heads of geniuses in sudden eureka moments; geniuses make leaps that are unfathomable to us, and sometimes even to them; geniuses are qualitatively different; geniuses are special. While most people labeled as a genius are probably somewhat smart, most smart people don’t get labeled as geniuses.
I believe that it is because there are no geniuses, not, at least, in the qualitatively-special sense. Instead, what makes some people better at idea-mongering is their style, their philosophy, their manner of hunting ideas. Whereas good hunters of big game are simply called good hunters, good hunters of big ideas are called geniuses, but they only deserve the moniker “good idea-hunter.” If genius is not a prerequisite for good idea-hunting, then perhaps we can take courses in idea-hunting. And there would appear to be lots of skilled idea-hunters from whom we may learn.
There are, however, fewer skilled idea-hunters than there might at first seem. One must distinguish between the successful hunter, and the proficient hunter – between the one-time fisherman who accidentally bags a 200 lb fish, and the experienced fisherman who regularly comes home with a big one (even if not 200 lbs). Communities can be creative even when no individual member is a skilled idea-hunter. This is because communities are dynamic evolving environments, and with enough individuals, there will always be people who do generate fantastically successful ideas. There will always be successful idea-hunters within creative communities, even if these individuals are not skilled idea-hunters, i.e., even if they are unlikely to ever achieve the same caliber of idea again. One wants to learn to fish from the fisherman who repeatedly comes home with a big one; these multiple successful hunts are evidence that the fisherman is a skilled fish-hunter, not just a lucky tourist with a record catch.
And what is the key behind proficient idea-hunters? In a word: aloofness. Being aloof—from people, from money, from tools, and from oneself—endows one’s brain with amplified creativity. Being aloof turns an obsessive, conservative, social, scheming status-seeking brain into a bubbly, dynamic brain that resembles in many respects a creative community of individuals. Being a successful idea-hunter requires understanding the field (whether science, art or technology), but acquiring the skill of idea-hunting itself requires taking active measures to “break out” from the ape brains evolution gave us, by being aloof.
I’ll have more to say about this concept over the next year, as I have begun writing my fourth book, tentatively titled Aloof: How Not Giving a Damn Maximizes Your Creativity. (See here and here for other pieces of mine on this general topic.) In the meantime, given the wealth of creative ScriptPhD.com readers and contributors, I would be grateful for your ideas in the comment section about what makes a skilled idea-hunter. If a student asked you how to be creative, how would you respond?
Mark Changizi is an Assistant Professor of Cognitive Science at Rensselaer Polytechnic Institute in New York and the author of The Vision Revolution and The Brain From 25,000 Feet. More of Dr. Changizi’s writing can be found on his blog, Facebook Fan Page, and Twitter.
ScriptPhD.com was privileged to sit down with Dr. Changizi for a half-hour interview about the concepts behind his current book, The Vision Revolution, out in paperback June 10, the magic that is human ocular perception, and their applications in our modern world. Listen to the podcast below:
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology
in entertainment, media and advertising. Hire our consulting company for creative content development.
Follow us on Twitter and our Facebook fan page. Subscribe to free email notifications of new posts on our home page.
]]>
You spot someone across a crowded room. There is eye contact. Your heart beats a little faster, palms are sweaty, you’re light-headed, and your suddenly-squeamish stomach has dropped to your knees. You’re either suffering from an onset of food poisoning or you’re in love. But what does that mean, scientifically, to fall in love, to be in love, to stay in love? In our special Valentine’s Day post, Editor Jovana Grbić expounds on the neuronal and biophysical markers of love, how psychologists and mathematicians have harnessed (and sometimes manipulated) this information to foster 21st Century digital-style romance, and concludes with a personal reflection on what love really means in the face of all of this science. You might just be surprised. So, Cupid, draw back your sword… and click “continue reading” for more!
What is This Thing Called Love?
Scientists are naturally attracted to romance. Why do we love? Falling in love can be many things emotionally, ranging from the exhilarating to the truly frightening. It is, however, also remarkably methodical, with three stages developed by legendary biological anthropologist Helen Fisher of Rutgers University—lust, attraction and attachment—each with its own distinct neurochemistry. During lust, in both men and women, two basic sex hormones, testosterone and estrogen, primarily drive behavior and psychology. Interestingly enough, although lust has been memorialized by artists aplenty, this stage of love is remarkably analytical. Psychologists have shown that those individuals primed with thoughts of lust had the highest levels of analytical thinking, while those primed with thoughts of love had the highest levels of creativity and insight. But we’ll get to love in a minute.
During attraction, a crucial trio of neurotransmitters, adrenaline, dopamine and serotonin, literally change our brain chemistry, leading to the phase of being obsessed and love-struck. Remember how we talked about a racing heart and sweaty palms upon seeing someone you’re smitten with? That would be a rush of adrenaline (also referred to as epinephrine), the “fight or flight” hormone/neurotransmitter, responsible for increased heart rate, contraction of blood vessels, and dilation of air passages. Dopamine is an evolutionarily-conserved, ubiquitous neurotransmitter the regulates basic functions such as anatomy, movement and cognition—a reason the loss of dopamine in Parkinson’s Disease patients can be so devastating. Dopamine is also responsible for the pleasure and reward mechanisms in the brain, hyperactivated by abuse of drugs such as cocaine and heroin. It has even been linked to creativity and idea generation via interactions of the frontal and temporal lobes and the limbic system. This is, therefore, that link between love and creativity that we mentioned above. Incidentally, the releasing or induction agent of norepinephrine and dopamine is a chemical called phenethylamine (PEA). Did you give your sweetheart chocolates for Valentine’s Day? If so, you did well, because chocolate is loaded with some of the highest naturally-occurring levels of phenethylamine, leading to a “chocolate theory of love.” If you can’t stop thinking about your beloved, it’s because of serotonin, one of love’s most important chemicals. Its neuronal functions include regulation of mood, appetite, sleep, and cognitive functions—all affected by love. Most modern generation antidepressants involve alteration of serotonin levels in the brain.
During latent attachment, two important chemicals “seal the deal” for long-term commitment: oxytocin and vasopressin. Oxytocin, often referred to as “the hormone of love,” is a neurotransmitter released during childbirth, breastfeeding and orgasms, and is crucial for species bonding, trust, and unconditional love. A sequence of experiments showed that trust formation in group activities, social interaction, and even psychological betrayal hinged on oxytocin levels. Vasopressin is a hormone responsible for memory formation and aggressive behavior. Recent research also suggests a role for vasopressin in sexual activity and in pair-bond formation. When vasopressin receptor gene was transplanted into mice (natural loners), they exhibited gregarious, social behaviors. That gene, the vasopressin receptor, was isolated in the prarie vole, among the select few of habitually monogamous mammals. When the receptor was introduced into their highly promiscuous Don Juan meadow vole relatives, they reformed their wicked rodent ways, fixated on one partner, guarded her jealously, and helped rear their young.
With all these chemicals floating around in the brain of the aroused and the amorous, it’s not surprising that scientists have deduced that the same brain chemistry responsible for addiction is also responsible for love!
The aforementioned Dr. Fisher gave an exceptional TED Talk in 2006 about her research in romantic love; its evolution, its biochemistry, and its social importance:
While the heart may hold the key to love, the brain helps unlock it. In fact, modern neuroscience and magnetic resonance imaging (MRI) scanning has helped answer a lot of questions about lasting romances, what being in love looks like, and whether there is a neurological difference between how we feel about casual sex, platonic friends, and those we’re in love with. In a critical fMRI study the brains of people who were newly in love were scanned while they looked at photographs, some of their friends and some of their lovers. Pictures of lovers activated specific areas of the brain (pictured on the left) that were not active when looking at pictures of good friends or thinking about sexual arousal, suggesting that romantic love and mate attachment aren’t so much of an emotion or state of mind as they are a deeply rooted drive akin to hunger, thirst and sex. Furthermore, a 2009 Florida State study showed that people in a committed relationship and are thinking of their partner subconsciously avert their eyes from an attractive member of the opposite sex. The most heartwarming part of all? It lasts. fMRI imaging of 10 women and 7 men still claiming to be madly in love with their partners after an average of 21 years of marriage showed equal brain activation to the earlier studies of nascent romances.
In case you’re blinded by all this science, remember this central fact about love: it’s good for you! The art of kissing has been shown to promote many health benefits, including stress relief, prevention of tooth and gum decay, a muscle workout, anti-aging effects, and therapeutic healing. If everything goes well with the kissing, it could lead to an even more healthy activity… sex! Not only does sex improve your sense of smell, boost fitness and weight loss, mitigate depression and pain, but it also strengthens the immune system, prevents heart disease and prostate cancer. In fact, “I have a headache” may be a specious excuse to avoid a little lovin’ since sex has been shown to cure migraines (and cause them, so be careful!). All of these facts and more, along with everything you ever wanted to know about sex, were collected and studied by neuroscientist Barry Komisaruk, endocrinologist Carlos Beyer-Flores and sexuality researcher Beverly Whipple in The Science of Orgasm. Add it to your shopping list today! The above activities may find you marching down the aisle, which especially for men is a very, very good thing. Studies show that married men not only live longer and healthier lives but also made more money and were more successful professionally (terrific New York Times article here).
Love in the Age of Algebra
While science can pinpoint the biological markers of love, can it act as a prognosticator of who will get together, and more importantly, stay together? Mathematicians and statisticians are sure trying! One of the foremost world-renowned experts on relationship and marriage modeling is University of Washington psychology professor John Gottman, head of The Gottman Institute and author of Why Marriages Succeed or Fail. Dr. Gottman uses complex mathematical modeling and microexpression analysis to predict with 90% accuracy which newlyweds will remain married four to six years later, and with 83% accuracy seven to nine years thereafter. In this terrific profile of Gottman’s “love lab,” we see that his methodology includes use of a facial action coding system (FACS) to analyze videotapes for minute signs of negative expressions such as contempt or disgust during simple conversations or stories. Take a look at this brief, fascinating video of how it all works:
Naturally, the next evolutionary step has been to cash in on this science in the online dating game, where successful matchmaking hinges on predicting which couples will be ideally suited to each other on paper. Dr. Helen Fisher has used her expertise in the chemicals of love to match couples through their brain chemistry personality profiles on Chemistry.com. eHarmony has an in-house research psychologist, Gian Gonzaga, an ardent proponent of personality assessment and skeptic of opposites attracting. Finally, the increasingly popular Match.com boasts of a radically advanced new personality profile called “match insights,” devised by none other than Dr. Fisher and medical doctor Jason Stockwood. If you don’t believe in the power of the soft sciences, you can take your matchmaking to a molecular level, with several new companies claiming to connect couples based on DNA fingerprints and the biological instinct to breed with people whose immune system differs significantly from ours for genetic stability. ScientificMatch.com promises that its pricey, patent-pending technology “uses your DNA to find others with a natural odor you’ll love, with whom you’d have healthier children, a more satisfying sex life and more,” while GenePartner.com tests couples based on only one group of genes: human leukocyte antigens (HLAs), which play an essential role in immune function. The accuracy of all of these sites? Mixed. Despite the problem of rampant lying in internet dating profiles and dating in volume to pinpoint the right match, some research has shown remarkably high success (as high as 94%) in e-partners that had met in person.
What’s Science Got To Do With It?
In the shadow of such vast technological advancement and deduction of romance to the binary and biological, readers of this blog might imagine that its scientist editor might condemn decidedly empirical views of love. They would be wrong. For while numbers and test tubes and brain scanning machines can help us describe love’s physiological and psychological nimbus, its esoteric nucleus will forever be elusive. And thank heavens for that! There exists no mathematical formula (other than perhaps chaos theory) that can explain the idea of two people, diametrical as day and night, falling in love and somehow making it work. No MRI is equipped with a magnet strong enough to properly quantify the utter heartbreak of those that don’t. There is not a statistical deviation alive that could categorize my grandparents’ unlikely 55-year marriage, a marriage that survived World War II, Communism, a miscarriage, the death of a child, poverty, imprisonment in a political gulag, and yes, even a torrid affair. After my grandfather died, my grandmother eked out another feeble few years before succumbing to what else but a broken heart. It is within that enigma that generations of poets, scribes, musicians, screenwriters, and artists dating all the way back to humanity’s cultural dawning—the Stone Age—have never exhausted of material, and they never will.
Love exists outside of all the things that science is—the ordered, the examined, the sterile, the safe, and the rational. It is inherently irrational, messy, disordered and frustrating. Science and technology forever aim to eliminate mistakes, imperfection and any obstacles to precision, which in matters of the heart would be a downright shame. Love is not about impersonal personality surveys, neurotransmitter cascades or the incessant beeping of laboratory machines measuring its output. It’s about magic, mystery, voodoo and charm. It’s about experimenting, floating off the ground, being scared out of your mind, laughing uncontrollably and inexplicably, flowers, bad dates, good dates, tubs of ice cream and chocolate with your closest friends, picking yourself up and starting the whole process all over again. It’s not about guarantees or prognostications, not even by smart University of Washington psychologists. It’s about having no clue what you’re doing, figuring it out as you go along, deviating from formulas, books, and everything scientists have ever told you, taking a chance on the stranger across a crowded room, and the moon hitting your eye like a big-a pizza pie.
Now that’s amore!
Hope everyone had a great Valentine’s Day.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Follow us on Twitter and our Facebook fan page. Subscribe to free email notifications of new posts on our home page.
]]>“First of all, let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” These inspiring words, borrowed from scribes Henry David Thoreau and Michel de Montaigne, were spoken by President Franklin Delano Roosevelt at his first inauguration during the only era more perilous than the one we currently face. But FDR had it easy. All he had to face was 25% unemployment and 2 million homeless Americans. We have, among other things, climate change, carcinogens, leaky breast implants, the obesity epidemic, the West Nile virus, SARS, avian/swine flu, flesh-eating disease, pedophiles, predators, herpes, satanic cults, mad cow disease, crack cocaine, and let’s not forget that paragon of Malthusian-like fatalism—terror. In his brilliant book The Science of Fear, journalist Daniel Gardner delves into the psychology and physiology of fear and the incendiary factors that drive it, including media, advertising, government, business and our own evolutionary mold. For our final blog post of 2009, ScriptPhD.com extends the science into a personal reflection, a discussion of why, despite there never having been a better time to be alive, we are more afraid than ever, and how we can turn a more rational leaf in the year 2010.
Prehistoric Predispositions and the Human Brain
Let’s talk about psychology for a moment. The psychology of fear, to be specific. Our minds largely evolved to cope with the “Environment of Evolutionary Adaptation”; Stone Age survival needs hard-wired into our brains to create a two-tiered system of conscious and subconscious thought. Elucidated by 2002 Nobel Prize winner in economics Daniel Kahneman, the systems are divided into the prehistoric System One (Gut) and System Two (Head). Gut is quick, evolutionary and designed to react to mortal threats, while Head is more modern, conscious thought capable of analyzing statistics and being rational. In a seminal 1974 paper published in the journal Science, Kahneman and his research partner Amos Tversky punctured the long-held belief that decision-making occurs via a Homo economicus (rational man) by proving that decisions are mostly made by the gut using three simple heuristics, or rules. The anchoring and adjustment heuristic (Anchoring Rule) involves grabbing hold of the nearest or most recent number when uncertain about a correct answer. This helps explain why the number 50,000 has been used to describe everything from how many predators are on the Internet in the 2000s to how many children were kidnapped by strangers every day in the 80s to the number of murders committed by Satanic cults in the 90s. The representativeness heuristic, or Rule of Typical Things, is our Gut judging things based on largely learned intuition. This explains why many predictions by experts are often as wrong as they are right. And why, despite being convinced they are not racist, Western societies perpetuate a dangerous stereotype of the non-white male. Finally, and most importantly, the availability heuristic, or Example rule, dictates that the easier it is to recall examples of something, the more common it must be. This is particularly sensitive to our memory formation and retention, particularly of violent or painful events, which were key to species survival in dangerous prehistoric times.
University of Oregon psychologist Paul Slovic added to these the affect heuristic, or Good/Bad Rule. When faced with something unfamiliar, Gut instantly decides how likely it is to kill it based on whether it feels good or should be good. This explains why we irrationally fear nuclear power, which have intellectually been shown to not be nearly as dangerous as we think they are, while we have no qualms about suntanning on a beach, which feels good, or getting an X-ray at the doctor’s office despite both having been shown to be more dangerous than estimated. Psychologists Marty Frank and Thomas Gilovich showed that in all but one season from 1970 to 1986, the five teams in the NFL and three teams in the NHL that wore black uniforms (black = bad) got more penalty yards and penalty minutes than league-average, respectively, even when wearing their alternate uniforms. Finally, scientist Peter Watson discovered that people judge risk not based on scientific information, but rather herd mentality and conformity, termed confirmation bias. Once having formed a view, we cling to information that supports that view while rejecting or ignoring information that casts any doubt on it. This can be seen in internet blogs of like-minded individuals that act as echo chambers and media and organization perpetuating a fear as rumor until it is accepted by the group as a mortal danger, despite no rational evidence to the contrary.
It’s a downright shame that this hallowed body of research took scientists such a long time to amass and ascertain, because they could have easily found it in the skyscrapers of Madison Avenue, where the psychology of fear has not only been long-defined, but long-exploited by advertising agencies and media moguls to sell products, news, and…more fear.
At its heart, effective advertising has always been about forming an emotional bond with consumers on an individual level. People are more likely to engage with a brand or buy a product if they can feel a deep connection or personal stake. This can be achieved through targeted storytelling, creativity, and the tapping and marketing of subconscious fear, coined as “schockvertising” by ad agencies. “X is a frightening threat to your safety or health, but use our product to make it go away.” It’s a surprisingly effective strategy and has been applied of late to disease (terrific read on pharmaceutical advertising tactics), the organic/natural movement, and politics. Purell (which The ScriptPhD will disclose being a huge fan of) was originally created by Pfizer for in-hospital use by medical professionals. In 1997, it was introduced to market with an ad blitz that included the slogan “Imagine a Touchable World.” I just did; it’s called the world up until 1997. Erectile dysfunction, hair loss, osteoporosis, restless leg syndrome, shyness, and even toenail fungus are now serious ailments that you need to ask your doctor about today. This camouflaged marketing extends to health lobby groups, professional associations, and even awareness campaigns. Roundly excoriated by medical professionals, a 2007 ad campaign pictured on the right warned of the looming dangers of skin cancer. Though the logo on the poster is that of the American Cancer Society, it’s sponsored by Neutrogena—a leading manufacturer of sunscreen.
“For the first time in the history of the world, every human being is now subjected to contact with dangerous chemicals, from the moment of conception until death,” wrote marine biologist Rachel Carson in her 1962 environmental bombshell Silent Spring. Up until then, ‘chemical’ was not a dirty word. In fact it was associated with progress, modernity, and prosperity, as evidenced by DuPont Corporation’s 1935 slogan “Better things for better living…through chemistry” (the latter part being dropped in 1982). Carson’s book preyed on what has become the biggest fear in the last half-century: cancer. It famously predicted that one in every four people would be stricken with the disease over the course of their lifetimes. These fears have been capitalized on by the health, nutrition and wellness industry to peddle organic, natural foods and supplements that veer far away from laboratories and manufactured synthesis. While there is nothing wrong with digging into a delicious meal of organic produce or popping a ginseng pill, the naturally occurring chemicals in the food supply exceed one million, none of which are liable to the rigorous safety guidelines and regulations that are performed on perscription drugs, pesticides and non-organic foods. In fact, the lifetime cancer risk figures invoked by everyone from Greenpeace to Whole Foods is not nearly as scary when adjusted for age and lifestyle. “Exposure to pollutants in occupational, community, and other settings is thought to account for a relatively small percentage of cancer deaths,” according to the American Cancer Society in Cancer Facts and Figures 2006. It’s lifestyle—smoking, drinking, diet, obesity and exercise—that accounts for 65% of all cancers, but it’s not nearly as sexy to stop smoking as it is to buy that imported Nepalese pear hand-picked by a Himalayan sherpa. Of course, this hasn’t prevented the organic market from growing 20% each year since the early 90s, with a 20-year marketing plan. The last frontier of fear advertising is politics. Anyone who has seen a grainy, black and white negative ad may be cognizant they are being manipulated, but may not know exactly how. Coined the “get ‘em sick, get ‘em well” model in the 1980s, political scientist Ted Brader notes in Campaigning for Hearts and Minds that 72 percent of such ads dominate an appeal to emotions versus logic, with nearly half appealing to anger, fear or pride. Take a look at the gem that started them all, a controversial, game-changing 1964 commercial called “Daisy Girl”. Emotional, visceral, and only aired once, this ad was widely credited with helping Lyndon B. Johnson defeat Barry Goldwater in the Presidential election:
And finally, we have the media. That venerated congregation of communicators dedicated to broadcasting all that’s newsworthy with integrity. Perhaps in an alternate universe. Psychologists specializing in fear perception have concluded that media disproportionately covers dramatic, violent, catastrophic causes of death. So after watching a Law and Order or CSI marathon, a Dateline special about online predators, and a CNN special on the missing blonde white woman du jour, you tune in to your local news (“Find out how your windshield wipers could kill you, tonight at 11!”). While the simplest explanation is that media profits from fear, it is also tailor made to two of the rules discussed above: the Example Rule and the Good/Bad rule. We don’t recognize that a disease covered on House or a news special about a kidnapping or violent suburban murder spree are rare, only that they are Bad and the last thing we saw, so they head straight to our Gut. The overwhelming preoccupation of the modern media is crime and terrorism, with a heavy focus on individual acts bereft of broader context. Just as they are in advertising, emotions and individual connection are essential in media crime reporting. The evolutionary desire to punish someone for wrongdoing towards someone else is hard-wired into our brains, and easier to conjure when watching an isolated story about someone relatable (a daughter, a mother, a little old lady) than about scores of dead or suffering millions of miles away, no matter how tragic. A convincing body of psychology research has concluded that individuals who watch a large amount of television are more likely to feel a greater threat from crime, believe crime is more prevalent than statistics indicate, and take more precautions against crime. Furthermore, crime portrayed on television is significantly more violent, random, and dangerous than crime in the “real” world. While this blog diverges with Gardner’s offhanded minimalization of the very real threat posed by radical terrorist organizations, he does bring forth a valid argument of reality versus risk perception. Excluding the State of Israel, a lifetime risk of injury or death in a terror attack is between 1 in 10,000 and 1 in a million. Your risk of being killed by a venomous plant or animal? 1 in 39,873. Drowning in a bathtub? 1 in 11,289. Being killed in a car crash? 1 in 84. Think about the last time CNN devoted a Wolf Blitzer special to venomous plants, bathtubs, or car crashes. Furthermore, the total cost of counterterror spending in the US between 2001 and 2007 was $58.3 billion, not including the $500 billion – $2 trillion price tag on the Iraq war. And the extra-half hour delay at the airport for security, the effectiveness of which we have witnessed this very week, costs the US economy $15 billion per year. A panel of experts gathered a month ago at the Paley Media Center in New York City to discuss the future of news media, audience preferences, and content in a competing marketplace. The conclusion? More sex, more scandal. In other words, more of the same.
It is highly unusual, if rare, for me to break the fourth wall as Editor of ScriptPhD.com to get personal and let readers in as Jovana Grbić. But, in the spirit of New Year’s resolutions, holiday warmth, and a little too much champagne, what the heck… lean in closely, because I’ll only say this once for posterity. I am more often afraid than unafraid. I am not an outlying exception to the principles listed above, but rather an adherent to their very human origins. From as far back as I can remember, I have wanted to be a professional writer. I was very good at math and science, but creative doodling, daydreaming, and filling my head and notebooks with words was what I lived for. Dramatic, artsy types were who I hung out with, admired, even lived with in college. So why not study film or creative writing? And, in the middle of graduate school, when it dawned on me that I loved science but didn’t live it, why not change course and pursue my dreams right away? Because I chose to follow the tenable rewards of the logical and attainable rather than risk the abstract (gut beat out head). Quite simply, I was afraid. I began 2009 by standing on the Mall in Washington, DC with a million or so of my closest friends to witness our nation’s first African-American President take the oath of office. 2009 was also the year that I broke free to launch this blog, its accompanying creative consulting practice, and my career as a writer. Neither accomplishment, one so public and one so personal, could have transpired through fear. Naturally, since we move and think as a herd, I had a lot of help from my friends. But the bitter irony is that cracking my fears has left me more scared than ever. There are still a million things that could go wrong, a million burdens and responsibilities that I now shoulder alone, and let’s not romanticize the creative lifestyle, the ultimate high-risk, low-reward proposition. But you know what, dear readers? I’ve never been happier! So, as we head into this, the last year of the first decade of our new millennium, we must individually and collectively ask ourselves, “What do I fear?” What great glucocordicoid blockade, personal or professional, prevents you from existential liberation?
Especially amidst vulnerable times like these, a retrospective insight into the rational reveals the psychological trail of breadcrumbs that cracked the mirage we’ve been living under this past decade. We would see that Jim Cramer, a so-called expert in the capricious art of the stock market (the fluctuations of which have been linked by psychologists to weather patterns) was simply evoking the anchoring and heuristics numbers rule as he confidently reassured his viewers to invest in Bear Stearns. The company went under a week later. We would see that scores of bankers, investors, insurance giants, and people just like you and me were lulled by the example rule to falsely believe the housing bubble would never burst because, hey, a week ago, everything was fine. And we can still see the media (print, social, digital, and everything in-between) trying to painfully bend the rule of typical things into a pretzel to forecast when and how this current recession will end, with optimistic economists predicting a 2010 recovery and more austere types warning that high unemployment might take a full decade to assuage. Caught in this miasmic cloud is a primed public receptive to the advertising, entertainment and news messages aimed straight at our primordial gut instincts to perpetuate a culture of fear. To step out of the cloud is to shed the hypothetical for the actual. While recessions are frightening and uncertain, they are also a natural part of the economic cycle, and have given birth to the upper echelon of the Fortune 500 and even the iPod you’re listening to as you read this blog. Rather than focusing on the myriad of byzantine ways we could die, we might devote energy and economy to putting a dent in our true terrorists: heart and cardiovascular diseases, cancer and diabetes. Rather than worrying about whether we’re eating the perfect, macrobiotic, garden-grown heirloom tomato, we should all just eat more tomatoes. Instead of acquiescing to appeals of vanity from pharmaceutical companies, we should worry about feeding the hungry and rebuilding inner-city neighborhoods. And every once in a while, we should let our gut worry about digesting dinner, and let our head worry about risk-assessment. We might stop worrying so much. I will conclude 2009 by echoing President Roosevelt’s preamble to not fearing fear: “This great Nation will endure as it has endured, will revive and will prosper.” I personally extend his wishes to every single one of you around the world. Just make sure to look both ways before crossing the street. There’s a 1 in 626 chance you could die.
Happy New Year.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and pop culture. Follow us on Twitter and our Facebook fan page. Subscribe to free email notifications of new posts on our home page.
Music. We all know what it sounds like when we hear it. It has the ability to create powerful emotions, or bring back memories from the distant past in our lives. We may use it when we exercise, study for exams, read, or for traveling form point A to point B. But what is music? Is it unique to human beings? How do we interpret music when we hear it? Are the emotions created by music universal, irrespective of the listener’s culture and country of origin? Can music teach us anything about how the human brain functions? These interesting questions and more are explored in the fascinating documentary The Music Instinct: Science and Song, a PBS production that was recently honored with the top prize at the prestigious annual Pariscience International Science Film Festival. ScriptPhD.com’s newest regular contributor NeuroScribe provides his review and discussion under the “continue reading” jump.
REVIEW: The Music Instinct: Science and Song
ScriptPhD.com Grade: A+
In his stirring and enormously popular 2007 opus Musicophilia: Tales of Music and the Brain, neurology professor Oliver Sacks examines the extreme effects of music on the human brain and how lives can be utterly transformed by the simplest of harmonies. These very themes are the foundation of The Music Instinct, hosted by Dr. Daniel Levitin, a neuroscientist and author of This Is Your Brain on Music: The Science of a Human Obsession, and 10-time Grammy award winning musician and vocalist Bobby McFerrin. Together they bridge the gap between listening to and creating music with neuroscientists seeking a greater understanding of how the brain works.
Dr. Levitin states in the program, “The brain is teaching us about music and music is teaching us about the brain.” Well, how does this happen? Music is after all information which the brain processes. How the brain processes and stores information is a huge area of basic neuroscience study. These studies are typically funded by the National Institutes of Health in Washington, DC. The results of these studies lead to improved treatment of patients suffering from a variety of human diseases. The incredible impact music has on people is remarkable. For example, the program mentions how music is used in hospitals to steady the breathing of premature babies, and the heart rate of cardiac patients. A three-year study from 2002-2005 on music therapy for infants was named by The Council for the Humanities, Arts and Social Sciences as one of 12 exemplar studies for the year 2006. For more videos and information on the power of music therapy, please visit the official American Music Therapy Association website.
Is music built into who we are as people? Some scientists do not think so, yet many others believe it is. What is music and how do we hear it? A basic definition would be that music is the vibration of molecules in the air which have a certain pitch and rhythm. These vibrations enter the ear and cause your ear drum to vibrate. As your ear drum vibrates, this causes the movement of three small bones called ossicles. These bones transmit the vibration to the nerves in the inner ear. It is here where the vibrations (initially music) are transmitted into nerve signals sent directly to your brain to be processed. In a nutshell, that is how we hear. While this is well known information, what isn’t well known is how and why music generates emotional changes in the brain after it has been physically interpreted. Scientific research is attempting to answer these and many other questions.
Scientists know a developing fetus begins hearing at seventeen to nineteen weeks of development. Can the fetus hear music? Does the fetus respond to music? The program shows a pregnant mother listening to music while the fetus’s heart rate is monitored. Researchers found that when music was played, the fetus’s heart rate increased. The heart rate of a person is one indicator of what is known as the “startle response”. The startle response is that “fight or flight” instinctual reflex the brain carries out to ensure an animal’s survival. This response is generated from very primitive areas of the brain. They also implanted a miniature hydrophone inside the mother’s womb to see if there was any natural sound in the womb. In addition, the hydrophone recorded what the music would sound like to the fetus. An audio recording from inside the mother transmits the pulsing of blood from the uterine artery. One can also hear the music as well fairly clearly as the infant would hear it—like a muffled sound as if you were underwater in the ocean or a pool. Many scientists think we are “wired” and listening to rhythm, the essential difference between sound and music, even before we are born.
The program also goes beyond just examining the cellular and physiological responses to music to discuss the differences and universal similarities of people’s emotional responses to music. In an interesting study looking at the emotions generated by music, participants listened to different types of classical music while various biological indicators of emotion changes were recorded. The neuroscientists observed when people listened to lighthearted sounds, the body was at a resting state. However, when they heard deep sounds say from a piano, heart rates increased and people began to sweat, indicators of stress. A very interesting recent study asked the question: Are there things about music which go beyond culture. Read about it here. A group of scientists, led by Dr. Thomas Fritz, traveled to the country of Cameroon in Africa. They visited a very remote group of people, the Malfa, living in the mountains. These people have never been exposed to Western music. In fact, in their language they don’t even have a word for music, yet they make musical instruments and play them daily. They were asked to listen to different pieces of Western classical music, and then point to one of 3 faces in a book. Each face represented a different emotion: happy, sad, or scary. The participants knew which emotion went to which face for the experiment. The scientists found despite these people never having heard Western music, they were able to identify music as happy, sad or scary, just as Western people could. Their conclusion was the emotional content of music is inherent in music itself, and is not solely the result of cultural imprinting. This study tells us there are some things about music and emotion which may indeed be universal, and not bounded by country or culture.
Nevertheless, our responses to music are not always universal. In Western cultures, a certain type of pitch is typically identified as being “sad”, yet the same pitch in a different culture is identified as being “happy”. This tells us there are some outside factors, such as culture, which help shape a person’s emotional response to music. Indeed, some of this behavior is in fact learned. Did you ever notice when you hear some music, without even thinking, you
start moving your head, or tapping your feet? Why do we do that? Well, the answers aren’t entirely known. However, it is no longer believed the brain has a central “music region”. The listening and processing of music occurs throughout many regions of the brain. In fact, music occupies more regions of the brain than language does! Despite this, the auditory cortex of the brain (pictured at right) has been shown to have strong ties to the motor regions of the brain. People with neurological disorders (strokes, Parkinson’s disease, etc.) have often been found to have improved neurological function after receiving music therapy.
One final interesting scientific argument discussed in this program is the origin of music, specifically what evolved first: music or language? Many scientists believe it is language which developed first, but The Music Instinct provides compelling evidence that suggests perhaps it is music which evolved first. A 2001 Scientific American article suggests that music not only predates language, but may be more ancient than the human race. For those who have watched the news, or traveled to foreign countries, you can see time after time how all types of music, in a different language than the listeners’ native language, influences the emotional behavior of those that are listening.
Music is indeed a universal form of communication which may supersede language. If so, maybe the next time I encounter a foreigner, instead of trying to say “Hello” I will jam out some Beatles tunes!
Trailer:
The Music Instinct: Science and Song was released on DVD November 4.
NeuroScribe obtained a B.S. in Biology, and a Ph.D. in Cell Biology with a strong emphasis in Neuroscience. When he’s not busy freelancing for ScriptPhD.com he is out in the field perfecting his photography, reading science policy, and throwing some Frisbee.
~*NeuroScribe*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and pop culture. Follow us on Twitter and our Facebook fan page. Subscribe to free email notifications of new posts on our home page.
Among the many manifestations of mental illness, psychological and developmental deficits and behavioral disorders regularly portrayed on television and film, conspicuously missing is Asperger’s Syndrome, a developmental Autism-spectrum disorder. ScriptPhD.com eagerly took in a private screening of Fox Searchlight’s thoughtful new film Adam, a sensitive love story with a groundbreaking portrayal of Asperger’s in a leading character. In addition to simply reviewing the film, we wanted to provide a background primer on the basics of Asperger’s diagnosis and treatment, as well as an ensuing discussion of accuracy in portrayal and plot by the filmmakers. To do so, ScriptPhD.com enlisted the help of two leading international Asperger’s experts, Timothy P. Kowalski and Dr. Tony Attwood. Please click “continue reading” for our full story.
Timothy P. Kowalski, M.A., C.C.C. is a world-renowned speech-language pathologist with extensive experience in treating individuals with Asperger’s Syndrome, high-functioning autism, and various psychiatric and behavioral/emotional deficits. Frequently asked to give national and international seminars as well as university guest lectures, Mr. Kowalski is the director of Professional Communication Services, Inc., a resource and treatment consulting center based in Orlando, Florida for a range of behavioral disorders. He is also the author of one of the most widely acclaimed books on this topic: The Source for Asperger’s Syndrome. Mr. Kowalski graciously and generously took time away from his vacation to provide ScriptPhD.com with a fundamental primer on the basics of Asperger’s Syndrome.
ScriptPhD: Asperger’s (in comparison with Autism) is rarely portrayed by leading characters in the collective media. As such, there is a lot of misinformation within the general public about what it is and what it is not. Can you dispel some of those notions for us?
Timothy P. Kowalski: It’s been said that the skill and knowledge of the diagnostician to fully understand what constitutes Asperger’s Syndrome (AS) is highly dependent upon whether or not an individual receives a diagnosis of Asperger’s Syndrome. Too many individuals are misdiagnosed because the diagnostician has a preconceived notion of what constitutes AS. While many individuals present with the classic speech pattern in which they talk incessantly about a specific topic, one does not have to have this characteristic to be diagnosed as such. Likewise, many also have difficulty with eye contact although this too is not a prerequisite for the diagnosis. When presenting on this topic I use a chart that compares three major classification systems used throughout the world to help comprehend the features seen in AS. Based on the criteria established by the DSM IV-TR [Diagnostic and Statistical Manual of Mental Disorders, the psychiatric bible of diagnosing mental disorders], these features may be best described as follows. A social impairment characterized by two “hits” from the following four characteristics:
1. A difficulty with appropriate use of eye gaze. They may look at something else while either talking or listening to someone without regard to how it is perceived by their communicative partner or they may look straight through the person as if focused on something about 18 inches behind them causing the person to feel extremely uncomfortable. Or…
A difficulty with facial expression. Frequently their face does not match their current emotional state. As such, individuals cannot “read” them and misinterpret their intentions. Or…
Inappropriate or unusual body posture. Many individuals don’t understand that body posture communicates a message. Slouching, lounging, laying on the floor all communicate a message of ambivalence or lack of concern to another person. However, they don’t realize this message is being “read” by their partner. Or…
A difficulty reading or using gestures. The fluidity of movement when gesturing is extremely difficult for many individuals with AS. Their movements may be perceived as awkward, stilted or just plain unusual.
But one doesn’t have to have any of the above because there are still three other criteria sets which are:
2. Difficulty developing appropriate peer relationships that are consistent with their age.
They often gravitate to younger children who are thrilled that an older child is playing with them. But the child with AS typically is not “playing” with them. Instead, he is dictating what and how they will act as if he is a director on a movie set. While many younger children will not recognize this form of manipulation, their peers do. Adults, however, have learned social graces. It is politically incorrect to tell someone they are obnoxious. But we communicate the message by subtle body language. Those of us who don’t have AS can read the message and we alter our social interaction accordingly but for the child with AS he can’t read it and may continue to talk incessantly on his current topic of interest. As a result, he may gravitate to adults who he perceives as being more accepting of his behavior.
3. Inappropriate sharing of enjoyment, interests, or achievements with others. Besides the obvious of sharing toys and objects, he may also have difficulty sharing his friends. He may avoid them or smother them and in doing so, continue to isolate himself from others.
4. A lack of social or emotional reciprocity. The inability to relate appropriately to others, coupled with inappropriate facial expression often leads others to believe they lack empathic concern. However, parents indicate their child is often extremely empathic. They just don’t communicate it with the same degree of finesse as those who don’t have AS.
A restricted range of interest characterized by two “hits” from the following characteristics:
1. Preoccupation with an interest that is highly intense: Someone with AS may obsess over a given topic by collecting things or discussing it with such intensity that it causes others to recognize this behavior pattern as unusual. Or…
2. A demand for following a routine or ritual that he may impose on himself or others. Or…
3. A preoccupation with parts of objects or the object itself that is seen as odd by others.
The presence of these symptoms must cause the individual to have a “clinically significant impairment” in the areas of social or occupational functions, no delay in language development (although many articles in peer reviewed journals are now stating a delay is possible), and the behaviors are not characteristic for another form of Pervasive Developmental Disorder or Schizophrenia.
The behaviors described above are required for the individual to be given the diagnostic label of Asperger’s Syndrome. Unfortunately, some diagnosticians do not look at the “whole picture” and instead focus on just one aspect when making a diagnostic decision.
SPhD: Can Asperger’s be categorized in the same vein as Autism? Does it have a similarly nebulous body of research as to the root cause?
TK: Asperger’s Syndrome is one of 6 diagnostic labels under the DSM umbrella of “Pervasive Developmental Disorder” which is often referred to as Autism Spectrum Disorders. The other 5 are Pervasive Developmental Disorder, Autism, Pervasive Developmental Disorder – Not Otherwise Specified, Rhett’s Disorder and Childhood Disintegrative Disorder.
There are two philosophies of where to place AS within the [above] spectrum. Some believe it is synonymous with “High Functioning Autism” while others believe it is a separate entity. My belief is that AS is a separate entity and it is based on their desire for social interaction. They want friends but have a significant difficulty making and keeping friends. Causal factors still have yet to be determined, however there is widespread agreement that it is a neuro-biological deficit and some families seem to have a predisposition to autistic spectrum disorders with many family members diagnosed with ASD labels.
SPhD: How treatable/manageable is this condition?
TK: There seems to be two “types” of individuals presenting with AS: those who realize they are having a difficult time navigating the social arena and understand they have to try to do so to get ahead, and those who simply don’t care about how they are perceived by others and refuse to admit they have any difficulty and simply want to do what they do without regard to others. It is my opinion that the outcomes for the latter are much worse than the other group. Structured intervention based on the principles of Cognitive Behavior Therapy coupled with extensive use of visual cues and a heavy emphasis on cognitive retraining will produce positive change. Whatever approach is used, one must emphasize the concept of social skills using a meta-cognitive approach in which the individual is always thinking about how he relates to his surroundings. By doing so greater generalization of functional gains will occur. A simple teaching of specific skills will not produce the desired result of greater social acceptance.
SPhD: In the movie “Adam”, a highly-functioning engineer with Asperger’s forms a
relationship with his understanding neighbor. Because of his condition, there are a lot of ups and (mostly) downs for them. Are people with this condition capable of forming strong relationships and marriages despite their diminished emotional capacity?
TK: For the spouse with AS, reading the subtleties required in a relationship is an extreme struggle. Quite frequently they want to please and try hard to do so, but unfortunately often fall short of what their spouse desires. Sadly many don’t realize they weren’t measuring up to their spouse’s expectations until their partner threatens to move out. It’s at this point where the spouse with AS realizes there’s a problem. Unfortunately their partner is completely exasperated.
Just to make things more interesting, add in an extreme dose of anxiety because the spouse with AS is consumed with a constant concern for producing social blunders. Be aware that when an individual is under a heightened degree of stress, linguistic competence is one of the first areas to decrease. History has taught him to be wary of new situations, group outings, and settings where he may be unsure of himself. Ultimately the spouse needs to realize where his social “Achilles heel” is in order to more effectively deal with it. Blaming others may seem like an appropriate way to fix the problem but more than likely it serves to compound the problem.
What’s needed is a quick fix in communication strategies. AS produces problems due to a neurobiological problem. They are not learned. Cognitive intervention is highly dependent upon self-reporting of behavior and feelings. But for the spouse with AS, how can he describe what he doesn’t realize he’s feeling, especially when he doesn’t have the words to describe what he doesn’t know what he’s feeling? Effective intervention focuses not on a quick fix but in assisting the client to identify and learn the social reasoning skills that those of us not impacted by AS have acquired through natural osmosis learning.
Despite these difficulties it is possible to have a meaningful relationship. It may not be filled with the same degree of “warm fuzziness” that others may have, but not everyone wants that in a relationship. As long as both partners find satisfaction in their relationship than it’s a positive relationship.
SPhD: How do you feel about Boston Legal‘s portrayal of Asperger’s in the form of the character Jerry “Hands” Espensen? Some Asperger’s awareness groups have complained about the extent of some of his tics and outbursts (the stiff hands on the side of the body, the random bleeps and noises, etc). In your opinion, does his character represent a realistic take on someone with Asperger’s and if not, where do the show runners go astray?
TK: When I talk about AS in my seminars I am often asked about how the media portrays specific characters. Some are seen as “over-the-top oddballs” in which we are made to laugh at their disability (something I find discomforting) and some are seen as having specific patterns of behavior that are definitely odd, yet the character is holding a job and functioning in society in his own unique way. The character Jerry “Hands” Espensen is presented in a manner that has characteristics of AS. Does he “typify” the classic form of AS? Of course not. But then, if the character was modeled after the stereotypical model of AS I believe it would be just as much of a disservice to the community as the current model is because that character’s performance will be seen by the general public as characterizing AS. To assume all individuals presenting with AS are going to be similar is unrealistic just as it is to assume all individuals with learning disabilities are exactly the same.
Movie Review: Adam
ScriptPhD Grade: A-
Adam, a new independent feature from Fox Searchlight Films, is more than the love story it’s being marketed as. Written and directed by acclaimed theater director Max Mayer, Adam is a movie about friendship, fidelity, and connection in an often ephemeral world. Adam (Hugh Dancy) is a brilliant but sheltered electroengineer with an insatiable curiosity for space and astronomy. After his father dies, his lone remaining friend is an old buddy of his dad’s that encourages Adam to expand his social circle. He gets that chance when the new neighbor moves in. Beth (Rose Byrne) is an aspiring children’s author and well taken care of city girl with a penchant for choosing the wrong guy. She feels an immediate connection with Adam, who showers her with unusual affection. She soon learns that the root of his quirky behavior is Asperger’s Syndrome. Rather than being repelled by Adam’s condition, Beth learns more about it, bring her closer to Adam in the process. Their romance is tested by their diametrical views of the world and learning to trust one another—themes universal to any relationship. In an extremely satisfying ending, both Beth and Adam fulfill their dreams and grow as individuals by embracing the other’s philosophy.
Actor Hugh Dancy took his portrayal of Adam very seriously. In addition to researching the disorder in books, online and with Mayer, Dancy also met with several Aspies. “People were very generous in talking with me,” he says, “and it was invaluable to me both to listen to what they had to say and to observe them. I felt a real responsibility to do that.” It shows. Dancy’s every flitter of the eye, his every facial expression and verbal enunciation so transforms him, that by the end of the shooting, co-star Rose Byrne remarked “Oh my goodness, this is not the person I just spent the last few weeks with!” Byrne, too, brought a personal connection to this role. “I have a family friend who has Asperger’s and I think it’s becoming more common. More and more people are touched by some form of autism,” she remarked. Some of Byrne’s own natural charm and zest for life bleeds over into her portrayal of romantically-challenged New Yorker Beth, creating an immediate chemistry between herself and Dancy. That chemistry lies at the heart of making the love story between Adam and Beth so satisfying and believable. In addition to its two talented leads, Adam touts a rich supporting cast, that includes Peter Gallagher as Beth’s doting yet imperfect wealthy father, the delectable Amy Irving as her wise mother, Frankie Faison, Adam’s caretaker and lone close friend, and Mark Lin-Baker as Adam’s boss.
In addition to first-rate performances by the two leads and a rich supporting cast, Adam benefits from a well-researched, tightly-composed screenplay by writer/director Max Mayer. “When I heard [a] man on the radio talking about Asperger’s Syndrome, I realized that not only was he describing his own very moving journey, but also something about the general condition,” says Mayer. “That’s what inspired me to begin [writing] Adam. As I started writing, I realized that Adam’s relationship to Beth is an extreme version of a very common dilemma we all face in life: the urge for an intimate connection to that which is necessarily strage—another person with their own view of the world.” Mayer’s script develops this unlikely relationship with sweetness, a humor that never plays on its lead’s disability, and material that stays true to the characters. Adam’s courtship of Beth is certainly unusual—when is the last time someone recreated the galaxy for you in their living room or showed up in a space suit to squeegee your windows? Likewise, funny moments abound, such as Adam ranting off the history of a New York theater when meeting Beth’s parents—a symptom of his illness—and yet, they never veer into the absurd or pejorative satire. The only quibble, and our reasoning for the A- grade, is that the script gets too disjointed, preachy, and didactic when explaining Adam’s disease. It felt a bit like, “Let us now pause this film while our character provides a brief run-down of Asperger’s symptoms.” The script, and disease, is best served when it shows the characters in their respective elements. Mayer’s direction, too, provides another layer of depth to visual cinematic experience. At times shooting with a jerky handheld camera from Hugh Dancy’s point of view, Mayer provides the audience with a visceral sense of the frenetic, fast-paced, noisy world within Adam’s brain.
Ultimately, the strength of Adam lies in the commitment of on and off-screen
talent to provide a genuine story without succumbing to Hollywood romantic comedy staple cliches.
Adam goes into limited release this week. Check your local listings for showtimes.
To help us get in-depth answers about the portrayal of Asperger’s in Adam, ScriptPhD.com contacted Dr. Tony Attwood. Based in Queensland, Australia, where he runs the Asperger’s Syndrome Clinic, Dr. Attwood is considered a world-renowned expert in the research, diagnosis and treatment of Asperger’s Syndrome. His book, The Complete Guide to Asperger’s Syndrome, has now been translated into several languages. Dr. Attwood graciously provided us his thoughts on the film and its accuracy:
ScriptPhD: You said that you were really impressed by this film’s portrayal of Asperger’s symptoms and the storyline. Can you elaborate on certain details from the film (Adam’s habits, his tidiness, the freakouts, etc) that are accurate portrayals of what Aspies go through in real life?
Dr. Tony Attwood: The details that I noticed were aspects such as Adam’s eye gaze, mannerisms, interests and social confusion that are typical of some adults with Asperger’s syndrome. The actor gave a very credible portrayal of the life circumstances and romantic relationships of a man with Asperger’s syndrome.
SPhD: Why was Beth’s little white lie such an instigator for Adam? He couldn’t seem to delineate between something tiny and a major lie, and was basically ready to cut her off entirely as a liar and a betrayer.
TA: People with Asperger’s syndrome seek the truth and find it very difficult to understand and appreciate why someone would lie, even if it is a ‘white lie’. They often have clear expectations with regard to friendships and, if these are broken, can be very black and white in terms of continuing the friendship. There can be reluctance to tolerate a friendship transgression. The lie can also be a trigger to intense emotions, sometimes due to past experiences of confusion as to why someone would lie and whether someone could be trusted.
SPhD: Adam seems to get “better” somewhat through the movie. He can pick up on social cues at the end (such as helping a lady with heavy boxes or improving interactions with colleagues) that he couldn’t at the beginning. Is this true? Can an Aspie “learn” certain behaviors to help them function socially or ameliorate their skill sets? How difficult is this for them?
TA: People with Asperger’s syndrome can acquire social abilities through guidance and positive experiences. This takes time. Other people may not realise and appreciate the amount of intellectual effort that goes into achieving social success which can be exhausting.
SPhD: The trust factor for Adam seemed extremely difficult. In fact, his only seeming friendship was with his father’s best friend, who looked out for him since childhood. Is it typical of people with Asperger’s to have a small social circle like that, or to form the one really close friendship?
TA: An adult with Asperger’s syndrome can usually cope very well with a small circle of friends and acquaintances. The person can feel most comfortable in a one-to-one relationship. I use the phrase ‘two’s company, three’s a crowd’. One of the characteristics that I enjoyed about the film was that, at the end, he has achieved a wider circle of friends and adapted to a different environment where his abilities were recognized and appreciated.
~*ScriptPhD*~
***************
Follow us on Twitter and our Facebook page.
Original Premise
Who among us hasn’t wanted, nay desperately needed, to forget a painful event, relationship, person, or circumstance that can’t seem to escape their memory? Oh to be able to just wipe it from your brain and pretend it never happened! The concept sounds like something straight out of the imaginative mind of screenwriter Charlie Kaufman. In his movie, Eternal Sunshine of the Spotless Mind, ex-lovers Joel and Clementine, played by Jim Carrey and Kate Winslet, erase memories of each other after their relationship sours. To do this, they seek out the bioengineering company Lacuna Inc, whose scruples are more than ambiguous. All’s well that ends well for the lovers, as they reconnect towards the end of the movie, rebuild new memories of one another and fall back in love.
Indeed, plenty of recent movies deal with memory loss, of varying degree, origin and consequence. In Christopher Nolan’s brilliant and esoteric Memento, Leonard Shelby (Guy Pearce), suffering from antiretrograde amnesia rendering him unable to form new memories, is trying to piece together the events of the vicious attack and murder of his wife. A similar condition is suffered by Drew Barrymore’s character in the romantic comedy 50 First Dates and has to “meet” her character’s love interest anew every day. In Paycheck, the film adaptation of Philip K. Dick’s science fiction story, Ben Affleck’s character takes extreme measure to protect his clients’ intellectual property, in the form of wiping his own memory, almost costing him his own life as his last deal embroils him in a standoff with the FBI.
Indeed, a slew of medical and psychological syndromes can cause, or is associated with, memory loss. But the idea of selective memory engineering has been the stuff of science fiction fancy.
Until now.
Current Research
While watching an episode of the television version of This American Life, I was struck by the episode entitled “Pandora’s Box”, which profiled the work of SUNY Downstate Medical researchers Drs. Todd Sacktor and Andre Fenton. Dr. Sacktor had a revolutionary idea about how memory is formed in the brain, and the elementary, yet powerful, way to manipulate it by eradicating the function of one regulatory molecule. And what a Pandora’s box did they open! Take a look at this short clip:
Powerful stuff, no? This research, in effect, suggests that a single molecule, Protein Kinase Mzeta, regulates the brain’s ability to form and retain memories, and consequently lies at the heart of memory erasure potential. In a recent New York Times interview, Dr. Sacktor admitted that his scientist dad directed him to a family of molecules called Protein Kinase C in 1985, from which his lab derived PKMzeta as a brain-specific member of that family. In a 1999 paper in the journal Nature Neuroscience, Drs. Jeff Lichtman and Joshua Sanes narrowed down 117 hypothetical molecules involved in long-term potentiation (LTP), the communication between two neurons when stimulated simultaneously. Following this paper, in a subsequent 2002 Nature Neuroscience paper, Dr. Sacktor’s lab was able to isolate PKMzeta as the absolute “it” memory factor, showing that it congregates semi-permanently en masse around these activated neuronal connections. At that point, he was off to the races. He joined forces with the friendly neighbor downstairs, neuroscientist Dr. Andre Fenton, who just happened to study spatial memory in mice and rats. He had previously shown that mice and rats placed in a circular chamber learn how to move around to avoid getting their feet shocked, a memory they retain, days, weeks, even months later. Sacktor’s lab injected an inhibitor for PKMzeta into the rats’ hippocampus, the part of our brain that regulates memory. The results were stunning. Two pioneering papers (paper 1 and paper 2) in the elite research journal Science showed that these “blockers” both reversed the rats’ neurons from forming long-term potentiation, and that it manifested in them forgetting the spatial information they’d learned in the chamber, an effect that seemed to last for weeks. Drs. Sacktor and Fenton had erased the rats’ memory!
Dr. Fenton and Dr. Sacktor’s reaction to their research in the This American Life piece was notable. Normally, scientists are shielded well behind the safe solitude of the ivory tower: long work hours, constant pressure, achieving the next research milestone. It’s not that scientists don’t ever think about the implications of their work per se, but they rarely have the luxury of time for such contemplation or the fortune of far-reaching results. While he read letters from victims of post-traumatic stress disorder, Dr. Fenton broke down crying, and expressed a desire to just help these people.
Less than two months ago, scientists at the Toronto’s Hospital for Sick Children [sorry I can’t help myself… as opposed to healthy ones? I love Canadians!] have added an important piece to this canon of research. In a Science paper, the scientists identified the exact group of neurons—lateral amygdala (LA) neurons with increased cyclic adenosine monophosphate response element-binding protein (CREB)—responsible for formation of a given memory (the neuronal memory trace). Selective targeting and deletion of these neurons using an injectable, inducible neurotoxin blocked all learned memories.
Eventually, of course, all of this body of science will coalesce into a more coherent picture of how memories are formed, what subsets of neurons in which portions of the brain store them, and what molecules and proteins we can manipulate to control, enhance or erase memory altogether. But that still leaves us to grapple with some very powerful and comprehensive bioethical dilemmas. Assuming that this translates into a medical procedure or pharmaceutical treatment for memory manipulation, who will regulate it? How will rules be established to regulate how far to take this therapy? Is memory erasure the equivalent of altering our personalities, the essence of who we are, a psychological lobotomy? Most importantly, however, is the question of how much we need memories, even painful, negative ones, to build the cornerstones of human morality, empathy, and the absolute meaning of right and wrong.
Sheena Jocelyn, one of the researchers involved in the University of Toronto study, acknowledged the bifurcated ethical implications of the research: “Our experiences, both good and bad, teach us things,” she said. “If we didn’t remember that the last time we touched a hot stove we got burned, we would be more likely to do it again. So in this sense, even memories of bad or frightening experiences are useful. However, there are some cases in which fearful memories become maladaptive, such as with post-traumatic stress disorder or severe phobia. Selectively erasing these intrusive memories may improve the lives of afflicted individuals.” In fact, Anjan Chatterjee, M.D., a neuroethicist at the University of Pennsylvania Ethics Center, penned an incredibly prescient piece two years ago that equated psychological mitigation of painful memories to “cosmetic neurology”. “If, as many religions and philosophies argue, struggle and even pain are important to the development of character,” Dr. Chatterje asks, “Does the use of pharmacological interventions to ameliorate our struggles undermine this essential process?”
To shed some light of this ethical quandary, ScriptPhD.com enlisted the help of Mary Devereaux, PhD, a bioethics expert at The Center for Ethics in Science and Technology in San Diego, CA and Peter Wagner, MD, a professor in the Schools of Medicine and Bioengineering at UCSD.
To continue reading these enlightening interviews, click “Continue Reading”…
Interview #1: Peter Wagner, MD, biomedical/pharmacological ethics of memory erasure
1) When an issue presents an ethical dilemma to the medical community, what is the role of bioethicists and the American Medical Association in any FDA approval process for a treatment or drug?
Both AMA and FDA organizations would be the best source of answers to these questions, not me.
That said, my understanding of the general process for bringing a treatment or drug to market is something like this:
The investigator wishing to gain FDA approval will have had the product go through a series of clinical trials (Phases I, II, III etc) that in a sequential manner establish safety and efficacy and also often side effects and indications and exclusions, all over some period of time. These trials may well have had their experimental protocols in part dictated by the FDA itself. The FDA has expert panels to review the trial data and render their verdict. As far as I am aware, the investigator would have the choice of including an ethicist in the process. I do not know whether the FDA gets involved in ethics issues, even at a high level, but I suspect not. This would be a very slippery slope and deciding what constitutes a trigger for FDA involvement might create more problems than ethical intervention would solve. Thus, genetic testing technology exists, but the ethical issues of being able to obtain specific genetic information does not seem to be part of the FDA piece. Ditto for organ transplant and gene therapy and their risks versus benefits. I also suspect that the AMA would not be involved until and unless, after FDA approval and only when the treatment/drug was now being used, there was reported some major medical dilemma that would prompt them to make a statement.
I would point out that there are many cases of treatments getting FDA approval after research suggesting safety and efficacy, yet which, after more experience in the field, were found to have serious risks not evident at the time of approval. How many years should pass and how many patients should be studied before such longer term risks are declared non-existent? During which time, many patients who would benefit from the new treatment are denied access to it? That is an ethical dilemma intrinsic to treatment development and approval, and it can never go away.
2) When anesthesia and epidurals first became available, there were many people who resisted their use on the similar grounds… not sure of the safety, pain as strengthening character. Couldn’t you argue that once we become familiar enough with the usage of such a technology, we might look back on our reticence in the same light?
I would separate medical from ethical issues here as much as possible: safety is a medical issue, and the inventor, the FDA, and the user (surgeon, prescriber etc) all have major, cut and dried, responsibilities to maximize safety during development and use. Ethics enters the room in deciding when a treatment has been tested enough to be sure bad side effects have “reasonably” been identified, as stated above.
Putting up with pain is different – to me that has to be a choice for each person to make based on balancing their own beliefs, their own informed concerns over safety and side effects and in this example, their pain tolerance. Medical professionals have the absolute responsibility of informing the patient of risks and benefits honestly and accurately, but the patient must be responsible for making the choice. Ethics comes in here when the professional misinforms the patient through ignorance or malfeasance.
3) If memory erasure becomes a medical reality, what kind of evaluations or consultations would a patient have to undergo before being allowed to undergo this procedure? What can possibly prepare a patient to mentally comprehend that their memories will be gone?
Nothing is ever simple, and memory manipulation may seem thornier to grapple with than something less mysterious such as heart surgery. Thus, messing with the mind conjures up images of brainwashing by mad scientists; heart surgery just opens clogged vessels but with well-defined physical risks. Yet my answer to your question is exactly as above – the caregiving health professional has to give the patient a detailed, honest and informed account of the risks and benefits. Then the patient has to be the one to decide. In either case, if I were a prospective patient, I would ask a bunch of questions. They would clearly be different between heart surgery and memory erasure. I would want to know if the memory treatment was permanent, would it wipe out good memories along with the bad, could I still form new memories going forward, would it have neural effects on other brain functions from emotional reaction to taste and smell to motor control to control of heart rate – and so on and so on. But the core principle seems to be no different than for heart surgery: properly informed consent so the patient can weigh the risks and benefits and balance them to reach their own decision. In the case of memory erasure, the unknowns may be so many and profound that for many, I expect the answer would be “no thanks”.
The more complicated we try and make the rules imposed on others, the more it actually becomes an ethical problem for us all.
The FDA has to be responsible for regulating treatment availability by requiring and evaluating the studies of treatment development and its job is to be as certain as possible that the efficacy and side effects have all been identified and risks quantified, such that the risk/benefit ratio is acceptable. While the FDA has to grapple with the ethics of what is the right balance of risks to benefits, it should not be charged with societal ethics or moral issues of treatment choice once a treatment is available.
The physician has to be responsible for informing the patient about treatment options and his/her ethical responsibility is to be complete, honest, accurate and unbiased.
The patient then has the responsibility of accepting or rejecting the treatment, be it for heart surgery or memory erasure.
Interview #2: Mary Devereaux, PhD
The ScriptPhD: Back when [Eternal Sunshine of the Spotless Mind] came out, the research hadn’t caught up. And one thing that kind of caught my attention was that now this has actually been done in the lab. I mean, in rats and in mice, but they served as an excellent template for human research. And so there’s no reason to believe that a researcher would not cross over and say, “Well if we can do this in mice, and if there are these chemicals that we can manipulate, or groups of cells….” What interests me is not necessarily the idea of “Can we do the research?” because I think technology races ahead; that’s not something that we can stop. I’m more interested in stopping and saying to ourselves, “What are the long-term ramifications of this and what are some important questions to ask before we race ahead?” And so my first question is a very general one. What is the relationship between memory and self? Between memory erasure and self? And then I’d go on to ask, in erasing painful memories, do we not irrevocably alter what we define as the major component that constitutes personality, and do we have the right to do that? And the reason that I ask this question is that I think it’s an important one to consider before doing something this drastic. What are your thoughts on this?
Mary Devereaux: Well, you know I tend to respond to these things in terms of ethics, because I’m a bioethicist. So, my first question before we get into the more philosophical things, from an ethical point of view, would be, “How good is the technology? How well does it work?” And I think the answer to that is “We don’t know.” That something works in mice doesn’t mean that it works in people. But in order to establish that it works in people, you would have to attempt it. I mean, I take it you have to run some kind of clinical trial. And that raises questions of safety, doing this kind of thing in human beings. Where, I take it, the aim would be to target, or erase, specific memories. But you might get this wrong.
SPhD: Mmm-hmm. Absolutely.
MD: So I think there are real questions about whether it would work and if it would work, how safe it is and how you’re going to establish that with human subjects. Of course you would need to get people’s informed consent and they’d need to understand the risks just like they would for any other kind of scientific research. In terms of actually targeting specific memories, where the idea is to erase those memories, one of my first questions would be about the coherence of the sort of narrative that’s left. That is, supposing, as in a lot of the discussion, we target memories that have to do with trauma, something like the kinds of things that lead to post-traumatic stress disorder. For example, somebody is attacked in the street, or somebody has a particular war memory. I’m not sure what happens here. If what’s left [is], “I had been walking down the street, somebody asks me for the time, and the next thing I know, I wake up in the hospital and I have all of these bruises and maybe I’ve been shot and so on, but I have no recollection of this.” So when you move to thinking about the impact of memory erasure on the self, there are scientific questions that, until you answer those questions, make it very difficult to answer your more general questions about what this is going to mean of ethics or personhood.
SPhD: Absolutely. But I think the point you raise about erasing a war memory, let’s talk about that in more depth. Because it leads really well into my second question. Let’s look at an instance where you have someone who has survived, let’s say Hurricane Katrina, which was a tremendously stressful event. Or they’ve come back from the Iraq War and they have images in their mind that are literally causing them to not be able to live a normal life. Or you’ve been raped—look at the survivors of the Darfur Crisis and what they’re having to look at on a daily basis. One of the things I’ve been reading about in the literature and the ethical literature, is if you look across religions, if you look across cultures, it is written about pain, and painful memories, as something that is a shared human experience. That it brings people together, it causes them to bond, let’s say over the death of a loved one. Another important thing that I’ve seen brought up is that it acts as a social deterrent. For example, if you have something like the Holocaust, and it’s so painful for people that they just choose to eradicate it from their memories, how does that give you the impetus to prevent something like it from happening again? Or to codify moral imperatives as a society? And so the question that I have for you is there a danger in making it really easy, [assuming that the technology is safe], should you do it? Because although there are these incredibly painful things that we go through as human beings, in a way, there’s this risk of numbing us as a society.
MD: Wellll, I think that seems like it’s jumping ahead in two ways. One thing is that there’s a sort of pattern of argument that is constantly used when talking about human enhancements. In a way, it’s kind of funny to talk about memory erasure as a sort of cognitive enhancement because you’re taking something away, but in another sense, you could clearly use the same technology to improve memory. But in taking away something that’s painful, you’re also improving the quality of someone’s cognitive or emotional life. So that too is an enhancement. My one response is that we always say, “Let’s assume that it’s safe and that it’s effective.” I think that’s a big assumption. And I think we too readily make that assumption. I think we’re a long way from it being safe and effective. So that’s point #1. The second point is that even if I give you your assumption, let’s assume that memory erasure works, we haven’t harmed anyone in demonstrating scientifically that it works, and now we have something that not only works, but it works safely. Well, it seems to me your question is sliding between two levels. One level of question is should we do this in individual cases for very specific memories? So, [a fictitious person named] Sarah Jones at Stanford is brutally attacked some night by a group of rowdy people on campus. And we now have the expertise that we can target and remove that memory. That’s a very different kind of question from a question like your example or the Holocaust or Hurricane Katrina. There you’re not talking about targeting individual memories in particular individuals. You’re talking about much more than a given particular memory. You’re talking about a whole historical event. Which is days, if not years, of activity. But the other thing is that you’re talking about, I mean, to erase the memory of the Holocaust, you would be talking about having to erase everyone—almost everyone living’s — memory in some specific way. And that seems to me …
SPhD: I think what I meant was more like an iterative effect. OK, let’s bring it on a much more simple level, like your example. [Fictitious] Sarah Jones is raped at Stanford. And we can erase that memory for her. It still kind of goes to asking about codifying moral imperatives. Because if it becomes easy enough to just erase her memory—I hate to say this, this is a horrible thing to say, but just bear with me for ethical purposes—then why does her rape seem as painful? In a way it seems—
MD: I get what you’re saying. Why punish these young guys who are all on their way to becoming engineers and senators, and they didn’t beat her up, they just raped her. And she didn’t get pregnant, and she didn’t get any sexually transmitted diseases, and now we can remove the memory. So might this actually change our view of the moral awfulness of what they’ve done?
SPhD: Yeah, and what’s to prevent future people from committing this crime? Because part of the horror of what we go through also prevents us from hurting other people. I do think that there’s a certain level of morality that is intrinsic, if you really take religion out of the question, or how we codify our moral standards. I think that human beings, because of our ability to feel and think and process these events, and to store them as memories if we really want to talk about it like that, I think it acts as a certain impetus to not hurt other people. Well, if you can just take [Sarah] Jones to the hospital, and take away her memory—maybe the Holocaust was [too big] of an example—but on an iterative scale—
MD: I think Hurricane Katrina is a much better example.
SPhD: Hurricane Katrina is an EXCELLENT example. But what I was worrying about on an iterative scale is that when all of these add up, that there’s a numbing effect, that what makes rape [as an example] so horrible now if you can just wipe it out of your mind, or any other terrible event for that matter?
MD: Well one thing is it’s horrible for those people who do it. It affects other people’s safety and so on. And it isn’t usually the case that there are no physical or other kinds of consequences. But the other thing is that putting the question this way suggests that if we open the door to the cases we started with, the rape and the Iraqi soldier, both of whom are having symptoms, say, of PTSD, then it will be available to everyone who has ANY kind of painful memory. And I don’t think that follows. I’m also not sure that I share your intuition that the main thing that keeps us from hurting each other is our memory of ourselves being hurt.
SPhD: But there are consequences. And so if you take, for example, the Iraqi soldier. War is horrible. It’s absolutely awful. And if you erase those memories from them [so easily], in a way it is a bit of a numbing down.
MD: Well that’s true. I mean, if you could simply send people off to war and they come back and not remember the two years at all, or however many years they were serving. On the other hand, then part of the appeal of military life, the kind of experience, the good things you gain from those experiences, it would also, I take it, be gone. In short, I’m not sure that you could eliminate something that—I mean, I’m not sure how, scientifically, they locate these memories and how they target them.
SPhD: That’s why I share your concern.
MD: In mice, you administer a shock, they learn to avoid a particular place on the floor of the cage. And then you eradicate that memory. But in order to eradicate that memory, you have to be able to target it. Now, with the mice, it may not matter if we’re targeting the rest of their memory. I mean, do we know, for example, that in these mice experiments, that the only thing the mice have lost is the memory of how to avoid the shock? I mean, they might have also lost where to find their water and where to find their food, whereas with human beings, we’re going to have to be much more fine-grained. We want to lose the memory of, say, the rape, but we don’t want to lose the memory of who we are or who our friends were when we left the bar.
SPhD: And that is the collateral. It could be that we can just never get the technology perfect. There’s a risk with everything we do medically. And so that’s why I think some of the issues you bring forth are really important. And you know, we’ve been talking about these horrific examples, and you can see why there’s a definite yin and yang to it, in the sense that you can really help people who are suffering, but it really begs you to ask these very fundamental questions of how you erase these memories and what is the collateral.
MD: Well, I think you’re right, because there are different kinds of memory. I mean, there’s memory of an event, like that I had dinner last night with such-and-such. But then, I take it there’s memory of a more general kind of experience. But personally, I do think that the real issues concern how you would establish the effectiveness of removing specific memories, and how you would do this safely. My own guess is that if we were able to do this at all, that it would be used very sparingly. Because the risks would be really significant. I mean, would you want somebody to put a probe into your brain to try to identify the correct memory? Well, maybe if you couldn’t sleep and you couldn’t eat and you couldn’t study, then you would be willing to risk that. But if it’s for something more moderate than that, you know the boyfriend or the girlfriend you don’t want to remember anymore—
SPhD: Well, that actually, you’re leading into my next question really well, which is given the assumption, and I think it’s a very fair assumption, that we’re never going to have 100% effectiveness with anything, and the human mind is so complex, how are you ever going to be able to pinpoint a perfect technology—
MD: Well, but it doesn’t have to be perfect. Most of what we do in medicine isn’t perfect. It just has to be good enough.
SPhD: Exactly. Good enough. Let’s say it’s good enough. How do you determine, then, that a certain memory is painful enough that it warrants erasing all others? If you’re the physician or the psychologist working with the physician, how do you assess that? Where is that line drawn in the sand? And I think that’s another really important question to ask.
MD: But from an ethical point of view, that wouldn’t be up to the physician. I mean the physician or the psychologist, I take it, would say to the patient, “You’re clearly suffering from a severe form of PTSD. You’ve not responded to our standard-of-care strategies. We do have something new that’s available. It has higher risks, but since these other things haven’t worked, you might want to consider it.” And then it would really be up to the patient who would be informed of the risks and benefits, and then you’d have to do informed consent, and all of that kind of thing beforehand, just like with any other surgery.
SPhD: That’s a really tough thing with informed consent. That’s actually something that Dr. Wagner talked about a lot, which is that ultimately, it REALLY comes down to the patient. And to me, it’s like being able to grasp the idea that in a state of suffering, I just feel like how can you ever fully comprehend the idea that all of your memories to date might be erased?
MD: But they wouldn’t be doing THAT. They could never. I mean, if that was the program, nobody would sign on for it. I think the only discussion about memory erasure technology has been targeting specific memories. Because if you were wiping somebody’s memory, I mean what is that show—Dollhouse—where they DO wipe people’s memories and so on, no one would agree. So no, I think this is something that’s [potentially] used to target specifically distressing memories. Of course the other question might be, and it might be interesting as a background for what it is you’re writing or posting, and that is, what is the state of PTSD research and therapy now? Because from what I understand, there are reasonably effective desensitizing or de-briefing strategies that people are developing for PTSD. And so, I don’t need you to put a probe in my brain if there are other ways we can de-intensify my memory and my suffering. And I think there are strategies. I think there are various behavioral kinds of things of bringing up memories in various sorts of ways that are effective. So I think it would be interesting to look at the risks and benefits of memory erasure in comparison to what else the neurosciences are allowing us to understand and do.
SPhD: Absolutely! I agree with you. I heard you earlier use the phrase “cognitive enhancement” which is another really interesting aspect of this. The article by Dr. Anjan Chatterjee was really interesting. He’s over at the University of Pennsylvania Ethics Center. And he really seemed to take a stance of, this is no different than, and I don’t want to misquote him, but based on his article, he really likened it to athletes taking steroids to enhance performance. And out of that article derived this very important question, which is, is it psychotropic evolution that presents this unfair advantage to those people for whom it’s either affordable or accessible?
MD: Well the answer to that is yes, and yes. But in that sense, cognitive enhancement or any of these enhancements are no different from almost everything else we’re doing in medicine. There’s very little in the United States medical system that isn’t equally, if not more, unfair. I mean, I don’t know if you’re familiar with this, but apparently, there’s literature suggesting that if you look at soldiers in the same theater of operations, that those who are officers as opposed to unlisted actually have lower rates of PTSD. And I think there’s some suggestion that they have perhaps more resources for self-understanding or managing stress or what have you. So it’s not just money, there are other things—education, psychological resilence, and how people were brought up and what kind of security they had as children, and what education they have, and so on. So yes I think it’s very unfair, but again, I don’t think that’s saying anything specific about cognitive enhancement because all kinds of—almost all medical care in the United States is unfair in that respect. Unless…well…
SPhD: Well, unless you have access to it, and in our country unless you have insurance, which is another—I mean, it just goes back to the idea of inequity in general in Western society and even within Western society within subsets of it. But I thought that particular question is an interesting one if you look at the metadata, and if you look at this kind of 10, 15, 20, 50 years down the line, which is why I was asking some of those questions long-term consequences of just being able to say, “Poof! Bad memory gone!” There is a sort of evolution of the mind there in terms of personality, in terms of consequence of bad things, in terms of how we relate to each other, of being able to bond with another human being over a tragedy as opposed to saying, “Doctor, doctor, make it go away.”
MD: Well, I mean I think if you’re really going to pursue this, then you need to distinguish between shared experience and empathy. Sometimes we commiserate with another because we in fact have experienced the same things. Somebody tells you that they’ve had a heart attack, and you’ve had one, so you know exactly what that’s like. But the other case is where somebody loses the use of their legs and we have our legs and we have no idea what that’s like but we’re able to empathize anyway through the power of the imagination. And I think a lot of what you’re talking about in terms of moral connection and community and how we bond to each other, I don’t think depends upon direct shared experiences of actual events. I think a lot of it comes from watching things, from literature, from storytelling—
SPhD: And I think 9/11 was such a great example of that. There were people who were thousands of miles away from 9/11 and it still brought communities together.
MD: Exactly. That’s a very good example. And the same thing with Katrina, although maybe less so. I think 9/11 really united the country in a way that the slogan “We’re all New Yorkers” exemplified. That’s exactly the kind of thing I’m talking about.
SPhD: And that’s sort of uplifting to think about. Because in researching this topic, it did get me to thinking of, gosh, there’s a nightmare scenario of living in this numb world of erased memories and, I don’t know, it’s kind of a very futuristic, sort of dark, Sci-Fi way of looking at it. And maybe that’s my cynical nature, but it is good to know I think that there is this delineation, and that in a way we would always be impervious to being [completely] numb. There is something in the human existence that even if you were to erase a memory, you can’t wipe away people’s ability to come together, to have this shared experience. And that is wonderful, I think.
I’d like to give a sincere and heartfelt thanks to Dr. Peter Wagner and Dr. Mary Devereaux for the engaging discussion and now open the floor to you, faithful readers. Feel free to comment or give your own opinion on what we’ve been discussing, or bring forth your own ethical concerns!
]]>