History abounds with examples of unsung science heroes, researchers and visionaries whose tireless efforts led to enormous breakthroughs and advances, often without credit or lasting widespread esteem. This is particularly true for women and minorities, who have historically been under-represented in STEM-related fields. English mathematician Ada Lovelace is broadly considered the first great tech and computing visionary — she pioneered computer programming language and helped construct what is considered the first computing machine (the Babbage Analytical Engine) in the mid-1800s. Physical chemist Dr. Rosalind Franklin performed essential X-ray crystallography work that ultimately revealed the double-helix shape of DNA (Photograph 51 is one of the most important images in the history of science). Her work was shown (without her permission) to rival King’s College biology duo Watson and Crick, who used the indispensable information to elucidate and publish the molecular structure of DNA, for which they would win a Nobel Prize. Dr. Percy Julian, a grandson of slaves and the first African-American chemist ever elected to the National Academy of Sciences, ingeniously pioneered the synthesis of hormones and other medicinal compounds from plants and soybeans. New movie Hidden Figures, based on the exhaustively researched book by Margot Lee Shetterley, tells the story of three such hitherto obscure heroes: Katherine Johnson, Dorothy Vaughn and Mary Jackson, standouts in a cohort of African-American mathematicians that helped NASA launch key missions during the tense 19060s Cold War “space race.” More importantly, Hidden Figures is a significant prototype for purpose-driven popular science communication — a narrative and vehicle for integrated multi-media platforms to encourage STEM diversity and scientific achievement.
The participation of women in astrophysics, space exploration and aeronautics goes back to the 1800s at the Harvard College Observatory, as chronicled by Dava Sobell in The Glass Universe, a companion book to Hidden Figures. These women, every bit as intellectually capable and scientifically curious as their male counterparts, took the only opportunity afforded to them, as human “computers,” literally calculating, measuring and analyzing the classification of our universe. By the 1930s, the National Advisory Committee for Aeronautics, a precursor that would be subsumed by the creation of NASA in 1958, hired five of these female computers for their Langley aeronautical building in Hampton, Virginia. Shortly after World War II, with the expansion of the female workforce and war fueling innovation for better planes, NACA began hiring college-educated African American female computers. They were segregated to the Western side of the Langley campus (referred to as the “West Computers”), and were required to use separate bathroom and dining facilities, all while being paid less to do equal work as their white counterparts. Many of these daily indignities were chronicled in Hidden Figures. By the 1960s, the Space Program at NASA was defined by the two biggest sociopolitical events of the era: the Cold War and the Civil Rights Movement. Embroiled in an anxious race with Soviet astronauts to launch a man in orbit (and eventually, to the Moon), NASA needed to recruit the brightest minds available to invent seemingly impossible math to make the mission possible. Katherine Goble (later Johnson), was one of those minds.
Katherine Johnson (portrayed by Taraji P. Henson) was a math prodigy. A high school freshman by the time she was 10 years old, Johnson’s fascination with numbers led her to a teaching position, and eventually, as a human calculator at the Langley NASA facility. Hand-picked to assist the Space Task Group, portrayed in the movie as Al Harrison (Kevin Costner), a fictionalized amalgamation of three directors Johnson worked with in her time at NASA, she had to traverse institutionalized racism, sexism and antagonistic collaborators in her path. Johnson would go on to calculate trajectories that sent both Alan Shepard and John Glenn into space, as well as key data for the Apollo Moon landing. Supporting Johnson are her good friends and fellow NASA colleagues Dorothy Vaughan (Octavia Spencer) and Mary Jackson (Janelle Monáe). Vaughan herself was a NASA pioneer, becoming the first black computing section leader and IBM FORTRAN programming expert. Jackson became the first black engineer at NASA, getting special permission to take advanced math courses in a segregated school.
Katherine Johnson’s legacy in science, mathematics, and civil rights cannot be understated. Current NASA chief Charles Bolden thoughtfully paid tribute to the iconic role model in Vanity Fair. “She is a human computer, indeed, but one with a quick wit, a quiet ambition, and a confidence in her talents that rose above her era and her surroundings,” he writes. The Langley NASA facility where she broke barriers and pioneered discovery honored Johnson by dedicating the building in her name last May. Late in 2015, Johnson was bestowed with a Presidential Medal of Freedom by President Barack Obama.
Featured prominently in Hidden Figures, technology giant IBM has had a long-standing relationship with NASA ever since the IBM 7090 became the first computing mainframe to be used for flight simulations, with the iconic System/360 mainframe engineering the Apollo Moon landing. Although IBM mainframes are no longer in use for mathematical calculations at NASA, they are partnering through the use of artificial intelligence for space missions. IBM Watson has the capability to sift through thousands of pages of information to get pilots critical data in real time and even monitor and diagnose astronauts’ health as a virtual/intelligence agent.
More importantly, IBM is taking a leadership role in developing STEM outreach education programs and a continued commitment to diversifying the technology workforce for the demands of the 21st Century. 50 years after Katherine Johnson’s monumental feats at NASA, the K-12 achievement gap between white and black students has barely budged. Furthermore, a 2015 STEM index analysis shows that even as the number of STEM-related degrees and jobs proliferates, deeply entrenched gaps between men and women, and an even wider gap between whites and minorities, remain in obtaining STEM degrees. This is exacerbated in the STEM work force, where diversity has largely stalled and women and minorities remain deeply under-represented. And yet, technology companies will need to fill 650,000 new STEM jobs (the fastest growing sector) by 2018, with the highest demand overall for so-called “middle-skill” jobs that may only require technical or community college preparation. Launched in 2011 by IBM, in collaboration with the New York Department of Education, P-TECH is an ambitious six-year education model predominantly aimed at minorities that combines college courses, internships and mentoring with a four year high school education. Armed with a combined high school and associates’ degree, these students would be immediately ready to fill high-tech, diverse workforce needs. Indeed, IBM’s original P-TECH school in Brooklyn has eclipsed national graduation rates for community college degrees over a two-year period, with the technology company committing to widely expanding the program in the coming years. Technology companies becoming stakeholders in, and even innovators of, educational models and partnerships can have profound impacts in innovation, economic growth and diminishing poverty through opportunity.
Dovetailing with the release of Hidden Figures, IBM has also partnered with The New York Times to launch their first augmented reality experience. Combining advocacy, outreach and data mining, the free, downloadable app called “Outthink Hidden” combines the inspirational stories portrayed in Hidden Figures with digitally-interactive content to create a PokemonGo-style nationwide hunt about STEM figures, historical leaders, places and areas of research across the country. The app can be used interactively at 150 locations in 10 U.S. cities, STEM Centers (such as NASA Langley Research Center and Kennedy Space Center) and STEM Universities to learn not just about the three mathematicians featured in Hidden Figures, but also other diverse STEM pioneers. Coupled with the powerful wide impact of Hollywood storytelling and a complimentary book release, “Outthink Hidden” could be an important prototype for engaging young tech-savvy students, possibly even in organized, classroom environments, and promoting interest in exploring STEM education, careers and mentorship opportunities.
There are no easy solutions for reforming STEM education or diversifying the talent pool in research labs and technology companies. But we can provide compelling narrative through movies and TV shows, and, increasingly, digital content. Perhaps the first step to inspiring and cultivating the next Katherine Johnson is simply to start by telling more stories like hers.
View a trailer for Hidden Figures:
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
It has become compulsory for modern medical (or scientifically-relevant) shows to rely on a team of advisors and experts for maximal technical accuracy and verisimilitude on screen. Many of these shows have become so culturally embedded that they’ve changed people’s perceptions and influenced policy. Even the Gates Foundation has partnered with popular television shows to embed important storyline messages pertinent to public health, HIV prevention and infectious diseases. But this was not always the case. When Neal Baer joined ER as a young writer and simultaneous medical student, he became the first technical expert to be subsumed as an official part of a production team. His subsequent canon of work has reshaped the integration of socially relevant issues in television content, but has also ushered in an age of public health awareness in Hollywood, and outreach beyond it. Dr. Baer sat down with ScriptPhD to discuss how lessons from ER have fueled his public health efforts as a professor and founder of UCLA’s Global Media Center For Social Impact, including storytelling through public health metrics and leveraging digital technology for propelling action.
Neal Baer’s passion for social outreach and lending a voice to vulnerable and disadvantaged populations was embedded in his genetic code from a very young age. “My mother was a social activist from as long as I can remember. She worked for the ACLU for almost 45 years and she participated in and was arrested for the migrant workers’ grape boycott in the 60s. It had a true and deep impact on me that I saw her commitment to social justice. My father was a surgeon and was very committed to health care and healing. The two of them set my underlying drives and goals by their own example.” Indeed, his diverse interests and innate curiosity led Baer to study writing at Harvard and the American Film Institute and eventually, medicine at Harvard Medical School. Potentially presenting a professional dichotomy, it instead gave him the perfect niche — medical storytelling — that he parlayed into a critical role on the hit show ER.
During his seven-year run as medical advisor and writer on ER, Baer helped usher the show to indisputable influence and critical acclaim. Through the narration of important, germane storylines and communication of health messages that educated and resonated with viewers, ER‘s authenticity caught the attention of the health care community and inspired many television successors. “It had a really profound impact on me, that people learn from television, and we should be as accurate as possible,” Baer reflects. “[Viewers] believe it’s real, because we’re trying to make it look as real as possible. We’re responsible, I think. We can’t just hide behind the façade of: it’s just entertainment.” As show runner of Law & Order: SVU, Baer spearheaded a storyline about rape kit backlogs in New York City that led to a real-life push to clear 17,000 backlogged kits and established a foundation that will help other major US cities do the same. With the help of the CDC and USC’s prestigious Norman Lear Center, Baer launched Hollywood, Health and Society, which has become an indispensable and inexhaustible source of expert information for entertainment industry professionals looking to incorporate health storylines into their projects. In 2013, Baer co-founded the Global Media Center For Media Impact at UCLA’s School of Public Health, with the aim of addressing public health issues through a combination of storytelling and traditional scientific metrics.
Soda Politics
One of Baer’s seminal accomplishments at the helm of the Global Media Center was convincing public health activist Marion Nestle to write the book Soda Politics: Taking on Big Soda (And Winning). Nestle has a long and storied career of food and social policy work, including the seminal book Food Politics. Baer first took note of the nutritional and health impact soda was having on children in his pediatrics practice. “I was just really intrigued by the story of soda, and the power that a product can have on billions of people, and make billions of dollars, where the product is something that one can easily live without,” he says. That story, as told in Soda Politics, is a powerful indictment on the deleterious contribution of soda to the United States’ obesity crisis, environmental damage and political exploitation of sugar producers, among others. More importantly, it’s an anthology of the history of dubious marketing strategies, insider lobbying power and subversive “goodwill” campaigns employed by Big Soda to broaden brand loyalty.
Even more than a public health cautionary tale, Soda Politics is a case study in the power of activism and persistent advocacy. According to a New York Times expose, the drop in soda consumption represents the “single biggest change in the American diet in the last decade.” Nestle meticulously details the exhaustive, persistent and unyielding efforts that have collectively chipped away at the Big Soda hegemony: highly successful soda taxes that have curbed consumption and obesity rates in Mexico, public health efforts to curb soda access in schools and in advertising that specifically targets young children, and emotion-based messaging that has increased public awareness of the deleterious effects of soda and shifted preference towards healthier options, notably water. And as soda companies are inevitably shifting advertising and sales strategy towards , as well as underdeveloped nations that lack access to clean water, the lessons outlined in the narrative of Soda Politics, which will soon be adapted into a documentary, can be implemented on a global scale.
ActionLab Initiative
Few technological advancements have had an impact on television consumption and creation like the evolution of digital transmedia and social networking. The (fast-crumbling) traditional model of television was linear: content was produced and broadcast by a select conglomerate of powerful broadcast networks, and somewhat less-powerful cable networks, for direct viewer consumption, measured by demographic ratings and advertising revenue. This model has been disrupted by web-based content streaming such as YouTube, Netflix, Hulu and Amazon, which, in conjunction with fractionated networks, will soon displace traditional TV watching altogether. At the same time, this shifting media landscape has burgeoned a powerful new dynamic among the public: engagement. On-demand content has not only broadened access to high-quality storytelling platforms, but also provides more diverse opportunities to tackle socially relevant issues. This is buoyed by the increased role of social media as an entertainment vector. It raises awareness of TV programs (and influences Hollywood content). But it also fosters intimate, influential and symbiotic conversation alongside the very content it promotes. Enter ActionLab.
One of the critical pillars of the Global Media Center at UCLA, ActionLab hopes to bridge the gap between popular media and social change on topics of critical importance. People will often find inspiration from watching a show, reading a book or even an important public advertising campaign, and be compelled to action. However, they don’t have the resources to translate that desire for action into direct results. “We first designed ActionLab about five or six years ago, because I saw the power that the shows were having – people were inspired, but they just didn’t know what to do,” says Baer. “It’s like catching lightning in a bottle.” As a pilot program, the site will offer pragmatic, interactive steps that people can implement to change their lives, families and communities. ActionLab offers personalized campaigns centered around specific inspirational projects Baer has been involved in, such as the Soda Politics book, the If You Build It documentary and a collaboration with New York Times columnist Nicholas Kristof on his book/documentary A Path Appears. As the initiative expands, however, more entertainment and media content will be tailored towards specific issues, such as wellness, female empowerment in poor countries, eradicating poverty and community-building.
“We are story-driven animals. We collect our thoughts and our memories in story bites,” Baer notes. “We’re always going to be telling stories. We just have new tools with which to tell them and share them. And new tools where we can take the inspiration from them and ignite action.”
Baer joined ScriptPhD.com for an exclusive interview, where he discussed how his medical education and the wide-reaching impact of ER influenced his social activism, why he feels multi-media and cross-platform storytelling are critical for the future of entertainment, and his future endeavors in bridging creative platforms and social engagement.
ScriptPhD: Your path to entertainment is an unusual one – not too many Harvard Medical School graduates go on to produce and write for some of the most impactful television shows in entertainment history. Did you always have this dual interest in medicine and creative pursuits?
Neal Baer: I started out as a writer, and went to Harvard as a graduate student in sociology, [where] I started making documentary films because I wanted to make culture instead of studying it from the ivory tower. So, I got to take a documentary course, and it’s come full circle because my mentor Ed Pinchas made his last film called “One Cut, One Life” recently and I was a producer, before his demise from leukemia. That sent me to film school at the American Film Institute in Los Angeles as a directing fellow, which then sent me to write and direct an ABC after-school special called “Private Affairs” and to work on the show China Beach. I got cold feet [about the entertainment industry] and applied to medical school. I was always interested in medicine. My father was a surgeon, and I realized that a lot of the [television] work I was doing was medically oriented. So I went to Harvard Medical School thinking that I was going to become a pediatrician. Little did I know that my childhood friend John Wells, who had hired me on China Beach, would [also] hire me on “ER” by sending me the script, originally written by Michael Crichton in 1969, and dormant for 25 years until it was discovered in a trunk owned by Steven Spielberg. [Wells] asked me what I thought of the script and I said “It’s like my life only it’s outdated.” I gave him notes on how to update it, and I ultimately became one of the first doctor-writers on television with ER, which set that trend of having doctors on the show to bring verisimilitude.
SPhD: From the series launch in 1994 through 2000, you wrote 19 episodes and created the medical storylines for 150 episodes. This work ran parallel to your own medical education as a medical student, and subsequently an intern and resident. How did the two go hand in hand?
NB: I started writing on ER when I was still a fourth year medical student going back and forth to finish up at Harvard Medical School, and my internship at Children’s Hospital of Los Angeles over six years. And I was very passionate about bringing public health messages to the work that I was doing because I saw the impact that television can have on the audience, particularly the large numbers of people that were watching ER back then.
I was Noah Wylie’s character Dr. Carter. He was a third year [medical student], I was a fourth year. So I was a little bit ahead of him, and I was able to capture what it was like to be the low person on the totem pole and to reflect his struggles through many of the things my friends and I had gone through or were going through. Some of my favorite episodes we did were really drawn on things that actually happened. A friend of mine was sequestered away from the operating table but the surgeons were playing a game of world capitals. And she knew the capital of Zaire, when no one else did, so she was allowed to join the operating table [because of that]. So [we used] that same circumstance for Dr. Carter in an episode. Like you wouldn’t know those things, you had to live through them, and that was the freshness that ER brought. It wasn’t what you think doctors do or how they act but truly what goes on in real life, and a lot of that came from my experience.
SPhD: Do you feel like the storylines that you were creating for the show were germane both to things happening socially as well as reflective of the experience of a young doctor in that time period?
NB: Absolutely. We talked to opinion leaders, we had story nights with doctors, residents and nurses. I would talk to people like Donna Shalala, who was the head of the Department of Health and Human Safetey. I asked the then-head of the National Institutes of Health Harold Varmus, a Nobel Prize winner, “What topics should we do?” And he said “Teen alcohol abuse.” So that is when we had Emile Hirsch do a two-episode arc directly because of that advice. Donna Shalala suggested doing a story about elderly patients sharing perscriptions because they couldn’t afford them and the terrible outcomes that were happening. So we were really able to draw on opinion leaders and also what people were dealing with [at that time] in society: PPOs, all the new things that were happening with medical care in the country, and then on an individual basis, we were struggling with new genetics, new tests, we were the first show to have a lead character who was HIV-positive, and the triple cocktail therapy didn’t even come out until 1996. So we were able to be path-breaking in terms of digging into social issues that had medical relevance. We had seen that on other shows, but not to the extent that ER delved in.
SPhD: One of the legacies of a show like ER is how ahead of its time it was with many prescient storylines and issues that it tackled that are relevant to this very day. Are there any that you look back on that stand out to you in that regard as favorites?
NB: I really like the storyline we did with Julianna Margulies’s character [Nurse Carole Hathaway] when she opened up a clinic in the ER to deal with health issues that just weren’t being addressed, like cervical cancer in Southeast Asian populations and dealing with gaps in care that existed, particularly for poor people in this country, and they still do. Emergency rooms [treating people] is not the best way efficiently, economically or really humanely. It’s great to have good ERs, but that’s not where good preventative health care starts. The ethical dilemmas that we raised in an episode I wrote called “Who’s Happy Now?” involving George Clooney’s character [Dr. Doug Ross] treating a minor child who had advanced cystic fibrosis and wanted to be allowed to die. That issue has come up over and over again and there’s a very different view now about letting young people decide their own fate if they have the cognitive ability, as opposed to doing whatever their parents want done.
SPhD: You’ve had an Appointment at UCLA since 2013 at the Fielding School of Global Health as one of the co-founders of the Global Media Center for Social Impact, with extremely lofty ambitions at the intersection of entertainment, social outreach, technology and public health. Tell me a bit about that partnership and how you decided on an academic appointment at this time in your life.
NB: Well, I’m still doing TV. I just finished a three-year stint running the CBS series Under the Dome, which was a small-scale parable about global climate change. While I was doing that, I took this adjunct professorship at UCLA because I felt that there’s a lot we don’t know about how people take stories, learn them, use them, and I wanted to understand that more. I wanted to have a base to do more work in this area of understanding how storytelling can promote public health, both domestically and globally. Our mission at the Global Media Center for Social Impact (GMI) is to draw on both traditional storytelling modes like film, documentaries, music, and innovative or ‘new’ or transmedia approaches like Facebook, Twitter, Instagram, graphic novels and even cell phones to promote and tell stories that engage and inspire people and that can make a difference in their lives.
SPhD: One of the first major initiatives is a very important book “Soda Politics” by the public health food expert Dr. Marion Nestle. You were actually partially responsible for convincing her to write this book. Why this topic and why is it so critical right now?
NB: I went to Marion Nestle because I was convinced after having read a number of studies, particularly those by Kelly Brownell (who is now Dean of Public Policy at Duke University), that sugar-sweetened sodas are the number one contributor to childhood obesity. Just [recently], in the New York Times, they chronicled a new study that showed that obesity is still on the rise. That entails horrible costs, both emotionally and physically for individuals across this country. It’s very expensive to treat Type II diabetes, and it has terrible consequences – retinal blindness, kidney failure and heart disease. So, I was very concerned about this issue, seeing large numbers of kids coming into Children’s Hospital with Type II diabetes when I was a resident, which we had never seen before. I told Marion Nestle about my concerns because I know she’s always been an advocate for reducing the intake and consumption of soda, so I got [her] research funds from the Robert Wood Johnson Foundation. What’s really interesting is the data on soda consumption really aren’t readily available and you have to pay for it. The USDA used to provide it, but advocates for the soda industry pushed to not make that data easily available. I [also] helped design an e-book, with over 50 actionable steps that you can take to take soda out of the home, schools and community.
SPhD: How has social media engagement via our phones and computers, directly alongside television watching, changed the metric for viewing popularity, content development and key outreach issues that you’re tackling with your ActionLab initiative?
NB: ActionLab does what we weren’t [previously] able to do, because we have the web platforms now to direct people in multi-directional ways. When I first started on ER in 1994, television and media were uni-directional endeavors. We provided the content, and the viewer consumed it. Now, with Twitter [as an example], we’ve moved from a uni-directional to a multi-directional approach. People are responding, we are responding back to them as the content creators, they’re giving us ideas, they’re telling us what they like, what they don’t like, what works, what doesn’t. And it’s reshaping the content, so it’s this very dynamic process now that didn’t exist in the past. We were really sheltered from public opinion. Now, public opinion can drive what we do and we have to be very careful to put on some filters, because we can’t listen to every single thing that is said, of course. But we do look at what people are saying and we do connect with them in ways they never had access to us before.
This multi-directional approach is not just actors and writers and directors discussing their work on social media, but it’s using all of the tools of the internet to build a new way of storytelling. Now, anyone can do their own shows and it’s very inexpensive. There are all kinds of YouTube shows on now that are watched by many people. It’s a kind of Wild West, where anything goes and I think that’s very exciting. It’s changed the whole world of how we consume media. I [wrote] an episode of ER 20 years ago with George Clooney, where he saved a young boy in a water tunnel, that was watched by 48 million people at once. One out of six Americans. That will never happen again. So, we had a different kind of impact. But now, the media landscape is fractured, and we don’t have anywhere near that kind of audience, and we never will again. It’s a much more democratic and open world than it used to be, and I don’t even think we know what the repercussions of that will be.
SPhD: If you had a wish list, what are some other issues or global obstacles that you’d love to see the entertainment world (and media) tackle more than they do?
NB: In terms of specifics, we really need to talk about civic engagement, and we need to tell stories about how [it] can change the world, not only in micro-ways, say like Habitat For Humanity or programs that make us feel better when we do something to help others, but in a macro policy-driven way, like asking how we are going to provide compulsory secondary education around the world, particularly for girls. How do we instate that? How do we take on child marriage and encourage countries, maybe even through economic boycotts, to raise the age of child marriage, a problem that we know places girls in terrible situations, often with no chance of ever making a good living, much less getting out of poverty. So, we need to think both macroly and microly in terms of our storytelling. We need to think about how to use the internet and crowdsourcing for public policy and social change. How can we amalgamate individuals to support [these issues]? We certainly have the tools now, with Facebook, Twitter and Instagram, and our friends and social networks, to spread the word – and a very good way to spread the word is through short stories.
SPhD: You’ve enjoyed a storied career, and achieved the pinnacle of success in two very competitive and difficult industries. What drives Dr. Neal Baer now, at this stage of your life?
NB: Well, I keep thinking about new and innovative ways to use trans media. How do I use graphic novels in Africa to tell the story of HIV and prevention? How do we use cell phones to tell very short stories that can motivate people to go and get tested? Innovative financing to pay for very expensive drugs around the world? So, I’m very much interested in how storytelling gets the word out, because stories don’t just change minds, they change hearts. Stories tickle our emotions in ways that I think we don’t fully understand yet. And I really want to learn more about that. I want to know about what I call the “epigenetics of storytelling.” I’m writing a book about that, looking into research that [uncovers] how stories actually change our brain and how do we use that knowledge to tell better stories.
Neal Baer, MD is a television writer and producer behind hit shows China Beach, ER, Law & Order SVU, Under The Dome, and others. He is a graduate of Harvard University Medical School and completed a pediatrics internship at Children’s Hospital Los Angeles. A former co-chair of USC’s Norman Lear Center Hollywood, Health and Society, Dr. Baer is the founder of the Global Media Center for Social Impact at the Fielding School of Global Health at UCLA.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
Engineering has an unfortunate image problem. With a seemingly endless array of socioeconomic, technological and large-scale problems to address, and with STEM fields set to comprise the most lucrative 21st Century careers, studying engineering should be a no-brainer. Unfortunately, attracting a wide array of students — or even appreciating engineers as cool — remains difficult, most noticeably among women. When Google Research found out that the #2 reason girls avoid studying STEM fields is perception and stereotypes on screen, they decided to work with Hollywood to change that. Recently, they partnered with the National Academy of Sciences and USC’s prestigious Viterbi School of Engineering to proactively seek out ideas for creating a television program that would showcase a female engineering hero to inspire a new generation of female engineers. The project, entitled “The Next MacGyver,” came to fruition last week in Los Angeles at a star-studded event. ScriptPhD.com was extremely fortunate to receive an invite and have the opportunity to interact with the leaders, scientists and Hollywood representatives that collaborated to make it all possible. Read our full comprehensive coverage below.
“We are in the most exciting era of engineering,” proclaims Yannis C. Yortsos, the Dean of USC’s Engineering School. “I look at engineering technology as leveraging phenomena for useful purposes.” These purposes have been recently unified as the 14 Grand Challenges of Engineering — everything from securing cyberspace to reverse engineering the brain to solving our environmental catastrophes to ensuring global access to food and water. These are monumental problems and they will require a full scale work force to fully realize. It’s no coincidence that STEM jobs are set to grow by 17% by 2024, more than any other sector. Recognizing this opportunity, the US Department of Education (in conjunction with the Science and Technology Council) launched a five-year strategic plan to prioritize STEM education and outreach in all communities.
Despite this golden age, where the possibilities for STEM innovation seem as vast as the challenges facing our world, there is a disconnect in maximizing a full array of talent for the next generation of engineers. There is a noticeable paucity of women and minority students studying STEM fields, with women comprising just 18-20% of all STEM bachelor’s degrees, regardless of the fact that more students are STEM degrees than ever before. Madeline Di Nono, CEO of the Geena Davis Institute on Gender in Media and a judge at the Next MacGyver competition, boils a lot of the disinterest down to a consistent lack of female STEM portrayal in television and film. “It’s a 15:1 ratio of male to female characters for just STEM alone. And most of the science related positions are in life sciences. So we’re not showing females in computer science or mathematics, which is where all the jobs are going to be.” Media portrayals of women (and by proxy minorities) in science remains shallow, biased and appearance-focused (as profiled in-depth by Scientific American). Why does this matter? There is a direct correlation between positive media portrayal and STEM career participation.
It has been 30 years since the debut of television’s MacGyver, an action adventure series about clever agent Angus MacGyver, working to right the wrongs of the world through innovation. Rather than using a conventional weapon, MacGyver thwarts enemies with his vast array of scientific knowledge — sometimes possessing no more than a paper clip, a box of matches and a roll of duct tape. Creator Lee Zlotoff notes that in those 30 years, the show has run continuously around the world, perhaps fueled in part by a love of MacGyver’s endless ingenuity. Zlotoff noted the uncanny parallels between MacGyver’s thought process and the scientific method: “You look at what you have and you figure out, how do I turn what I have into what I need?” Three decades later, in the spirit of the show, the USC Viterbi School of Engineering partnered with the National Academy of Sciences and the MacGyver Foundation to search for a new MacGyver, a television show centered around a female protagonist engineer who must solve problems, create new opportunities and most importantly, save the day. It was an initiative that started back in 2008 at the National Academy of Sciences, aiming to rebrand engineering entirely, away from geeks and techno-gadget builders towards an emphasis on the much bigger impact that engineering technology has on the world – solving big, global societal problems. USC’s Yortsos says that this big picture resonates distinctly with female students who would otherwise be reluctant to choose engineering as a career. Out of thousands of submitted TV show ideas, twelve were chosen as finalists, each of whom was given five minutes to pitch to a distinguished panel of judges comprising of writers, producers, CEOs and successful show runners. Five winners will have an opportunity to pair with an established Hollywood mentor in writing a pilot and showcasing it for potential production for television.
If The Next MacGyver feels far-reaching in scope, it’s because it has aspirations that stretch beyond simply getting a clever TV show on air. No less than the White House lent its support to the initiative, with an encouraging video from Chief Technology Officer Megan Smith, reiterating the importance of STEM to the future of the 21st Century workforce. As Al Roming, the President of the National Academy of Engineering noted, the great 1950s and 1960s era of engineering growth was fueled by intense competition with the USSR. But we now need to be unified and driven by the 14 grand challenges of engineering and their offshoots. And part of that will include diversifying the engineering workforce and attracting new talent with fresh ideas. As I noted in a 2013 TEDx talk, television and film curry tremendous power and influence to fuel science passion. And the desire to marry engineering and television extends as far back as 1992, when Lockheed and Martin’s Norm Augustine proposed a high-profile show named LA Engineer. Since then, he has remained a passionate advocate for elevating engineers to the highest ranks of decision-making, governance and celebrity status. Andrew Viterbi, namesake of USC’s engineering school, echoed this imperative to elevate engineering to “celebrity status” in a 2012 Forbes editorial. “To me, the stakes seem sufficiently high,” said Adam Smith, Senior Manager of Communications and Marketing at USC’s Viterbi School of Engineering. “If you believe that we have real challenges in this country, whether it is cybersecurity, the drought here in California, making cheaper, more efficient solar energy, whatever it may be, if you believe that we can get by with half the talent in this country, that is great. But I believe, and the School believes, that we need a full creative potential to be tackling these big problems.”
So how does Hollywood feel about this movement and the realistic goal of increasing its array of STEM content? “From Script To Screen,” a panel discussion featuring leaders in the entertainment industry, gave equal parts cautionary advice and hopeful encouragement for aspiring writers and producers. Ann Merchant, the director of the Los Angeles-based Science And Entertainment Exchange, an offshoot of the National Academy of Sciences that connects filmmakers and writers with scientific expertise for accuracy, sees the biggest obstacle facing television depiction of scientists and engineers as a connectivity problem. Writers know so few scientists and engineers that they incorporate stereotypes in their writing or eschew the content altogether. Ann Blanchard, of the Creative Artists Agency, somewhat concurred, noting that writers are often so right-brain focused, that they naturally gravitate towards telling creative stories about creative people. But Danielle Feinberg, a computer engineer and lighting director for Oscar-winning Pixar animated films, sees these misconceptions about scientists and what they do as an illusion. When people find out that you can combine these careers with what you are naturally passionate about to solve real problems, it’s actually possible and exciting. Nevertheless, ABC Fmaily’s Marci Cooperstein, who oversaw and developed the crime drama Stitchers, centered around engineers and neuroscientists, remains optimistic and encouraged about keeping the doors open and encouraging these types of stories, because the demand for new and exciting content is very real. Among 42 scripted networks alone, with many more independent channels, she feels we should celebrate the diversity of science and medical programming that already exists, and build from it. Put together a room of writers and engineers, and they will find a way to tell cool stories.
At the end of the day, Hollywood is in the business of entertaining, telling stories that reflect the contemporary zeitgeist and filling a demand for the subjects that people are most passionate about. The challenge isn’t wanting it, but finding and showcasing it. The panel’s universal advice was to ultimately tell exciting new stories centered around science characters that feel new, flawed and interesting. Be innovative and think about why people are going to care about this character and storyline enough to come back each week for more and incorporate a central engine that will drive the show over several seasons. “Story does trump science,” Merchant pointed out. “But science does drive story.”
The twelve pitches represented a diverse array of procedural, adventure and sci-fi plots, with writers representing an array of traditional screenwriting and scientific training. The five winners, as chosen by the judges and mentors, were as follows:
Miranda Sajdak — Riveting
Sajdak is an accomplished film and TV writer/producer and founder of screenwriting service company ScriptChix. She proposed a World War II-era adventure drama centered around engineer Junie Duncan, who joins the military engineer corps after her fiancee is tragically killed on the front line. Her ingenuity and help in tackling engineering and technology development helps ultimately win the war.
Beth Keser, PhD — Rule 702
Dr. Keser, unique among the winners for being the only pure scientist, is a global leader in the semiconductor industry and leads a technology initiative at San Diego’s Qualcomm. She proposed a crime procedural centered around Mimi, a brilliant scientist with dual PhDs, who forgoes corporate life to be a traveling expert witness for the most complex criminal cases in the world, each of which needs to be investigated and uncovered.
Jayde Lovell — SECs (Science And Engineering Clubs)
Jayde, a rising STEM communication star, launched the YouTube pop science network “Did Someone Say Science?.” Her show proposal is a fun fish-out-of-water drama about 15-year-old Emily, a pretty, popular and privileged high school student. After accidentally burning down her high school gym, she forgoes expulsion only by joining the dreaded, geeky SECs club, and in turn, helping them to win an engineering competition while learning to be cool.
Craig Motlong — Q Branch
Craig is a USC-trained MFA screenwriter and now a creative director at a Seattle advertising agency. His spy action thriller centered around mad scientist Skyler Towne, an engineer leading a corps of researchers at the fringes of the CIA’s “Q Branch,” where they develop and test the gadgets that will help agents stay three steps ahead of the biggest criminals in the world.
Shanee Edwards — Ada and the Machine
Shanee, an award-winning screenwriter, is the film reviewer at SheKnows.com and the host/producer of the web series She Blinded Me With Science. As a fan of traditional scientific figures, Shanee proposed a fictionalized series around real-life 1800s mathematician Ada Lovelace, famous for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. In this drama, Ada works with Babbinge to help Scotland Yard fight opponents of the industrial revolution, exploring familiar themes of technology ethics relevant to our lives today.
Craig Motlong, one of five ultimate winners, and one of the few finalists with absolutely no science background, spent several months researching his concept with engineers and CIA experts to see how theoretical technology might be incorporated and utilized in a modern criminal lab. He told me he was equal parts grateful and overwhelmed. “It’s an amazing group of pitches, and seeing everyone pitch their ideas today made me fall in love with each one of them a little bit, so I know it’s gotta be hard to choose from them.”
Whether inspired by social change, pragmatic inquisitiveness or pure scientific ambition, this seminal event was ultimately both a cornerstone for strengthening a growing science/entertainment alliance and a deeply personal quest for all involved. “I don’t know if I was as wrapped up in these issues until I had kids,” said USC’s Smith. “I’ve got two little girls, and I tried thinking about what kind of shows [depicting female science protagonists] I should have them watch. There’s not a lot that I feel really good sharing with them, once you start scanning through the channels.” Motlong, the only male winner, is profoundly influenced by his experience of being surrounded by strong women, including a beloved USC screenwriting instructor. “My grandmother worked during the Depression and had to quit because her husband got a job. My mom had like a couple of options available to her in terms of career, my wife wanted to be a genetic engineer when she was little and can’t remember why she stopped,” he reflected. “So I feel like we are losing generations of talent here, and I’m on the side of the angels, I hope.” NAS’s Ann Merchant sees an overarching vision on an institutional level to help achieve the STEM goals set forth by this competition in influencing the next generation of scientist. “it’s why the National Academy of Sciences and Engineering as an institution has a program [The Science and Entertainment Exchange] based out of Los Angeles, because it is much more than this [single competition].”
Indeed, The Next MacGyver event, while glitzy and glamorous in a way befitting the entertainment industry, still seemed to have succeeded wildly beyond its sponsors’ collective expectations. It was ambitious, sweeping, the first of its kind and required the collaboration of many academic, industry and entertainment alliances. But it might have the power to influence and transform an entire pool of STEM participants, the way ER and CSI transformed renewed interest in emergency medicine and forensic science and justice, respectively. If not this group of pitched shows, then the next. If not this group of writers, then the ones who come after them. Searching for a new MacGyver transcends finding an engineering hero for a new age with new, complex problems. It’s about being a catalyst for meaningful academic change and creative inspiration. Or at the very least opening up Hollywood’s eyes and time slots. Zlotoff, whose MacGyver Foundation supported the event and continually seeks to promote innovation and peaceful change through education opportunities, recognized this in his powerful closing remarks. “The important thing about this competition is that we had this competition. The bell got rung. Women need to be a part of the solution to fixing the problems on this planet. [By recognizing that], we’ve succeeded. We’ve already won.”
The Next MacGyver event was held at the Paley Center For Media in Beverly Hills, CA on July 28, 2015. Follow all of the competition information on their site. Watch a full recap of the event on the Paley Center YouTube Channel.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
On February 28, 1998, the revered British medical journal The Lancet published a brief paper by then-high profile but controversial gastroenterologist Andrew Wakefield that claimed to have linked the MMR (measles, mumps and rubella) vaccine with regressive autism and inflammation of the colon in a small case number of children. A subsequent paper published four years later claimed to have isolated the strain of attenuated measles virus used in the MMR vaccine in the colons of autistic children through a polymerase chain reaction (PCR amplification). The effect on vaccination rates in the UK was immediate, with MMR vaccinations reaching a record low in 2003/2004, and parts of London losing herd immunity with vaccination rates of 62%. 15 American states currently have immunization rates below the recommended 90% threshold. Wakefield was eventually exposed as a scientific fraud and an opportunist trying to cash in on people’s fears with ‘alternative clinics’ and pre-planned a ‘safe’ vaccine of his own before the Lancet paper was ever published. Even the 12 children in his study turned out to have been selectively referred by parents convinced of a link between the MMR vaccine and their children’s autism. The original Lancet paper was retracted and Wakefield was stripped of his medical license. By that point, irreparable damage had been done that may take decades to reverse.
How could a single fraudulent scientific paper, unable to be replicated or validated by the medical community, cause such widespread panic? How could it influence legions of otherwise rational parents to not vaccinate their children against devastating, preventable diseases, at a cost of millions of dollars in treatment and worse yet, unnecessary child fatalities? And why, despite all evidence to the contrary, have people remained adamant in their beliefs that vaccines are responsible for harming otherwise healthy children, whether through autism or other insidious side effects? In his brilliant, timely, meticulously-researched book The Panic Virus, author Seth Mnookin disseminates the aggregate effect of media coverage, echo chamber information exchange, cognitive biases and the desperate anguish of autism parents as fuel for the recent anti-vaccine movement. In doing so, he retraces the triumphs and missteps in the history of vaccines, examines the social impact of rejecting the scientific method in a more broad perspective, and ways that this current utterly preventable public health crisis can be avoided in future scenarios. A review of The Panic Virus, an enthusiastic ScriptPhD.com Editor’s Selection, follows below.
Such fervent controversy over inoculating young children for communicable diseases might have seemed unimaginable to the pre-vaccine generations. It wasn’t long ago, Mnookin chronicles, that death and suffering at the hands of diseases like polio and small pox were the accepted norm. In 18th Century Europe, for example, 400,000 people per year regularly died of small pox, and it caused one third of all cases of blindness. So desperate were people to avoid the illnesses’ ravages, that crude, rudimentary inoculation methods were employed, even at the high risk of death, to achieve life-long immunity. A 1916 polio outbreak in New York City, with fatality rates between 20 and 25 percent, frayed nerves and public health infrastructure to the point of near-anarchy. As the disease waxed and waned in outbreaks throughout the decades that followed, distraught parents had no idea about how to protect their children, who were often far more susceptible to fatality than adults. By the time Jonas Salk’s polio vaccine breakthrough was announced as “safe, effective and potent” on April 12, 1955, pandemonium broke out. “Air raid sirens were set off,” Mnookin writes. “Traffic lights blinked red; churches’ bells rang; grown men and women wept; schoolchildren observed a moment of silence.” Salk’s discovery was hailed as “one of the greatest events in the history of medicine.”
Mnookin doesn’t let scientists off the hook where vaccines are concerned, however, and rightfully so. Starting around World War II, with advances such as the cowpox and polio vaccines, along with the dawning of the Antibiotics Age, eradicating death and suffering from communicable diseases and bacterial infections, a hubris and sense of superiority began to creep into the scientific establishment, with dangerous consequences. Fearing the threat of biological warfare during World War II, a 1941 hastily-constructed US military campaign to vaccinate all US troops against yellow fever resulted in batches contaminated with Hepatitis B, resulting in 300,000 infections and 60 deaths. The first iteration of Salk’s polio vaccine was only 60-90% effective before being perfected and eventually replaced by the more effective Sabin vaccine. Furthermore, dozens of children who had received doses from the first batch of vaccines were paralyzed or killed due to contaminated vaccines that had failed safety tests. In 1976, buoyed by the death of a soldier from a flu virus that bore striking genetic similarity to the 1918 Spanish flu epidemic strain, President Gerald Ford instituted a nation-wide mass vaccination initiative against a “swine flu” epidemic. Unfortunately, although 40 million Americans were vaccinated in three months, 500 developed symptoms of Guillain-Barré syndrome (30 died), seven times higher than would normally be expected as a rare side effect of vaccination. Many people feel that the scars of the 1976 fiasco have incurred a permanent distrust of the medical establishment and have haunted public health influenza immunization efforts to this day.
These black marks on the otherwise miraculous, life-saving history of vaccine development not only instilled a gradual mistrust in public health officials, but laid the groundwork for the incendiary autism-vaccine scandal. The only missing components were a proper context of panic, a snake oil salesman and a compliant media willing to spread his erroneous message.
Enter the autism epidemic and Andrew Wakefield’s hoax. Because this seminal event had such a profound effect on the formation and proliferation of the current anti-vaccine movement, it is chronicled in far greater detail than our introduction above. From precursor incidents that ripened the potential for coercion to the Wakefield’s shoddy methodology and the naive medical community that took him at his word, Mnookin weaves through this case with well-researched scientific facts, interesting interviews and logic. A large chunk of the book is ultimately devoted to the psychology of what the anti-vaccine movement really is: a cognitive bias and a willingness to stay adamant in the belief that vaccines cause harm despite all evidence to the contrary. “If you assume,” he writes, “as I had, that human beings are fundamentally logical creatures, this obsessive preoccupation with a theory that has for all intents and purposes been disproved is hard to fathom. But when it comes to decisions around emotionally charged topics, logic often takes a back seat to a set of unconscious mechanisms that convince us that it is our feelings about a situation and not the facts that represent the truth.”
Given this blog’s objective to cover science and technology in entertainment and media, it would be disingenuous to write about the anti-vaccine movement without recognizing the implicit role played by the media and entertainment industries in exacerbating the polemic. By lending a voice to the anti-vaccine argument, even in a subtle manner or in a journalistic attempt to “be fair to the other side,” over time, an echo chamber of lies turned into an inferno. In 1982, an hour-long NBC documentary called DPT: Vaccine Roulette aired, overemphasizing rare side effects in babies from vaccinations to a nation of alarmed parents and completely undermining their benefits. It was a propaganda piece, but an important hallmark for what would come later. A 2008 episode of the popular ABC hit show Eli Stone irresponsibly aired anti-vaccination propaganda involving a lawyer questioning a pharmaceutical company that manufactures vaccines due to the even then-debunked link to autism. For several recent years, actress and Playboy bunny Jenny McCarthy (who is given an entire chapter by Mnookin) became a tireless advocate against vaccinations, believing that they gave her son autism. She didn’t have any scientific proof for this, but was nevertheless given a platform by everyone from Larry King on CNN to a fawning Oprah Winfrey.
As it turns out, McCarthy’s son never even had autism, but rather a very rare and treatable neurological disorder. In a self-penned editorial for the Chicago Sun-Times, she has officially retroactively denied her anti-vaccine stance, and says she simply wants “more research on their effectiveness.” An extremely sympathetic 2014 eight-page Washington Post magazine article profile of prominent anti-vaccine activist Robert F. Kennedy, Jr. (who believes in the link between vaccines and autism) repeated his talking points numerous times throughout. This, among an endless cycle of interviews and appearances by defiant anti-vaccine proponents, given equal air time side-by-side with frustrated scientists, as if both positions were somehow viable, and worthy of journalistic debate. Once the worm was out of the can, no amount of rational discourse could temper the visceral antipathy that had been created. This is irresponsible, dangerous and flat-out wrong. When the public is confused about an esoteric issue pertaining to science, medicine or technology, influencers in the public eye cannot perpetuate misinformation.
Despite the unanimous medical repudiation of Wakefield’s fraudulent methods and conclusions and the retraction of his Lancet paper, an irreversible and insidious myth had begun permeating, first among the autism community, then spreading to proponents of organic and holistic approaches to health and finally, to mainstream society. In the aftermath of the controversy, epidemiological studies debunking the autism-vaccination “link,” combined with a growing disease crisis, have forced the largest US-based autism advocacy organization to reverse its stance and fully endorse vaccination to a still-divided community. Wakefield remains more defiant than ever, insisting to this day that his research was valid, attempting to sue the British journalism outlet that funded the inquiry into his fraud and peddling holistic treatments for autism as well as his “alternative” vaccine. Sadly, the public health ramifications have nothing short of disastrous, with a dangerous recurrence of several major childhood diseases.
A few examples of the many systemic casualties of the anti-vaccination movement (many occurring just since the publication of Mnookin’s book):
•A summary from the American Medical Association about the nascence of the measles crisis in 2011, when the US saw more measles cases than it had in 15 years
•Immunization rates falling so low that schools in some communities are being forced to terminate personal exemption waivers and, in some cases, legally mandated immunization for public school attendance
•California’s worst whooping cough epidemic in 70 years.
•Most recently, a measles outbreak at Disneyland, resulting in 26 cases spread across four states, after an unvaccinated woman visited the theme park
•Anti-vaccine hysteria has spread to Europe, which has had a measles rise of 348% from 2013 to 2014 (and growing), along with an alarming resurgence of pertussis
The scientific evidence that vaccines work is indisputable, and as the below infographic summarizes, their impact on morbidity from communicable diseases is miraculous. Sadly, now that the anti-vaccine movement has streamlined into the general population, anxious parents are conflicted as to whether vaccinating is the right choice for their children. We must start by going back to the basics of what a vaccine actually is and how it works. Next, we must reiterate the critical importance that maintaining herd immunity above 92-95% plays in protecting not only those too young or immunocompromised to be vaccinated, but even fully vaccinated populations. If all else fails, try emailing skeptical friends and family a clever graphic cartoon that breaks down digestible vaccine facts. Simply put: getting vaccinated is not a personal choice, it’s a selfish and dangerous choice.
The Panic Virus is first and foremost an incredibly entertaining, well-written narrative of the dawn of an anti-vaccine phenomenon which has reached a critical mass. It is also an important case study and cautionary tale about how we process and disseminate information in the age of the Internet and access to instant information. It is also an indictment on a trigger-happy, ratings-driven, sensationalist media that reports “news” as they interpret it first, and bother to check for facts later. In the case of the anti-vaccine movement of the last few years, the media fueled the fire that Andrew Wakefield started, and once a gaggle of angry, sympathetic parents was released, it was difficult (if-near impossible) to undo the damage. This type of journalism, Mnookin writes, “gives credence to the belief that we can intuit our way through all the various decisions we need to make in our lives and it validates the notion that our feelings are a more reliable barometer of reality than the facts.” Sadly, the autism-vaccine panic movement is not an outlying incident, but rather a disconcerting emblem of a growing anti-science agenda. The UN just released its most dire and alarming report ever issued on man-made climate change impacts, warning that temperature changes and industrial pollution will affect not just the environment, extreme weather events and coastal cities, but even the stability of our global economy itself. Immediate rebuttals from an influential lobbying group tried to undermine the majority of the scientists’ findings. So toxic is the corporate and political resistance to any kind of mitigating action, that some feel we need a technological or political miracle to stave off a certain environmental crisis. At a time when physicists are serious debate on evolution versus creationism and thousands of public schools across the United States use taxpayer funds to teach creationism in the classroom.
Mnookin’s book is an important resource and conversation starter for scientists, researchers and frustrated physicians as they carve out talking points and communication strategies to establish a dialogue with the public at large. When young parents have questions about vaccines (no matter how erroneous or ill-informed), pediatricians should already have materials for engaging in a positive, thoughtful discussion with them. When scientists and researchers encounter anti-science proclivities or subversive efforts to undermine their advocacy for a pressing issue, they should be armed with powerful, articulate communicators — ready and willing to deliberate in the media and convey factual information in an accessible way. When Jenny McCarthy and a gaggle of new-age holistic herbologists were peddling their “mommy instincts” and conspiracy theories against vaccines, far too many scientists and physicians simply thought it was beneath them to even engage in a discussion about something whose certainty and proof of concept was beyond reproach. Now, the newest polling suggests that nothing will change an anti-vaxxer’s mind, not even factual reasoning. Going forward, regardless of the issue at hand, this type of response can never happen again. The cost of complacency or arrogance is nothing short of life or death.
The Panic Virus by Seth Mnookin is currently available on paperback and Kindle wherever books are sold. For further reading on how to deal with the complexities of the anti-vaccine movement aftermath, we suggest the recent book On Immunity: An Inoculation by Northwestern University lecturer Eula Bliss.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
The history of science movies nominated for Oscars is not a very long one. Aside from the technical achievement awards or an occasional nomination for acting merits, the Best Picture category has historically not opened its doors to scientific content, save for notable nominees “A Clockwork Orange,” “District 9,” “Inception” and “Avatar.” A documentary about science has never been nominated for the Best Documentary category, until this year, with How To Survive a Plague, Director David France’s stunning account of the brave activists that brought the AIDS epidemic to the attention of the government and science community in the disease’s darkest early days. “Plague” set history last weekend by becoming the first “Best Documentary” nominee with an almost entirely scientific/biomedical narrative. More importantly, it also established a standard by which future science documentaries should use emotional storytelling to captivate audiences and inspire action. ScriptPhD review and discussion under the “continue reading” cut.
“How To Survive a Plague” picks up where groundbreaking companion AIDS film “And the Band Played On” drops off, around 1987, with the formation of the AIDS Coalition to Unleash Power (ACT UP) advocacy group, which is the central protagonist of the documentary. The AIDS virus had been identified, isolated and diagnosed in patients. But as a running ticker throughout the movie reminds us, the AIDS death toll knew no limit. By 1988, 70,000 people had already perished, a number that would climb to an astonishing 500,000 by 1997. Complacency and frustration were the norm among medical professionals, who treated patients with a series of “what the hell” drugs, when they’d even consider treating them. The scientific community, although recognizing that research was necessary, devoted little money or manpower. Even early drugs that showed experimental efficacy or relieved symptoms in AIDS patients were dismissed.
Scientists and the government were the targets of ACT UP’s fury and protests.
But by the late 80s/early 90s, ACT UP’s mission had reached a critical Phase II: conformity. Extremists and truculent zealots were dismissed to the sidelines, while the group became self-made scientists, learning everything from medicine to virology and immunology to chemistry. Rather than shut down the FDA for a day like they’d done years ago, they showed up to a scientific meeting in suits and ties to hand out a well-thought-out publication worthy proposal on AIDS research and treatment timelines. Impressed scientists took note. By the time charming ACT UP leader Peter Staley addresses an international convention meeting of the American Society for Microbiology, he is given a standing ovation. ACT UP’s fight, the fight of the gay community, had now become a global fight.
In many ways, “How To Survive a Plague” is an emotional contrast to “And the Band Played On,” even though the former is a documentary largely consisting of reel footage of the events it portrays, while the latter is a dramatized account of scientists racing to find the identity of the AIDS virus. Although “Band” touches briefly on the fear, government insouciance and distrust within the gay community in the earliest days of the bourgeoning epidemic, it is very much a pure science film. Its themes of persistence, no-holds-barred competition, stunningly accurate epidemiology and virology details and race to an answer could be about any virus in any historical time. “Plague” puts all of the science and medicine of the AIDS crisis in an emotional and historical context. A running death ticker as the years pass lends an urgency to the battle of the ACT UP activists. Moreover, France inserts actual footage of their protests (the most famous being an all-day takeover of the National Institutes of Health), meetings and press conferences, and difficult-to-watch footage of AIDS that shines an intimate spotlight of realism on the crisis. As France notes, the AIDS crisis burgeoned concomitantly with the appearance of the camcorder, making early AIDS activists “the very first social movement to shoot a world the dominant culture was ignoring.”
Scientists in the movie range from heroes and anti-heroes to ordinary people, which is a rarity in entertainment media. By far the biggest hero is Dr. Iris Long, a chemist with 20 years of experience in retroviral drug development. Although she knew no one with AIDS and never met a homosexual in her life, Dr. Long became a mentor and science advisor to ACT UP. Her fearless leadership and ability to educate the members led to direct reforms at the FDA and NIH. Other members like Bill Bahlman (the first to demand a direct drug treatment for AIDS) and Garance Franke-Ruta (a high school drop out and science nerd who became the group’s leading advocate for science-based activism) led the internal change to join forces with scientists rather than fighting them. And for every scientist that ignored the AIDS crisis was a research pioneer like Anthony Faucci, now head of the NIH Institute of Allergy and Infectious Disease, or a Merck chemist leading the development of anti-retroviral drugs. Moments after a graphics-filled technical explanation of how anti-retroviral drugs inhibit HIV virus replication, one of the Merck scientists interviewed in the film broke down into tears when recollecting the enormity of what they’d accomplished. It’s a stunning, raw moment in a film filled with them. Recent advances in writing for sci-fi have painted more complex, human depictions of scientists and researchers. But such insights are far too rare in documentaries.
In a strangely macabre way, “Plague” is an emotional feel good story, but one that isn’t over yet. Through the darkest days of rallying a tone-deaf world, all the while losing members day by day, ACT UP’s commitment and perseverance never failed. By the time surviving members, some of whom professed in footage that they never expected to live, are finally revealed in the present day in the film’s last act, the audience is flooded with gratitude and catharsis. The science world, which once didn’t know what to make of this emerging virus, took only one year from the time the first protease inhibitor hit the market to come up with and approve the current three-drug treatment cocktail.
But the film’s unstated, looming conclusion is that we will never get back the millions of people who died during a decade of silence. Too many people continue to perish, most in what has become a new frontier for the AIDS crisis. The fight for a cure or prevention is not over. And a new plague could always be around the corner. It is our hope that future documentarians take note of both the film’s message and its delivery style.
Take a look at the official trailer for “How To Survive a Plague”:
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>“The wars of the 21st Century will be fought over water.” —Ismail Serageldin, World Bank
Watching the devastation and havoc caused by Hurricane Sandy and several recent water-related natural disasters, it’s hard to imagine that global water shortages represent an environmental crisis on par with climate change. But if current water usage habits do not abate, or if major technological advances to help recycle clean water are not implemented, this is precisely the scenario we are facing—a majority of 21st Century conflicts being fought over water. From the producers of socially-conscious films An Inconvenient Truth and Food, Inc., Last Call at the Oasis is a timely documentary that chronicles current challenges in worldwide water supply, outlines the variables that contribute to chronic shortages and interviews leading environmental scientists and activists about the ramifications of chemical contamination in drinking water. More than just an environmental polemic, Last Call is a stirring call to action for engineering and technology solutions to a decidedly solvable problem. A ScriptPhD.com review under the “continue reading” cut.
Although the Earth is composed of 70% water, only 0.7% (or 1% of the total supply) of it is fresh and potable, which presents a considerable resource challenge for a growing population expected to hit 9 billion people in 2050. In a profile series of some of the world’s most populous metropolises, Last Call vividly demonstrates that stark imagery of shortage crises is no longer confined to third world countries or women traveling miles with a precious gallon of water perched on their heads. The Aral Sea, a critical climate buffer for Russia and surrounding Central Asia neighbors, is one-half its original size and devoid of fish. The worst global droughts in a millennium have increased food prices 10% and raised a very real prospect of food riots. Urban water shortages, such as an epic 2008 shortage that forced Barcelona to import emergency water, will be far more common. The United States, by far the biggest consumer of water in the world, could also face the biggest impact. Lake Mead, the biggest supplier of water in America and a portal to the electricity-generating Hoover Dam, is only 40% full. Hoover Dam, which stops generating electricity when water levels are at 1050 feet, faces that daunting prospect in less than 4 years!
One strength of Last Call is that it is framed around a fairly uniform and well-substantiated hypothesis: water shortage is directly related to profligate, unsustainable water usage. Some usage, such as the 80% that is devoted to agriculture and food production, will merit evaluation for future conservation methods. California and Australia, two agricultural behemoths half a world apart, both face similar threats to their industries. But others, such as watering lawns. are unnecessary habits that can be reduced or eliminated. Toilets, most of which still flush 6 gallons in a single use, are the single biggest user of water in our homes—6 billion gallons per day! The US is also the largest consumer of bottled water in the world, with $11 billion in sales, even though bottled water, unlike municipal tap water, is under the jurisdiction of the FDA, not the EPA. As chronicled in the documentary Tapped, 45% of all bottled water starts off as tapped water, and has been subject to over 100 recalls for contamination.
A cohort of science and environmental experts bolsters Last Call’s message with the latest scientific research in the area. NASA scientists at the Jet Propulsion Laboratory are using a program called the Gravity Recovery and Climate Experiment (GRACE) Satellite to measure the change in oceans, including water depletion, rise in sea levels and circulation through a series of gravity maps. Erin Brockovich, famously portrayed by Julia Roberts in the eponymous film, appears throughout the documentary to discuss still-ongoing issues with water contamination, corporate pollution and lack of EPA regulation. UC Berkeley marine biologist Tyrone Hayes expounds on what we can learn from genetic irregularities in amphibians found in contaminated habitats.
Take a look at a trailer for Last Call at the Oasis:
Indeed, chemical contamination is the only issue that supersedes overuse as a threat to our water supply. Drugs, antibiotics and other chemicals, which cannot be treated at sewage treatment plants, are increasingly finding their way into the water supply, many of them at the hands of large corporations. Between 2004 and 2009, there were one half a million violations of the Clean Water Act. Last Call doesn’t spare the eye-opening details that will make you think twice when you take a sip of water. Atrazine, for example, is the best-selling pesticide in the world, and the most-used on the US corn supply. Unfortunately, it has also been associated with breast cancer and altered testosterone levels in sea life, and is being investigated for safety by the EPA, with a decision expected in 2013. More disturbing is the contamination from concentrated animal feeding operations (CAFOs) near major rivers and lakes. Tons of manure from cows, one of which contributes the waste of 23 humans, is dumped into artificial lagoons that then seep into interconnected groundwater supplies.
It’s not all doom and gloom with this documentary, however. Unlike other polemics in its genre, Last Call doesn’t simply outline the crisis, it also offers implementable solutions and a challenge for an entire generation of engineers and scientists. At the top of the list is a greater scrutiny of polluters and the pollutants they release into the water supply without impunity. But solutions such as recycling sewage water, which has made Singapore a global model for water technology and reuse, are at our fingertips, if developed and marketed properly. The city of Los Angeles has already announced plans to recycle 4.9 billion gallons of waste water by 2019. Last Call is an effective final call to save a fast-dwindling resource through science, innovation and conservation.
Last Call at the Oasis went out on DVD November 6th.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]> This past weekend, over 130,000 people descended on the San Diego Convention Center to take part in Comic-Con 2012. Each year, a growing amalgamation of costumed super heroes, comics geeks, sci-fi enthusiasts and die-hard fans of more mainstream entertainment pop culture mix together to celebrate and share the popular arts. Some are there to observe, some to find future employment and others to do business, as beautifully depicted in this year’s Morgan Spurlock documentary Comic-Con Episode IV: A Fan’s Hope. But Comic-Con San Diego is more than just a convention or a pop culture phenomenon. It is a symbol of the big business that comics and transmedia pop culture has become. It is a harbinger of future profits in the entertainment industry, which often uses Comic-Con to gauge buzz about releases and spot emerging trends. And it is also a cautionary tale for anyone working at the intersection of television, film, video games and publishing about the meteoric rise of an industry and the uncertainty of where it goes next. We review Rob Salkowitz’s new book Comic-Con and the Business of Pop Culture, an engaging insider perspective on the convergence of geekdom and big business.
Comic-Con wasn’t always the packed, “see and be seen” cultural juggernaut it’s become, as Salkowitz details in the early chapters of his book. In fact, 43 years ago, when the first Con was held at the US Grant hotel in San Diego, led by the efforts of comics superfan Shel Dorf, only 300 people came! In its early days, Comic-Con was a casual place where the titans of comics publishers such as DC and Marvel would gather with fans and other semi-professional artists to exchange ideas and critique one another’s work in an intimate setting. In fact, Salkowitz, a long-time Con attendee who has garnered quite a few insider perks along the way over the years, recalls that his early days of attendance were not quite so harried and frantic. The audience for Comic-Con steadily grew until about 2000, when attendance began skyrocketing, to the point that it now takes over an entire American city for a week each year. Why did this happen? Salkowitz argues that this time period is when a quantum leap shift occurred away from comic books and towards comics culture, a platform that transcends graphic novels and traditional comic books and usurps the entertainment and business matrices of television, film, video games and other “mainstream” art. Indeed, when ScriptPhD last covered Comic-Con in 2010, even their slogan changed to “celebrating the popular arts,” a seismic shift in focus and attention. (This year, Con organizers made explicit attempts to explore the history and heritage partially in order to assuage purists who argue that the event has lost sight of its roots.) In theory, this meteoric rise is wonderful, right? With all that money flowing, everyone wins! Not so fast.
Lost amidst the pomp and circumstance of the yearly festivities is the fact that within this mixed array of cultural forces, there are cracks in the armor. For one thing, comics themselves are not doing well at all. For example, more than 70 million people bought a ticket to the 2008 movie The Dark Knight, but fewer than 70,000 people bought the July 2011 issue of Batman: The Dark Knight. Salkowitz postulates that we may be nearing the unimaginable: a “future of the comics industry that does not include comic books.” To unravel the backstory behind the upstaging of an industry at its own event, Salkowitz structures the book around the four days of the 2011 San Diego Comic-Con. In a rather clever bit of exposition, he weaves between four days of events, meetings, exclusive parties, panels of various size and one-on-one interactions to take the reader on a guided tour of Comic-Con, while in the process peeling back the layers of transmedia and business collaborations
that are the underbelly of the current “peak geek” saturation. A brief divergence to the downfall of the traditional publishing industry, including bookstores (the traditional sellers of comics), the reliance of comics on movie adaptations and the pitfalls of digital publication is a must-read primer for anyone wishing to work in the industry. Even more strapped are merchants that sell rare comics and collectibles on the convention floor. Often relegated to the side corners with famous comics artists so that entertainment conglomerates can occupy prime real estate on the floor, many dealers struggle just to break even. Among them are independent comics, self-published artists, and “alternative” comics, all hoping to cash in on the Comic-Con sweepstakes. Comics may be big business, but not for everyone. Forays into the world of grass-roots publishing, the microcosm of the yearly Eisner Awards for achievement in comics and the alternative con within a Con called Trickster (a more low-key networking event that harkens to the days of yore) all remind the reader of the tight-knit relationship that comics have with their fan base.
In many ways, the comics crisis that Salkowitz describes is not only very real, but difficult to resolve. The erosion of print media is unlikely to be reversed, nor is the penchant towards acquiring free content in the digital universe. Furthermore, video games, represent one of the biggest single-cause reasons for the erosion of comics in the last 20 years. Games such as Halo, Mass Effect, Grand Theft Auto and others, execute recurring linear storylines in a more cost-conscious three-dimensional interactive platform. On the other hand, there are also a myriad of reasons to be positive about the future of comics. The advent of tablets (notably the iPad) represents an unprecedented opportunity to re-establish comics’ popularity and distribution profits. Traditional and casual fans of comics haven’t gone anywhere, they’re just temporarily drowned out by the lines for the Twilight panel. A rising demographic of geek girls represents a potential growth segment in audience. And finally, a tremendous rise in popularity of traditional comics (even the classics) in global markets such as India and China portends a new global model for marketing and distribution. If superheroes are to continue as the mainstay of live-action media, the entertainment industry is highly dependent upon a viable, continued production of good stories. Movies need for comics to stay robust. The creativity and ingenuity that has been the hallmark of great comics continues to thrive with independent artists, some of whose work has gone viral and garnered publishing contracts.
Make no mistake, comics fans and enthusiastic geeks. Comic-Con and the Business of Pop Culture is very much a business and brand strategy book, centered around a very trendy and chic brand. There’s no question that casual fans and people interested in the more technical side of comics transmedia will find it an interesting, if at times esoteric, read. But for those working in (or aspiring to) the intersection of comics and entertainment, it is an essential read. Cautioning both the entertainment and comics industries against complacency against what could be a temporary “gold rush” cultural phenomenon, Salkowitz nevertheless peppers the book with salient advice for sustaining comics-based entertainment and media, while fortifying traditional comics and their creative integrity for the next generation of fans. The final portion of the book is its strongest; a hypothetical journey several years into the future, utilizing what he calls “scenario planning” to prognosticate what might happen. Comic-Con (and all the business that it represents) might grow larger than ever, an absolute phenomenon, might scale back to account for a diminishing fan interest, might stay the same or fraction into a series of global events to account for the growing overseas interest in traditional comics. Which one will come to fruition depends on brand synergy, fan growth and engagement, distribution with digital and interactive media, and a carefully cultivated relationship between comics audiences, creators and publishers. Salkowitz calls Comic-Con a “laboratory in which the global future of media is unspooling in real time.” What will happen next? Like any good scientist knows, experiments, even under controlled circumstances, are entirely unpredictable. See you in San Diego next year!
Rob Salkowitz is the cofounder and Principal Consultant of Seattle-based MediaPlant LLC and is the author of two other books, Young World Rising and Generation Blend. He also teaches in the Digital Media program at the University of Washington. Follow Rob on Twitter.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>
Earlier this year, the Susan G. Komen Foundation made headlines around the world after their politically-charged decision to cut funding for breast cancer screening at Planned Parenthood caused outrage and negatively impacted donations. Despite reversing the decision and apologizing, many people in the health care and fund raising community feel that the aftermath of the controversy still dogs the foundation. Indeed, Advertising Age literally referred to it as a PR crisis. If all of this sounds more like spin for a brand rather than a charity working towards the cure of a devastating illness, it’s not far from the truth. Susan G. Komen For the Cure, Avon Walk For Breast Cancer and the Revlon Run/Walk For Women represent a triumvirate hegemony in the “pink ribbon” fundraising domain. Over time, their initial breast cancer awareness movement (and everything the pink ribbon stood for symbolically) has moved from activism to pure consumerism. The new documentary Pink Ribbons, Inc. deftly and devastatingly examines the rise of corporate culture in breast cancer fundraising. Who is really profiting from these pink ribbon campaigns, brands or people with the disease? How has the positional messaging of these “pink ribbon” events impacted the women who are actually facing the illness? And finally, has motivation for profit driven the very same companies whose products cause cancer to benefit from the disease? ScriptPhD.com’s Selling Science Smartly advertising series continues with a review of Pink Ribbons, Inc..
“We used to march in the streets. Now you’re supposed to run for a cure, or walk for a cure, or jump for a cure, or whatever it is,” states Barbara Ehrenreich, breast cancer survivor and author of Welcome to Cancerland, in the opening minutes of the documentary Pink Ribbons, Inc. Directed by Canadian filmmaker Lea Pool and based on the book Pink Ribbons, Inc.: Breast Cancer and the Politics of Philosophy by Samantha King, the documentary features in-depth interviews with leading authors, experts, activists and medical professionals. It also includes an important look at the leading players in breast cancer fundraising and marketing. The production crew filmed a number of prominent fundraising events across North America, using the upbeat festivities (where some didn’t even visibly show the word ‘cancer’) as the backdrop for exploring the “pinkwashing” of breast cancer through marketing, advertising and slick gimmicks. At the same time, showcasing well-meaning, enthusiastic walkers, runners and fundraisers is a double-edged sword and was handled with the appropriate sensitivity by the filmmakers. Pool wanted to “make sure we showed the difference between the participants, and their courage and will to do something positive, and the businesses that use these events to promote their products to make money.”
Often lost amidst the pomp and circumstance of these bright pink feel-good galas is that the origins of the pink ribbon are quite inauspicious. The original pink ribbon wasn’t even pink. It was a salmon-colored cloth ribbon made by breast cancer activist Charlotte Haley as part of a grass roots organization she called Peach Corps. From a kitchen counter mail-in operation, Haley’s vision grew to hundreds of thousands of supporters, so much so that it caught the attention of Estee Lauder founder Evelyn Lauder. The company wanted to buy the peach ribbon from Haley, who refused, so they simply rebranded breast cancer to a comforting, reassuring, non-threatening color: bright pink. And before our very eyes, a stroke of marketing genius was born.
As the pink ribbon movement took hold of the fundraising community, the money started to spill over into mainstream advertising, adorning everything from the food we eat to the clothes we wear, all under the auspices of philanthropy. In theory, people should feel great about buying products that return some of their profits for such a great cause. In practice, many of these campaigns simply throw a bright pink cloak over false, if not cynical, advertising. Take Yoplait’s yearly “Save Lids to Save Lives” campaign:
For every lid you save from a Yoplait yogurt (and mail in, using a $0.44 stamp, mind you!), they will donate 10 cents to breast cancer research. If you ate three yogurts per day for fourth months, you will have raised a grand total of $36 for breast cancer research, but spent more on stamps and in environmental shipping waste. Not as impressive when you break it down, eh?
A recent American Express campaign called “Every Dollar Counts” pledged that every purchase during a four month period would incur a one cent donation to breast cancer research. Unfortunately, they never quantified donations commensurate to spending, so that meant whether you charged a pack of gum or a big screen TV to your AmEx, they would donate a penny. The breast cancer community was so outraged by this hubris, they staged a successful campaign to rescind the ads. The fact is, the above examples demonstrate that pink ribbons have become an industry, with demographics and talking points, just like everything else. Pinkwashing campaigns tend to target middle class, ultra-feminine white women. Why? Because they are typical targets that move the products these industries are trying to sell. Take the NFL’s recent pink screening campaign. Well-meaning or not, it came amidst a series of crimes and violence by NFL players, some of which was domestic in nature. One can imagine that players adorned in hot pink gear would have been a smart way to mollify its rather impressive female fanbase.
As Ehrenreich states in the documentary, the collective effect of this marketing has been to soften breast cancer into a pretty, pink and feminine disease. Nothing too scary, nothing too controversial. Just enough to keep raising money that goes… somewhere. Take a look at the recent chipper television campaign for the Breast Cancer Centre of Australia:
While some of the breast cancer-related branding and pink sponsorships mislead through good intentions, others are a dangerous bold-faced lie. Some of the very companies that sponsor fundraising events and make money off of pink revenue either make deleterious products linked to cancer or stand to profit from treatment of it. Revlon, sponsors of the Run/Walk for Women, are manufacturers of many cosmetics (searchable on the database Skin Deep) that are linked to cancer. The average woman puts on 12 cosmetics products per day, yet only 20% of all cosmetics have undergone FDA examination and safety testing. The pharmaceutical giant Astra Zeneca can’t seem to decide if it’s for or against cancer. They produce the anti-estrogen breast cancer drug Tamoxifen, yet also manufacture the pesticide atrazine (under the Swiss-based company Syngenta), which has been linked to cancer as an estrogen-boosting compound. Breast cancer history month (October) is nothing more than a PR stunt that was invented by a marketing expert at… drumroll please… Astra Zeneca! Their goal was to promote mammography as a powerful weapon in the war against breast cancer. But as the American arm of the largest chemical company in the world, the reality is that Astra Zeneca was and is benefiting from the very illness it was urging women to get screened for. Perhaps the most audacious example of them all is pharmaceutical giant Eli Lilly. Sponsors of cancer research and treatment, both in medicine and the community, Lilly produced the cancer and infertility causing DES (diethylstilbestrol), and currently manufactures rBGH, an artificial hormone given to cows to make them produce more milk. rBGH has been linked to breast cancer and a host of other health problems. These strong corporate links in many ways explain the uplifting, happy, sterile messaging behind the pink ribbon. Corporations are, quite bluntly, making money off of marketing cancer, so if they don’t put a smiley face on the disease, they will alienate their customers and the conglomerate businesses pouring money into these campaigns.
Juxtaposed with the uplifting, bombastic, bright pink backdrop of the various cancer fundraisers and rallies, Pink Ribbons, Inc. quietly profiles the IV League, an Austin, TX-based support group for metastatic breast cancer. The women meet on a regular basis to share stories, help each other cope and accept the rigors of the disease and realities of dying. Many of the group members interviewed found current breast cancer campaign marketing offensive, tastelessly positive and falsely empowering (“If you just get screened and get mammograms and eat healthy, breast cancer can’t happen to you!”). The group, which has lost 10 members last year alone, is among a large faction of cancer sufferers that feel left out in the pinkwashing tide of marketing campaigns. Highlighting that sometimes you do get cancer because of no explanation, and sometimes you won’t respond to any treatment is a downer. It’s not the kind of uplifting story that advertising campaigns are built around, leaving the women feeling as if they’re living alone with the fact that they are dying. “You’re the angel of death,” remarks IV Leaguer Jeanne Collins. “You’re the elephant in the room. And they’re learning to live and you’re learning to die.” By utilizing powerful messaging keywords like BATTLE, WAR and SURVIVOR, cancer foundations and brands are subliminally putting down those who didn’t survive. And there are many who don’t survive — someone dies of breast cancer every 69 seconds. Are they suggesting that people who died or didn’t respond to treatment simply didn’t try hard enough? One of the most poignant moments in the film was an IV League Stage 4 cancer patient, probably weeks or months from her death: “You can die in a perfectly healed state.”
Although Pink Ribbons, Inc. is a sobering polemic against the mindless trivialization of commercializing breast cancer and even misdirecting funds from where they can be most helpful, it is not a hopeless film. Far from projecting pessimism, the film showcases the tremendous willpower and manpower that these three-day walks engender. It is simply misdirected. If hundreds of thousands of women and men can be motivated to fundraise, walk, run and (in some cases) jump out of planes, the effort is absolutely there to stop breast cancer. “Do something besides worry to make a difference,” concludes Barbara Brenner. “We have enormous power, if only we’d use it.” Director Lea Pool hopes that the film will encourage people to “be more critical about our actions and stop thinking that by buying pink toilet paper we’re doing what needs to be done. I don’t want to say that we absolutely shouldn’t be raising money. We are just saying ‘Think before you pink.'”
Watch the trailer for Pink Ribbons, Inc. here:
Pink Ribbons, Inc. goes into wide release in theaters nationwide June 8, 2012 and was released on DVD in September of 2012.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>Read through any archive of science fiction movies, and you quickly realize that the merger of pop culture and science dates as far back as the dawn of cinema in the early 1920s. Even more surprising than the enduring prevalence of science in film is that the relationship between film directors, scribes and the science advisors that have influenced their works is equally as rich and timeless. Lab Coats in Hollywood: Science, Scientists, and Cinema (2011, MIT Press), one of the most in-depth books on the intersection of science and Hollywood to date, serves as the backdrop for recounting the history of science and technology in film, how it influenced real-world research and the scientists that contributed their ideas to improve the cinematic realism of science and scientists. For a full ScriptPhD.com review and in-depth extended discussion of science advising in the film industry, please click the “continue reading” cut.
Written by David A. Kirby, Lecturer in Science Communication Studies at the Centre for History of Science, Technology and Medicine at the University of Manchester, England, Lab Coats offers a surprising, detailed analysis of the symbiotic—if sometimes contentious—partnership between filmmakers and scientists. This includes the wide-ranging services science advisors can be asked to provide to members of a film’s production staff, how these ideas are subsequently incorporated into the film, and why the depiction of scientists in film carries such enormous real-world consequences. Thorough, detailed, and honest, Lab Coats in Hollywood is an exhaustive tome of the history of scientists’ impact on cinema and storytelling. It’s also an essential and realistic road map of the challenges that scientists, engineers and other technical advisors might face as they seriously pursue science advising to the film industry as a career.
The essential questions that Lab Coats in Hollywood addresses are these—is it worth it to hire a science advisor for a movie production? Is it worth it for the scientist to be an advisor? The book’s purposefully vague conclusion is that it depends solely on how the scientist can film’s storyline and visual effects. Kirby wisely writes with an objective tone here because the topic is open to a considerable amount of debate among the scientists and filmmakers profiled in the book. Sometimes a scientist is so key to a film’s development, he or she becomes an indispensible part of the day-to-day production. A good example of this is Jack Horner, paleontologist at the Museum of the Rockies in Bozeman, MT, and technical science advisor to Steven Spielberg in Jurassic Park and both of its sequels. Horner, who drew from his own research on the link between dinosaurs and birds for a more realistic depiction of the film’s contentious science, helped filmmakers construct visuals, write dialogue, character reactions, animal behaviors, and map out entire scenes. J. Marvin Herndon, a geophysicist at the Transdyne Corporation, approached the director of the disaster film The Core when he learned the plot was going to be based on his controversial hypothesis about a giant uranium ball in the center of the Earth. Herndon’s ideas were fully incorporated into the film’s plot, while Herndon rode the wave of publicity from the film to publish his research in a PNAS paper. The gold standard of science input, however, were Stanley Kubrik’s multiple science and engineering advisors for 2001: A Space Odyssey, discussed in much further detail below.
Kirby hypothesizes that sometimes, a film’s poor reception might have been avoided with a science advisor. He provides the example of the Arnold Schwarzenegger futuristic sci-fi bomb The Sixth Day, which contained a ludicrously implausible use of human cloning in its main plot. While the film may have been destined for failure, Kirby posits that it only could have benefited from proper script vetting by a scientist. By contrast, the 1998 action adventure thriller Armageddon came under heavy expert criticism for its basic assertion that an asteroid “the size of Texas” could go undetected until eighteen days before impact. Director Michael Bay patently refused to take the advice of his advisor, NASA researcher Ivan Bakey, and admitted he was sacrificing science for plot, but Armageddon went on to be a huge box office hit regardless. Quite often, the presence of a science advisor is helpful, albeit unnecessary. One of the book’s more amusing anecdotes is about Dustin Hoffman’s hyper-obsessive shadowing of a scientist for the making of the pandemic thriller Outbreak (great guide to the movie’s science can be found here). Hoffman was preparing to play a virologist and wanted to infuse realism in all of his character’s reactions. Hoffman kept asking the scientist to document reactions in mundane situations that we all encounter—a traffic jam, for example—only to come to the shocking conclusion that the scientist was a real person just like everyone else.
Most of the time, including scientists in the filmmaking process is at the discretion of the studios because of the one immutable decree reiterated throughout the book: the story is king. When a writer, producer or director hires a science consultant, their expertise is utilized solely to facilitate, improve or augment story elements for the purposes of entertaining the audience. Because of this, one of the most difficult adjustments a science consultant may face is a secondary status on-set even though they may be a superstar in their own field. Some of the other less glamorous aspects of film consulting include heavy negotiations with unionized writers for script or storyline changes, long working hours, a delicate balance between side consulting work and a day job, and most importantly, an inconsistent (sometimes nonexistent) payment structure per project. I was notably thrilled to see Kirby mention the pros and cons of programs such as the National Science Foundation’s Creative Science Studio (a collaboration with USC’s school of the Cinematic Arts) and the National Academy of Science’s Science and Entertainment Exchange, which both provide on-demand scientific expertise to the Hollywood filmmaking community in the hope of increasing and promoting the realism of scientific portrayal in film. While valuable commodities to science communication, both programs have had the unfortunate effect of acclimating Hollywood studios to expect high-level scientific consulting for free.
1968’s 2001: A Space Odyssey is widely considered by popular consensus to be the greatest sci-fi movie ever made, and certainly the most influential. As such, Kirby devotes an entire chapter to detailing the film’s production and integration of science. Director Stanley Kubrik took painstaking detail in scientific accuracy to explore complex ideas about the relationship between humanity and technology, hiring a range of advisors from anthropologists, aeronautical engineers, statisticians, and nuclear physicists for various stages of production. Statistician I. J. Good provided advice on supercomputers, aerospace Harry Lange provided production design, while NASA space scientist Frederick Ordway lent over three years of his time to develop the space technology used in the film. In doing so, Kubrik’s staff consulted with over sixty-five different private companies, government agencies, university groups and research institutions. So real was the space technology in 2001 that moon landing hoax supporters have claimed the real moon landing by United States astronauts, taking place in 1969, was taped on the same sets. Not every science-based film has used science input as meticulously or thoroughly since, but Kubrik’s influence on the film industry’s fascination with science and technology has been an undeniable legacy.
One of the real treats of Lab Coats in Hollywood is the exploration of the two-way relationship between scientists and filmmakers, and how film in turn influences the course of science, as we discuss in more detail below. Between film case studies, critiques and interviews with past science advisors are interstitial vignettes of ways scientists have shaped films we know and love. Even the animated feature Finding Nemo had an oceanography advisor to get the marine biology correct. The seminal moment of the most recent Star Trek installation was due to a piece of off-handed scientific advice from an astronomer. The cloning science of Jurassic Park, so thoroughly researched and pieced together by director Steven Spielberg and science advisor Jack Horner, was actually published in a top-notch journal days ahead of the movie’s premiere. Even in rare spots where the book drags a bit with highly technical analysis are cinematic backstories with details that readers will salivate over. (For example, there’s a very good reason all the kelp went missing from Finding Nemo between its cinematic and DVD releases.)
As the director of a creative scientific consulting company based in Los Angeles, one of the biggest questions I get asked on a regular basis is “What does a science advisor do, exactly?” Lab Coats in Hollywood does an excellent job of recounting stories and case studies of high-profile scientist consultants, all of whom contributed their creative talents to their respective films in different ways, what might be expected (and not expected) of scientists on set, and of giving different areas of expertise that are currently in demand in Hollywood. Kirby breaks down cinematic fact checking, the most frequent task scientists are hired to perform, into three areas within textbook science (known, proven facts that cannot be disputed, such as gravity): public science, something we all know and would think was ridiculous if filmmakers got wrong, expert science, facts that are known to specialists and scientific experts outside of the lay audience, and (most problematic) folk science, incorrect science that has nevertheless been accepted as true by the public. Filmmakers are most likely to alter or modify facts that they perceive as expert science to minimize repercussions at the box office.
A science advisor is constantly navigating cinematic storytelling constraints and a filmmaker’s desire to utilize only the most visually appealing and interesting aspects of science (regardless of whether the context is always academically appropriate). Another broad area of high demand is in helping actors look and act like a real scientist on screen. Scientists have been hired to do everything from doctoring dialogue to add realism into an actor’s portrayal (the movie Contact and Jodie Foster’s depiction of Dr. Ellie Arroway is a good example of this), training actors in using equipment and pronouncing foreign-sounding jargon, replicating laboratory notebooks or chalkboard scribbles with the symbols and shorthand of science (such as in the mathematics film A Beautiful Mind), and to recreate the physical space of an authentic laboratory. Finally, the scientist’s expertise of the known is used to help construct plausible scenarios and storylines for the speculative, an area that requires the greatest degree of flexibility and compromise from the science advisor. Uncertainty, unexplored research and “what if” scenarios, the bane of every scientist’s existence, happen to be Hollywood’s favorite scenarios, because they allow the greatest creative freedom in storytelling and speculative conceptualization without being negated by a proven scientific impossibility. An entire chapter—the book’s finest—is devoted to two case studies, Deep Impact and The Hulk, where real science concepts (near-Earth asteroid impacts and genetic engineering, respectively) were researched and integrated into the stories that unfolded in the films. (Side note: if you are ever planning on being a science advisor read this section of the book very carefully).
In years past, consulting in films didn’t necessarily bring acclaim to scientists within their own research communities; indeed, Lab Coats recounts many instances where scientists were viewed as betraying science or undermining its seriousness with Hollywood frivolity, including many popular media figures such as Carl Sagan and Paul Ehrlich. Recently, however, consultants have come to be viewed as publicity investments both by studios that hire high-profile researchers for recognition value of their film’s science content and by institutes that benefit from branding and exposure. Science films from the last 10-15 years such as GATTACA, Outbreak, Armageddon, Contact, The Day After Tomorrow and a panoply of space-related flicks have attached big-name scientists as consultants (gene therapy pioneer French Anderson, epidemiologist David Morens, NASA director Ivan Bekey, SETI institute astronomers Seth Shostak and Jill Tartar and climatologist Michael Molitor, respectively). They also happened to revolve around the research salient to our modern era: genetic bioengineering, global infectious diseases, near-earth objects, global warming and (as always) exploring deep space. As such, a mutually beneficial marketing relationship has emerged between science advisors and studios that transcends the film itself resulting in funding and visibility to individual scientists, their research, and even institutes and research centers. The National Severe Storms Laboratory (NSSL) promoted themselves in two recent films, Twister and Dante’s Peak, using the films as a vehicle to promote their scientific work, to brand themselves as heroes underfunded by the government, and to temper public expectations about storm predictions. No institute has had a deeper relationship with Hollywood than NASA, extending back to the Star Trek television series, with intricate involvement and prominent logo display in the films Apollo 13, Armageddon, Mission to Mars, and Space Cowboys. Some critics have argued that this relationship played an integral role in helping NASA maintain a positive public profile after the devastating 1986 Challenger space shuttle disaster. The end result of the aforementioned promotion via cinematic integration can only benefit scientific innovation and public support.
Accurate and favorable portrayal of science content in modern cinema has an even bigger beneficiary than specific research institutes, and that is society itself. Fictional technology portrayed in film – termed a “diegetic prototype” – has often inspired or led directly to real-world application and development. Kirby offers the most impactful case of diegetic prototyping as the 1981 film Threshold, which portrayed the first successful implantation of a permanent artificial heart, a medical marvel that became reality only a year later. Robert Jarvik, inventor of the Jarvik-7 artificial heart used in the transplant, was also a key medical advisor for Threshold, and felt that his participation in the film could both facilitate technological realism and by doing so, help ease public fears about what was then considered a freak surgery, even engendering a ban in Great Britain. Of the many obstacles that expensive, ambitious, large-scale research faces, Kirby argues that skepticism or lack of enthusiasm from the public can be the most difficult to overcome, precisely because it feeds directly into essential political support that makes funding possible. A later example of film as an avenue for promotion of futuristic technology is Minority Report, set in the year 2054, and featuring realistic gestural interfacing technology and visual analytics software used to predict crime before it actually happens. Less than a decade later, technology and gadgets featured in the film have come to fruition in the form of multi-touch interfaces like the iPad and retina scanners, with others in development including insect robots (mimics of the film’s spider robots), facial recognition advertising billboards, crime prediction software and electronic paper. A much more recent example not featured in the book is the 2011 film Limitless, featuring a writer that is able to stimulate and access 100% of his brain at will by taking a nootropic drug. While the fictitious drug portrayed in the film is not yet a neurochemical reality, brain enhancement is a rising field of biomedical research, and may one day indeed yield a brain-boosting pill.
No other scientific feat has been a bigger beneficiary of diegetic prototyping than space travel, starting with 1929’s prophetic masterpiece Frau im Mond [Woman in the Moon], sponsored by the German Rocket Society and advised masterfully by Hermann Oberth, a pioneering German rocket research scientist. The first film to ever present the basics of rocket travel in cinema, and credited with the now-standard countdown to zero before launch in real life, Frau im Mond also featured a prototype of the liquid-fuel rocket and inspired a generation of physicists to contribute to the eventual realization of space travel. Destination Moon, a 1950 American sci-fi film about a privately financed trip to the Moon, was the first film produced in the United States to deal realistically with the prospect of space travel by utilizing the technical and screenplay input of notable science fiction author Robert A. Heinlein. Released seven years before the start of the USSR Sputnik program, Destination Moon set off a wave of iconic space films and television shows such as When Worlds Collide, Red Planet Mars, Conquest of Space and Star Trek in the midst of the 1950s and 1960s Cold War “space race” between the United States and Russia. What theoretical scientific feat will propel the next diegetic prototype? A mission to Mars? Space colonization? Anti-aging research? Advanced stem cell research? Time will only tell.
Ultimately, readers will enjoy Lab Coats in Hollywood for its engaging writing style, detailed exploration of the history of science in film and most of all, valuable advice from fellow scientists who transitioned from the lab to consulting on a movie set. Whether you are a sci-fi film buff or a research scientist aspiring to be a Hollywood consultant, you will find some aspect of this book fascinating. Especially given the rapid proliferation of science and technology content in movies (even those outside of the traditional sci-fi genre), and the input from the scientific community that it will surely necessitate, knowing the benefits and pitfalls of this increasingly in-demand career choice is as important as its significance in ensuring accurate portrayal of scientists to the general public.
~*ScriptPhD*~
***************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]> As Comic-Con winds down on the shortened Day 4, we conclude our coverage with two panels that exemplify what Comic-Con is all about. As promised, we dissect the “Comics Design” panel of the world’s top logo designers deconstructing their work, coupled with images of their work. We also bring you an interesting panel of ethnographers, consisting of undergraduate and graduate student, studying the culture and the varying forces that shape Comic-Con. Seriously, they’re studying nerds! Finally, we are delighted to shine our ScriptPhD.com spotlight on new sci-fi author Charles Yu, who presented his new novel at his first (of what we are sure are many) Comic-Con appearance. We sat down and chatted with Charles, and are pleased to publish the interview. And of course, our Day 4 Costume of the Day. Comic-Con 2010 (through the eyes of ScriptPhD.com) ends under the “continue reading” cut!
Comics Design
We are not ashamed to admit that here at ScriptPhD.com, we are secret design nerds. We love it, particularly since good design so often elevates the content of films, television, and books, but is a relatively mysterious process. One of THE most fascinating panels that we attended at Comic-Con 2010 was on the design secrets behind some of your favorite comics and book covers. A panel of the world’s leading designers revealed their methodologies (and sometimes failures) in the design process behind their hit pieces, lifting the shroud of secrecy that designers often envelop themselves in. An unparalleled purview into the mind of the designer, and the visual appeal that so often subliminally contributes to the success of a graphic novel, comic, or even regular book. We do, as it turns out, judge books by their covers.
As promised, we revisit this illuminating panel, and thank Christopher Butcher, co-founder of The Toronto Comic Arts Festival and co-owner of The Beguiling, Canada’s finest comics bookstore. Chris was kind enough to provide us with high-quality images of the Comics Design panel’s work, for which we at ScriptPhD.com are grateful. Chris had each of the graphic artists discuss their work with an example of design that worked, and design that didn’t (if available or so inclined). The artist was asked to deconstruct the logo or design and talk about the thought process behind it.
Mark Ciarello – (art + design director at DC Comics)
Mark chose to design the cover of this book with an overall emphasis on the individual artist. Hence the white space on the book, and a focus on the logo above the “solo” artist.
Adam Grano – (designer at Fantagraphics)
Adam took the title of this book quite literally, and let loose with his design to truly emphasize the title. He called it “method design.” He wanted the cover to look like a drunken dream.
For the Humbug collection, Grano tried hard not to impress too much of himself (and his tastes) in the design of the cover. He wanted to inject simplicity in a project that would stand the test of time, because it was a collector’s series.
Grano considered this design project his “failure.” It contrasts greatly with the simplicity and elegance of Humbug. He mentioned that everyone on the page is scripted and gridded, something that designers try to avoid in comics.
Chip Kidd – (designer at Random House)
Chip Kidd had the honor of working on the first posthumous Peanuts release after Charles M. Schultz’s death, and took to the project quite seriously. In the cover, he wanted to deconstruct a Peanuts strip. All of the human element is taken out of the strip, with the characters on the cover up to their necks in suburban anxiety.
Kidd likes this cover because he considers it an updated spin on Superman. It’s not a classic Superman panel, so he designed a logo that deviated from the classic “Superman” logo to match.
Kidd chose this as his design “failure”, but not the design itself. The cover represents one of seven volumes, in which the logo pictured disintegrates by the seventh issue, to match the crisis in the title. Kidd’s only regret here is that he was too subtle. He wishes he’d chosen to start the logo disintegration progression sooner, as there’s very little difference between the first few volumes.
Fawn Lau – (designer at VIZ)
Fawn was commissioned to redesign this book cover for an American audience. Keeping this in mind, and wanting the Japanese animation to be more legible for the American audience, she didn’t want too heavy-handed of a logo. In an utterly genius stroke of creativity, Lau went to an art store, bought $70 worth of art supplies, and played around with them until she constructed the “Picasso” logo. Clever, clever girl!
Mark Siegel – (First Second Books)
Mark Siegel was hired to create the cover of the new biography Feynman, an eponymous title about one of the most famous physicists of all time. Feynman was an amazing man who lived an amazing life, including a Nobel Prize in physics in 1965. His biographer, Ottaviani Myrick, a nuclear physicist and speed skating champion, is an equally accomplished individual. The design of the cover was therefore chosen to reflect their dynamic personalities. The colors were chosen to represent the atomic bomb and Los Alamos, New Mexico, where Feynman assisted in the development of The Manhattan Project. Incidentally, the quote on the cover – “If that’s the world’s smartest man, God help us!” – is from Feynman’s own mother.
Keith Wood – (Oni Press)
Wood remarked that this was the first time he was able to do design on a large scale, which really worked for this project. He chose a very basic color scheme, again to emphasize a collection standing the test of time, and designed all the covers simultaneously, including color schemes and graphics. He felt this gave the project a sense of connectedness.
Wood chose a pantone silver as the base of this design with a stenciled typeface meant to look very modern. The back of the cover and the front of the cover were initially going to be reversed when the artists first brought him the renderings. However, Wood felt that since the book’s content is about the idea of a girl’s traveling across the United States, it would be more compelling and evocative to use feet/baggage as the front of the book. He was also the only graphic artist to show a progression of 10-12 renderings, playing with colors, panels and typeface, that led to the final design. He believes in a very traditional approach to design, which includes hand sketches and multiple renderings.
The Culture of Popular Things: Ethnographic Examinations of Comic-Con 2010
Each year, for the past four years, Comic-Con ends on an academic note. Matthew J. Smith, a professor at Wittenberg University in Ohio, takes along a cadre of students, graduate and undergraduate, to study Comic-Con; the nerds, the geeks, the entertainment component, the comics component, to ultimately understand the culture of what goes on in this fascinating microcosm of consumerism and fandom. By culture, the students embrace the accepted definition by famous anthropologist Raymond J. DeMallie: “what is understood by members of a group.” The students ultimately wanted to ask why people come to Comic-Con in general. They are united by the general forces of being fans; this is what is understood in their group. After milling around the various locales that constituted the Con, the students deduced that two ultimate forces were simultaneously at play. The fan culture drives and energizes the Con as a whole, while strong marketing forces were on display in the exhibit halls and panels.
Maxwell Wassmann, a political economy student at Wayne State University, pointed out that “secretly, what we’re talking about is the culture of buying things.” He compared Comic-Con as a giant shopping mall, a microcosm of our economic system in one place. “If you’ve spent at least 10 minutes at Comic-Con,” he pointed out, “you probably bought something or had something tried to be sold to you. Everything is about marketing.” As a whole, Comic-Con is subliminally designed to reinforce the idea that this piece of pop culture, which ultimately advertises an even greater subset of pop culture, is worth your money. Wassmann pointed out an advertising meme present throughout the weekend that we took notice of as well—garment-challenged ladies advertising the new Green Hornet movie. The movie itself is not terribly sexy, but by using garment-challenged ladies to espouse the very picture of the movie, when you leave Comic-Con and see a poster for Green Hornet, you will subconsciously link it to the sexy images you were exposed to in San Diego, greatly increasing your chances of wanting to see the film. By contrast, Wassmann also pointed out that there is a concomitant old-town economy happening; small comics. In the fringes of the exhibition center and the artists’ space, a totally different microcosm of consumerism and content exchange.
Kane Anderson, a PhD student at UC Santa Barbara getting his doctorate in “Superheroology” (seriously, why didn’t I think of that back in graduate school??), came to San Diego to observe how costumes relate to the superhero experience. To fully absorb himself in the experience, and to gain the trust of Con attendees that he’d be interviewing, Anderson came in full costume (see above picture). Overall, he deduced that the costume-goers, who we will openly admit to enjoying and photographing during our stay in San Diego, act as goodwill ambassadors for the characters and superheroes they represent. They also add to the fantasy and adventure of Comic-Con goers, creating the “experience.” The negative side to this is that it evokes a certain “looky-loo” effect, where people are actively seeking out, and singling out, costume-wearers, even though they only constitute 5% of all attendees.
Tanya Zuk, a media masters student from the University of Arizona, and Jacob Sigafoos, an undergraduate communications major at Wittenberg University, both took on the mighty Hollywood forces invading the Con, primarily the distribution of independent content, an enormous portion of the programming at Comic-Con (and a growing presence on the web). Zuk spoke about original video content, more distinctive of new media, is distributed primarily online. It allows for more exchange between creators and their audience than traditional content (such as film and cable television), and builds a community fanbase through organic interaction. Sigafoos expanded on this by talking about how to properly market such material to gain viral popularity—none at all! Lack of marketing, at least traditional forms, is the most successful way to promote a product. Producing a high-quality product, handing it off to friends, and promoting through social media is still the best way to grow a devoted following.
And speaking of Hollywood, their presence at Comic-Con is undeniable. Emily Saidel, a Master’s student at NYU, and Sam Kinney, a business/marketing student at Wittenberg University, both took on the behemoth forces of major studios hawking their products in what originally started out as a quite independent gathering. Saidel tackled Hollywood’s presence at Comic-Con, people’s acceptance/rejection thereof, and how comics are accepted by traditional academic disciplines as didactic tools in and of themselves. The common thread is a clash between the culture and the community. Being a member of a group is a relatively simple idea, but because Comic-Con is so large, it incorporates multiple communities, leading to tensions between those feeling on the outside (i.e. fringe comics or anime fans) versus those feeling on the inside (i.e. the more common mainstream fans). Comics fans would like to be part of that mainstream group and do show interest in those adaptations and changes (we’re all movie buffs, after all), noted Kinney, but feel that Comic-Con is bigger than what it should be.
But how much tension is there between the different subgroups and forces? The most salient example from last year’s Con was the invasion of the uber-mainstream Twilight fans, who not only created a ruckus on the streets of San Diego, but also usurped all the seats of the largest pavilion, Hall H, to wait for their panel, locking out other fans from seeing their panels. (No one was stabbed.) In reality, the supposed clash of cultures is blown out of proportion, with most fans not really feeling the tension. To boot, Seidel pointed out that tension isn’t necessarily a bad thing, either. She gave a metaphor of a rubber band, which only fulfills its purpose with tension. The different forces of Comic-Con work in different ways, if sometimes imperfectly. And that’s a good thing.
Incidentally, if you are reading this and interested in participating in the week-long program in San Diego next year, visit the official website of the Comic-Con field study for more information. Some of the benefits include: attending the Comic-Con programs of your choice, learning the tools of ethnographic investigation, and presenting the findings as part of a presentation to the Comics Arts Conference. Dr. Matthew Smith, who leads the field study every year, is not just a veteran attendee of Comic-Con, but also the author of The Power of Comics.
COMIC-CON SPOTLIGHT ON: Charles Yu, author of How To Live Safely in a Science Fictional Universe.
Here at ScriptPhD.com, we love hobnobbing with the scientific and entertainment elite and talking to writers and filmmakers at the top of their craft as much as the next website. But what we love even more is seeking out new talent, the makers of the books, movies and ideas that you’ll be talking about tomorrow, and being proud to be the first to showcase their work. This year, in our preparation for Comic-Con 2010, we ran across such an individual in Charles Yu, whose first novel, How To Live Safely in a Science Fictional Universe premieres this fall, and who spoke about it at a panel over the weekend. We had an opportunity to have lunch with Charles in Los Angeles just prior Comic-Con, and spoke in-depth about his new book, along with the state of sci-fi in current literature. We’re pretty sure Charles Yu is a name science fiction fans are going to be hearing for some time to come. ScriptPhD.com is proud to shine our 2010 Comic-Con spotlight on Charles and his debut novel, which is available September 7, 2010.
How To Live Safely in a Science Fictional Universe is the story of a son searching for his father… through quantum-space time. The story takes place on Minor Universe 31, a vast story-space on the outskirts of fiction, where paradox fluctuates like the stock market, lonely sexbots beckon failed protagonists, and time travel is serious business. Every day, people get into time machines and try to do the one thing they should never do: try to change the past. That’s where the main character, Charles Yu, time travel technician, steps in. Accompanied by TAMMY (who we consider the new Hal), an operating system with low self-esteem, and a nonexistent but ontologically valid dog named Ed, Charles helps save people from themselves. When he’s not on the job, Charles visits his mother (stuck in a one-hour cycle, she makes dinner over and over and over) and searches for his father, who invented time travel and then vanished.
Questions for Charles Yu
ScriptPhD.com: Charles, the story has tremendous traditional sci-fi roots. Can you discuss where the inspiration for this came from?
Charles Yu: Well the sci-fi angle definitely comes from being a kid in the 80s, when there were blockbuster sci-fi things all over the place. I’ve always loved [that time], as a casual fan, but also wanted to write it. I didn’t even start doing that until after I’d graduated from law school. I did write, growing up, but I never wrote fiction—I didn’t think I’d be any good at it! I wrote poetry in college, minored in it, actually. Fiction and poetry are both incredibly hard, and poetry takes more discipline, but at least when I failed in my early writing, it was a 100 words of failure, instead of 5,000 words of it.
SPhD: What were some of your biggest inspirations growing up (television or books) that contributed to your later work?
CY: Definitely The Foundation Trilogy. I remember reading that in the 8th grade, and I remember spending every waking moment reading, because it was the greatest thing I’d ever read. First of all, I was in the 8th grade, so I hadn’t read that many things, but the idea that Asimov created this entire self-contained universe, it was the first time that I’d been exposed to that idea. And then to have this psychohistory on top, it was kind of trippy. Psychohistory is the idea that social sciences can be just as rigorously captured with equations as any physical science. I think that series of books is the main thing that got me into sci-fi.
SPhD: Any regrets about having named the main character after yourself?
CY: Yes. For a very specific reason. People in my life are going to think it’s biographical, which it’s very much not. And it’s very natural for people to do that. And in my first book of short stories, none of the main characters was named after anyone, and still I had family members that asked if that was about our family, or people that gave me great feedback but then said, “How could you do that to your family?” And it was fiction! I don’t think the book could have gotten written had I not left that placeholder in, because the one thing that drove any sort of emotional connection for the story for me was the idea of having less things to worry about. The other thing is that because the main character is named after you, as you’re writing the book, it acts as a fuel or vector to help drive the emotional completion.
SPhD: In the world of your novel, people live in a lachrymose, technologically-driven society. Any commentary therein whatsoever on the technological numbing of our own current culture?
CY: Yes. But I didn’t mean it as a condemnation, in a sense. I wouldn’t make an overt statement about technology and society, but I am more interested in the way that technology can sometimes not connect people, but enable people’s tendency to isolate themselves. Certainly, technology has amazing connective possibilities, but that would have been a much different story, obviously. The emotional plot-level core of this book is a box. And that sort of drove everything from there. The technology is almost an emotional technology that [Charles, the main character] has invented with his dad. It’s a larger reflection of his inability to move past certain limitations that he’s put on himself.
SPhD: What drives Charles, the main character of this book?
CY: What’s really driving Charles emotionally is looking for his dad. But more than that, is trying to move through time, to navigate the past without getting stuck in it.
SPhD: Both of his companions are non-human. Any significance to that?
CY: It probably speaks more to my limitations as a writer [laughs]. That was all part of the lonely guy type that Charles is being portrayed as. If he had a human with him, he’d be a much different person.
SPhD: The book abounds in scientific jargon and technological terminology, which is par for the course in science fiction, but was still very ambitious. Do you have high expectations of the audience that will read this book?
CY: Yeah. I was just reading an interview where the writer essentially said “You can never go wrong by expecting too much [of your audience].” You can definitely go wrong the other way, because that would come off as terrible, or assuming that you know more. But actually, my concerns were more in the other direction, because I knew I was playing fast and loose with concepts that I know I don’t have a great grasp of. I’m writing from the level of amateur who likes reading science books, and studied science in college—an entertainment layreader. My worry was whether I was BSing too much [of the science]. There are parts where it’s clearly fictional science, but there are other parts that I cite things that are real, and is anyone who reads this who actually knows something about science going to say “What the heck is this guy saying?”
SPhD: How To Live… is written in a very atavistic, retro 80s style of science fiction, and really reminded me of the best of Isaac Asimov. How do you feel about the current state of sci-fi literature as relates to your book?
CY: Two really big keys for me, and things I was thinking about while writing [this book], were one, there is kind of a kitchiness to sci-fi, and I think that’s kind of intentional. It has a kind of do-it-yourself aesthetic to it. In my book, you basically have a guy in the garage with his dad, and yes the dad is an engineer, but it’s in a garage without great equipment, so it’s not going to look sleek, you can imagine what it’s going to look like—it’s going to look like something you’d build with things you have lying around in the garage. On the other hand, it is supposed to be this fully realized time machine, and you’re not supposed to be able to imagine it. Even now, when I’m in the library in the science-fiction section, I’ll often look for anthologies that are from the 80s, or the greatest time travel stories from the 20th Century that cover a much greater range of time than what’s being published now. It’s almost like the advancement of real-world technology is edging closer to what used to be the realm of science fiction. The way that I would think about that is that it’s not exploting what the real possibility of science fiction is, which is to explore a current world or any other completely strange world, but not a world totally envisionable ten years from now. You end up speculating on what’s possible or what’s easily extrapollatable from here; that’s not necessarily going to make for super emotional stories.
Charles Yu is a writer and attorney living in Los Angeles, CA.
Last, but certainly not least, is our final Costume of the Day. We chose this young ninja not only because of the coolness of his costume, but because of his quick wit. As we were taking the snapshot he said, “I’m smiling, you just can’t see it.” And a check mate to you, young sir.
Incidentally, you can find much more photographic coverage of Comic-Con on our Facebook fan page. Become a fan, because this week, we will be announcing Comic-Con swag giveaways that only Facebook fans are eligible for.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>