From its earliest inceptions, science fiction has blurred the line between reality and technological fantasy in a remarkably prescient manner. Many of the discoveries and gadgets that have integrated seamlessly into modern life were first preconceived theoretically. More recently, the technologies behind ultra-realistic visual and motion capture effects are simultaneously helping scientists as research tools on a granular level in real time. The dazzling visual effects within the time-jumping space film Interstellar included creating original code for a physics-based ultra-realistic depiction of what it would be like to orbit around and through a black hole. Astrophysics researchers soon utilized the film’s code to visualize black hole surfaces and their effects on nearby objects. Virtual reality, whose initial development was largely rooted in imbuing realism into the gaming and video industries, has advanced towards multi-purpose applications in film, technology and science. The Science Channel is augmenting traditional programming with a ‘virtual experience’ to simulate the challenges and scenarios of an astronaut’s journey into space; VR-equipped GoPro cameras are documenting remote research environments to foster scientific collaboration and share knowledge; it’s even being implemented in health care for improving training, diagnosis and treatment concepts. The ability to record high-definition film of landscapes and isolated areas with drones, which will have an enormous impact on cinematography, carries with it the simultaneous capacity to aid scientists and health workers with disaster relief, wildlife conservation and remote geomapping.
The evolution of entertainment industry technology is sophisticated, computationally powerful and increasingly cross-functional. A cohort of interdisciplinary researchers at Northwestern University is adapting computing and screen resolution developed at DreamWorks Animation Studios as a vehicle for data visualization, innovation and producing more rapid and efficient results. Their efforts, detailed below, and a collective trend towards integration of visual design in interpreting complex research, portends a collaborative future between science and entertainment.
Not long into his tenure as the lead visualization engineer at Northwestern University’s Center For Advanced Molecular Imaging (CAMI), Matt McCrory noticed a striking contrast between the quality of the aesthetic and computational toolkits used in scientific research versus the entertainment industry. “When you spend enough time in the research community, with people who are doing the research and the visualization of their own data, you start to see what an immense gap there is between what Hollywood produces (in terms of visualization) and what research produces.” McCrory, a former lighting technical director at DreamWorks Animation, where he developed technical tools for the visual effects in Shark Tale, Flushed Away and Kung Fu Panda, believes that combining expertise in cutting-edge visual design with emerging tools of molecular medicine, biochemistry and pharmacology can greatly speed up the process of discovery. Initially, it was science that offered the TV and film world the rudimentary seeds of technology that would fuel creative output. But the TV and film world ran with it — so much so, that the line between science and art is less distinguishable than in any other industry. “We’re getting to a point [on the screen] where we can’t discern anymore what’s real and what’s not,” McCrory notes. “That’s how good Hollywood [technology] is getting. At the same time, you see very little progress being made in scientific visualization.”
What is most perplexing about the stagnant computing power and visualization in science is that modern research across almost all fields is driven by extremely large, high-resolution data sets. A handful of select MRI imaging scanners are now equipped with magnets ranging from 9.4 to 11.75 Teslas, capable of providing cellular resolution on the micron scale (0.1 to 0.2 millimeters versus 1.5 Tesla hospital scanners, at 1 millimeter resolution) and cellular changes on the microsecond scale. The ultra high-resolution imaging provides researchers with insight into everything from cancer to neurodegenerative diseases. While most biomedical drug discovery today is engineered by robotics equipment, which screens enormous libraries of chemical compounds for activity potential around the clock in a “high-throughput” fashion — from thousands to hundreds of thousands of samples — data must still be analyzed, optimized and implemented by researchers. Astronomical observations (from black holes to galaxies colliding to detailed high-power telescope observations in the billions of pixels) produce some of the largest data sets in all of science. Molecular biology and genetics, which in the genomics era has unveiled great potential for DNA-based sub-cellular therapeutics, has also produced petrabytes of data sets that are a quandary for most researchers to store, let alone visualize.
Unfortunately, most scientists can’t allocate dual resources to both advancing their own research and finding the best technology with which to optimize it. As McCrory points out: “In a field like chemistry or biology, you don’t have people who are working day and night with the next greatest way of rendering photo-realistic images. They’re focused on something related to protein structures or whatever their research is.”
The entertainment industry, on the other hand, has a singular focus on developing and continuously perfecting these tools, as necessitated by proliferation of divergent content sources, screen resolution and powerful capture devices. As an industry insider, McCrory appreciates the competitive evolution, driven by an urgency that science doesn’t often have to grapple with. “They’ve had to solve some serious problems out there and they also have to deal with issues involving timelines, since it’s a profit-driven industry,” he notes. “So they have to come up with [computing] solutions that are purely about efficiency.” Disney’s 2014 animated science film Big Hero 6 was rendered with cutting-edge visualization tools, including a 55,000-core computer and custom proprietary lighting software called Hyperion. Indeed, render farms at LucasFilm and Pixar consist of core data centers and state-of-the-art supercomputing resources that could be independent enterprise server banks.
At Northwestern’s CAMI, this aggregate toolkit is leveraged by scientists and visual engineers as an integrated collaborative research asset. In conjunction with a senior animation specialist and long-time video game developer, McCrory helped to construct an interactive 3D visualization wall consisting of 25 high-resolution screens that comprise 52 million total pixels. Compared to a standard computer (at 1-2 million pixels), the wall allows researchers to visualize and manage entire data sets acquired with higher-quality instruments. Researchers can gain different perspectives on their data in native resolution, often standing in front of it in large groups, and analyze complex structures (such as proteins and DNA) in 3D. The interface facilitates real-time innovation and stunning clarity for complex multi-disciplinary experiments. Biochemists, for example, can partner with neuroscientists to visualize brain activity in a mouse as they perfect drug design for an Alzheimer’s enzyme inhibitor. Additionally, 7 thousand in-house high performing core servers (comparable to most studios) provide undisrupted big data acquisition, storage and mining.
Could there be a day where partnerships between science and entertainment are commonplace? Virtual reality studios such as Wevr, producing cutting-edge content and wearable technology, could become a go-to virtual modeling destination for physicists and structural chemists. Programs like RenderMan, a photo-realistic 3D software developed by Pixar Animation Studios for image synthesis, could enable greater clarity on biological processes and therapeutics targets. Leading global animation studios could be a source of both render farm technology and talent for science centers to increase proficiency in data analysis. One day, as its own visualization capacity grows, McCrory, now pushing pixels at animation studio Rainmaker Entertainment, posits that NUViz/CAMI could even be a mini-studio within Chicago for aspiring filmmakers.
The entertainment industry has always been at the forefront of inspiring us to “dream big” about what is scientifically possible. But now, it can play an active role in making these possibilities a reality.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
The current scientific landscape can best be thought of as a transitional one. With the proliferation of scientific innovation and the role that technology plays in our lives, along with the demand for more of these breakthroughs, comes the simultaneous challenge of balancing affordable lab space, funding and opportunity for young investigators and inventors to shape their companies and test novel projects. Los Angeles science incubator Lab Launch is trying to simplify the process through a revolutionary, not-for-profit approach that serves as a proof of concept for an eventual interconnected network of “discovery hubs”. Founder Llewelyn Cox sits down with ScriptPhD for an insightful podcast that assesses the current scientific climate, the backdrop that catalyzed Lab Launch, and why alternatives to traditional avenues of research are critical for fueling the 21st Century economy.
As science and biotechnology innovation go, we are, to put it in Dickensian terms, in the best of times and the worst of times.
On the one hand, we are in the midst of a pioneering golden age of discovery, biomedical cures and technological evolution. It seems that every day brings limitless possibility and unbridled imagination. Recent development of CRISPR gene-editing machinery will facilitate specific genome splicing and wholescale epigenetic insight into disease and function. Immunotherapy, programming the body’s innate immune system and utilizing it to eradicate targeted tumors, represents the biggest progress in cancer research in decades. For the first time ever, physicists have detected and quantified gravitational waves, underscoring Einsteins theory of gravity, relativity and how the space continuum expands and contracts. The private company SpaceX landed a rocket on a drone ship for the first time, enabling faster, cheaper launches and reusable rockets.
Despite these exciting and hopeful advancements, many of which have the potential to greatly benefit society and quality of life, there remain tangible challenges to fostering and preserving innovation. Academic science produces too many PhDs, which saturates the job market, stifles viable prospects for the most talented scientists and even hurts science in the long run. Exacerbating this problem is a shortage of basic research funding in the United States that represents the worst crisis in 50 years. And while European countries experience a similar pullback in grant availability, developing countries are investing in research as an avenue of future economic growth. High-risk, high-reward research, particularly from young investigators, is suppressed at the expense of “safe research” and already-wealthy, established labs. Conduits towards entrepreneurship are possible, many through commercializing academic findings, but few come without strings attached, start-up companies are in a 48% decline since the 1970s. With research and development stagnating at most big pharmaceutical companies and current biomedical research growth unsustainable, there is an unprecedented opportunity to disrupt the innovation pipeline and create a more robust economy.
In an effort to boost discovery and development, there has been a permeation of venture capital accelerators and think-tank style early stage incubators from the technology sector into basic science; indeed it’s experiencing a proliferating boom. Affordable space, world-class facilities, access to startup capital and a opportunity to explore high-risk ideas — all are attractive to young academics and scientific entrepreneurs. Even pharmaceutical giants are spawning innovation arms as potential sources of future ideas. Large cities like New York are even using incubator space as a catalyst for growing a localized biotechnology-fueled economy. Such opportunities, however, don’t come without risk and collateral to innovators. As Mike Jones of science, inc. warns, the single biggest question that innovators must asses is: “Is the value I am getting equal to the risk I am saving, through equity?” Many incubators and accelerators act as direct conduits to academia and industry, both for talent recruitment and retention of intellectual material. In fact, the business model governing incubator space and asset allocation can often be nebulous, and sometimes further complicated by mandatory “collaborative” sharing not just of materials and space, but data and intellectual property. Even wealthy investors, who are now underwriting academic and private sector research, want a voice in the type of research and how it is conducted.
Amidst this idea-driven revolution was borne the concept of Lab Launch, a transformative permutation of incubator space for fostering pharmaceutical and biotechnology innovation. The fundamental principle behind Los Angeles-based Lab Launch is deceptively simple. As a not-for-profit endeavor, it provides simple, sleek and high-level equipment and space for life science and biotechnology experimentation. Because all shared equipment is donated as overflow from companies and laboratories that no longer need it, costs are minimized towards laboratory management fees and rental of facilities. As a stripped-down discovery engine model, this allows Lab Launch scientists to keep 100% of their intellectual property and equity, something that is virtually unheralded for young innovators at early-development stages. On a more complex level, the potential wide-scale benefits of Lab Launch (and future copycat spawns) are profound and resonant. In an industry where the Boston-San Francisco-San Diego triumvirate presents a near-hegemony for biotechnology funding, development and intellectual assets, the growth of simple, inexpensive science incubators in large cities carries tremendous economic upside. Critics might point out the lack of substantive guidance and elite think tank access of such a platform, yet 90% of all incubators and accelerators still fail, regardless. Moreover, selection criteria are often biased towards specific business interests or research aims that buoy academia and venture capital profiteers, which weed out the most high-risk ideas and participants. How, for example, would a scientist without a PhD or prestigious pedigree get access to a mainstream incubator lab space? How would a radically non-traditional idea or approach merit mainstream support or funding? A recent Harvard Business Review article suggests that lean start-ups with the most efficient, bare-bones development models, have far higher success rates and should be the template for driving an innovation-based economy. As elucidated in the podcast below, opening doors to facilitate proof-of-concept innovation and linking a virtual network of lab spaces will give rise to not just the next Silicon Valley, but the great scientific breakthroughs of tomorrow.
Lab Launch founder Dr. Llewellyn Cox sat down with ScriptPhD for a podcast interview to talk about his revolutionary not-for-profit startup incubator and the challenging scientific environment that inspired the idea. Among our topics of discussion:
•How lack of funding and overflow of PhDs in the current scientific climate stifles creativity and innovation
•Why biotechnology will cultivate exciting new industries in the 21st Century
•How no strings attached incubators like Lab Launch help give rise to Silicon Valleys of the future
•Why we should in fact be hopeful about how scientific progress is advancing
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
The last 25 years have brought an unprecedented level of scientific and technological advances, impacting virtually all dimensions of society, from communication and the digital revolution, to economics and food production to nanotechnology and medicine – and that’s just a start. The next few decades will rapidly expand this progress with exponential discovery and innovation, amidst more pressing global challenges than we’ve ever faced before. The opportunities to develop faster, better and cheaper products that improve modern living are limitless – Tesla electric cars, energy-saving fuels and machines, robotics – but they all share a common basic need for developing and studying materials in a more efficient manner. This will require a real-time acceleration of sharing, analytics and simulation through readily accessible databases. Essentially, an open-source wiki for materials scientists. In our in-depth article below, ScriptPhD.com explains why materials science is the most critical gateway towards 21st Century technology and how California startup company Citrine Informatics is providing revolutionary new information extraction software to create a crowdsourced, open access database available to any scientist.
There has been a public access revolution of sorts transforming science. A necessary one, at that. Science funding is in crisis. The peer-review process is under heavy scrutiny. And scientists are turning to transparency to help. Some notable billionaires are circumventing public funding and privatizing science. Molecular biologist Ethan Perlstein has used crowdfunding to raise hundreds of thousands of dollars from the public at large for a basic research lab. Last fall, as the Ebola crisis gripped the world’s attention, an expert researcher at The Scripps Research Institute (where I received a PhD) appealed to public crowd funding to raise $100,000 for vaccine research. With platforms like Experiment, RocketHub and KickStarter proliferating support for science, the public at large has begun to serve as an important incubator of innovation. More importantly, researchers are increasingly rethinking traditional academic publishing and recognizing the value of open data sharing through publications and databases. The Public Library of Science (PLoS) is the biggest non-profit advocate and publisher of open access research. Mainstream journals like Science have begun publishing free web-based alternatives that are immediately accessible. Calls for unified data sharing have grown louder and more widespread. Even the FDA has announced plans for crowdsourcing a genomics research platform to improve efficiency of diagnostic and clinical tests and their analysis. Far from hindering the scientific process and output, these efforts have accelerated alternative means of funding, data acquisition and exchange, collaboration and ultimately, discovery.
Think about virtually any practical aspect of life and it relies on materials. Transportation of any kind, Kevlar vests, sports equipment, communication devices, clothing, growing and making food… it’s impossible to think about modern society in even the most impoverished developing countries without them. Now think on a bigger scale. Superconductors. Carbon nanotubules. Graphene. The materials of the future that make even these breakthroughs obsolete. So crucial is materials science to all facets of economics, research and quality of life, that in 2011, the United States Government launched a Materials Genome Initiative. A partnership between the private sector industry, universities, and the government, its primary goal is to utilize a materials genome approach to cut the cost and time to market for basic materials products by 50%.
The only problem with materials research? Data. And lots of it. Aided largely by digital services and a proliferation of technology research – not to mention marketplace for the products that it makes possible – there is so much data produced today that the process of testing, developing, tweaking and inserting a material into a product takes about 20 years, according to the National Academy of Sciences. To combat this, materials scientists and engineers have growingly embraced a similar open access philosophy to that of life scientists. Leading science publisher Elsevier recently launched an infrastructure called “Open Data” to facilitate materials science data sharing across thirteen major publications. The Lawrence Berkeley National Laboratory just created the world’s largest database of elastic properties, a virtual gold mine for scientists working on materials that require mechanical properties for things like cars and airplanes. NASA has even opened a Physical Science Informatics database of all of its space station materials research in the hopes that the crowdsourcing accelerates engineering research discovery, applicable both to space and Earth.
Citrine Informatics, a startup company in California, is hoping to unify these concepts of data sharing and mining through open access web-based software (boosted by crowdsourcing) to build a comprehensive, open database of materials and chemical data. Essentially, Citrine is rolling out a cloud platform (Citrination) that will act as a digital middle man between data acquisition and product development. And rather than storing it in vastly differing locations under differing access guidelines, Citrination’s algorithms gather all available data (from publications to databases to publicly shared data from private companies) in order to show properties under different conditions, including when they might fail. A simple, mainframe registry database allows users to upload data, build customized data sets and see all known properties of a particular material under all different physical conditions (see below). Rater than testing and retesting internal materials data internally, thus lengthening R&D pipelines, research labs can simply model based on available information and adjust experiments accordingly.
“What we do represents a big change in the status quo,” co-founder Bryce Meredig says. “A lot of the scientists in these industries don’t have a natural inclination to turn to software to solve these problems.” Until now. The software will be made free to not for profit research institutions and will sell to materials and chemical companies that want to avail themselves of the wealth of data. Citrine has used a large recent infusion of grant funding to raise their ambition beyond a database and towards applications like 3D printing, renewable energy technology (co-founder Greg Mulholland was recognized by Forbes as a 30 under 30 leader in energy) and the next generation of devices.
The practical benefits to society as a whole of extending open access and crowdsourcing to materials science are tremendous. On an individual level, virtually every computer and mobile device we put in our hands, every gadget and machine we buy in the future will depend on improved materials. On a more wide-scale level, materials will be at the forefront of solving the most complex, important challenges to our planet and its inhabitants, none more important than the energy crisis. Food production, water purification and distribution, transportation, infrastructure all will rely on creating sustainable energy. More ambitious endeavors such as space travel, medical treatments and advanced research collaborations will be even more reliant on new and improved materials. The faster that scientists, researchers and engineers are able to mine data from previous experiments, replicate it and design smarter studies based on computerized algorithms of how those materials behave, the faster they can produce breakthroughs in the laboratory and into the marketplace. The stakes are high. The scientific rewards are infinite. The time to open and use a free-access materials database is now.
This article was sponsored by Citrine Informatics.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
Engineering has an unfortunate image problem. With a seemingly endless array of socioeconomic, technological and large-scale problems to address, and with STEM fields set to comprise the most lucrative 21st Century careers, studying engineering should be a no-brainer. Unfortunately, attracting a wide array of students — or even appreciating engineers as cool — remains difficult, most noticeably among women. When Google Research found out that the #2 reason girls avoid studying STEM fields is perception and stereotypes on screen, they decided to work with Hollywood to change that. Recently, they partnered with the National Academy of Sciences and USC’s prestigious Viterbi School of Engineering to proactively seek out ideas for creating a television program that would showcase a female engineering hero to inspire a new generation of female engineers. The project, entitled “The Next MacGyver,” came to fruition last week in Los Angeles at a star-studded event. ScriptPhD.com was extremely fortunate to receive an invite and have the opportunity to interact with the leaders, scientists and Hollywood representatives that collaborated to make it all possible. Read our full comprehensive coverage below.
“We are in the most exciting era of engineering,” proclaims Yannis C. Yortsos, the Dean of USC’s Engineering School. “I look at engineering technology as leveraging phenomena for useful purposes.” These purposes have been recently unified as the 14 Grand Challenges of Engineering — everything from securing cyberspace to reverse engineering the brain to solving our environmental catastrophes to ensuring global access to food and water. These are monumental problems and they will require a full scale work force to fully realize. It’s no coincidence that STEM jobs are set to grow by 17% by 2024, more than any other sector. Recognizing this opportunity, the US Department of Education (in conjunction with the Science and Technology Council) launched a five-year strategic plan to prioritize STEM education and outreach in all communities.
Despite this golden age, where the possibilities for STEM innovation seem as vast as the challenges facing our world, there is a disconnect in maximizing a full array of talent for the next generation of engineers. There is a noticeable paucity of women and minority students studying STEM fields, with women comprising just 18-20% of all STEM bachelor’s degrees, regardless of the fact that more students are STEM degrees than ever before. Madeline Di Nono, CEO of the Geena Davis Institute on Gender in Media and a judge at the Next MacGyver competition, boils a lot of the disinterest down to a consistent lack of female STEM portrayal in television and film. “It’s a 15:1 ratio of male to female characters for just STEM alone. And most of the science related positions are in life sciences. So we’re not showing females in computer science or mathematics, which is where all the jobs are going to be.” Media portrayals of women (and by proxy minorities) in science remains shallow, biased and appearance-focused (as profiled in-depth by Scientific American). Why does this matter? There is a direct correlation between positive media portrayal and STEM career participation.
It has been 30 years since the debut of television’s MacGyver, an action adventure series about clever agent Angus MacGyver, working to right the wrongs of the world through innovation. Rather than using a conventional weapon, MacGyver thwarts enemies with his vast array of scientific knowledge — sometimes possessing no more than a paper clip, a box of matches and a roll of duct tape. Creator Lee Zlotoff notes that in those 30 years, the show has run continuously around the world, perhaps fueled in part by a love of MacGyver’s endless ingenuity. Zlotoff noted the uncanny parallels between MacGyver’s thought process and the scientific method: “You look at what you have and you figure out, how do I turn what I have into what I need?” Three decades later, in the spirit of the show, the USC Viterbi School of Engineering partnered with the National Academy of Sciences and the MacGyver Foundation to search for a new MacGyver, a television show centered around a female protagonist engineer who must solve problems, create new opportunities and most importantly, save the day. It was an initiative that started back in 2008 at the National Academy of Sciences, aiming to rebrand engineering entirely, away from geeks and techno-gadget builders towards an emphasis on the much bigger impact that engineering technology has on the world – solving big, global societal problems. USC’s Yortsos says that this big picture resonates distinctly with female students who would otherwise be reluctant to choose engineering as a career. Out of thousands of submitted TV show ideas, twelve were chosen as finalists, each of whom was given five minutes to pitch to a distinguished panel of judges comprising of writers, producers, CEOs and successful show runners. Five winners will have an opportunity to pair with an established Hollywood mentor in writing a pilot and showcasing it for potential production for television.
If The Next MacGyver feels far-reaching in scope, it’s because it has aspirations that stretch beyond simply getting a clever TV show on air. No less than the White House lent its support to the initiative, with an encouraging video from Chief Technology Officer Megan Smith, reiterating the importance of STEM to the future of the 21st Century workforce. As Al Roming, the President of the National Academy of Engineering noted, the great 1950s and 1960s era of engineering growth was fueled by intense competition with the USSR. But we now need to be unified and driven by the 14 grand challenges of engineering and their offshoots. And part of that will include diversifying the engineering workforce and attracting new talent with fresh ideas. As I noted in a 2013 TEDx talk, television and film curry tremendous power and influence to fuel science passion. And the desire to marry engineering and television extends as far back as 1992, when Lockheed and Martin’s Norm Augustine proposed a high-profile show named LA Engineer. Since then, he has remained a passionate advocate for elevating engineers to the highest ranks of decision-making, governance and celebrity status. Andrew Viterbi, namesake of USC’s engineering school, echoed this imperative to elevate engineering to “celebrity status” in a 2012 Forbes editorial. “To me, the stakes seem sufficiently high,” said Adam Smith, Senior Manager of Communications and Marketing at USC’s Viterbi School of Engineering. “If you believe that we have real challenges in this country, whether it is cybersecurity, the drought here in California, making cheaper, more efficient solar energy, whatever it may be, if you believe that we can get by with half the talent in this country, that is great. But I believe, and the School believes, that we need a full creative potential to be tackling these big problems.”
So how does Hollywood feel about this movement and the realistic goal of increasing its array of STEM content? “From Script To Screen,” a panel discussion featuring leaders in the entertainment industry, gave equal parts cautionary advice and hopeful encouragement for aspiring writers and producers. Ann Merchant, the director of the Los Angeles-based Science And Entertainment Exchange, an offshoot of the National Academy of Sciences that connects filmmakers and writers with scientific expertise for accuracy, sees the biggest obstacle facing television depiction of scientists and engineers as a connectivity problem. Writers know so few scientists and engineers that they incorporate stereotypes in their writing or eschew the content altogether. Ann Blanchard, of the Creative Artists Agency, somewhat concurred, noting that writers are often so right-brain focused, that they naturally gravitate towards telling creative stories about creative people. But Danielle Feinberg, a computer engineer and lighting director for Oscar-winning Pixar animated films, sees these misconceptions about scientists and what they do as an illusion. When people find out that you can combine these careers with what you are naturally passionate about to solve real problems, it’s actually possible and exciting. Nevertheless, ABC Fmaily’s Marci Cooperstein, who oversaw and developed the crime drama Stitchers, centered around engineers and neuroscientists, remains optimistic and encouraged about keeping the doors open and encouraging these types of stories, because the demand for new and exciting content is very real. Among 42 scripted networks alone, with many more independent channels, she feels we should celebrate the diversity of science and medical programming that already exists, and build from it. Put together a room of writers and engineers, and they will find a way to tell cool stories.
At the end of the day, Hollywood is in the business of entertaining, telling stories that reflect the contemporary zeitgeist and filling a demand for the subjects that people are most passionate about. The challenge isn’t wanting it, but finding and showcasing it. The panel’s universal advice was to ultimately tell exciting new stories centered around science characters that feel new, flawed and interesting. Be innovative and think about why people are going to care about this character and storyline enough to come back each week for more and incorporate a central engine that will drive the show over several seasons. “Story does trump science,” Merchant pointed out. “But science does drive story.”
The twelve pitches represented a diverse array of procedural, adventure and sci-fi plots, with writers representing an array of traditional screenwriting and scientific training. The five winners, as chosen by the judges and mentors, were as follows:
Miranda Sajdak — Riveting
Sajdak is an accomplished film and TV writer/producer and founder of screenwriting service company ScriptChix. She proposed a World War II-era adventure drama centered around engineer Junie Duncan, who joins the military engineer corps after her fiancee is tragically killed on the front line. Her ingenuity and help in tackling engineering and technology development helps ultimately win the war.
Beth Keser, PhD — Rule 702
Dr. Keser, unique among the winners for being the only pure scientist, is a global leader in the semiconductor industry and leads a technology initiative at San Diego’s Qualcomm. She proposed a crime procedural centered around Mimi, a brilliant scientist with dual PhDs, who forgoes corporate life to be a traveling expert witness for the most complex criminal cases in the world, each of which needs to be investigated and uncovered.
Jayde Lovell — SECs (Science And Engineering Clubs)
Jayde, a rising STEM communication star, launched the YouTube pop science network “Did Someone Say Science?.” Her show proposal is a fun fish-out-of-water drama about 15-year-old Emily, a pretty, popular and privileged high school student. After accidentally burning down her high school gym, she forgoes expulsion only by joining the dreaded, geeky SECs club, and in turn, helping them to win an engineering competition while learning to be cool.
Craig Motlong — Q Branch
Craig is a USC-trained MFA screenwriter and now a creative director at a Seattle advertising agency. His spy action thriller centered around mad scientist Skyler Towne, an engineer leading a corps of researchers at the fringes of the CIA’s “Q Branch,” where they develop and test the gadgets that will help agents stay three steps ahead of the biggest criminals in the world.
Shanee Edwards — Ada and the Machine
Shanee, an award-winning screenwriter, is the film reviewer at SheKnows.com and the host/producer of the web series She Blinded Me With Science. As a fan of traditional scientific figures, Shanee proposed a fictionalized series around real-life 1800s mathematician Ada Lovelace, famous for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. In this drama, Ada works with Babbinge to help Scotland Yard fight opponents of the industrial revolution, exploring familiar themes of technology ethics relevant to our lives today.
Craig Motlong, one of five ultimate winners, and one of the few finalists with absolutely no science background, spent several months researching his concept with engineers and CIA experts to see how theoretical technology might be incorporated and utilized in a modern criminal lab. He told me he was equal parts grateful and overwhelmed. “It’s an amazing group of pitches, and seeing everyone pitch their ideas today made me fall in love with each one of them a little bit, so I know it’s gotta be hard to choose from them.”
Whether inspired by social change, pragmatic inquisitiveness or pure scientific ambition, this seminal event was ultimately both a cornerstone for strengthening a growing science/entertainment alliance and a deeply personal quest for all involved. “I don’t know if I was as wrapped up in these issues until I had kids,” said USC’s Smith. “I’ve got two little girls, and I tried thinking about what kind of shows [depicting female science protagonists] I should have them watch. There’s not a lot that I feel really good sharing with them, once you start scanning through the channels.” Motlong, the only male winner, is profoundly influenced by his experience of being surrounded by strong women, including a beloved USC screenwriting instructor. “My grandmother worked during the Depression and had to quit because her husband got a job. My mom had like a couple of options available to her in terms of career, my wife wanted to be a genetic engineer when she was little and can’t remember why she stopped,” he reflected. “So I feel like we are losing generations of talent here, and I’m on the side of the angels, I hope.” NAS’s Ann Merchant sees an overarching vision on an institutional level to help achieve the STEM goals set forth by this competition in influencing the next generation of scientist. “it’s why the National Academy of Sciences and Engineering as an institution has a program [The Science and Entertainment Exchange] based out of Los Angeles, because it is much more than this [single competition].”
Indeed, The Next MacGyver event, while glitzy and glamorous in a way befitting the entertainment industry, still seemed to have succeeded wildly beyond its sponsors’ collective expectations. It was ambitious, sweeping, the first of its kind and required the collaboration of many academic, industry and entertainment alliances. But it might have the power to influence and transform an entire pool of STEM participants, the way ER and CSI transformed renewed interest in emergency medicine and forensic science and justice, respectively. If not this group of pitched shows, then the next. If not this group of writers, then the ones who come after them. Searching for a new MacGyver transcends finding an engineering hero for a new age with new, complex problems. It’s about being a catalyst for meaningful academic change and creative inspiration. Or at the very least opening up Hollywood’s eyes and time slots. Zlotoff, whose MacGyver Foundation supported the event and continually seeks to promote innovation and peaceful change through education opportunities, recognized this in his powerful closing remarks. “The important thing about this competition is that we had this competition. The bell got rung. Women need to be a part of the solution to fixing the problems on this planet. [By recognizing that], we’ve succeeded. We’ve already won.”
The Next MacGyver event was held at the Paley Center For Media in Beverly Hills, CA on July 28, 2015. Follow all of the competition information on their site. Watch a full recap of the event on the Paley Center YouTube Channel.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development. Follow us on Twitter and Facebook. Subscribe to our podcast on SoundCloud or iTunes.
“It’s like a war. You don’t know whether you’re going to win the war. You don’t know if you’re going to survive the war. You don’t know if the project is going to survive the war.” The war? Cancer, still one of the leading causes of death despite 40 years passing since the National Cancer Act of 1971 catapulted Richard Nixon’s famous “War on Cancer.” The speaker of the above quote? A scientist at Genentech, a San Francisco-based biotechnology and pharmaceutical company, describing efforts to pursue a then-promising miracle treatment for breast cancer facing numerous obstacles, not the least of which was the patients’ rapid illness. If it sounds like a made-for-Hollywood story, it is. But I Want So Much To Live is no ordinary documentary. It was commissioned as an in-house documentary by Genentech, a rarity in the staid, secretive scientific corporate world. The production values and storytelling offer a tremendous template for Hollywood filmmakers, as science and biomedical content become even more pervasive in film. Finally, the inspirational story behind Herceptin, one of the most successful cancer treatments of all time, offers a testament and rare insight to the dedication and emotion that makes science work. Full story and review under the “continue reading” cut.
For biotechnology and pharmaceutical companies, it is the best of times, it is the worst of times. On the one hand, many people consider this a Golden Era of pharmaceutical discovery and innovation for certain illnesses like cancer. Others, such as HIV, receive poor grades for drug and vaccine development. Furthermore, the FDA recently passed much more stringent controls on drugs brought to market, leaving some to posit that this will have a negative impact on future pharmaceutical breakthroughs. And while a recent documentary chronicles some of the unhealthy profits of the pharmaceutical industry, the enormous cost of developing and bringing medicines to market is often gravely overlooked. Today, the pharmaceutical industry as a whole has one of the lowest favorability scores of any major industry, despite some impressive social contributions, partnerships and global health investments. Much of this public hostility simply comes down to the fact that people don’t know very much about the pharmaceutical industry, notoriously reluctant to publicize or reveal anything about their inner workings.
Science in Hollywood is experiencing no such crisis. In many ways, it is a golden age for science, technology and medicine in film, with more big-budget mainstream films exploring themes and content germane to 21st Century science than ever before. Last year alone, three smart hit movies broached the realities, hopes and anxiety of the technological times we live in, each in a very different way. The stylish and ambitious thriller Limitless explored the possibility of a limitless brain capacity through pharmacopeia, a magical pill that would maximize one’s intelligence and allow 100% brain function around the clock. Certainly echoing the credo of the modern pharmaceutical movement—there is a pill that can solve every problem, whether it’s been invented or not—Limitless fell slightly short in condemning (or even properly acknowledging) the impracticalities ethical irresponsibility of developing such a drug, especially in its ending. Stephen Soderbergh’s surgical and pinpoint-accurate epic Contagion gave audiences a spine-chilling, terrorizing purview into the medical and public health realities of a modern-day pandemic. But while it strove, and succeeded, in showcasing how government agencies, university labs and medical establishments would contend with and fight off such a global disaster, Contagion was never able to connect audiences emotionally either with the characters impacted by the pandemic or with the scientists battling it. No recent movie is a better example of delicate introspection and exposition than the brilliant, poignant, funny and difficult 50/50. On the heels of CNN pondering whether Hollywood could take on cancer came a film that did so with reality, grace and even humor. Partially because it was based on screenwriter Will Reiser’s own brush with cancer, 50/50 set aside the clinical as a secondary backdrop to examine the psychological.
Each of the films above has an important quality that is be an essential component to effective Hollywood science storytelling – scientific accuracy, emotional connection to the outside world and an overview of biomedical impact and innovation. We recently screened an industry documentary, filmed at the request of Genentech scientists, called I Want So Much To Live, that is an excellent blueprint for the way we’d like to see scientific stories portrayed in film. Best of all, it doesn’t sacrifice the human story for the technical one, nor the very real complex emotions that scientists, engineers and doctors feel when they develop and market potentially life-saving technology.
The miracle of Herceptin is really a decade-long journey that started in the labs of UCLA, moved to the pharmaceutical labs of San Francisco, endured countless obstacles, street riots and controversies to end up as one of the most revolutionary breakthroughs in breast cancer treatment research history. Advances in cancer insight always seem to come in evolutionary leaps. For example, the cellular mechanism of how normal cells become cancerous was unknown until Harold Varmus and Michael Bishop established the presence of retroviral oncogenes, genes that control cellular growth and replication. When either disrupted or turned on, these genes contribute to the transformation of normal cells into tumors. Other than the discovery of as an anti-estrogen treatment for breast cancers, relatively little new ground had been gained in fighting the disease. Scientists continued to be perplexed why some women were cured by chemotherapy, which tries to stop cancer cell division by attacking the most rapidly-dividing cells in the body, while others didn’t respond at all. It was not until the late 80s that scientists Alex Ullrich and Michael Shepherd (both featured in the film) discovered that about 20-30% of early-stage breast cancers express amplify a gene called HER-2, a protein embedded in the cell membrane that helps regulate cell growth and signaling. With the help of UCLA scientist Dennis Slamon, famously portrayed by Harry Connick, Jr. in a made-for-TV movie about the development of Herceptin, the scientists soon developed an anti-HER-2 antibody that significantly slowed tumor growth.
An early Phase I clinical trial was conducted simply to establish safety, with 20 volunteers. The lone survivor, still alive to this day, was given 10 weeks to live. Phase II trials honed in on dosage and establishing that the drug performed its intended effects. This time, out of 85 volunteers, 5 survived completely, not a bad result, but not enough for the FDA and the science community. The scientists took a huge risk for their Phase III study. They combined their anti-HER-2 antibody with current treatment. The results were astounding. Out of 450 patients, 50% survived — the highest ever success rate for metastatic cancer!
Think the story ends here? Think again. This is where it just begins to take more emotional twists and turns than a fictitious Hollywood script. Unlike many Hollywood productions, though, the human impact angle was shared equally between all the players in this evolving story, easily this documentary’s most powerful aspect. In order to test their Phase III trials of Herceptin (in concert with chemotherapy treatments available at that time), Genentech had to establish a highly controversial lottery system to pick those who would receive highly limited life-saving quantities of Herceptin, and those who would be categorized in the control studies, and thereby handed a death sentence. So controversial was the lottery system, that it engendered televised protests in the Bay Area, along with anguished pleas from dying patients—the documentary’s title is the first sentence of one such letter: “I want so much to live.” The scientists at Genentech were hardly immune to the weight of each decision, either. They were tormented over the fairness of the lottery system, producing enough high-quality treatment to pass the clinical trial, and even in keeping an unbiased eye on the science to save lives in the long run. Talking about the pressure of those days reduced one of the scientists to tears. And after all was said and done, the lone FDA scientist entrusted with the power to oversee the Herceptin study and green light its approval as a drug? She had just lost her mother to breast cancer. These intertwining fortunes are summarized by executive producer Christie Castro: “By definition, groups of people are imperfect. But those who worked on Herceptin proved that the complexity – indeed, the fantastic mess – that simply comes with being human can sometimes result in something truly worthwhile.”
One of the first patients to get the experimental Herceptin treatment prior to FDA approval, though not profiled in the movie, is flourishing well over a decade after being diagnosed with the most aggressive form of breast cancer. Stories like hers lie at the emotional heart of the I Want So Much To Live story (and Genentech’s motivation for continuing the controversial studies):
Herceptin was officially approved as a drug on September 22, 2000. On October 20, 2010, Herceptin was approved as an adjuvant (joint) treatment with current chemotheraphy drugs for the treatment of aggressive breast cancer. To date, the adjuvant therapy has had an impressive 58% success rate for a cancer that once carried an unlikely rate of survival for those afflicted.
Take a look at the trailer for I Want So Much To Live:
The powerful and well-crafted content of this documentary should serve as a valuable template for how the multi-faceted power of storytelling can be used across multiple industries. It smartly tells a gripping scientific story without either dumbing down the science or elevating it beyond a layperson’s understanding—a certain goal for the increasing amount of cinematic fare such as Contagion. It provides a functional breakdown of the enormous challenges and technical obstacles of the pharmaceutical drug development process. Like many other aspects of science, it is mysterious to the general public, out of their grasp and seemingly always occuring behind closed doors. Especially at a time when public perception of the pharmaceutical industry is at an all-time low, such transparency could strengthen reputations and increase business. “Corporations are,” executive producer Christine Castro reminds us, “groups of people who have ideas, ambitions, conflicts and dreams, and, at the end of the day, a desire to see their work result in something meaningful. That’s why we decided to take a creative chance and face the potential skepticism that a corporation would or could tell an unvarnished story about itself.”
Finally, the film develops a three-dimensional emotional tether to the three different sides impacted by the scientific process: scientists, the agencies that regulate them and society as a whole. There doesn’t always have to be a tacit bad guy, and sometimes, this protagonistic complexity makes for the best story of all. Holder, who started filming I Want So Much To Live around the same time that her late brother was diagnosed with a rare and virulent form of cancer, echoed our sentiment as she reflected on the process of making the film. It allowed her to discover “that science is a creative pursuit as well as a technical one; that science is beautiful and can be accessible; and that anyone, at any time, might have the idea that could one day save lives.”
We can only hope that the harmony of creativity, passion and emotion devoted to all sides of the drug discovery process within this film translates to more private and studio productions dealing with complex scientific and socio-technological issues.
ScriptPhD.com caught up with filmmaker Elizabeth Holder, who directed and produced I Want So Much To Live. Here are some of her thoughts on putting together this incredible story and interacting with the scientists and heroic patients that made it happen:
ScriptPhD.com: Can you tell me where the seeds of inspiration for the story of the drug Herceptin first arose, and what inspired you to tackle this material for your documentary?
Elizabeth Holder: The initial idea to make a documentary film about Herceptin came from executive producer Chris Castro, who upon joining Genentech in 2007 thought that the story would make a compelling documentary film. (She will have to share with you her experience.) I first heard about the project from a friend and began doing research on Herceptin and Genentech. I was excited to work on this film; excited to jump into and explore a new world. My first inspiration came from the people who were the story; the passionate men and women who faced adversity with courage and perseverance, never swaying from their pursuit, making difficult decisions laced with moral and ethical ramifications. I knew this story of individual and collective growth would resonate with many, and would be especially poignant to the employees of Genentech. (This at the time was the intended audience for the film.) When I began working on this film in 2008 I had no idea how personal this journey would become and how connected I would be to the people I would meet and the story I was going to tell.
While I was making the film, my younger brother David was battling cancer – a rare type of cancer for a 33 year old man. While I was meeting with scientists and learning about biotech and drug development for the movie, David was fighting the disease with everything science and medicine could offer. He wrote a blog about his journey, signing off each entry with the words “Plow On”. Each day, I would hope that the scientists would hurry up. Figure it out. But I learned firsthand that science is not a “hurry up” business and that many people are doing everything they can to find ways to stop cancer. My wish is that the film serves to inspire everyone who is on the frontlines in the battle against cancer, to encourage them to keep on fighting the good fight, no matter what, and even on a bad day, to Plow On.
SPhD: How willing were the patients and scientists to contribute to the project?
EH: As you can imagine, everyone, especially scientists, are skeptical. Some people took a bit more convincing than others, but once they started talking, the interviews, both on and off camera, were amazing.
I am grateful to the patients, scientists, activists, executives, and doctors for honestly and enthusiastically sharing their stories, perspective, and experience with me. I quickly became indebted to mentors and colleagues who diligently and without judgment explained and re-explained molecular biology and the drug development process to me. I hope the determination and delight in which they approach their work is reflected in the film.
SPhD: Any of your own preconceived notions that were shattered or altered throughout the making of this film?
EH: I discovered striking similarities between scientists and filmmakers which I did not expect to find. A research scientist and a filmmaker must each imagine an idea, convince others to recognize the value of funding the idea, and then prove the concept. Like many filmmakers, the scientists I met were impassioned about their work and showed great determination in the face of extraordinary odds. Like filmmaking, drug development takes a village. Before making this film I had no idea how many years and how many people it took to develop a drug; the process involves a huge collaborative effort between massive numbers of people in multiple organizations, in various countries.
It was incredible and amazing to me that the scientists would talk about “cells” and “exxons” and “nucleotides” as if they could actually be seen by the human eye. It was also inspiring to me that a scientist is committed enough to work on a research project for their whole career with the knowledge that they might not ever see an outcome in their lifetime. And finally, I was pleased to confirm (though not statistically proven) that a lot of really smart and accomplished people do not have perfectly clean desks.
SPhD: Within the movie, we get a real feel for the dichotomy between the emotional appeals of the desperately ill patients, the cautious, careful FDA scientists, and the Genentech researchers who want to make sure the product they introduce is safe for patients. Was this a thematic element you foresaw or that developed as you pieced the film together?
EH: I carefully planned out the film, yet also left room for new discoveries along the way. (I was constantly learning – from each filmed interview, from advisors, from books.) For each defining moment in the film I made sure to film at least three people talking about the same experience with different opinions. I wanted to make sure that the topic was covered from various perspectives so I could intercut interviews together. I knew that I was not going to use narration. I only wanted people who were part of the story to be telling the story; to engage the audience with their firsthand accounts. I wanted the audience to feel connected emotionally to each person in the film, to empathize with the person on screen even if they disagreed with their tactic and/or goal. Additionally, I knew I was going to use archival footage, photos and authentic documents to organically reveal the isolation and miscommunication, the unwitting partnerships, the building mistrust and the eventual coming together. When I first saw and read the pile of letters saved by Geoff, I knew that I would use it in the film. I carried a few of those letters with me to every interview and pulled them out when it felt right, asking people to read them and respond. The scene was assembled to show how incorrect assumptions lead to strife; to show how each person’s journey was critical to the whole story; and to show how those intertwining stories eventually became the framework for the work that is continuing today.
SPhD: What are your own thoughts on the lottery system that Genentech ultimately used to determine who would be eligible to participate in the Herceptin clinical trials?
EH: I see both sides of the issue, and don’t think there is an easy answer. When interviewing people for this film, I went into each interview with a clean slate, without having any pre-conceived agenda or opinion. It was critical that I empathized with each person and was able to tell the story though the objectives and needs of those who I interviewed, those who had direct experience. I needed to be able to fully see and feel the situation from their point of view. And, to me, judgment is only something that pulls us apart, not together. I am thankful I am in the documentary business and not in the business of making the kind of decisions that had to be made during that time. I am not sure what I would have done if someone I loved needed the drug before it was approved.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>Read through any archive of science fiction movies, and you quickly realize that the merger of pop culture and science dates as far back as the dawn of cinema in the early 1920s. Even more surprising than the enduring prevalence of science in film is that the relationship between film directors, scribes and the science advisors that have influenced their works is equally as rich and timeless. Lab Coats in Hollywood: Science, Scientists, and Cinema (2011, MIT Press), one of the most in-depth books on the intersection of science and Hollywood to date, serves as the backdrop for recounting the history of science and technology in film, how it influenced real-world research and the scientists that contributed their ideas to improve the cinematic realism of science and scientists. For a full ScriptPhD.com review and in-depth extended discussion of science advising in the film industry, please click the “continue reading” cut.
Written by David A. Kirby, Lecturer in Science Communication Studies at the Centre for History of Science, Technology and Medicine at the University of Manchester, England, Lab Coats offers a surprising, detailed analysis of the symbiotic—if sometimes contentious—partnership between filmmakers and scientists. This includes the wide-ranging services science advisors can be asked to provide to members of a film’s production staff, how these ideas are subsequently incorporated into the film, and why the depiction of scientists in film carries such enormous real-world consequences. Thorough, detailed, and honest, Lab Coats in Hollywood is an exhaustive tome of the history of scientists’ impact on cinema and storytelling. It’s also an essential and realistic road map of the challenges that scientists, engineers and other technical advisors might face as they seriously pursue science advising to the film industry as a career.
The essential questions that Lab Coats in Hollywood addresses are these—is it worth it to hire a science advisor for a movie production? Is it worth it for the scientist to be an advisor? The book’s purposefully vague conclusion is that it depends solely on how the scientist can film’s storyline and visual effects. Kirby wisely writes with an objective tone here because the topic is open to a considerable amount of debate among the scientists and filmmakers profiled in the book. Sometimes a scientist is so key to a film’s development, he or she becomes an indispensible part of the day-to-day production. A good example of this is Jack Horner, paleontologist at the Museum of the Rockies in Bozeman, MT, and technical science advisor to Steven Spielberg in Jurassic Park and both of its sequels. Horner, who drew from his own research on the link between dinosaurs and birds for a more realistic depiction of the film’s contentious science, helped filmmakers construct visuals, write dialogue, character reactions, animal behaviors, and map out entire scenes. J. Marvin Herndon, a geophysicist at the Transdyne Corporation, approached the director of the disaster film The Core when he learned the plot was going to be based on his controversial hypothesis about a giant uranium ball in the center of the Earth. Herndon’s ideas were fully incorporated into the film’s plot, while Herndon rode the wave of publicity from the film to publish his research in a PNAS paper. The gold standard of science input, however, were Stanley Kubrik’s multiple science and engineering advisors for 2001: A Space Odyssey, discussed in much further detail below.
Kirby hypothesizes that sometimes, a film’s poor reception might have been avoided with a science advisor. He provides the example of the Arnold Schwarzenegger futuristic sci-fi bomb The Sixth Day, which contained a ludicrously implausible use of human cloning in its main plot. While the film may have been destined for failure, Kirby posits that it only could have benefited from proper script vetting by a scientist. By contrast, the 1998 action adventure thriller Armageddon came under heavy expert criticism for its basic assertion that an asteroid “the size of Texas” could go undetected until eighteen days before impact. Director Michael Bay patently refused to take the advice of his advisor, NASA researcher Ivan Bakey, and admitted he was sacrificing science for plot, but Armageddon went on to be a huge box office hit regardless. Quite often, the presence of a science advisor is helpful, albeit unnecessary. One of the book’s more amusing anecdotes is about Dustin Hoffman’s hyper-obsessive shadowing of a scientist for the making of the pandemic thriller Outbreak (great guide to the movie’s science can be found here). Hoffman was preparing to play a virologist and wanted to infuse realism in all of his character’s reactions. Hoffman kept asking the scientist to document reactions in mundane situations that we all encounter—a traffic jam, for example—only to come to the shocking conclusion that the scientist was a real person just like everyone else.
Most of the time, including scientists in the filmmaking process is at the discretion of the studios because of the one immutable decree reiterated throughout the book: the story is king. When a writer, producer or director hires a science consultant, their expertise is utilized solely to facilitate, improve or augment story elements for the purposes of entertaining the audience. Because of this, one of the most difficult adjustments a science consultant may face is a secondary status on-set even though they may be a superstar in their own field. Some of the other less glamorous aspects of film consulting include heavy negotiations with unionized writers for script or storyline changes, long working hours, a delicate balance between side consulting work and a day job, and most importantly, an inconsistent (sometimes nonexistent) payment structure per project. I was notably thrilled to see Kirby mention the pros and cons of programs such as the National Science Foundation’s Creative Science Studio (a collaboration with USC’s school of the Cinematic Arts) and the National Academy of Science’s Science and Entertainment Exchange, which both provide on-demand scientific expertise to the Hollywood filmmaking community in the hope of increasing and promoting the realism of scientific portrayal in film. While valuable commodities to science communication, both programs have had the unfortunate effect of acclimating Hollywood studios to expect high-level scientific consulting for free.
1968’s 2001: A Space Odyssey is widely considered by popular consensus to be the greatest sci-fi movie ever made, and certainly the most influential. As such, Kirby devotes an entire chapter to detailing the film’s production and integration of science. Director Stanley Kubrik took painstaking detail in scientific accuracy to explore complex ideas about the relationship between humanity and technology, hiring a range of advisors from anthropologists, aeronautical engineers, statisticians, and nuclear physicists for various stages of production. Statistician I. J. Good provided advice on supercomputers, aerospace Harry Lange provided production design, while NASA space scientist Frederick Ordway lent over three years of his time to develop the space technology used in the film. In doing so, Kubrik’s staff consulted with over sixty-five different private companies, government agencies, university groups and research institutions. So real was the space technology in 2001 that moon landing hoax supporters have claimed the real moon landing by United States astronauts, taking place in 1969, was taped on the same sets. Not every science-based film has used science input as meticulously or thoroughly since, but Kubrik’s influence on the film industry’s fascination with science and technology has been an undeniable legacy.
One of the real treats of Lab Coats in Hollywood is the exploration of the two-way relationship between scientists and filmmakers, and how film in turn influences the course of science, as we discuss in more detail below. Between film case studies, critiques and interviews with past science advisors are interstitial vignettes of ways scientists have shaped films we know and love. Even the animated feature Finding Nemo had an oceanography advisor to get the marine biology correct. The seminal moment of the most recent Star Trek installation was due to a piece of off-handed scientific advice from an astronomer. The cloning science of Jurassic Park, so thoroughly researched and pieced together by director Steven Spielberg and science advisor Jack Horner, was actually published in a top-notch journal days ahead of the movie’s premiere. Even in rare spots where the book drags a bit with highly technical analysis are cinematic backstories with details that readers will salivate over. (For example, there’s a very good reason all the kelp went missing from Finding Nemo between its cinematic and DVD releases.)
As the director of a creative scientific consulting company based in Los Angeles, one of the biggest questions I get asked on a regular basis is “What does a science advisor do, exactly?” Lab Coats in Hollywood does an excellent job of recounting stories and case studies of high-profile scientist consultants, all of whom contributed their creative talents to their respective films in different ways, what might be expected (and not expected) of scientists on set, and of giving different areas of expertise that are currently in demand in Hollywood. Kirby breaks down cinematic fact checking, the most frequent task scientists are hired to perform, into three areas within textbook science (known, proven facts that cannot be disputed, such as gravity): public science, something we all know and would think was ridiculous if filmmakers got wrong, expert science, facts that are known to specialists and scientific experts outside of the lay audience, and (most problematic) folk science, incorrect science that has nevertheless been accepted as true by the public. Filmmakers are most likely to alter or modify facts that they perceive as expert science to minimize repercussions at the box office.
A science advisor is constantly navigating cinematic storytelling constraints and a filmmaker’s desire to utilize only the most visually appealing and interesting aspects of science (regardless of whether the context is always academically appropriate). Another broad area of high demand is in helping actors look and act like a real scientist on screen. Scientists have been hired to do everything from doctoring dialogue to add realism into an actor’s portrayal (the movie Contact and Jodie Foster’s depiction of Dr. Ellie Arroway is a good example of this), training actors in using equipment and pronouncing foreign-sounding jargon, replicating laboratory notebooks or chalkboard scribbles with the symbols and shorthand of science (such as in the mathematics film A Beautiful Mind), and to recreate the physical space of an authentic laboratory. Finally, the scientist’s expertise of the known is used to help construct plausible scenarios and storylines for the speculative, an area that requires the greatest degree of flexibility and compromise from the science advisor. Uncertainty, unexplored research and “what if” scenarios, the bane of every scientist’s existence, happen to be Hollywood’s favorite scenarios, because they allow the greatest creative freedom in storytelling and speculative conceptualization without being negated by a proven scientific impossibility. An entire chapter—the book’s finest—is devoted to two case studies, Deep Impact and The Hulk, where real science concepts (near-Earth asteroid impacts and genetic engineering, respectively) were researched and integrated into the stories that unfolded in the films. (Side note: if you are ever planning on being a science advisor read this section of the book very carefully).
In years past, consulting in films didn’t necessarily bring acclaim to scientists within their own research communities; indeed, Lab Coats recounts many instances where scientists were viewed as betraying science or undermining its seriousness with Hollywood frivolity, including many popular media figures such as Carl Sagan and Paul Ehrlich. Recently, however, consultants have come to be viewed as publicity investments both by studios that hire high-profile researchers for recognition value of their film’s science content and by institutes that benefit from branding and exposure. Science films from the last 10-15 years such as GATTACA, Outbreak, Armageddon, Contact, The Day After Tomorrow and a panoply of space-related flicks have attached big-name scientists as consultants (gene therapy pioneer French Anderson, epidemiologist David Morens, NASA director Ivan Bekey, SETI institute astronomers Seth Shostak and Jill Tartar and climatologist Michael Molitor, respectively). They also happened to revolve around the research salient to our modern era: genetic bioengineering, global infectious diseases, near-earth objects, global warming and (as always) exploring deep space. As such, a mutually beneficial marketing relationship has emerged between science advisors and studios that transcends the film itself resulting in funding and visibility to individual scientists, their research, and even institutes and research centers. The National Severe Storms Laboratory (NSSL) promoted themselves in two recent films, Twister and Dante’s Peak, using the films as a vehicle to promote their scientific work, to brand themselves as heroes underfunded by the government, and to temper public expectations about storm predictions. No institute has had a deeper relationship with Hollywood than NASA, extending back to the Star Trek television series, with intricate involvement and prominent logo display in the films Apollo 13, Armageddon, Mission to Mars, and Space Cowboys. Some critics have argued that this relationship played an integral role in helping NASA maintain a positive public profile after the devastating 1986 Challenger space shuttle disaster. The end result of the aforementioned promotion via cinematic integration can only benefit scientific innovation and public support.
Accurate and favorable portrayal of science content in modern cinema has an even bigger beneficiary than specific research institutes, and that is society itself. Fictional technology portrayed in film – termed a “diegetic prototype” – has often inspired or led directly to real-world application and development. Kirby offers the most impactful case of diegetic prototyping as the 1981 film Threshold, which portrayed the first successful implantation of a permanent artificial heart, a medical marvel that became reality only a year later. Robert Jarvik, inventor of the Jarvik-7 artificial heart used in the transplant, was also a key medical advisor for Threshold, and felt that his participation in the film could both facilitate technological realism and by doing so, help ease public fears about what was then considered a freak surgery, even engendering a ban in Great Britain. Of the many obstacles that expensive, ambitious, large-scale research faces, Kirby argues that skepticism or lack of enthusiasm from the public can be the most difficult to overcome, precisely because it feeds directly into essential political support that makes funding possible. A later example of film as an avenue for promotion of futuristic technology is Minority Report, set in the year 2054, and featuring realistic gestural interfacing technology and visual analytics software used to predict crime before it actually happens. Less than a decade later, technology and gadgets featured in the film have come to fruition in the form of multi-touch interfaces like the iPad and retina scanners, with others in development including insect robots (mimics of the film’s spider robots), facial recognition advertising billboards, crime prediction software and electronic paper. A much more recent example not featured in the book is the 2011 film Limitless, featuring a writer that is able to stimulate and access 100% of his brain at will by taking a nootropic drug. While the fictitious drug portrayed in the film is not yet a neurochemical reality, brain enhancement is a rising field of biomedical research, and may one day indeed yield a brain-boosting pill.
No other scientific feat has been a bigger beneficiary of diegetic prototyping than space travel, starting with 1929’s prophetic masterpiece Frau im Mond [Woman in the Moon], sponsored by the German Rocket Society and advised masterfully by Hermann Oberth, a pioneering German rocket research scientist. The first film to ever present the basics of rocket travel in cinema, and credited with the now-standard countdown to zero before launch in real life, Frau im Mond also featured a prototype of the liquid-fuel rocket and inspired a generation of physicists to contribute to the eventual realization of space travel. Destination Moon, a 1950 American sci-fi film about a privately financed trip to the Moon, was the first film produced in the United States to deal realistically with the prospect of space travel by utilizing the technical and screenplay input of notable science fiction author Robert A. Heinlein. Released seven years before the start of the USSR Sputnik program, Destination Moon set off a wave of iconic space films and television shows such as When Worlds Collide, Red Planet Mars, Conquest of Space and Star Trek in the midst of the 1950s and 1960s Cold War “space race” between the United States and Russia. What theoretical scientific feat will propel the next diegetic prototype? A mission to Mars? Space colonization? Anti-aging research? Advanced stem cell research? Time will only tell.
Ultimately, readers will enjoy Lab Coats in Hollywood for its engaging writing style, detailed exploration of the history of science in film and most of all, valuable advice from fellow scientists who transitioned from the lab to consulting on a movie set. Whether you are a sci-fi film buff or a research scientist aspiring to be a Hollywood consultant, you will find some aspect of this book fascinating. Especially given the rapid proliferation of science and technology content in movies (even those outside of the traditional sci-fi genre), and the input from the scientific community that it will surely necessitate, knowing the benefits and pitfalls of this increasingly in-demand career choice is as important as its significance in ensuring accurate portrayal of scientists to the general public.
~*ScriptPhD*~
***************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>
In his State of the Union speech in January, US President Barack Obama proclaimed that “we need to teach our kids that it’s not just the winner of the Super Bowl who deserves to be celebrated, but the winner of the science fair.” A noble (and correct) assessment, to be sure, but one mired in numerous educational and cultural obstacles. For one thing, science fairs themselves are at a perilous crossroads. A New York Times report issued in February stated that not only is participation in science fairs among high school kids falling, but that the kind of creativity and independent exploration that these competitions necessitate is impossible under current rigid test-driven educational guidelines for teaching mathematics and science. Indeed, an interesting recent Newsweek article on “The Creativity Crisis” conveyed research studies showing that for the first time, American creativity is declining. How appropriate, then, that this April (national math education month) brings the culmination of the Google World Science Fair, the first ever competition of its kind transpiring online and open to lab rats from all over the globe. ScriptPhD.com discusses why this could be a game-changer for the next generation of young scientists, under the “continue reading” cut.
One of the most chilling chapters in Thomas Friedman’s brilliant 2005 book “The World is Flat” discusses the ramifications of the globalization of science, and how quickly America is getting left behind. In addition to global “flatteners” (connectors) such as the internet, outsourcing, and yes, even access to free information via Google, Friedman details how hard third-world nations such as India and China work to attain supreme educations in math and science. On the one hand, they are producing more raw talent than ever, which often (due to lack of job opportunities and world-class facilities) finds its way into American (and Western) laboratories and corporations. On the other hand, it leaves American students and scientists ill-prepared to compete in a globalized economy based on information rather than raw production. (See Tom’s talk about global flattening at MIT here.) China will surpass the United States in patent filings by scientists by 2020. They are set to overtake the US in published research output even faster – in 2 years! Disturbingly, US teens ranked 25th out of 34 countries in math and science in the most recent world rankings, prompting President Obama to direct $250 million dollars towards math and science education. How that education is conveyed in classrooms is a subject of quite ardent debate.
Clearly, science education, in its current incarnation, is not working successfully. Unorthodox curricula have been proposed by numerous academic institutions, and even implemented with success in some countries. Furthermore, the idea of iconoclasts and self-taught geniuses, left alone to ferment their creativity, is not new. Albert Einstein famously clashed with authorities in primary school (which he barely finished), noting that “the spirit of creativity and learning were lost in strict rote learning.” In 2009, self-taught college dropout Erik Anderson proposed a major new theory on the structure of spiral galaxies and published it in one of the world’s most prestigious journals. (See ScriptPhD.com’s excellent post on whether creativity can really be measured in the lab.) Enter the Google World Science Fair. Capitalizing on the web and social media-driven knowledge of the current generation, they aim to not only expand on traditional well-known science competitions like Intel and Siemens, but to catapult them into the modern Internet era. Concomitantly, and even more importantly, as the fair’s organizers relayed over the weekend to the New York Times, they wish to improve science and math education in America incorporating a brand that many kids are already familiar with and use with ease. Why not infuse the excitement of a Google search into the staid, antiquated methodologies afflicting much of math and science curricula today? The impacts of science and independent experimentation are wide-reaching and powerful. During a gathering of scientists, students and judges on the day of the science fair announcement at Google headquarters, African self-taught scientist William Kamkwamba shared how from a library book, he was able to build a wind mill that powered his large family’s house, brought water to his impoverished village, but then taught other villagers to build wind mills, and by proxy, improved schools and living conditions. Who knows how many of this year’s global entrants will make such sizable contributions to their communities, or even, as they’re encouraged to do, solve global-scale afflictions?
Beyond the originality factor, he Google competition is important in several ways. It’s virtual and literally open to anyone in the world so long as they are a student between the ages of 13-18, thereby negating the most obvious roadblock to participation in many science competitions: location and affordability. (Though studies argue that internet access is still an overwhelming factor in economic and social equality, which is a not insignificant hurdle for aspiring third world participants.) Secondly, the competition is being judged on a passion for science and ideas, especially those relevant to the world today. In an age when we’re trying to ameliorate diseases, epidemics, the effects of global warming and violently changing weather patterns, urban sprawl and overpopulation, along with an ever-frustrating lack of access to water, food and sanitation by the poor, a few extra ideas and approaches can’t hurt. After all, a 15-year-old Louis Braille invented a system of reading for the blind, 18-year-old Alexander Graham Bell sketched rough ideas for what would turn into the telephone, 14-year-old Philo T. Farnsworth invented the television, and the modern microscope that many entrants will likely use in their experiments was invented by a 16-year-old Anton van Leeuwenhoek! (See more here.)
In the same spirit of hip novelty and digital cleverness that they’ve infused into the age-old science fair, Google hired the team from Los Angeles-based Synn Labs, the same team behind the viral OK Go music video, to create a thirty-second Rube Goldberg-themed video promoting the science fair. It is, perhaps, the highlight of the competition itself! Take a look:
The submission deadline for the 2011 online global science fair is today, April 4, 2011. All information about submission, judging, prizes, and blogs about entries can be found on the Google Global Science Fair homepage. The site also offers resources for teachers and educators looking to gain ways to bring the essence of Google’s science fair into their classrooms. You can also track all projects, as well as interact and exchange ideas with other science buffs, on their Facebook fan page and Twitter page.
ScriptPhD.com encourages all of our readers, clients, and fans who either submitted entries by the deadline, had their kids enter, or know someone who entered the competition to come back and tell us about the experience on our Facebook page. We’d love to hear about it! We wholeheartedly support programs that promote science and innovation, especially applicable to mitigating global social and technological obstacles. Our consulting company mantra is that great creative enterprises are fueled by great ideas. So, too, are science and technology. As such, we applaud Google for reinventing (and virtualizing) science outreach to encourage ideas and transform an entire generation of scientists, regardless of location, education or perceived ability. And if you’re bummed that you missed out on this year’s competition, think of it this way: you have plenty of time to prepare for 2012!
This post was sponsored by Unruly Media.
~*ScriptPhD*~
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>
You can always tell you’ve gone too far when you reach the wind farms. They populate the barren wastes of California’s northern interior, rows of them spinning atop camel-haired hills starved of moisture to slake the thirst of the Los Angeles glitterati. These motionless pinwheels are an ironic green afterthought to the ecological disaster that embraces the Interstate-5 freeway: now that we’ve created the dust bowl we may as well use the wind to power our air filters. There’s more than wind and dust out here. This is where they put the kinds of facilities the government doesn’t want people snooping around in. Lawrence Livermore National Laboratory is one of them—a secretive development center for our nation’s nuclear arsenal during the Cold War. Here in Livermore, the world’s finest physicists are on the verge of a breakthrough that could power entire cities on a bathtub full of water. The National Ignition Facility, also known as the world’s largest laser, is on the cusp of achieving the first break-even nuclear fusion reaction. NIF is the U.S. Department of Energy’s Sagrada Familia. If successful, the four billion dollar facility will be the first ever to demonstrate Ignition: a fusion reaction that releases more energy than was put into it. The energy, national security, economic and environmental ramifications for the United States, if not the world, would be staggering. ScriptPhD.com’s Stephen Compson gained ultra-exclusive access to the normally reclusive facility, including tours, interviews, and a peek at the lasers that could hold the key to the United States’s global rebirth. With nuclear fusion on the brink of break-even, Stephen recounts we tour the world’s next scientific revolution.
The Photon Valley
Livermore is not so much a city as it is a byproduct of the lab at its center. This small suburban community lies halfway between San Francisco’s Bay Area and the Interstate 5 that allows truckers a north-south passage between the “Two Californias.” They call it The Photon Valley, the world capital of laser technology and all things light-related. Moth-like high-tech subcontractors orbit the facility’s perimeter. Charon Sue Wollard is Livermore’s second Poet Laureate. Her poem Steller Gest hints at the secrets locked away in the cathedral of optics:
I check into a room that smells like a Pine-Sol explosion and cruise the two main drags looking for a meal. A pretty girl takes my order at a pizza place, and I realize for the first time what a physics town this truly is. Her eye shadow has been cross-sectioned into bandwidths, showing five different colors on each eyelid. It is a cosmetic display of the visible light spectrum. She smiles and asks if I’d like a beer.
Fusion is the process by which two atomic bodies, driven together by an overwhelming force, merge to become one heavier nucleus. It is the reverse of fission, the splitting process that gave us the atomic bomb and the conventional nuclear reactor. An atomic act of coupling, fusion powers our sun and every other light in the night sky. At the National Ignition Facility here in Livermore, scientists and engineers stand at the threshold of tapping into that process to create a limitless supply of clean energy. When the stars burn out, we’ll make our own.
The Birth of a Star
The car is completely covered in bird crap. This isn’t just one or two droppings caked onto a dirty hood, but a carpet-bombing by a flock of irritably-boweled miscreants. Lynda, the public affairs officer assigned to chaperoning me through the facility, smiles apologetically and gestures toward the culprits in the dense trees that loom over the parking lot.
Like a quiet family home in mid-Los Angeles, the Livermore Lab hides from prying eyes with a protective wall of foliage that makes it look more like a nature preserve than a nuclear weapons facility. But behind the trees they have plenty of barbed wire and guards armed with assault rifles. I finger the clearance pass with my photo on it and thank her for the souvenir. She looks puzzled, “You won’t be allowed to keep it.” Bruno Von Wonterghem is exactly the sort of Germanic super-scientist you’d imagine running operations at the world’s largest laser facility. His accent lends an easy everyday quality to words like laser, optics, and Neodymium. He has the propensity to trail off into a mumble that could fill an entire page, like he’s been talking about the laser’s attributes merely for his own benefit. Bruno has been working on laser systems at the lab for almost two decades. Every time I ask a question this strange light comes into his eyes, as though he’s realizing for the first time that there are people in this world who don’t know about the National Ignition Facility.
From the outside, the building doesn’t quite look real. It’s glossy, like one of those photos a developer might put up in front of a vacant lot to convince people that something could be built there. I’ve been trying to get into this facility for over a year, and to my disappointment it looks exactly like the pictures. Most of the facility’s iconic equipment is on display in the front lobby, and the walls throughout are lined with posters explaining how everything works. The whole thing screams field trip, and I’m starting to wonder if there is actually any science that takes place here. Then they show me the laser bays.
Every article you will ever read about the National Ignition Facility measures the size of its laser bays in terms of football fields (four). It’s difficult to get a feel for the scale from the picture, because this is only one of two laser bays and what you don’t realize looking down on it is that the whole system is suspended a story above the ground, putting us three stories up. They do that because any replacement modules have to be loaded from underneath in pre-assembled clean rooms so that any outside particles will fall out of the system.
After giving me a minute to ogle, Bruno clears his throat. “The Facility is essentially an energy concentrator in time and space. It takes about sixty seconds to charge up the capacitor bank with six megawatts, but the laser releases that energy in a very short amount of time: billionths of a second. Those six megawatts are concentrated by over fifteen orders of magnitude, more power than the entire United States electrical grid.” The cool thing about lasers is that they have an unlimited threshold for delivering power. Photons, the packets that physicists use to quantify light, occupy no space, so you can pack an unlimited number of them into as focused a path as your lenses allow.
The charge process concentrate the lasers through time, and the lenses concentrate them through space. Each of the tubes acts like a telescope, converging NIF’s 192 beams onto a single capsule the size of a vitamin, gaining another nine orders of magnitude in the process to create temperatures hotter than the inside of the sun.
At the molecular level, heat causes vibration. The hydrogen ions in the fuel capsule are both positively charged, so they magnetically repel one another. However, the supercharged laser creates a perfectly symmetrical layer of plasma around the target. There’s nowhere for the ions to run, and as they heat up and accelerate to over a million miles per hour, the two cores inevitably fuse to become a hydrogen atom, releasing some of their mass as energy in the form of neutrons. Each laser bay contains 96 beams, one in each steel tube. Thick black high-voltage cables snake along the outside delivering power to the amplifiers. The tubes are filled with Argon gas because air reacts with lasers and impedes their progress. Aside from the amplifiers, the lasers also pass through pink slabs of neodymium that add juice to the discharge. The shot bounces through the entire array four times before it enters the switchyard into the target bay.
This is the world’s largest optical system. There are over 70,000 large optical controllers and 30,000 smaller optics. As we look over it all from the third floor, Bruno sniffs, “If you could look through all these tubes and strip away all this steel, you would find a sea of optical elements. It would be beautiful. Right now, it just looks like tubes.”
Between the Conception and the Creation
Every shot starts in the Oscillator Room, which is a somewhat disappointing set of server-like cabinets containing the three oscillators that send out the initial pulse to the preamplifiers. This starting pulse is only a billionth of a joule, or 1/160th the kinetic energy of a flying mosquito. A fairly humble beginning for a star’s birth. They use the three oscillators in tandem to fine-tune the timing of the pulse down to a few trillionths of a second. The oscillators run constantly, but only one of these pulses will enter the amplifiers to begin powering up for a shot. A large red counter keeps track of all pulses that pass without notice through the fiber-optics, unable to reach their full potential in the adjacent laser bays.
In 1957, John Nuckolls began investigating peaceful applications of nuclear weapons technology. He proposed a novel scheme: the implosion of a Deuterium-Tritium (hydrogen isotopes) fuel capsule inside a tiny holhraum driven by an external energy source (refer to above graphic). This is the same scheme NIF uses today, but at the time when Nuckolls proposed it the laser had not yet been invented, so he considered other exotic power sources like particle accelerators, plasma guns, and hypervelocity pellet guns, which sound suspiciously like glorified BB guns.
The reaction that takes place at NIF today is almost identical to the one Nuckolls designed in the 60s, but the physicists needed a laser system capable of generating a pulse hotter than the core of our sun to achieve ignition. First they built the SHIVA laser, named for Oppenheimer’s proclamation that he had become the destroyer of worlds after unveiling the atomic bomb. SHIVA was followed by NOVA, NIF’s predecessor and the first laser system Bruno worked on at the facility. In late 2009, almost fifty years after the idea was conceived, NIF successfully demonstrated the temperatures necessary for energy gain ignition. “It was so much bigger than any of us ever imagined,” Bruno reflected. “When we bring in the original scientists to look at the target bay, they sort of look around in awe and say, ‘We can’t believe this is what we asked for.’”
The target chamber is a 10-meter diameter aluminum sphere with laser tubes sprouting from its surface. From the outside, it looks like one of those alien objects from Contact or Sphere. It spans three stories of the facility, so you can only ever look at it from above or below. We stand on the third floor watching a crew of workers install some newly hardened diagnostic equipment. They’ve spent the past six months preparing the target chamber to absorb the massive amount of energy released by the fusion reaction. The entire chamber is now surrounded by a concrete and boron barrier two meters thick. In the LIFE facility, a commercial power plant designed to actually generate electricity, the neutrons will be absorbed by a mantle of liquid salts that transfer their kinetic energy to heat energy which drives a conventional steam generator. However, because NIF is an experimental facility where they actually need to observe the reaction, its physicists were presented with the challenge of preparing cameras that could withstand the bombardment, since neutrons lay waste to electronics.
The density of the imploding target is so high that normal x-rays can’t penetrate the implosion’s surface, so they concentrate four x-ray beams to over a Petawatt, which is a quadrillion watts (yeah, real number). The reaction itself is only a tenth of the size of a human hair, and it boggles even the scientists who work there that they have an x-ray powerful enough to observe it. They call this camera Dante, “because it looks into the mouth of hell.” Despite the incredibly powerful forces at work, Inertial Confined Fusion is safer than any of the power-generating technologies that have preceded it. The key thing to remember is that the reaction is an implosion, not an explosion. If anything goes wrong, the whole thing collapses on itself and nothing happens. The worst thing that can happen is all the fuel gets consumed and we’re out one more capsule. There’s no possibility of a catastrophic meltdown like Three Mile Island or Chernobyl. Lynda deals with this line of questioning all the time: “People hear that NIF generates temperatures hotter than the sun and that it creates a miniature star and they wonder how the whole facility doesn’t melt down. But we’re talking about an implosion, with an incredibly small scale, for only a few billionths of a second. It’s no different than a supernova, the physics are all the same, but the scale’s a bit smaller.”
Holhraum is the German word for “hollow room.” It has been mostly analogized to a pill capsule, in reference to its size and shape. Its role is to act as an x-ray oven, containing the plasma generated around the target pellet and acting as a mold for the fusion reaction’s symmetry. The holhraum contains the fuel pellet made up of the hydrogen isotopes Tritium and Deuterium. There’s no danger of scarcity with these two elements: Tritium is derived from the relatively common Lithium and Deuterium comes from plain old water. Another advantage to fusion is that there’s no waste created, at all. Everything is consumed by the reaction, leaving behind no toxic radiation or weaponizable elements. In fact, physicists can use facilities like NIF to dispose of the nuclear waste from the previous generation of fission reactors, rendering the arguments about Yucca mountain or blasting it into space completely moot.
The shot director begins the countdown, and alarms sound throughout the facility. Daylight hours at the facility mostly consist of construction and maintenance work, they fire the lasers at night. There are only twenty people in the facility during a shot, and they’re all here in the control room, but they have the alarms just in case. The countdown is four minutes, which seems like an incredibly long time. Nearly all of the process is controlled by computers, a necessity for the minute level of control required to achieve symmetry. The control room looks a lot like NASA’s Mission Control. They like to hire nuclear submarine captains as shot directors, because of the rigidity in operations requirements. “We can train them with the technical knowledge, but having that operations experience from a submarine where things are going on all around you is essential. There’s a lot of action during a shot, but each one can take up to twelve hours. It’s like loading a new missile every day.”
As we reach the final few seconds of the countdown, I look around nervously. The overhead lights flash, and that’s it. No sound effects, no shaking. I have to ask if it worked. The only noise comes from the physicists next door scrambling to be the first to retrieve the data. Lynda leans over, “When you take the amount of shots fired in an entire year, where each one only takes a fraction of a second, it’s almost like this thing is never really on.”
Countdown to Ignition
The National Ignition Campaign coincides with football season, but the stakes are a little higher. Ignition is the validation of fusion as a viable energy source, the point at which the critics are silenced and the rest of the world scrambles to duplicate the feat. “It’s going to be incredible, standing room only. There’s people lining up around the world that want to be here for that event. Every milestone along the way has been a major event, when we went from two to four lasers it was a major event. Now we have 192. And it’s all leading up to the ignition.” The question they must get tired of hearing is: when? “September, October. There’s a few [target] options we have, a plastic capsule, a beryllium capsule, a diamond capsule.” I ask Bruno which one he thinks will do the trick. It’s important not to underestimate the technological perfection required to achieve the symmetry necessary for the ignition. Every aspect of the shot must be analyzed and optimized to an order of precision never before achieved. With all that said, he doesn’t see any reason why the plastic capsule won’t get the job done.
I have to press: “The plastic capsule, that’s going to be the one?” Bruno replies with a tremor in his voice. Like the girl who’s fallen for too many bad boys, he’s been hurt before, “I believe that will be the one, yes.”
Are you ready for some Physics?
The above title sounds a lot more enticing if you sing it to the Monday Night Football theme song. There’s no good reason why I should be allowed into a nuclear weapons lab. I’m nobody, another starving bookworm with a taste for Faulkner and single barrel scotch. All I can surmise is that so far, no-one’s managed to pull this four billion dollar sword from its stone and shove it somewhere that will make the American public pay attention. I’m living proof that they’re desperate. No one I know has even heard of the National Ignition Facility, which quite sadly included the Editor of ScriptPhD.com who sent me there. I’ve spoken with professional engineers and physicists who don’t recognize the name. There’s certainly been plenty of media coverage: the BBC, the Discovery Channel, Wired, Time. People’s ears tend to perk up when they hear something might kill them, but the general public hasn’t had a survival-based reason to pay attention to physics since the end of the Cold War. We know that unruly nations getting access to nuclear weapons is a bad thing, but as to the current state of the field, a nuclear warhead is implicitly bad enough that there’s been no reason to continue following their progress.
Fusion offers us the solution to a problem most people don’t even know we have. The immediacy of it isn’t so clear, but in his excellent BBC Horizons: Can We Make a Star?, Professor Brian Cox paints a grim picture. It’s not possible to give the rest of the world access to even half the electricity that the average American uses without bankrupting all possible means of generating electricity, and in the process laying waste to the environment.
We have to figure out how to generate electricity for the growing population around the world without, as Bruno likes to say, choking ourselves. The actual numbers are sobering, but it’s a subject people don’t like to read about because in the past there’s been no clear solution. Now here we are, being handed that solution on a silver platter by a group of individuals that no one’s heard of who have been working on that solution for the last fifty years. Which only makes the endeavor that much more noble.
The Holy Grail
When I began this trip, I had no idea that I would be making a pilgrimage. Bruno and his co-workers are like stonemasons laying the foundation for a church they will never live to see. It will be at least twenty years before the first commercial fusion reactors come online (skeptics put the time-frame around 2050). John Nuckolls, the man who came up with Inertial Confinement Fusion, is in his nineties. Five thousand people have spent their lives building this facility up to this point. They are the clergy of the modern era, humanity’s most educated class, working selflessly to create a better world. “People want to be here to work on a mission. The goal is really very abstract to many of them, but they’re all motivated by being able to contribute to an event where they can make history. There’s a grand challenge to it, a vast significance in being a part of that.”
Gone are the days when the average American held down a job for the same company his or her entire working life, but many at the National Ignition Facility join the team when they finish their doctorates from the world’s finest universities and work there until they retire. There are fathers who have spent their entire careers working on fusion only to see their son or daughter take up the cause. Remarked Bruno: “We have to think long term. You need technology that can carry you beyond fifty and a hundred years into the future. When you think about your great grandchildren, this is the only solution, the only way that we can survive with the quality of life that we’ve become accustomed to. It is our holy grail.”
The Neutron Age
The achievement of energy gain fusion is one that should fundamentally alter human existence. As technology advances, our quality of life will remain intimately connected with access to cheap, renewable energy. There are existential stakes as well. Fusion is the engine that drives our universe. By achieving Ignition, we begin our mastery of Mother Nature’s own energy source, a force so fundamental that cultures around the world worshiped its daily appearance at dawn. “Every fifty to a hundred years you reach a point where you make a quantum jump in technology. We went from coal in the industrial age to the atomic age in the fifties and now we will reach the neutron age. We can finally see an opening into the energy problem. Within fifty years you could provide a significant fraction of your basic power production from fusion. You can give this to developing countries and bring them into a new century. Suddenly someone with no refrigerator, no microwave will realize what it means to live in the modern age. We have a quality of life here in America that simply cannot be provided around the world with the resources we have. Fusion will make that possible.”
Clean power is only the beginning. Like the technologies that emerged from the Pandora’s box of quantum mechanics, fusion will undoubtedly open up a whole new wave of technological advancement. Every aspect of the National Ignition Facility, from the optics that carry the lasers to the supercomputers that process the data have pushed the limits in their field. Bruno gives me a sly look. “We probably have no idea yet what applications will come from having a neutron source like this.” And thanks to the NIF’s tireless believers, we enter that neutron age this fall.
Stephen Compson studied English and Physics at Pomona College. He writes fiction and screenplays and is currently working toward a Master of Fine Arts at UCLA’s School of Theater, Film & Television.
~*Stephen Compson*~
************************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]> Each of the brain’s 100 billion neurons has somewhere in the realm of 7,000 connections to other neurons, creating a tangled roadmap of about 700 trillion possible turns. But thinking of the brain as roads makes it sound very fixedyou know, pavement, and rebar, and steel girders and all. But the opposite is true: at work in our brains are never-sleeping teams of Fraggles and Doozers who rip apart the roads, build new ones, and are constantly at work retooling the brain’s intersections. This study of Fraggles and Doozers is the booming field of neuroplasticity: how the basic architecture of the brain changes over time. Scientist, neuro math geek, Science Channel personality and accomplished author Garth Sundem writes for ScriptPhD.com about the phenomenon of brain training and memory.
Certainly the brain is plasticthe gray matter you wake up with is not the stuff you take to sleep at night. But what changes the brain? How do the Fraggles know what to rip apart and how do the Doozers know what to build? Part of the answer lies in a simple idea: neurons that fire together, wire together. This is an integral part of the process we call learning. When you have a thought or perform a task, a car leaves point A in your brain and travels to point B. The first time you do something, the route from point A to B might be circuitous and the car might take wrong turns, but the more the car travels this same route, the more efficient the pathway becomes. Your brain learns to more efficiently pass this information through its neural net.
A simple example of this “firing together is wiring together” is seen in the infant hippocampus. The hippocampus packages memories for storage deeper in the brain: an experience goes in and a bundle comes out. I think of it like the pegboard at the Seattle Science Center: you drop a bouncy ball in the top and it ricochets down through the matrix of pegs until exiting a slot at the bottom. In the hippocampus, it’s a defined path: you drop an experience in slot number 5,678,284 and
it comes out exit number 1,274,986. How does the hippocampus possibly know which entrance leads to which exit? It wires itself by trial and error (oversimplification alert but you get the point). Infants constantly fire test balls through the matrix and ones that reach a worthwhile endpoint reinforce worthwhile pathways. These neurons fire together, wire together, and eventually the hippocampus becomes efficient. It’s just that easy. (And because it’s so easy, researchers aren’t far away from creating an artificial hippocampus.)
Now let’s think about Sudoku. The first time you discover which missing numbers go in which empty boxes, you do so inefficiently. But over time, you get better at it. You learn tricks. You start to see patterns. You develop a workflow. And practice creates efficiency in your brain as neurons create the connections necessary for the quick processing of Sudoku. This is true of any puzzle: your plastic brain changes its basic architecture to allow you to complete subsequent puzzles more efficiently. Okay, that’s great and all, but studies are finding that the vast majority of brain-training attempts don’t generalize to overall intelligence. In other words, by doing Sudoku, you only get better at Sudoku. This might gain you street cred in certain circles, but it doesn’t necessarily make you smarter. Unfortunately, the same is true of puzzle regiments: you get better at the puzzles, but you don’t necessarily get smarter in a general way.
That said, one type of puzzle offers some hope: the crossword. In fact, researchers at Wake Forest University suggest that crossword puzzles strengthen the brain (even in later years) the same way that lifting weights can increase muscle strength. Still, it remains true that doing the crossword only reinforces the mechanism needed to do the crossword. But the crossword uses a very specific mechanism: it forces you to pull a range of facts from deep within your brain into your working memory. This is a nice thing to get better at. Think about it: there are few tasks that don’t require some sort of recall, be it of facts or experiences. And so training a nimble working memory through crosswords seems a more promising regiment than other single type of brain training exercise.
This is borne out by research. A Columbia University study published in 2008 found that training working memory increased overall fluid intelligence. So the answer to this article’s title question is yes, brain training is very real. (Only, there’s lot of schlock out there.) But hidden in this article lies the new key that many researchers hope will point the way to brain training of the future. Any ONE brain training regiment only makes you better at the one thing being trained. But NEW EXPERIENCES in general, promise a varied and continual rewiring of the brain for a fluid and ever-changing development of intelligence. In other words, if you stay in your comfort zone, the comfort zone decays around you. In order to build intelligence or even to keep what you have, you need to be building new rooms, outside your comfort zone. If you consume a new media source in the morning, experiment with a new route to work, eat a new food for lunch, talk to a new person, or try a NEW puzzle, you’re forcing your brain to rewire itself to be able to deal with these new experiencesyou’re growing new neurons and forcing your old ones to scramble to create new connections.
Here’s what that means for your brain-training regimen: doing a puzzle is little more than busywork; it’s the act of figuring out how to do it that makes you smarter. Sit down and read the directions. If you understand them immediately and know how you should go about solving a puzzle, put it down and look for something else something new. It’s not just use it or lose it. It’s use it in a novel way or lose it. Try it. Your brain will thank you for it.
Garth Sundem works at the intersection of math, science, and humor with a growing list of bestselling books including the recently released Brain Candy: Science, Puzzles, Paradoxes, Logic and Illogic to Nourish Your Neurons, which he packed with tasty tidbits of fun, new experiences in hopes of making readers just a little bit smarter without boring them
into stupidity. He is a frequent on-screen contributor to The Science Channel and has written for magazines including Wired, Seed, Sand Esquire. You can visit him online or follow his Twitter feed.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>
Scientists are becoming more interested in trying to pinpoint precisely what’s going on inside our brains while we’re engaged in creative thinking. Which brain chemicals play a role? Which areas of the brain are firing? Is the magic of creativity linked to one specific brain structure? The answers are not entirely clear. But thanks to brain scan technology, some interesting discoveries are emerging. ScriptPhD.com was founded and focused on the creative applications of science and technology in entertainment, media and advertising, fields traditionally defined by “right brain” propensity. It stands to reason, then, that we would be fascinated by the very technology and science that as attempting to deduce and quantify what, exactly, makes for creativity. To help us in this endeavor, we are pleased to welcome computer scientist and writer Ravi Singh’s guest post to ScriptPhD.com. For his complete article, please click “continue reading.”
Before you can measure something, you must be able to clearly define what it is. It’s not easy to find consensus among scientists on the definition of creativity. But then, it’s not easy to find consensus among artists, either, about what’s creative and what’s not. Psychologists have traditionally defined creativity as “the ability to combine novelty and usefulness in a particular social context.” But newer models argue that these type of definitions, which rely on extremely-subjective criteria like ‘novelty’ and ‘usefulness,’ are too vague. John Kounios, a psychologist at Drexel University who studies the neural basis of insight, defines creativity as “the ability to restructure ones understanding of a situation in a non-obvious way.” His research shows that creativity is not a singular concept. Rather, it’s a collection of different processes that emerge from different areas of the brain.
In attempting to measure creativity, scientists have had a tendency to correlate creativity with intelligenceor at least link creativity to intelligenceprobably because we believe that we have a handle on intelligence. We believe can measure it with some degree of accuracy and reliability. But not creativity. No consensus measure for creativity exists. Creativity is too complex to be measured through tidy, discrete questions. There is no standardized test. There is yet to be a meaningful “Creativity Quotient.” In fact, creativity defies standardization. In the creative realm, one could argue, there’s no place for “standards.” After all, doesn’t the very notion of standardization contradict what creativity is all about?
To test creativity, researchers have historically attempted to test divergent thinking, an assessment construct originally developed in the 1950s by psychologist J. P. Guilford, who believed that standardized IQ tests favored convergent thinkers (who stay focused on solving a core problem), rather than divergent thinkers (who go ‘off on tangents’). Guilford believed that scores on IQ tests should not be taken as a unidimensional measure of intelligence. He observed that creative people often score lower on standard IQ tests because their approach to solving the problems generates a larger number of possible solutions, some of which are thoroughly original. The test’s designers would have never thought of those possibilities. Testing divergent thinking, he believed, allowed for greater appreciation of the diversity of human thinking and abilities. A test of divergent thinking might ask the subject to come up with new and useful functions for a familiar object, such as a brick or a pencil. Or the subject might be asked to draw the taste of chocolate. You can see how it would be very difficult, if not impossible to standardize a “correct” answer.
Eastern traditions have their own ideas about creativity and where it comes from. In Japan, where students and factory workers are stereotyped as being too methodical, researchers are studying schoolchildren for a possible correlation between playfulness and creativity. Nath philosopher Mahendranath wrote that man’s “memory became buried under the artificial superstructure of civilization and its artificial concepts,” his way of saying that that too much convergent thinking can inhibit creativity. Sanskrit authors described the spontaneous and divergent mental experience of sahaja meditation, where new insights occur after allowing the mind to rest and return to the natural, unconditioned state. But while modern scientific research on meditation is good at measuring physiological and behavioral changes, the “creative” part is much more elusive.
Some western scientists suggest that creativity is mostly ascribed to neurochemistry. High intelligence and skill proficiency have traditionally been associated with fast, efficient firing of neurons. But the research of Dr. Rex Jung, a research professor in the department of neurosurgery at the University of New Mexico, shows that this is not necessarily true. In researching the neurology of the creative process, Jung has found that subjects who tested high in “creativity” had thinner white matter and connecting axons in their brains, which has the effect of slowing nerve traffic. Jung believes that this slowdown in the left frontal cortex, a brain region where emotion and cognition are integrated, may allow us to be more creative, and to connect disparate ideas in novel ways. Jung has found that when it comes to intellectual pursuits, the brain is “an efficient superhighway” that gets you from Point A to Point B quickly. But creativity follows a slower, more meandering path that has lots of little detours, side roads and rabbit trails. Sometimes, it is along those rabbit trails that our most revolutionary ideas emerge.
You just have to be willing to venture off the main highway.
We’ve all had aha! momentsthose sudden bursts of insight that solve a vexing problem, solder an important connection, or reinterpret a situation. We know what it is, but often, we’d be hard-pressed to explain where it came from or how it originated. Dr. Kounios, along with Northwestern University psychologist Mark Beeman, has extensively studied the the Aha! moment.” They presented study participants with simple word puzzles that could be solved either through a quick, methodical analysis or an instant creative insight. Participants are given three words then are asked to come up with one word that could be combined with each of these three to form a familiar term; for example: crab, pine and sauce. (Answer: “apple.”) Or eye, gown and basket. (Answer: ball)
About half the participants arrived at solutions by methodically thinking through possibilities; for the other half, the answer popped into their minds suddenly. During the “Aha! moment,” neuroimaging showed a burst of high-frequency activity in the participants’ right temporal lobe, regardless of whether the answer popped into the subjects’ minds instantly or they solved the problem methodically. But there was a big difference in how each group mentally prepared for the test question. The methodical problem solvers prepared by paying close attention to the screen before the words appearedtheir visual cortices were on high alert. By contrast, those who received a sudden Aha! flash of creative insight prepared by automatically shutting down activity in the visual cortex for an instantthe neurological equivalent of closing their eyes to block out distractions so that they could concentrate better. These creative thinkers, Kounios said, were “cutting out other sensory input and boosting the signal-to-noise ratio to enable themselves retrieve the answer from the subconscious.
Creativity, in the end, is about letting the mind roam freely, giving it permission to ignore conventional solutions and explore uncharted waters. Accomplishing that requires an ability, and willingness, to inhibit habitual responses, take risks. Dr. Kenneth M. Heilman, a neurologist at the University of Florida believes that this capacity to let go may involve a dampening of norepinephrine, a neurotransmitter that triggers the fight-or-flight alarm. Since norepinephrine also plays a role in long-term memory retrieval, its reduction during creative thought may help the brain temporarily suppress what it already knows, which paves the way for new ideas and discovering novel connections. This neurochemical mechanism may explain why creative ideas and Aha! moments often occur when we are at our most peaceful, for example, relaxing or meditating.
The creative mind, by definition, is always open to new possibilities, and often fashions new ideas from seemingly irrelevant information. Psychologists at the University of Toronto and Harvard University believe they have discovered a biological basis for this behavior. They found that the brains of creative people may be more receptive to incoming stimuli from the environment that the brains of others would shut out through the the process of “latent inhibition,” our unconscious capacity to ignore stimuli that experience tells us are irrelevant to our needs. In other words, creative people are more likely to have low levels of latent inhibition. The average person becomes aware of such stimuli, classifies it and forgets about it. But the creative person maintains connections to that extra data that’s constantly streaming in from the environment and uses it.
Sometimes, just one tiny stand of information is all it takes to trigger a life-changing “Aha!” moment.
Ravi Singh is a California-based IT professional with a Masters in Computer Science (MCS) from the University of Illinois. He works on corporate information systems and is pursuing a career in writing.
*****************
ScriptPhD.com covers science and technology in entertainment, media and advertising. Hire our consulting company for creative content development.
Subscribe to free email notifications of new posts on our home page.
]]>