Skip to Content, Navigation, or Footer.

Frankenstein Project is starting conversations about scientific responsibility

With great (scientific) power comes great (scientific) responsibility

Frankenstein Drawing.jpg

"ASU’s School for the Future of Innovation in Society is using the story of Frankenstein to bring about conversations about scientific responsibility." Illustration published on Wednesday, Oct. 25, 2017.


During First Friday on Oct. 6 in downtown Phoenix, ASU’s School for the Future of Innovation in Society had a "Frankenstein" themed booth to promote Future of X, the Frankenstein Bicentennial Project and other programs. 

The story of Frankenstein's monster may be fiction, but it is being used by the Frankenstein Bicentennial Project to start a conversation about scientific innovation and the responsibilities of scientists. 

Future of X hosts a series of conversations with guest speakers of various disciplines, including artificial intelligence, human consciousness and biology. 

“The Future of X series was designed to start conversations about what the future might look like and how new thinking and technology is going to affect it,” Andrew Maynard, a professor at ASU's School for the Future of Innovation in Society, said. 

Future of X once focused solely on guest speakers. Now, however, it is starting to take a more interactive and engaging approach by reaching out at First Fridays .

Mary Shelley, the author "Frankenstein", published her book in 1818. She was inspired by the industrial revolution, and Frankenstein reflected the fears of many of that era. Though the story was published 200 years ago, its themes of humans losing control of technology are still relevant today. 

“Projects like the Frankenstein Project allow people outside of the scientific community and academia to have an understanding about science and the processes behind science,” Diana Bowman, associate dean for international engagement at ASU’s Sandra Day O’Connor College of Law, said. 

“Anything that makes science accessible to people and gives them the tools and insight to question science can only be a good thing." 

The story of "Frankenstein" is one of the most well known works of fiction, making it an accessible way to better relate to the idea of scientific responsibility. 

“People know the story of 'Frankenstein,'" Bowman said. "It’s a very easy narrative for them to connect to real-world science. It allows people to see that there is a very human dimension to science."   

Easy comparisons can be drawn between Dr. Frankenstein and revered scientists. They are researchers who pushed innovation and whose inventions in one way or another grew out of their control. 

“I don't want to stereotype and say 'all scientists,'" Bowman said. "However, many traditionally see themselves as very independent and isolated from what may happen downstream."

She also said that talking to scientists about the potential negative and positive impacts of their work can help promote progress. 

“What I’d like to see is scientists thinking about what are the big problems that society needs help solving and how their work can be used to address those big challenges,” Bowman said. 

Scientific innovations like artificial intelligence, synthetic biology and neuroscience have the potential to benefit humankind. However, they also have the potential to cause harm. Fear of the negative potential of these innovations is reflected in modern day literature and other media just like "Frankenstein" reflected the fears people had in the 1800s. 

“I think we have tremendous potential from a lot of these new and emerging technologies, but it's precisely because of their new and surprising properties that we don't know how they’ll interact with other natural and social systems,”  Erik Fisher, a professor at ASU’s School for the Future of Innovation in Society, said. 

Fisher said people like Alfred Nobel, who invented dynamite, are similar to "Dr. Frankenstein" in their motivation to do things that have never been done before. 

“They happen to be working on projects that have some pretty obvious unintended consequences," Fisher said. "It becomes a question of being reflexive and being aware that your work can be used for things that go beyond you."  

Fisher is working on a project called Socio-Technical Integration Research, which is aimed at integrating philosophers and sociologists into scientific laboratories to get scientists to reflect on their work. Fisher hopes that the project will lead to interesting conversations. 

Innovative technology, such as artificial intelligence, is a hot topic in the scientific community. 

“Artificial intelligence is a great example," Maynard said. "There’s a lot of talk about whether we’re going to create 'Skynet' or super intelligence that decides humans aren't worth it and kills us all."

Maynard said that one thing to consider is the impact innovations like artificial intelligence will have on society. 

“Another thing to consider is our responsibility to society," Maynard said. "With something like artificial intelligence, how do we create a powerful technology that will benefit humanity without causing harm?”

Future of X was able to utilize the Frankenstein Project as a way to initiate conversations with the public about scientific innovations and our duty to gauge their impact.  

“One thing we know for certain is that if we allow scientists and engineers to do whatever they like without thinking about the consequences, you increase the chances of something going wrong,” Maynard said.

Future of X will have booths with varying themes to start conversations about the future of science at First Fridays for the rest of the year.   



Reach the reporter at jicazare@asu.edu or follow @sonic_429 on Twitter. 

Like State Press on Facebook and follow @statepress on Twitter.


Continue supporting student journalism and donate to The State Press today.

Subscribe to Pressing Matters



×

Notice

This website uses cookies to make your experience better and easier. By using this website you consent to our use of cookies. For more information, please see our Cookie Policy.