Skip to Content, Navigation, or Footer.

Many Americans have heard of the “singularity,” the theoretical point when technological intelligence will surpass human intelligence.

There are those who remain extremely optimistic, looking forward to the possibilities of artificial intelligence and the coming of singularity.

Then there are those who remain stuck on Hollywood’s interpretation of the singularity and are absolutely terrified of the human’s race possible imminent doom. Movies like “The Terminator” and “Minority Report” only exacerbate the hysteria that technology is driving humanity down an ominous road.

The truth is that the hopes of scientists and fears of Hollywood are both a little off. Neither party is addressing how humans will prepare themselves mentally to the technological revolution.

According to Ray Kurzweil’s “The Singularity is Near,” the singularity is “an era in which our intelligence will become increasingly non-biological and trillions of times more powerful than it is today — the dawning of a new civilization that will enable us to transcend our biological limitations and amplify our creativity.”

Kurzweil describes singularity as an exciting and progressive movement, but while there’s nothing to fear yet, according to hundreds of scientists, engineers and businessmen that attend a yearly Singularity Summit, we should still pay more attention to the implications of an ever-increasing intelligence and lifespan for humanity.

Scientists have made huge progress since the super-computer Deep Blue beat the world’s greatest chess player in 1996. Since then, scientists and computer analysts have created a program that can recognize, read and even predict human movements through a U.S. government-funded project.

Laura Deming, one of the youngest speakers to present at the Singularity Summit, sees the possibilities of AI and our progression toward non-biological intelligence as the cure to a yet incurable disease. “There is one fact that never fails to infuriate me,” Deming says.  “Every day, 150,000 die of a disease that we ignore. If we succeed, we will have turned the most awful paradigm that we know on its head. The inevitability of death.”

This sounds like an incredible answer to society’s most pressing problems, like population growth, war and famine — but it has some troubling implications. Inspired by the prospects of singularity, many of our brightest scientists are focusing on a less tangible approach to problems that might only make things worse.

Jaan Tallinn, an Estonian programmer also in attendance at the Singularity Summit, states that one thing we can do to avoid disaster is to “spread the idea that, although this sounds like science fiction, it is deadly serious. We definitely need way more resources to work on the safety aspects of developing artificial intelligence and possibly superhuman intelligence.”

The actual point of singularity has been vaguely predicted for 2020, 2030 and even 2040. The mental transition of human psyche is the true point that should concern citizens and scientists.

With the prospect of uploading our consciousness at the end of our biological lives, scientists must prepare the world for the idea of a life that doesn’t end, and how that will translate to humanity’s increasing tendency to consume and destroy everything we lay our hands on.

 

 

Reach the columnist at caleb.varoga@asu.edu or follow him at @calebvaroga

 

Want to join the conversation? Send an email to opiniondesk.statepress@gmail.com. Keep letters under 300 words and be sure to include your university affiliation. Anonymity will not be granted.


Continue supporting student journalism and donate to The State Press today.

Subscribe to Pressing Matters



×

Notice

This website uses cookies to make your experience better and easier. By using this website you consent to our use of cookies. For more information, please see our Cookie Policy.