Recognizing Technology’s Potential For Destruction
By Christopher Schrader | A month ago, I was on a 1950s refurbished diesel boat cruising in the Arctic Circle under the midnight sun. I was sharing a whiskey with Jonathan Nolan, creator of the hit HBO show Westworld. The series has a cult-like following, and in what may be my famous last words as a columnist, Westworld is far better than Game of Thrones.
Westworld is set in a theme park where the emergence of consciousness in its android hosts threatens the safety of park-goers and the goals of park management. Nolan, who is known for his colorful language, shared his take on the big ideas the show seeks to explore with me: “It’s not about the AI (Artificial Intelligence) we create, but what AI the AI creates. That’s fascinating. For the first time in our history, we’ll have no f– idea what the f– is going to happen.”
Generally speaking, when you create something, you can understand and control it. But if that creation invents intelligence of its own, you lose the control you have on its outcomes. Silicon Valley leaders ask what intelligence built by intelligence will look like when it never forgets, is powered by the collective knowledge of our species (i.e., the internet), and has the capacity to learn it in seconds.
I’m reminded of the opening chapter from Hugo Award-winning science fiction novel A Fire Upon the Deep: space-faring archaeologists accidentally awaken an ancient superintelligence, and the opening chapter follows the five seconds since its awakening, as it wipes out the settlement and plans to destroy the galaxy. The chapter highlights that human concepts of time and cognition are entirely different for machines that can make calculations and decisions in seconds, where it would require months for a human.
While Hollywood directors and science fiction authors worry about what AIs will create, the rest of us should also pay attention to unintended and destructive consequences of our creations in technology today. At Harvard, one of my professors often reminded us often that ‘the road to hell is paved with good intentions.’ Even today, we witness many adverse secondary effects happening around us.
Researchers at MIT recently showed that fake political news spreads three times faster than other news and is 60% more likely to be retweeted. In countries with gun regulation, ‘recipes’ for 3D printed, single-use weapons can spread across the internet and be downloaded by anyone in seconds. A study by the Royal Society of Public Health (s. 1500) found that 70% of 14 to 24 year olds reported that Instagram exacerbated feelings of anxiety and caused cyberbullying.
In today’s world, technology stands unique as an industry in that a promising invention can propagate across the entire world and be used by millions, within minutes. When inventors aren’t concerned enough with the unintended effects of their inventions, products can rapidly devolve into unmanageable conduits that spread harm and destruction.
Westworld‘s most famous line is: “these violent delights have violent ends”–a phrase repeated throughout the series by the main characters. It actually originates from Romeo and Juliet; Friar Laurence suggests that Romeo practice patience and moderation, and cautions him against an explosive and swift love, which would end in tragedy.
Like Romeo and Juliet’s romance, we are in love with technology and all the exciting possibilities it brings us. But even without AI, we are beginning to feel those effects permeate society. We must be patient, responsible and careful in our creations to avoid unintended and destructive tragedy.
About the Author
Christopher Schrader is General Manager of Surkus, an application that connects brands to their ideal audiences through experiences and events. Chris is an accomplished adventurer, having walked across the Gobi Desert, lived with Kazakh Nomads and cycled across Canada for the speed record. He is Founder of the global charity the 24 Hour Race. Chris is one of the youngest ever Fellows of the Royal Geographical Society, a Kairos Society fellow and a Founding Member of Future Talks.