Panel Discussion on Singularity

At the ICML Deep Learning Workshop 2015, six scientists from different institutions briefly discuss their views on the possibilities and perils of technological singularity—the moment when humans create artificial intelligence that so far advanced that it surpasses us (and maybe decides to eradicate us).

Throughout the years, singularity has been one of the most popular bets on what will cause the apocalypse (assuming that it happens).

Jürgen Schmidhuber (Swiss AI Lab IDSIA), Neil Lawrence (University of Sheffield), Kevin Murphy (Google), Yoshua Bengio (University of Montreal), Yann LeCun (Facebook, New York University), and Demis Hassabis (Google Deepmind) discuss their views on the possibility of singularity.

Watch the video below.

Will We Be Gods or Slaves?

With the speed of advancements in robotics and AI, science fiction is quickly becoming science fact. Recent discussions include worries about how AI is already starting to take over some of our jobs, and how, in the future, there may be no roles left for humans. 

Prominent figures such as Stephen Hawking, Bill Gates, and Elon Musk have publicly voiced their concerns regarding advancements in the artificial intelligence industry. And Hawking warns that we would become obsolete, “It [AI] would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”

On the other hand, some people argue that, when AI exceeds our capabilities (assuming that it ever does) it may lead to them treating us like gods, insisting that these robots will be our allies rather than enemies. Some people have even started to form of religion around these ideas, believing that god is technology, an ideology some refer to as "rapture of the nerds."

For now, we can only guess how things will go. But one thing is for sure Like any technology, AI is as susceptible to misuse as it is beneficial to mankind.


Share This Article