Five Brilliant Minds Are Concerned

Artificial intelligence (AI) is the intelligence exhibited by machines or software. The idea that a machine can be as intelligent as a human being has fascinated mankind for decades. Numerous seience fiction novels and movies have explored the scenarios that may unfold once machines develop AI. Some are entertaining while others are frightening.

Now, in scientific circles, an increasing number of experts believe there is a reasonable chance that the singularity will happen. The singularity refers to the moment when machines become more intelligent than us. What will happen then? Five brilliant minds of our time are concerned. What about you? Are you worried?

Stephen Hawking

“The development of full artificial intelligence could spell the end of the human race. It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution,couldn’t compete, and would he superseded,” the world-renowned physicist said in an interview with the BBC.

Elon Musk

Musk is famous for his businesses on the cutting edge of technology, such as Tesla and SpaceX, yet he is concerned about Al. He warned that could be “the biggest existential threat” to mankind and said. “With artificial intelligence we aré summonimng the demon.” He also pyeeted that is “potentially more dangerous than nukes.”

Bill Gates

Bill Gates wrote during an AMA (ask me anything) session on Reddit “I am in the camp that is concerned about super intelligence. First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that, though, the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.”

Vernor Vinge

Vinge, a mathematician and fiction writer who coined the term ‘the singularity’, believes the singularity is inevitable. “The competitive advantage - economic, military, even artistic-of every advance in automation is so compelling,” he wrote, “that passing laws, or having customs, that forbid such things merely assures that someone else will get them first.” What will happen when the singularity occurs? “The physical extinction of the human race is one possibility,” Vinge wrote.

Nick Bostrom

Bostrom, the philosopher and director of the Future of Humanity Institute at the University of Oxford, writes in his book Superintelligence that machines could eradicate humans with various strategies and that the world could become “a society of economic miracles and technological awesomeness, with nobody there to benefit. A Disneyland without children.”

단어 & 숙어

  • fascinate : 매혹시키다
  • frightening : 무서운
  • spell : 철자를 말하다, 결과를 의미하다
  • an ~ rate : ~ 속력으로
  • supersede : 대신하다, 대체하다
  • summon a demon : 악마를 소환하다
  • in the camp : ~진영의, ~편의
  • compelling : 흥미진진한