Prominent Transhumanist Sounds Alarm on Artificial General Intelligence: ‘We Must Stop Everything.’
by Jacob Rosenberg | All Israel News
At last week’s SXSW conference, prominent transhumanist Eliezer Yudkowsky said that if the development of artificial general intelligence is not stopped immediately across the globe, humanity may be destroyed.
“We must stop everything,” Yudkowsky said during a panel titled “How to Make AGI (Artificial General Intelligence) Not Kill Everyone.”
“We are not ready,” he continued. “We do not have the technological capability to design a superintelligent AI that is polite, obedient and aligned with human intentions – and we are nowhere close to achieving that.”
Yudkowsky, founder of the Machine Intelligence Research Institute, has made similar comments in recent years, repeatedly warning that humanity must cease all work on AGI or face human extinction.
In a 2023 article in Time magazine, Yudkowsky said that no current AGI project had a feasible plan to align AGI with the interests of humanity.
“We are not ready,” Yudkowsky wrote. “We are not on track to be significantly readier in the foreseeable future. If we go ahead on this everyone will die, including children who did not choose this and did not do anything wrong. Read Full Article >