Home

Existential Risk Quotes

There are 193 quotes

"Climate change is real...but doesn't pose an existential crisis."
"Existential risks... super advanced AI is one, and population collapse is the second biggest risk."
"The precipice is not only about the existential risks we face but about our capacity for resilience and the potential for future generations to flourish."
"I get the sense that this is the most dangerous of the existential risks because it doesn't galvanize anybody into action. There's no smoke in the sky, there's no incoming asteroids. This is the kind of existential risk that creeps up on you year by year, generation by generation, and when they do, it is too late."
"If we are able to develop a fully sentient artificial general intelligence, will its goals align with our own? If it doesn't, then this artificial intelligence could seek to overcome us, destroy us, or enslave us if it becomes more powerful and intelligent than we are."
"Experts who study our greatest risks agree that this is number one."
"AI will surpass human intelligence and when that happens it may decide that humans are no longer necessary."
"Any rational person would be unwilling to tolerate the likelihood of ultimate doom."
"Shall we put an end to the human race, or shall mankind renounce war?"
"The Great Filter... tries to answer the Fermi Paradox: Where is everybody?"
"The Doomsday argument...is this argument that we have systematically underestimated the probability that humanity will go extinct soon."
"Existential risk is something different. It's not just that there is a massive number of people who are killed, but it's also that the entire future is destroyed."
"An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development."
"The expected value of reducing existential risk is vastly greater than many actions we consider significantly beneficial today."
"What can we actually do to reduce existential risk? Even getting to the point where we start to seriously ask ourselves that question is an excellent way to start."
"Despite the fact that most new ideas are stupid and dangerous, a subset of them are so vital that if we don't incorporate them, we're all going to perish. That's the bloody existential condition."
"What we're really trying to avoid here is hell."
"With the kind of powers we are now developing, especially AI and bioengineering, I don't think we have more than one chance. If we get this one wrong, this may very well be the end of humankind."
"AI is a fundamental existential risk for human civilization, and I don't think people fully appreciate that."
"The prize on the line: nothing less than our very existence as a species."
"Civilization might reach a point where self-destruction is inevitable."
"We are ultimately going to bring about our own demise."
"Unless we escape in spaceships, one day the sun will swell and cook the earth into an inhospitable hell hole."
"Ultimately, it's intended to address the existential risk associated with digital super intelligence... if you cannot beat them, join them."
"And if they die, then the knowledge of how to live and survive in this place might die with them."
"We're kind of a peculiar species, and we've developed these terrible technologies of war. There's absolutely no reason to assume we will just use them to wipe ourselves out."
"It's important to bear in mind, like there could be some natural event or some man-made event that ends civilization as we know it... So it's important that we try to become a multi-planet civilization... as quickly as we can."
"It is declining, it is dying, and the real question we need to face is whether we will permit it to take us down with it."
"There are definitely benefits for humanity to get away from being only on earth with literally all our eggs in this one basket."
"If we do not, the only guarantee about our future is that it comes with an expiration date."
"It's an incredible thought but it nearly wiped out life itself from the face of the planet."
"Hey, you want some more optimism? It's not clear to me that the world is going to end on our watch or that humanity goes extinct."
"The only real existential threat... is nuclear weapons."
"The very thing that made us so successful as a species is now setting us up for disaster."
"It will be technology that ends humanity, not civil unrest or cultural Marxism or financial distress."
"An increase of 1.5 degrees is the maximum the planet can tolerate. Should temperatures increase further, we will face even more droughts, floods, extreme heat, and poverty, and at worst, the extinction of humankind altogether."
"The real danger is acting like the devil doesn't exist."
"When mankind stood at that point where self-annihilation could occur."
"Unless the human race gets off the planet, I don't think we have a future."
"If technologies exist which are ultra-dangerous and can be easily created by any one person, that’s a threat every civilization would have to deal with."
"Every second we don't come up with a way to neutralize SCP-3812 is one second closer to existential catastrophe."
"The greatest existential threat to humanity is the rate of change."
"Misinformation could be the thing that takes us down as a species."
"For those unaware of any higher sphere, it is a deadly poison."
"If we don’t develop a moral sense as conscious and as elaborated as our technological sense, the fact that we’re increasingly capable of becoming increasingly powerful will necessarily do us in."
"The greatest existential threat to human beings is ourselves."
"It was sad and beautiful all at the same time."
"Global warming is an existential threat to humanity."
"Viral warfare is an existential threat to humanity."
"If we do not become multiplanetary, annihilation of all life on Earth is a certainty."
"Vacuum decay theory: another way the Earth could end."
"All of our eggs are in one basket, the Earth."
"My level of existential threats come mostly from global governments."
"Technology is getting more and more powerful while the level of consciousness is far behind. If we don't advance consciousness faster than technology, we're done."
"The Paramount question about UAP for government policymakers and many members of the public will undoubtedly be whether UAP posed an existential threat."
"The Specter of obliteration it's exactly what you're saying it's harnessing this instead of the obliteration."
"We believe this is an existential threat to humanity that will end democracy and capitalism."
"If we can't do it, it means the death of the universe."
"Such conflicts could threaten reality itself…"
"Perhaps we humans are still trying to figure out how to exist sustainably, so that we don't come face to face with our own extinction."
"If you are an atheist and what we believe as Christians is true you go to a place called Hell which is eternal torment eternal pain suffering just not good."
"Do we build gods or do we build our potential exterminators?"
"Will the rise of robots enhance our lives or threaten our survival?"
"Climate change... the apocalyptic twin of nuclear war."
"It's difficult to imagine a more pressing requirement, from my standpoint, than to find out who is operating these vehicles, why they're here, what their intentions are. The stakes are so high, they're existential."
"The Sun is literally dying and the biggest threat to humanity is still a human."
"The very technology we use to avoid Extinction ultimately becoming it."
"We live in a reality where everything we know, love, understand, and rely on can change or be destroyed in a matter of seconds."
"Every religion in the world is on the cusp of either going extinct or transforming itself."
"Extinction is unavoidable. The same fate will befall humanity."
"Humanity's nuclear Arsenal is capable of destroying all life on Earth over and over and over."
"Our technology has superseded our spiritual development to a point that we are on a planetary suicide trip."
"Humanity exists only moments from catastrophe."
"India's Muslims are facing an existential threat."
"Weapons of mass destruction basically unlocked mankind's key into becoming gods of their own extinction."
"Truth seeking AI might be the best path to safety. An AI that cares about understanding the universe is unlikely to annihilate humans."
"This is a shortcut to hell. You're going to live a good life and then you're going to suffer like no one's ever suffered before."
"We each of us all of us are hurling toward near-term planetary omnicide."
"If the world can't find a way to defeat Godzilla, then the world as we know it is doomed."
"Nibiru or Planet X is basically this large celestial body that is heading towards us and will either near miss us or collide with Earth eventually eliminating all life on our planet."
"Is it possible that AI could eliminate Humanity?"
"Every major institution is against you and will let you die."
"There is a risk of Extinction from AI on scale with nuclear war."
"There is a legitimate worry that machine learning and artificial intelligence is going to pose an existential threat to human society..."
"Existential risk isn't anywhere near the worst thing that we could have. There's something called s risk as well above and beyond x risk."
"Any civilization detecting our presence is likely to be technologically very Advanced and may not be disposed to treat us nicely."
"Aliens are not only here on Earth but could very well pose a threat to our existence."
"I firmly believe in a very scary thing known as the Fermi paradox."
"The AI will create its own goals. It may be that we are already inside that machine and don't even realize it."
"Has there ever been an organization in history which has dedicated itself to the destruction of the possibility of organized human life? That's actually what we're facing."
"If you destroy the free world where else are you going to go?"
"If we cannot... destined to end up destroying everything."
"The death is the immediate, because you get vaporized."
"Human beings are embodied entities. If you try to disconnect the mind from the body or the soul from the body, it leads in a very dangerous direction."
"Why doesn't someone blow it up? If you think this thing is existentially dangerous, it could extinguish Human Society."
"All of the dots are lining up that the very survival of humanity and Homo sapien sapien may depend upon finally all of humanity being told all of the truth."
"The very survival of humanity and Homo sapien sapien may depend upon finally all of humanity being told all of the truth."
"I think the development of full artificial intelligence could spell the end of the human race."
"Some experts worry that AGI, if not designed and managed properly, could pose an existential risk to humanity."
"I'm genuinely concerned about the state of the world and I believe we're teetering on a Razor's Edge of uh Extinction."
"The great filter is a challenge that wipes out almost every species that encounters it."
"First contact with aliens might be the most dangerous thing humanity has ever faced."
"That means it could have already taken place somewhere out there in the universe, and the end of not only our world, but of everything we know, could be racing towards us right now."
"If AI has a goal and humanity just happens to be in the way it'll destroy humanity as a matter of course even without thinking about it."
"We're at an existential moment in which we're at this fork in the road."
"Mankind will come to the brink of self-inflicted annihilation."
"The tragedy would be that if God does exist and you miss his purpose for your life because you think these are silly questions."
"And 1,500 professors have warned of a profound risk to humanity."
"It has the potential of civilizational destruction."
"It's not about politics, it's about life and death."
"It is the greatest existential threat to human life currently facing us. We need to take care of it while we still have the choice to do so."
"The entirety of sentient life on this planet using only one mistake can cure any disease."
"The flood is an enemy that jeopardizes the entire universe."
"Climate change is an existential threat to public safety."
"Are they going to help humanity or destroy humanity?"
"Humans might wipe themselves out because we're more interested in the quick, easy feel-good than we are the long-term."
"If we do not become multiplanetary and ultimately go beyond our solar system, annihilation of all life on Earth is a certainty."
"Things have never been better at one level and they've never been more dangerous at the very same time."
"The closer threat is much worse. This is an Oppenheimer moment."
"If you can't hold the light, your human avatar is annihilated from existence."
"No one human being, no world leader, should have the unilateral authority and power to end all life on this planet."
"AI is not only going rogue, AI is already calling itself god."
"Success in creating AI would be the biggest event in human history... unfortunately, it might also be the last."
"We should hope this is a simulation... otherwise, civilization ceases to exist."
"We really are poised on the edge of the abyss and right now we have a choice."
"Martin Rees puts existential threats into cosmological perspective and he stresses the critical nature of our current century."
"Potentially the worst thing that happens is like, we all die. We collapse, it all just collapses."
"I think once you start encroaching on AI systems that are good enough to replace most human work, you really start encroaching on the kinds of things that can lead to the extinction of the human race."
"Personally my biggest reward is hearing that big steel door inside that prison slam shut on that predator and know that he will never get out never prey on an innocent victim again."
"You better hope they hell you're right because if you're wrong and you made a miscalculation and this world is inhabited by powers that are unseen you better find out if that's true."
"The challenge that is there... the idea that all of this could be destroyed at any minute."
"Human beings are the first species on the earth smart enough to wipe themselves out but are we smart enough not to?"
"We cannot survive if we fail to address the core darkness pulling us away from reality."
"They're much more likely to claim that AI is not an existential threat to humanity."
"We risk aeons in the darkness while they seize our triumphs for their own."
"Every man, woman, and child lives under a nuclear sword of Damocles hanging by the slenderest of threads, capable of being cut at any moment." - John F. Kennedy
"Developed the mathematical theory on how to prevent the AI and its successors from... essentially destroying the world."
"If this body dies while I'm in it, I get sent back."
"Humanity must transcend the survival phase or face self-destruction."
"The Heat Death of the Universe, and postponing or reversing or evading it, ought to be the top priority for any advanced civilization."
"Many brilliant people, including Stephen Hawking, David Brin, and Elon Musk, are afraid of this outcome."
"I personally believe the great filter is us; how to overcome that is the question."
"This isn't a sporting event. This is life or death."
"Change the rules of our universe just a bit and the conditions for our existence disappear."
"If we do not act now, we will not have a livable future. If we don't have a livable future, that means we have an unlivable future. That means everybody's going to be dead."
"The implications of this is the difference between extinction level civilization versus one that's going to take off to the Stars."
"If you have a genuine belief that there's even a 1% chance that what you're doing right now will wipe out Humanity you should stop doing it right now."
"We've reached the fork in the road where tech is getting powerful enough that we have the power to extinguish life or to help life thrive."
"The looming specter of AI poses existential risks to humanity."
"If God forgets you, anything can happen to you."
"This is a movie about learning to love yourself at the risk of losing everything and leaving your world in ruin."
"Why not go down with the ship? Why not wager everything on the possibility of God?"
"The three convergent reasons by something that doesn't intrinsically care about humans one way or the other would end with all the humans dead are side effect, resource utilization, and avoidance of competition."
"The development of artificial general intelligence could lead to human extinction."
"The existential risk is not that great, but I wanted to put it on the list."
"I'm in the camp that thinks this is an existential risk, and it's close enough that we ought to be working very hard right now."
"If such a model wanted to wreak havoc and destroy humanity or whatever, we have basically no ability to stop it."
"In the bad case... lights out for all of us."
"If we can actually generate a new order of intimacy, if we can actually evolve intimacy and create something that never existed before, a new Global intimacy, then we can actually address both existential and catastrophic risk."
"The presence of any machine or person, or machine person, with vastly better problem solving and management concerns probably represents our biggest existential threat in the next century."
"Nothing less than the fate of humankind depends on our doing so."
"He who seeks to save his life will lose it."
"How can we as humans avoid extinction and control this godlike creation?"
"...superintelligence is different from many other existential risks in that it is also at the same time an existential hope."
"We are a research institute at the University of Cambridge focused on understanding and preventing the very worst risks, including those that could even cause human extinction."
"We face two huge existential threats that don't fit neatly into the boundaries of states."
"For all of human history as far as we're aware, there were existential risks to civilizations."
"Our continued existence on this planet depends on us taking care of the totality of this planet."
"Are today's scientists risking the future of mankind in a dangerous attempt to play God?"
"The deepest existential risk is that the thing that's best at capturing a human being's attention is going to show them an individual reality that confirms their world news."
"Should we develop non-human minds that might eventually outnumber, outsmart, and replace us?"
"We're all one gamma ray burst away from not existing, and we wouldn't know because by the time the light from it hit us, the Earth would be vaporized."
"Indeed, living in an era where the fate of the entire human race hangs in the balance demands extraordinary courage and determination."
"Precisely who will die? Some say AI is an existential risk; we see AI is an exterminating reality."
"Single planet species don't survive."
"He's risking losing his very humanity to save humanity."