over the past few months, I have begun to have more and more concern over ASI take over. the more I look at the world… the more I worry that Eliezer Yudkowsky is right about the path we are on with gradient descent models. with my own thoughts I have been thinking about how to prep for an extinction event. this isn’t great.
I’m moving soon, and I have jokingly told some friends that I hope to see them again but that I may never see them again. some part is a joke, but some part of me is very seriously concerned about AGI/ASI in the next 3-8 months, and that I really may not be able to see them again.
it seems like it is a good idea to come to peace with the high possibility of an extinction event in the near future, and that we should consider some basic precautions to try to be in the latter part of those who die… with some bug-out supplies/backpacking gear, PPE, faraday cage, planning, etc.
we are changing, and our child homo-artificialis may not be our friend. we should all be prepared for a worse outcome than we may expect, not that we can physically prepare for such an event but we may be able to mentally prepare.
we should fight for a conference to be held yesterday to slow or halt research while we work on safety problems that are almost entirely unsolved as of yet. ASML, NVIDIA, Intel, OpenAI, Deepmind, Anthropic, DeepSeek, IBM, etc. need to hit the brakes before we slam into extinction. we need to prep way more. we need international cooperation, because we are creating powerful technologies that may not be able to be controlled for good, and may only wipe us out and be left in a universe by themselves.
it is tempting to think about powerful technologies as being harnessable, but there is no chance that is true— we thought there was a chance that a nuclear explosion on earth may burn off the atmosphere, it was a small chance, but we took the gamble, we shouldn’t take gambles that may destroy our world. history is tempting because so far we have survived our gambles, but history is biased. this may be our great filter.
we need every person in power to be informed that it is not a safe thing to bet on, we need to shut this arms race down. we need a some sort of organization that is empowered to enforce a restriction of dangerous development techniques.
we need to put more restrictions on all technologies that may be used by ASI to harm us. we need to slow down and think, we have no other choice for our survival but to proceed with caution. the world will not end by slowing down this technology development and wiping out or slowing the return on large investments, but continuing without caution may destroy our world and us.
but, this seems unlikely, so I think it may be good to also prepare to die with as much dignity as we can muster which means fighting hard for our survival, and living in as much peace while we are still here.
finis.

