[Home]
 [Parent directory]
 [Search]
 my_research/persuade_normies_about_asi.html
2025-10-25
How to persuade average person about extinction risk and totalitarian risk from artificial superintelligence
Disclaimer
- Quick Notes
 
- Everything on this page is a Quick Note unless specified otherwise
 
- Important: I have not solved this problem yet. Here is a collection of lots of messy notes and untested hypothesis.
 
Scroll to bottom for newer notes, which are more likely to be improvements over older ones.
2025-10-12
Movement strategy
Movement strategy
- I think we need a core coalition of youtubers, journalists and politicians who can build influence in the US govt to counter the influence of bigtech and the influence of natsec that exists in US govt by default.
- Different youtubers will target different demographics and that is fine. We can build a loose coalition between people who otherwise disagree on lots of matters. For instance a youtuber targeting leftist young urban americans and a youtuber targeting christian rural middle and old age americians can both feed into the same political campaign against ASI, even though they disagree eith each other on many other matters.
 
- I think it is important for most youtubers, journalists and politicians to actually be talking about superintelligence though, not environment and job loss and whatnot. If tomorrow environmental or job loss concerns get addressed, that audience might flip their position, and it is important that they don't flip their position later.
 
- I think all of the above can be done with integrity and honesty. I can ally with the christian youtuber and the leftist youtuber mentioned above without having to lie about my own position. I'm neither christian nor leftist for example.
 
 
- We also need a vast number of people spreading panic and fear.
- Spreading emotions is way more important than spreading technical arguments, most people look at their social circle and their preferred thought leaders when deciding how to feel about things.
 
- Only a few nerds (like me) care about technical arguments and even their biases are often very socially driven.
 
- Many of these people may not do anything productive besides spreading emotions via like/share/subscribe. This is fine. A few people will be motivated to take the big next step and actually devote some years of their life fulltime to growing the movement. This is good.
 
- Yes, spreading panic is increasing human suffering. This might even increase number of suicides. I support providing mental health resources to such people. I am fine with all this happening.
 
 
2025-10-20
How to trigger fear and panic
(Contains untested hypotheses)
How to trigger fear and panic in general population, such that they organise mass protest against the creation of superintelligence
- If AI capabilities continue to grow on same trend, then by 2027 I expect
- The US natsec establishment will be convinced that AI is as important as nukes, even if they're not ASI-pilled, and will publicly declare that "we" are in Cold War. This will be helpful for triggering fear in general public.
 
- A lot more people in general public will actually try the latest AI models and hence be convinced that AI is as important as nukes, even if they're not ASI-pilled.
 
 
- I think US natsec circles are a machinery with not much agency. Just convincing a few people about ASI or xrisk is not enough to change the direction of the entire machinery.
- This also why I am not optimistic on just persuading people in natsec to stop building ASI.
 
- You need the woke protest machinery in the US to organise mass protest and counter the natsec machinery. It is important to work with people who already have the machinery required to organise large-scale protests, there is no time to build it from scratch.
 
- I expect mass protest will only have a medium influence on the electoral incentives of US congress and senate. Previous mass protests in the US have had only a medium-sized influence at best - be it against nuclear arms race, or Vietnam war, or Syria war and Iraq WMDs, or now Palestine and Ukraine wars.
 
 
- I think we need to use fear of death to trigger fight-or-flight response in people.
- War footage is especially important, to trigger people to protest against war. I think some protestors of Ukraine and Palestine wars have seen war footage and this motivated them.
 
- Talking about death, visiting graveyards, etc seems important.
 
- A lot of people have organised a lot of their psyche to cope with their fear of death. Actually breaking people out of this and reminding them they could die soon can trigger them into fight-or-flight.
 
 
- I think we need to use fear of outgroup and hatred of outgroup to trigger fight-or-flight response in people.
- Every well-organised political group in the US seems united in their hatred of another group. Christians hate atheists. Rural people hate urban people. Communists hate the tech industry. A lot of worker unions hate anyone who can take their job. Tech founders hate people who talk about ethics. Tech academia hates the tech industry too.
 
- With the right rhetoric, Silicon Valley and Big Tech in particular can easily become the outgroup for most of the groups in the US.
 
- See also: I can tolerate anybody except the outgroup, by Scott Alexander
 
- Each political group needs to be triggered into fight-or-flight by a youtuber thought leader who actually believes in the ideals of that group. People are very bad at finding truth but they're very good at detecting authenticity in other people. If you don't actually believe in the ideals of a group, and you are not exceptionally skilled, don't try to become a thought leader for that group. Ally with the existing thought leader instead, and ask them to talk about ASI.
 
- Fear of the outgroup comes from the outgroup having more power than you.
 
- "The world is complex and its problems are complex" is an explanation almost nobody wants. (And even people who do, probably do it more to signal intellectualism than to seek truth.) "My outgroup is responsible for all my problems" is an explanation everyone likes to hear.
 
 
- The truth takes too long to transmit. Do not appeal to truth alone.
- Most people are trapped in groups where only a few people are truth-seekers and most people will mindlessly copy-paste their family and friends. Triggering a belief cascade across an entire group takes too long, and humanity might be extinct before that happens.
 
- Reinforcing their existing false beliefs is way faster than convincing them of a truth, and this is how many youtubers become popular.
 
- I think just sharing the truth is powerful (more than a lot of people think), but the timelines are so short, this may not be enough to convince most of the public.
 
- Most people focus on the emotional content of a message more than the factual content.
 
- This also applies to groups that contain a lot of truth-seekers, but to a weaker degree. Truth is more useful when convincing such people. Such people also disproportionately occupy some (but not all) of the centres of power in society.
 
 
Three levels of stupid
three levels
- Convince people to change their ideology and community, and join a new ideology and community
- To grow this to large scale, it will takes 30 years at minimum and centuries more realistically as people need to leave their existing family and friends, which is painful.
 
- Don't do this for AI, we don't have 30 years left.
 
 
- Don't challenge people on their core ideology and community, but convince them using truth that ASI may be coming soon and can cause extinction.
- This will probably take a few years after we have clear empirical evidence we are on track to ASI. We may or may not have this much time.
 
- Find thought leaders in their own community and ideology who can talk about ASI and xrisk.
 
- They will probably find incorrect ways of interpreting these facts, but atleast they will know them.
 
- For some audiences this is the right way to go.
 
 
- Don't challenge people on their core ideology and community, don't try to convince them about ASI using truth or facts. Instead reinforce whatever nonsense they already believe, and just retarget their existing fear and hate on Big Tech / US govt / AI labs / Silicon Valley / atheists / urban people etc as an outgroup
- This can be done immediately.
 
- For a lot of audiences this is probably what is required. Most people are ignorant of the world and proud of their ignorance. You can't use truth to appeal to someone who does not form a lot of their opinions that way in the first place.
 
- Find thought leaders as ignorant as them and ask them to spread hate and fear against Big Tech / US govt / AI labs / Silicon Valley / etc.
 
 
2025-10-25
How to convince youtubers, journalists, politicians to talk about superintelligence
- Figure out how to persuade youtubers, journalists, politicians targeting various different demographics to talk about risks from superintelligence.
- Appeal to existing incentives
- Compile viewcount stats, to show it is their self-interest to talk about it?
 
- Compile rhetorical arguments (for example using outgroup hatred) to make it easier for them to inflame the emotions of their audience, and hence get views
 
 
- Convince them of the truth
- Figure out if they're technical, if they use any thinking of their own or just a mindless copy-paste algorithm just like their audience.
 
- If they use a copy-paste algo, figure out who they take seriously and go convince those person to talk about the truth about ASI instead. If you do this recursively (for instance by analysis retweets and repeated phrases on twitter), you can map out the entire social deference graph.
 
 
- Change their ideology
- No time for this, see the "why no religion" post
 
 
 
2025-10-28
Why spread fear?
- Trying to answer this question: "How do I create the mass hysteria that followed the Hiroshima bombing, multiple years before the first bomb drops?"
- Almost everyone is stuck in a bad plan and a social circle to reinforce said plan. This includes most people currently working on AI capabilities, most people working on AI risk, and most people not working or aware of either.
 
- My current (untested) hypothesis is that making these people see mass hysteria in other people is the best way to shake them awake.
 
- There may be no time to trigger the belief cascades in social reality (via their preferred youtubers, religious leaders, whoever) so entire groups change their mind, and also changing their mind is not enough to get them to leave their careers and do mass protests or similar.
 
 
How to spread mass hysteria
to do
How
- Target audience ???
- Target a mentally vulnerable demographic if you want to increase likelihood of creating hysteria.
 
- Target society's deference nodes (researchers then youtubers and politicians) if you want the hysteria to spread further once you've made one person hysteric.
 
 
- Appeal to fear of death
- Use graphic violent imagery, such as war footage. Seems important for getting people to protest Israel-Palestine or Russia-Ukraine wars for example.
 
- Show conflict using automated militaries
 
- Maybe even show astronomical suffering threats.
 
- Maybe show a permanent dictatorship or AI takeover.
 
 
- Use video (maybe AI-generated) to make it easier to visualise these abstract scenarios.
- If making video, it definitely needs to end with a section explicitly highlighting previous protestors and calling for a mass movement. This is what will make Altman more unhappy.
 
 
2025-10-27
How to decide between career plans
- If you're undecided between two career plans, pick the one that will cause Sam Altman and allies more suffering.
- People like Altman are extremely focussed on what is most important to their goals. They will only suffer when you're threatening what is most important to them. They will tend to ignore threats to things that are not that important to them.
 
- The more retaliation you see (legal notices, social ostracism, financial offers aka bribes), the more you know they are suffering as a result of your actions.
 
 
Subscribe / Comment
Enter email to subscribe, or enter comment to post comment