[Home]
[Parent directory]
[Search]

my_research/who_should_you_trust_when_fixing_asi_risk.html


2026-01-07

Who should you trust when fixing risks from ASI?

Disclaimer

Summary

To have a 30 min in-person meeting

If you have non-trivial probability the other person will physically injure you or sexually harrass you or similar, don't take the meeting.

You can bring someone with you for security (free or paid). You can meet online. You can maybe rely on prison security (assuming the other person is in prison).

I don't have first hand experience with this, and this seems like a rare edge case. But it is still worth reminding people IMO, that unless you think the other person might literally assault you, there is no danger to having a quick meeting. The realistic worst case is usually that they will waste your time, that's it.

People have bodily reactions that tell them to not trust others. It is fundamental biological wiring with evolutionary history. I am saying unless you think this person will assault you, maybe you should ignore the message your wiring/gut/intuition is giving you, if the only decision at stake is whether to chat with them or not.

You can still be cautious about sharing your secrets with them. Taking a chat doesn't mean anything more than taking a chat.

Maybe you are particularly influential and are worried about the PR of taking a private chat. (Example: Everyone who ever took a meeting with Epstein is now being considered suspicious by the public.) I don't think this matters all that much. It is very easy for you to explain to the public a reasonable reason why you took a meeting with Epstein or Hitler or whoever it is you took a meeting with them, assuming you did in fact have a reasonable reason for taking the meeting.

To vote for them. Or to like/share/subscribe. Or to sign their petition. Or to endorse them to have more political power in the US govt.

Unless you personally plan to become a billionaire or major politician, you have to endorse people who are not you to acquire power (IMO political power).

There is actually a significant probability that almost every single person in the ASI risk space right now who is trying to acquire power, will do something useless or even actively harmful, if they were suddenly given a billion dollars or a Congress/Senate seat.

Assume Alice is running for US election and pursuing a strongly anti-AI message (as I think they should). They will become the direct enemies of both the US intelligence community, the Big Tech / AI company lobbies, multiple other pro-capitalist lobbies, and so on. These enemies will then use a variety of methods (persuasion, going after funding, going after allies, physical intimidation, etc) to get Alice to dilute both her actual actions in the world, and dilute her public messaging. It is well known what what Alice claims in public she will do, what she privately wishes she could do, and what she actually ultimately does, are three different things.

There is a lot of pressure on Alice to not do the right thing (in terms of what will actually fix risks from ASI).

Inspite of all this, I strongly encourage you to endorse some political candidates ASAP, to vote for them, to sign their petitions, to share their youtube channels and so on.

Every month you delay is another month that they fail to acquire political power because you didn't endorse them, which is another month closer to either human extinction or permanent dictatorship.

And like I said, if not you, and if not them, then who? Trump? Are you serious? (Trump, if you are reading this, there is still hope. You can still change your mind about how to deal with ASI risk.)

Politics has always been about voting for the lesser evil, and it still is that way today. The most useless voters are the people who vote None Of The Above.

Your endorsement can always carry disclaimers that you specify (on social media or on whichever forum)

IMO the best option is to run for politics yourself, the second best option is to endorse a politician (or youtuber or similar) who has atleast some chance of doing the right thing when they get more power, and the absolute worst option is to endorse no one, i.e. wait for the current set of politicians who have clearly no interest in doing the right thing to suddenly change their mind. Pick one of these options and execute ASAP, because every month of delay is very costly.

Don't endorse someone if they are proposing something that would be illegal in your country, such that even endorsing them is sufficient to get you in prison. But also, if you have some money, you also have freedom to pick which country you are living in. Maybe you should go live in a country that will protect you. I think it is straightforwardly bad that the US govt is about to accelerate towards ASI greatly, and so many people don't dare even utter the names of people in the US govt, because they also happen to live in the US. How can you fight someone if you don't even dare utter their name in public? And remember every govt is willing to twist the letter of the law atleast a little, in order to get their enemies in prison.

To be your cofounder

I don't have insights to share on this right now.

To be in your physical spaces, like conference venues or protest venues or so on

I don't have insights to share on this right now. Maybe later.

General heuristics that always apply

Figure out if this person values acquiring knowledge more than acquiring power, or acquiring power more than acquiring knowledge.

"Figure out" just means look at their past track record, ask them questions, look at costly sacrifices they have made in the past, and so on.

Examples of costly sacrifices that indicate valuing knowledge more than valuing power

If a person values knowledge more than power, you can predict their behaviour that way. They will screw you over if they get a lot more knowledge (or access to spaces and networks with knowledge) by doing so. This is rare but happens.

If a person values power more than knowledge, you can predict their behaviour that way. They will screw you over if they get a lot more power (or access to spaces and networks with power) by doing so. This is extremely common.

Some people care about long-term things such as reputation and self-esteem (I do, or atleast claim to). But also, we are barreling head first towards the literal end of the world. Having a clean reputation and high self-esteem are going to be worth increasingly less as we get closer to the end, and we are going to see more and more people just do the thing that is short-term good long-term bad because they believe there is no long-term (and maybe there actually is no long-term).

A lot of people in the ASI risk space sure as shit don't value relationships or community (as much as they value either knowledge or power) and this is fine IMO. People who value relationships don't usually alter world history, and we need to alter world history here. Most politicians who claim to value relationships are just pretending in order to fool their audience IMO, as the audience has lots of people who value relationships. But most successful politicians are first and foremost loyal to acquiring power for themselves.

In short, don't trust anyone absolutely, atleast in your professional life. But also, endorse lots of people to acquire power as soon as possible, because the cost of not endorsing anyone is even higher. Inaction will cost you.

P.S. I am not currently endorsing anyone because I have been asked by atleast one person explicitly, that I don't talk about them. This is because I have proposed some plans to fix ASI risk that are illegal in the US. If they want to be endorsed I would love to endorse them.

Useless people

If you are not planning to become a billionaire or major politician, and the person for whom you undecided whether to trust is also not planning to become a billionaire or major politician, my gut reaction is there's a high probability the both of you are useless, who cares what either of you do? Almost all the real trust issues only come in once there is actual power involved. If you're busy having a catfight with someone over $1M or $10M or some equally small amount, in my mind there's a high probability, you're wasting time that could have directed towards actually fixing the problem of ASI risk.

There are probably edge cases here and I might take this back later. This is just my gut reaction rather than my actual well thought out opinion.

Subscribe

Enter email or phone number to subscribe. You will receive atmost one update per month

Comment

Enter comment