Some people on lesswrong have described political discussions with words like "trauma" and "paranoia".
Trauma and paranoia are both mental states where a person is reacting too strongly to some threat as compared to what is the rational response to that threat.
I think it is true that people often form their political opinions based on perceived threats from others. They then formed "trapped priors" and "shoulder advisors" of their perceived enemies, and all of that. (Out of scope.)
I think it is worth asking more about what the threat actually is. Maybe the person's response is rational. Or maybe it is irrational but understanding the threat is useful anyway.
What is the threat?
If you are engaging in a political discussion, the main threats to you are violence, social exclusion and losing money.
Political violence
There are orgs that publish democracy indices and freedom of speech indices for each country.
There are orgs that try to count number of people affected during each war, genocide, mass migration and so on.
There are legal databases that track political lawsuits in a specific issue, and media houses that may write about them, and so on.
Obviously all this information too is political and biased, but you can use that to get closer to the truth.
Losing job due to political opinions
Most billionaires and politicians have some rational (and many irrational) reasons to restrict the freedom of the people working directly under them, assuming they stick to their current strategy of acquiring power.
Not giving someone a job in first place is way more common than giving someone a job but firing them later.
There are some legally protected classes where you can sue your employer and win, if you prove you didn't get a job for belonging to that protected class. In this case, there are legal orgs that are incentivised to help you.
More generally though, I haven't found good stats on people being denied job due to political opinion. Out of scope of this post.
Being socially cut off by family, friends, colleagues, etc
In recent times, this is often coordinated online (as opposed to TV, radio, in-person etc) and is called cancel culture.
Tracking accurate stats for this is hard. But there are people trying recently.
Providing accurate info
In general, I think providing accurate information about all the three is valuable. This info will necessarily encode your personal political worldview, but you can try your best to be objective. The people reading your content will then be better calibrated in dealing with political discussions (as opposed to being traumatised or paranoid or similar).
In particular, I think tracking cancel culture is worth it because people get cancelled often, and on a wider variety of topics. The probability of you losing a friend or family member over a political opinion is higher, for many opinions, as opposed to you being imprisoned or losing your job.
All three threats succeed when most people are not actively being caught by the system. More people are caught when the system is becoming weak or becoming strong in some way.
Side Note
I initially wrote this while thinking about culture war stuff not related to AI. But it also seems relevant in the AI context.
Maybe providing them more information about the probability of violence, social exclusion, etc for various actions is worth it? But also, I feel the bottlenecks are bigger there.
Many people lack courage to do anything about ASI risk (even among those convinced the problem is real).
Suppose someone is convinced about AI risk, but not willing to share this offline or online because they don't want their family or friends to know, or they don't want their workplace to know. Maybe having info about base rates (of being cancelled) is useful. They can also just go ask this other person if they will be cancelled or not.
I definitely think these base rates are shaped by political views in society that can become popular and unpopular very quickly.
I also think this target audience is small (only a few thousand people are EA/LW). But because some of these people are highly competent, I think convincing even a few people is worth it.
There are definitely also people here who aren't putting active effort to hide their opinions, but also aren't putting active effort to share their opinions with others. Sharing 24x7 is bad but sharing atleast once is better than sharing zero times.
Some people are executing plans I don't believe in, to fix ASI risk.
In this case, they probably have existing social circle already invested in this plan, and they need to painfully disconnect from this person in order to pursue my plan. In this case, honestly, talking to me personally seems like a good idea. Seeing the first few courageous people seems useful. Apart from that, however, it is just straightforwardly true that a significant number of people will cancel you, and me providing more accurate statistics about how many people may not help much.