Why is the far-right more likely to spread and believe misinformation?

A recent study that evaluated 32 million social media posts from parliamentarians in 26 countries over a span of six years and found that far-right political discourse is the most prone to spreading false information:

"Using multilevel analysis with random country intercepts, we find that radical-right populism is the strongest determinant for the propensity to spread misinformation. Populism, left-wing populism, and right-wing politics are not linked to the spread of misinformation. These results suggest that political misinformation should be understood as part and parcel of the current wave of radical right populism, and its opposition to liberal democratic institution."

Other studies that analyzed differences in how websites moderate political speech found similar results: Users associated with right-wing politic did experience more moderation or sanctions, but users from that cohort were also more likely to spread false information and rely on low-quality sources:

"We argue that differential sharing of misinformation by people identifying with different political groups could lead to political asymmetries in enforcement, even by unbiased policies. We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites—even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople—and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. Political imbalance in enforcement need not imply bias on the part of social media companies implementing anti-misinformation policies."

Discussion:

Why is there such a high correlation between far-right political ideology and perpetuating false information? Does one necessarily lead to the other, or does the question of which came first even matter?

What steps can be taken to limit the spread of false information?

Do you agree with the conclusion that an imbalance in the enforcement of platform moderation does not necessarily imply a political bias given that users with far-right political ideology experience moderation more frequently due to being more likely to spread false information?