Summary: A new study shows that X’s verification system, which gives verified users priority in algorithms, can increase polarization and trigger the formation of echo chambers. Researchers used computational modeling to simulate how verified users affect the spread of political opinions on social media.
They found that when verified users with entrenched opinions post, their influence can drive polarization, while centrist ideologues can reduce it if present in sufficient numbers. These findings highlight the unintended effects of prioritizing verified users in social networks and suggest that platforms should reconsider how algorithms promote content.
Key Facts:
- Verified users on X can increase polarization and create echo chambers.
- Verified ideologues with entrenched views have the most polarizing effect.
- Centrist ideologues can reduce polarization if present in large enough numbers.
Source: Cell Press
When X (formerly Twitter) changed its verification system in 2022, many foresaw its potential to impact the spread of political opinions on the platform.
In a modeling study publishing October 22 in the Cell Press journal iScience, researchers show that having verified users whose posts are prioritized by the platform’s algorithms can result in increased polarization and trigger the formation of echo chambers.
Because X’s new verification system allows almost anybody to become verified, this side effect could be taken advantage of by users wishing to manipulate others’ opinions, the researchers say.
“Our findings confirm that ideologues and verified users play a crucial role in shaping the flow of information and opinions within the social network,” says first author Henrique Ferraz de Arruda, a computer scientist at George Mason University.
“When verified people post things, it can reach more people, which allows them to have a significant impact on the formation and reinforcement of echo chambers.”
Though many people speculated that X’s verification system might have ramifications, its actual impact hasn’t been studied in depth—in part because the platform no longer allows researchers to access its data.
For this reason, the researchers used a computational model simulating how people post and receive messages on social media platforms to investigate how having a larger number of verified users might impact polarization and the formation of echo chambers.
Within the model, they tweaked the number of verified users and also varied how stubborn these individuals were in their opinions.
They showed that verified users can actually facilitate consensus on the platform if they are not stubborn in their opinions. However, if verified users are “ideologues” with entrenched opinions that they hope to disseminate, their presence can drive polarization.
When verified user ideologues held extreme views, their presence triggered the formation of echo chambers in addition to driving polarization. In contrast, the presence of verified centrist ideologues decreased polarization, while the presence of stubborn but unverified centrists drove polarization without triggering echo chambers.
“We found that even centrist ideologues, who may appear as a moderating force on the surface, can have a significant impact on the opinion dynamics when in enough numbers,” says Arruda.
These differences were driven because of changing connections within the network—essentially, how users followed or unfollowed others within the network.
“When the number of ideologues in the network becomes sufficiently large, regardless of whether they exhibit centrist or extremist behavior, we observed that a significant portion of the messages exchanged in the network are either sent to or received from these influential users,” says Arruda.
“This suggests that, when social network algorithms prioritize visibility over content control, the users may be able to reach others to reinforce their opinions in groups, which could entrench echo chamber structures.”
Though the study was based on X’s framework, the researchers say that the results are probably also relevant to other social media platforms. They say that social media companies should be aware of the possible impact they have on political polarization and attempt to mitigate this within their algorithms.
Though in some cases social media moguls could be attempting to polarize their networks, Arruda speculates that for other platforms, this “happens as a side effect because they want to make us use the platform more.”
In future research, the team plans to increase the realism of their model by adding features such as news feeds and reposting and to incorporate data from other social media platforms such as Bluesky.
Funding:
This research was supported by the Government of Aragón, Spain and the Ministerio de Ciencia e Innovación, Agencia Española de Investigación.
About this social behavior research news
Author: Kristopher Benke
Source: Cell Press
Contact: Kristopher Benke – Cell Press
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Echo chamber formation sharpened by priority users” by Henrique Ferraz de Arruda et al. iScience
Abstract
Echo chamber formation sharpened by priority users
On social media platforms, priority users (e.g., verified profiles on X) are users whose posts are promoted by recommendation algorithms. However, their influence on opinion dynamics, in particular polarization and echo chamber formation, is not well understood.
Through computational modeling, we investigate this influence in a stylized setting. By introducing priority user accounts, we find that prioritization can mitigate polarization.
However, by incorporating stubborn user behavior, we find that the results change and that priority accounts can exacerbate the formation of echo chambers. In other words, a minority of extremist ideologues can trigger a transition from consensus to polarization.
Our study suggests careful monitoring of platform prioritization policies to prevent potential misuse of users with enhanced influence.