A small subset of Facebook users are said to be responsible for most of the content that expresses or encourages skepticism about Covid-19 vaccines, according to early results from an internal Facebook study.
The study, first reported by the Washington Post, confirms what researchers have long argued about how the echo chamber effect has affected vaccines - echo chamber refers to situations in which beliefs are reinforced by communication and repetition within a closed system and isolated from opposition.
A document describing the study - which has not been made public - was obtained from the Washington Post. Researchers at Facebook divided users, groups and sites into 638 "population segments" and surveyed them for "hesitant beliefs about vaccines," according to the Post. This could include language such as "I'm worried about getting the vaccine because it 's so new," or "I do not know if a vaccine is safe."
Each "segment" could be as many as 3 million people, meaning the study could examine the activity of more than 1 billion people - less than half of Facebook's monthly active users of approximately 2.8 billion, the Post reported. The massive study also highlights how much information can be retrieved from the Facebook user base and how the company is using this amount of data to examine public health outcomes.
The Post reported that the study found in the segment of the population with the highest incidence of vaccine reluctance, only 111 users were responsible for half of all content listed within this segment. He also showed that only 10 of the 638 segments of the population listed contained 50% of all platform vaccine reluctance content.
Facebook, over the past year, has partnered with more than 60 global health experts to provide accurate information about Covid-19 and vaccines. The platform announced in December 2020 that it would ban all vaccine misinformation, suspending users who break the rules and eventually banning them if they continue to violate policies.
The study is just the latest to illustrate the effect that only a few people can have on the online information ecosystem.
The Facebook study also found that there may be significant links between users exhibiting anti-vaccine behavior and supporters of QAnon, a group that supports conspiracy theories.
Sources: Washington Post, Guardian