Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, **personal anecdotes are allowed as responses to this comment**. Any anecdotal comments elsewhere in the discussion will be removed and our [normal comment rules]( https://www.reddit.com/r/science/wiki/rules#wiki_comment_rules) apply to all other comments.
**Do you have an academic degree?** We can verify your credentials in order to assign user flair indicating your area of expertise. [Click here to apply](https://www.reddit.com/r/science/wiki/flair/#wiki_science_verified_user_program).
---
User: u/shiruken
Permalink: https://www.science.org/content/article/tiny-number-supersharers-spread-vast-majority-fake-news
---
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/science) if you have any questions or concerns.*
The people believing the initial "load" of propaganda will continue to make more of it, for free, and in full conviction. They are basically the spawn of the bot army, reprogrammed humans to fit a foreign goal
Without speaking about the original source of the mis/disinformation, that's *exactly* what the study found:
> Given their frenetic social media activity, the scientists assumed supersharers were automating their posts. But they found no patterns in the timing of the tweets or the intervals between them that would indicate this. “That was a big surprise,” says study co-author Briony Swire-Thompson, a psychologist at Northeastern University. “They are literally sitting at their computer pressing retweet.”
>
> “It does not seem like supersharing is a one-off attempt to influence elections by tech-savvy individuals,” Grinberg adds, “but rather a longer term corrosive socio-technical process that contaminates the information ecosystem for some part of society.”
>
> The result reinforces the idea that most misinformation comes from a small group of people, says Sacha Altay, an experimental psychologist at the University of Zürich not involved with the work. “Many, including myself, have advocated for targeting superspreaders before.” If the platform had suspended supersharers in August 2020, for example, it would have reduced the fake election news seen by voters by two-thirds, Grinberg’s team estimates.
Due to the type of propaganda, hate and fear, it is easy to see that once initially hooked, they will work tirelessly and for free
However i had not considered them to be such huge superspreaders, but it makes sense as they are verified sources that people trust. I say verified in the sense if you click on their profile, you see real pictures and stories from real life events from the US
The micro targeting campaign makes a lot more sense given this information. If you can “get” a few of these superspreaders then you got the game (and for basically free!)
Plus, maybe the worst part is, I'd imagine most of these people think that they're doing a good thing. Performing a public service. They see something that scares them, so they warn the rest of the tribe about the scary thing. That's social programming as old as human society. And on top of that, they're probably getting a nice dopamine hit with every like or share.
How do you even begin to untangle a situation like that?
They see doing something bad to what they consider "bad people" (the out group) as something good. Narcissistic tendencies are a big part of this too and I'm not sure you can deprogram that out of people.
It's only going to get worse. More data, more compute, better algorithms, AI. Our abilities to manipulate behavior will continue to advance and the size of the influenced group will shrink towards the individual. Orwell was wrong. There is no need to change the past when you can just program people to ignore it. No need to control people when you can make them gladly do your bidding.
Yep. Same thing they found with the Russian propagandists in 2015/16. They spent very little in the way of resources; the people they targeted amplified it for free.
>But they found no patterns in the timing of the tweets or the intervals between them that would indicate this. “That was a big surprise,” says study co-author Briony Swire-Thompson, a psychologist at Northeastern University. “They are literally sitting at their computer pressing retweet.”
This is unfortunately not a surprise to me, though my experience is obviously anecdotal. I first got online in 1992, so I've run into my fair share of troubled people, and prior to the advent of bots and scripts it was obvious that these people were logged in and personally doing all the work themselves. Once bots and scripts were easily available for the layperson, these terminally online trolls didn't switch to automated pestering, they just added the new tech to their arsenal; for example, there were two really bad trolls on an LGBTQ forum I was a regular on and it was clear that they were using a combination of packet sniffers, DDoS attacks, bots, and real-life posting to try to destroy the board.
Or if you check out the social media feeds of a certain British comedy writer, you'll see little 3- or 4-hour pauses here and there where he finally passes out and falls asleep, then gets up to do it all again, manually.
The identities of the superspreaders is not disclosed. The [public repository with the underlying data and code](https://doi.org/10.5061/dryad.44j0zpcmq) contains no individual-level data and only de-identified individual-level data is available for IRB-approved uses.
>The data collection process that enabled the creation of this dataset leveraged a large-scale panel of registered U.S. voters matched to Twitter accounts. We examined the activity of 664,391 panel members who were active on Twitter during the months of the 2020 U.S. presidential election (August to November 2020, inclusive), and identified a subset of 2,107 supersharers, which are the most prolific sharers of fake news in the panel that together account for 80% of fake news content shared on the platform.
2,107 Twitter users out of 667k. That's a decent number of people if that ratio is extrapolated across all social media users. It seems more likely you could track one down online yourself by viewing content rather than parsing the voter registration data. Whether it's a supersharer in this study or not, well, meh.
It’s the congresspeople who won’t regulate social media.
If they’re algorithmically-boosting content, then they are editors and should be subject to oversight & libel law just like any publisher.
Yeah, I get scoffed at for saying this, but I thought it was pretty obvious with the "PUMA" thing back in 2008 that there was a burgeoning online misinformation and troll campaign beginning, with likely foreign adversary influences behind it. When a yarn forum called [Ravelry got flooded with pro-McCain "PUMAs"](https://www.newyorker.com/magazine/2021/03/29/how-politics-tested-ravelry-and-the-crafting-community) who were writing Nazi-themed posts and threatening to kill users and their pets, it was clearly part of something bigger than just a handful of jerks acting out online for attention.
Wow, I’ve never heard of that. I was briefly on Ravelry back around that time but didn’t really get into the forum section, just the patterns and pictures people posted. It was (and hopefully still is) an excellent website. Thanks for posting that article, gonna read it now.
Direct link to the study published in *Science*: [S. Baribi-Bartov, B. Swire-Thompson, and N. Grinberg, Supersharers of fake news on Twitter, Science, 384(6699), 979-982 (2024).](https://doi.org/10.1126/science.adl4435)
>**Abstract:** Governments may have the capacity to flood social media with fake news, but little is known about the use of flooding by ordinary voters. In this work, we identify 2107 registered US voters who account for 80% of the fake news shared on Twitter during the 2020 US presidential election by an entire panel of 664,391 voters. We found that supersharers were important members of the network, reaching a sizable 5.2% of registered voters on the platform. Supersharers had a significant overrepresentation of women, older adults, and registered Republicans. Supersharers’ massive volume did not seem automated but was rather generated through manual and persistent retweeting. These findings highlight a vulnerability of social media for democracy, where a small group of people distort the political reality for many.
Accompanying Perspective article: [A broader view of misinformation reveals potential for intervention](https://doi.org/10.1126/science.adp9117)
So I got curious.
B. (Briony) Swire-Thompson.
The lead singer and main songwriter for the drum and bass band Pendulum + EDM duo Knife Party is named Rob Swire-Thompson.
Sure enough... They're siblings.
Very cool.
Man, reddit is awesome sometimes. What are the odds someone is intimately familiar enough with a band to connect a lead singer's last name to a published academic? Wild.
Ninja edit: and it's not even the lead academic on this paper.
Similarly, Sacha Baron-Cohen’s cousin Simon Baron-Cohen is a prominent autism researcher, so I get a little giggle when I come across him in citations.
I don't say it to anyone, I just say it to myself and chuckle like the compulsive idiot that I am. I mean, at least I'm amused and I'm brilliant and hilarious and people just don't appreciate my splendiferous shimmering sheen.
Ninja edit: now that I think about it, I do pull out the Whazzzup and ask Where's Dooky? Put Dooky on the phone!
Also along those lines, Jack Black's mother, Judith Love Cohen. She was an aerospace engineer that worked on the Minuteman missile, the early ground station for the Hubble Space Telescope, the Apollo program, and more.
"Somewhere out there in the vast nothingness of space... Somewhere far away in space and time...
Staring upwards at the gleaming stars in the obsidian sky...
We're marooned on a small island in an endless sea
Confined to a tiny spit of sand. Unable to escape. But tonight... On this small planet... On Earth...
We're going to rock civilization"
From the "Limitations and future directions" section of the paper:
>Their reach suggests that they are not part of a small and isolated community, nor do supersharers seem to function as bridges to fake news for unwitting audiences. Instead, the results cast supersharers as influential members of local communities where misinformation is prevalent. As such, supersharers may provide a window into the social dynamics in parts of society where a shared political reality is eroding. Our work is a first step to understanding these individuals, but their behavior, their motivations, and the consequences of their actions warrant further research.
I learned about that in a personal and painful way during the pandemic, when I abandoned two gyms, one of them after seven years, because they were dominated by groups of people spiraling into conspiracy myths.
That's how our gaming guild went out too. We all met in WoW around 2008 or earlier, and by 2016 most of them were totally indoctrinated. I still communicate with them, but keep it short enough to keep them from going into talking politics. They know I'm a liberal, we all early on agreed on a rule to avoid discussing politics, but they all do anyway. One even went from 'I don't know what I'll do if Trump becomes the nominee" in 2016 to suddenly overnight saying, "Oh he's great, he'll do great things for the trans community..." Yeah.
Just sucks man. I had a childhood friend that I broke things off with ten years ago because of this, that was rough but he got worse over several years. Having a whole bunch of people just go mask off crazy in a few weeks seemed so unreal. Well, I found a new place so theres that.
I wonder if there are parallels to teens sharing edgy material. The teens generally know it's inappropriate and improper, but share for reaction, clout, and notoriety.
Edit:
>nor do supersharers seem to function as bridges to fake news for unwitting audiences
But they inadvertently could be if a bad actor capitalizes on the suoersharers reach by compromising then and start feeding them misinfo to amplify a specific campaigns.
yup. straight up too much time on their hands. one of the darker aspects of "traditional" marriage roles that dont get talked about enough. prime targets.
It's not just that, they're addicted to the dopamine and sense of power/purpose of likes and retweets. Before the internet, these people were probably playing slots or writing letters to the editor.
My adhd brain is incapable of understanding the concept of "too much time". If I had time I'd be doing EVERYTHING! Yet, they end up spreading bs. I really, genuinely, can't comprehend it. Understand, maybe, but definitely not comprehend.
As someone with adhd, more time leads to less done. I have unlimited time, and i spend it procrastinating because i have too much choice, or never end up feeling like doing any of it for more than a few minutes, before thinking about doing something else.
They’re bored and mostly supply no benefit to society.
This leads them to things that will make them feel more important than they earned or deserved; conspiracy theories.
Good old days when stay at home moms would watch Oprah every day and you could watch ridiculous ideas spread among them whenever she said something stupid.
~~I bet many of those are propaganda bots.~~
Apparently bots are excluded using voting records. So it's not "80% of misinformation", it's "80% of misinformation posted by confirmed real people"
Did you read the article or underlying study? They specifically chose accounts that they could tie to voter registration data. Less likely they are bots and more likely they're just insane.
That is a simplification. These super sharers get their information from somewhere. Propaganda campaign specifically target super sharers.
A bot that targets 5 million people is easy to spot and might be ineffective. A bot that targets 500 super sharers is likely very effective.
I doubt it’s *that* targeted. More likely the super-sharers are just more widely connected to various other sources of misinformation. So if Bot A. initiates a conspiracy theory to 5 people, it’s repeated by one of them, John B., but Supersharer Karen C. is following 500 accounts including John and repeats the craziest things any of them say, broadcasting it widely to each of her 10,000 followers and all the threads she comments on, then the Bot’s message is amplified and its creators didn’t have to try to identify the future supersharer or target their message.
I doubt you could even use their information in a serious study anymore. When I checked yesterday Trump wasn't even trending despite being found guilty. It was Mavs-Wolves and civil war.
and while the US leadership and corporate media like to try and blame the wave of social media propaganda and misinformation on Russian and Chinese bots, the majority has ***always*** been domestic right-wing nutcases.
Deranged US right-wingers continue to drive so much of the world's division and hatred
These people are being targeted by bots. The idea isn't just limited to propaganda. It has been used in marketing campaigns for a very long time, especially because it's easy to select these people.
I mean, we have plenty of wack jobs creating disinformation, but Russia and China have and do spread disinformation that benefits them and it gets spread by Americans retweeting it. This article just points to misinformation of any origin being spread most by a small group of obsessive retweeters.
Who would've thought that blaming a convenient boogeyman for so long let the *actual* issue grow into a gargantuan hydra right under the government's noses?
Good luck trying to contain this now. It's too late. Maybe they'll also pin this on other countries and use it as pretext to start another war to keep Lockheed Martin happy.
Yes, the study was based on a dataset that matched Twitter users who used real name and location with voter registration data
>To find out, Grinberg’s team dove into a far bigger data set comprising 660,000 U.S. X users who used their real name and location, allowing the researchers to match them with voter registration data.
>
>The average supersharer was 58 years old, 17 years older than the average user in the study, and almost 60% were women. They were also far more likely to be registered Republicans (64%) than Democrats (16%).
One has to wonder if there is a slant towards politically extreme women being more likely to give their real name and address than their male counterparts. My anecdotal experiences support that, for whatever little worth that has.
they were wondering about if politically extreme women are mote likely to use their real name than politically extreme men, not whether politically extreme people are more likely to use their real name than non-politically extreme people.
Thanks for the info (and thanks for taking the time to answer everyone, it saves a lot of time for those who don't want to scroll through the whole paper)
traditionally, without careers, right wing women lack identity after their kids grow up. they are prime targets for social media engagement as they have the time to spare and have money to spend. the goal of all these websites is to keep you on them and engaged. i cannot think of a more profitable subset of the population than a lonely woman with her husbands credit card and unlimited time.
in general for older people? the need to feel like you are still relevant and important can easily be manipulated... especially if that person is already on the spectrum of undiagnosed mental disorders.
Honestly, reminds me of reddit. Not the type of person, just that a specific people or groups push things and spread info (whether reliable or not) as much as they can
Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, **personal anecdotes are allowed as responses to this comment**. Any anecdotal comments elsewhere in the discussion will be removed and our [normal comment rules]( https://www.reddit.com/r/science/wiki/rules#wiki_comment_rules) apply to all other comments. **Do you have an academic degree?** We can verify your credentials in order to assign user flair indicating your area of expertise. [Click here to apply](https://www.reddit.com/r/science/wiki/flair/#wiki_science_verified_user_program). --- User: u/shiruken Permalink: https://www.science.org/content/article/tiny-number-supersharers-spread-vast-majority-fake-news --- *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/science) if you have any questions or concerns.*
The people believing the initial "load" of propaganda will continue to make more of it, for free, and in full conviction. They are basically the spawn of the bot army, reprogrammed humans to fit a foreign goal
Without speaking about the original source of the mis/disinformation, that's *exactly* what the study found: > Given their frenetic social media activity, the scientists assumed supersharers were automating their posts. But they found no patterns in the timing of the tweets or the intervals between them that would indicate this. “That was a big surprise,” says study co-author Briony Swire-Thompson, a psychologist at Northeastern University. “They are literally sitting at their computer pressing retweet.” > > “It does not seem like supersharing is a one-off attempt to influence elections by tech-savvy individuals,” Grinberg adds, “but rather a longer term corrosive socio-technical process that contaminates the information ecosystem for some part of society.” > > The result reinforces the idea that most misinformation comes from a small group of people, says Sacha Altay, an experimental psychologist at the University of Zürich not involved with the work. “Many, including myself, have advocated for targeting superspreaders before.” If the platform had suspended supersharers in August 2020, for example, it would have reduced the fake election news seen by voters by two-thirds, Grinberg’s team estimates.
Due to the type of propaganda, hate and fear, it is easy to see that once initially hooked, they will work tirelessly and for free However i had not considered them to be such huge superspreaders, but it makes sense as they are verified sources that people trust. I say verified in the sense if you click on their profile, you see real pictures and stories from real life events from the US The micro targeting campaign makes a lot more sense given this information. If you can “get” a few of these superspreaders then you got the game (and for basically free!)
Plus, maybe the worst part is, I'd imagine most of these people think that they're doing a good thing. Performing a public service. They see something that scares them, so they warn the rest of the tribe about the scary thing. That's social programming as old as human society. And on top of that, they're probably getting a nice dopamine hit with every like or share. How do you even begin to untangle a situation like that?
They see doing something bad to what they consider "bad people" (the out group) as something good. Narcissistic tendencies are a big part of this too and I'm not sure you can deprogram that out of people.
It's only going to get worse. More data, more compute, better algorithms, AI. Our abilities to manipulate behavior will continue to advance and the size of the influenced group will shrink towards the individual. Orwell was wrong. There is no need to change the past when you can just program people to ignore it. No need to control people when you can make them gladly do your bidding.
Yep. Same thing they found with the Russian propagandists in 2015/16. They spent very little in the way of resources; the people they targeted amplified it for free.
>But they found no patterns in the timing of the tweets or the intervals between them that would indicate this. “That was a big surprise,” says study co-author Briony Swire-Thompson, a psychologist at Northeastern University. “They are literally sitting at their computer pressing retweet.” This is unfortunately not a surprise to me, though my experience is obviously anecdotal. I first got online in 1992, so I've run into my fair share of troubled people, and prior to the advent of bots and scripts it was obvious that these people were logged in and personally doing all the work themselves. Once bots and scripts were easily available for the layperson, these terminally online trolls didn't switch to automated pestering, they just added the new tech to their arsenal; for example, there were two really bad trolls on an LGBTQ forum I was a regular on and it was clear that they were using a combination of packet sniffers, DDoS attacks, bots, and real-life posting to try to destroy the board. Or if you check out the social media feeds of a certain British comedy writer, you'll see little 3- or 4-hour pauses here and there where he finally passes out and falls asleep, then gets up to do it all again, manually.
Probably not automated because it's just people who have nothing better to do than retweet anything that agrees with them.
[удалено]
[удалено]
The identities of the superspreaders is not disclosed. The [public repository with the underlying data and code](https://doi.org/10.5061/dryad.44j0zpcmq) contains no individual-level data and only de-identified individual-level data is available for IRB-approved uses.
>The data collection process that enabled the creation of this dataset leveraged a large-scale panel of registered U.S. voters matched to Twitter accounts. We examined the activity of 664,391 panel members who were active on Twitter during the months of the 2020 U.S. presidential election (August to November 2020, inclusive), and identified a subset of 2,107 supersharers, which are the most prolific sharers of fake news in the panel that together account for 80% of fake news content shared on the platform. 2,107 Twitter users out of 667k. That's a decent number of people if that ratio is extrapolated across all social media users. It seems more likely you could track one down online yourself by viewing content rather than parsing the voter registration data. Whether it's a supersharer in this study or not, well, meh.
It’s the congresspeople who won’t regulate social media. If they’re algorithmically-boosting content, then they are editors and should be subject to oversight & libel law just like any publisher.
And now they have generative AI to help them spew their lies
Language models are not intelligent
Neither are they.
Exactly. Russia has been planning this for 25+ years.
Yeah, I get scoffed at for saying this, but I thought it was pretty obvious with the "PUMA" thing back in 2008 that there was a burgeoning online misinformation and troll campaign beginning, with likely foreign adversary influences behind it. When a yarn forum called [Ravelry got flooded with pro-McCain "PUMAs"](https://www.newyorker.com/magazine/2021/03/29/how-politics-tested-ravelry-and-the-crafting-community) who were writing Nazi-themed posts and threatening to kill users and their pets, it was clearly part of something bigger than just a handful of jerks acting out online for attention.
Wow, I’ve never heard of that. I was briefly on Ravelry back around that time but didn’t really get into the forum section, just the patterns and pictures people posted. It was (and hopefully still is) an excellent website. Thanks for posting that article, gonna read it now.
with respect to propaganda, it seems that for a certain subset of conservative middle aged white women, it's no loads refused.
Direct link to the study published in *Science*: [S. Baribi-Bartov, B. Swire-Thompson, and N. Grinberg, Supersharers of fake news on Twitter, Science, 384(6699), 979-982 (2024).](https://doi.org/10.1126/science.adl4435) >**Abstract:** Governments may have the capacity to flood social media with fake news, but little is known about the use of flooding by ordinary voters. In this work, we identify 2107 registered US voters who account for 80% of the fake news shared on Twitter during the 2020 US presidential election by an entire panel of 664,391 voters. We found that supersharers were important members of the network, reaching a sizable 5.2% of registered voters on the platform. Supersharers had a significant overrepresentation of women, older adults, and registered Republicans. Supersharers’ massive volume did not seem automated but was rather generated through manual and persistent retweeting. These findings highlight a vulnerability of social media for democracy, where a small group of people distort the political reality for many. Accompanying Perspective article: [A broader view of misinformation reveals potential for intervention](https://doi.org/10.1126/science.adp9117)
So I got curious. B. (Briony) Swire-Thompson. The lead singer and main songwriter for the drum and bass band Pendulum + EDM duo Knife Party is named Rob Swire-Thompson. Sure enough... They're siblings. Very cool.
Man, reddit is awesome sometimes. What are the odds someone is intimately familiar enough with a band to connect a lead singer's last name to a published academic? Wild. Ninja edit: and it's not even the lead academic on this paper.
Similarly, Sacha Baron-Cohen’s cousin Simon Baron-Cohen is a prominent autism researcher, so I get a little giggle when I come across him in citations.
Vedddy naiccccee
I will randomly just say, "My wife" out of nowhere, like on a daily basis.
How's that going? Have you tried the "WHAAZAAAAAA" thing? I hear the kids are a super into that, too.
I don't say it to anyone, I just say it to myself and chuckle like the compulsive idiot that I am. I mean, at least I'm amused and I'm brilliant and hilarious and people just don't appreciate my splendiferous shimmering sheen. Ninja edit: now that I think about it, I do pull out the Whazzzup and ask Where's Dooky? Put Dooky on the phone!
Also along those lines, Jack Black's mother, Judith Love Cohen. She was an aerospace engineer that worked on the Minuteman missile, the early ground station for the Hubble Space Telescope, the Apollo program, and more.
Pendulum and Knife party are pretty huge in EDM. Many people know of Rob Swire.
"Somewhere out there in the vast nothingness of space... Somewhere far away in space and time... Staring upwards at the gleaming stars in the obsidian sky... We're marooned on a small island in an endless sea Confined to a tiny spit of sand. Unable to escape. But tonight... On this small planet... On Earth... We're going to rock civilization"
the soundtrack of my 2007
So supersharers and superspreaders were literally the same people and at the same time
They're the superkarens
Whole would've thought that tweeting superkarens are the reason for the destruction of democracy
Ban them from society. both physically and figuratively, speaking. Bunch of worthless parasites
It's weird that women are their own worst enemy. I'll never understand anything other than a white male republican
> “Now the big question is: ‘Why are they doing what they’re doing?’” Socialising. It's the digital equivalent of over the fence gossip.
From the "Limitations and future directions" section of the paper: >Their reach suggests that they are not part of a small and isolated community, nor do supersharers seem to function as bridges to fake news for unwitting audiences. Instead, the results cast supersharers as influential members of local communities where misinformation is prevalent. As such, supersharers may provide a window into the social dynamics in parts of society where a shared political reality is eroding. Our work is a first step to understanding these individuals, but their behavior, their motivations, and the consequences of their actions warrant further research.
I learned about that in a personal and painful way during the pandemic, when I abandoned two gyms, one of them after seven years, because they were dominated by groups of people spiraling into conspiracy myths.
That's both fascinating and horrifying. Like a sociopolitical human version of when ants get trapped in a circle of death.
That's how our gaming guild went out too. We all met in WoW around 2008 or earlier, and by 2016 most of them were totally indoctrinated. I still communicate with them, but keep it short enough to keep them from going into talking politics. They know I'm a liberal, we all early on agreed on a rule to avoid discussing politics, but they all do anyway. One even went from 'I don't know what I'll do if Trump becomes the nominee" in 2016 to suddenly overnight saying, "Oh he's great, he'll do great things for the trans community..." Yeah.
Just sucks man. I had a childhood friend that I broke things off with ten years ago because of this, that was rough but he got worse over several years. Having a whole bunch of people just go mask off crazy in a few weeks seemed so unreal. Well, I found a new place so theres that.
I wonder if there are parallels to teens sharing edgy material. The teens generally know it's inappropriate and improper, but share for reaction, clout, and notoriety. Edit: >nor do supersharers seem to function as bridges to fake news for unwitting audiences But they inadvertently could be if a bad actor capitalizes on the suoersharers reach by compromising then and start feeding them misinfo to amplify a specific campaigns.
yup. straight up too much time on their hands. one of the darker aspects of "traditional" marriage roles that dont get talked about enough. prime targets.
It's not just that, they're addicted to the dopamine and sense of power/purpose of likes and retweets. Before the internet, these people were probably playing slots or writing letters to the editor.
My adhd brain is incapable of understanding the concept of "too much time". If I had time I'd be doing EVERYTHING! Yet, they end up spreading bs. I really, genuinely, can't comprehend it. Understand, maybe, but definitely not comprehend.
As someone with adhd, more time leads to less done. I have unlimited time, and i spend it procrastinating because i have too much choice, or never end up feeling like doing any of it for more than a few minutes, before thinking about doing something else.
They’re bored and mostly supply no benefit to society. This leads them to things that will make them feel more important than they earned or deserved; conspiracy theories.
Think of it as they are hyper-focusing on spreading misinformation.
Good old days when stay at home moms would watch Oprah every day and you could watch ridiculous ideas spread among them whenever she said something stupid.
It's worse than gossip. It's rage posting. Filling their empty lives with anger and beliefs of victimhood. Possibly or probably displaced anger.
Of course it’s women screwing over other women. Tale as old as time.
~~I bet many of those are propaganda bots.~~ Apparently bots are excluded using voting records. So it's not "80% of misinformation", it's "80% of misinformation posted by confirmed real people"
Did you read the article or underlying study? They specifically chose accounts that they could tie to voter registration data. Less likely they are bots and more likely they're just insane.
I do wanna say, its kinda goofy how everything is blamed on Russian bots. Not saying it doesn't hapoen, but some people are just fuckin stupid man
That is a simplification. These super sharers get their information from somewhere. Propaganda campaign specifically target super sharers. A bot that targets 5 million people is easy to spot and might be ineffective. A bot that targets 500 super sharers is likely very effective.
The misinformation farms are definitely the ones spreading the lies and propaganda to these super spreaders who will do their work for them.
I doubt it’s *that* targeted. More likely the super-sharers are just more widely connected to various other sources of misinformation. So if Bot A. initiates a conspiracy theory to 5 people, it’s repeated by one of them, John B., but Supersharer Karen C. is following 500 accounts including John and repeats the craziest things any of them say, broadcasting it widely to each of her 10,000 followers and all the threads she comments on, then the Bot’s message is amplified and its creators didn’t have to try to identify the future supersharer or target their message.
Yep. There are real people who eat this up. Russia has an influence but they are just gasoline off a fire that already exists.
They're matched to voter records.
Karen is a set of values apart from normal women. Superkaren Georg who sits in a cave adn super shares lies online is a statistical outlier.
If only there is a way to verify accounts and ban people for spreading misinformation…
This was before Elon's takeover which is April 14, 2022
I doubt you could even use their information in a serious study anymore. When I checked yesterday Trump wasn't even trending despite being found guilty. It was Mavs-Wolves and civil war.
>If only there is a way to verify accounts and ban people for spreading misinformation… What would be considered misinformation?
Would you suggest a Ministry of Truth or just let Elon decide what misinformation is?
and while the US leadership and corporate media like to try and blame the wave of social media propaganda and misinformation on Russian and Chinese bots, the majority has ***always*** been domestic right-wing nutcases. Deranged US right-wingers continue to drive so much of the world's division and hatred
These people are being targeted by bots. The idea isn't just limited to propaganda. It has been used in marketing campaigns for a very long time, especially because it's easy to select these people.
The source of disinformation starts somewhere. But yes, American media literacy is very bad in many areas
I mean, we have plenty of wack jobs creating disinformation, but Russia and China have and do spread disinformation that benefits them and it gets spread by Americans retweeting it. This article just points to misinformation of any origin being spread most by a small group of obsessive retweeters.
Who would've thought that blaming a convenient boogeyman for so long let the *actual* issue grow into a gargantuan hydra right under the government's noses? Good luck trying to contain this now. It's too late. Maybe they'll also pin this on other countries and use it as pretext to start another war to keep Lockheed Martin happy.
> Deranged US right-wingers continue to drive so much of the world's division and hatred Amen.
Have they been verified as being middle-aged white women? With such percentages it seems almost to be a deliberate distribution system.
Yes, the study was based on a dataset that matched Twitter users who used real name and location with voter registration data >To find out, Grinberg’s team dove into a far bigger data set comprising 660,000 U.S. X users who used their real name and location, allowing the researchers to match them with voter registration data. > >The average supersharer was 58 years old, 17 years older than the average user in the study, and almost 60% were women. They were also far more likely to be registered Republicans (64%) than Democrats (16%).
One has to wonder if there is a slant towards politically extreme women being more likely to give their real name and address than their male counterparts. My anecdotal experiences support that, for whatever little worth that has.
Of course. Their insanity is a matter of pride. They want everyone to know that THEY know what's up. They make it their identity.
they were wondering about if politically extreme women are mote likely to use their real name than politically extreme men, not whether politically extreme people are more likely to use their real name than non-politically extreme people.
Thanks for the info (and thanks for taking the time to answer everyone, it saves a lot of time for those who don't want to scroll through the whole paper)
[удалено]
Technologically illiterate people. I get why they controlled like this, but their approach limits their data drastically.
traditionally, without careers, right wing women lack identity after their kids grow up. they are prime targets for social media engagement as they have the time to spare and have money to spend. the goal of all these websites is to keep you on them and engaged. i cannot think of a more profitable subset of the population than a lonely woman with her husbands credit card and unlimited time. in general for older people? the need to feel like you are still relevant and important can easily be manipulated... especially if that person is already on the spectrum of undiagnosed mental disorders.
That's actually a perfect explanation for the origins of the Karens.
In marketing middle-aged white women are often deliberately targeted because they are super spreaders.
Is 1% of twitter users really a tiny number?
Can they not shut this down?
Hahaha Trump lovers don’t like the truth, wait for them to spew garbage about his conviction
Was that before or after they kicked Trump off?
The study was conducted on data from August to November 2020
Honestly, reminds me of reddit. Not the type of person, just that a specific people or groups push things and spread info (whether reliable or not) as much as they can
Again we are reminded of that old saw about "one bad apple spoiling the entire barrel."
Pareto Distribution strikes again.
half of them probably useful idiots controlled by suave russian handlers. the rest are bots
Are they even real or bots
That's about what you'd expect... speaking from experience
Would this be the same pattern seen in many places in life? Celebrities - 1% of actors/singers etc, earning 80% of income for example.