WISCONSIN — In the aftermath of tragic events, particularly school shootings and mass shootings, the negative aspects of social media become more apparent.
The speed and unvetted nature of these platforms sets the tone for potential misinformation to circulate within seconds, spreading it to millions of people.
“Social media has created a dopamine hit for us,” said Assistant Professor Patrick Johnson, who teaches journalism at Marquette University. “We have access at our fingertips like we never did before. Anyone can say whatever they want and these platforms, as a result of Section 230, aren’t accountable for that information.”
As time goes on and technology becomes more advanced, it’s becoming increasingly difficult for the public to decipher what is true and what is not on social media.
“One of our most severe challenges is how instantaneous it is, which means for journalists, we don’t get to control the story like we used to,” Johnson said. “Sadly, anybody can tweet, post, re-post anything instantaneously without the ability for someone who’s trained to do that fact-checking to actually do that fact-checking.”
Unconfirmed, or false information, can also be a detriment to the official investigation into what happened and why.
“When it comes to school shootings, what people don’t realize is that we have to operate with such a degree of care and sensitivity that we do not just want to tweet out what’s happening,” Johnson said. “These are serious situations with minors, with people whose parents maybe have no idea if their child is in that building still.”
Johnson said, “I think journalism does it right in that we slow it down when it comes to these things, but we have to help communities understand the how and why we’re slowing it down.”
In a society that is embracing artificial intelligence and advancing technology, with that comes what is known as bots. Bots are computer-generated entities that perpetuate certain, primarily negative, rhetoric from social media algorithms. They can be generated by other countries, or organizations that have trained algorithms to post certain content, which often stokes fear and anger.
“If you’re seeing the same exact language, down to the capitalization, multiple times, but linked to different accounts, that should be a telltale sign some of those, if not all of them, are bots,” Johnson said. “The images that are often used on those social accounts likely are not going to be of an actual person’s face.”
“Even if it’s not a bot, if you’re not seeing a person and someone is a keyboard warrior anonymously, I would say we should discredit the story they’re trying to tell.”
The toll on mental health, especially for kids, is significant during tragic events such as school shootings. Young people have social media access at their fingertips. It means many times, they’re seeing tragic events unfold before their parents do.
“Kids don’t have a filter,” Johnson said. “Sometimes, it’s not just the gruesome imagery of what’s happening, they’re seeing the gruesome ways in which people change the story.”
He also said parents need to be proactive in the kinds of conversations they’re having with their kids.
“To be proactive, parents need to say, ‘I also need you to come and tell me how you feel’ because we have to be conscientious of the mental health crisis that is occurring, and these images and these narratives aren’t helping.”
Disinformation can also be a detriment to marginalized communities, Johnson added.
“People equate one marginalized person to another as if we’re all a monolith,” he said. “That happened with this school shooting. That happened with the Nashville school shooting.”
“Social media doesn’t tend to hinder the privileged. It tends to stoke flames against the marginalized.”