An on-surface look creates the impression that social media brings to us boundless opportunities to connect with the world like we have never done before. However, a recently published reports the exact opposite.
As opposed to the common notion, Facebook can make us more narrow-minded. It isolates us by facilitating biases and most times erroneous information.
The research experts used data modeling to map the spread of two types of content: scientific information and conspiracy theories.
The paper concluded that most people “tend to select and share content related to a specific narrative and to ignore the rest.”The paper further explains that social homogeneity is the main driver when it comes to content diffusion.
Simply put, you and your friends share the same content, notwithstanding whether it is bunk, because your thinking is alike. Due to your tightly defined ideas, it is hard to give room for anything new or challenging.
What of the fake news?
The paper was co-authored by Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California. He explained that the study targeted at explaining why there is too much fake news online.
“Our analysis showed that two well-shaped, highly segregated, and mostly non-interacting communities exist around scientific and conspiracy-like topics,” Bessi told CNN.
He added that users tend to search for information that conforms to their pre-existing beliefs, referring to this as a “confirmation bias.”
Via Fox 6 Now
Thus social media is no longer a platform for challenging or informing but rather advocates for an idea that accepted in the midst of social circles. This misinformation is what ultimately generates “fake news.”
“Indeed, we found that conspiracy-like claims spread only inside the echo chambers of users that usually support alternative sources of information and distrust official and mainstream news,” Bessi says.
What’s the way forward?
As much as you may think you are more open-minded and that your online conversations are devoid of misinformation, Bessi warns that every person can fall into the misinformation net.
He said that when we come across something that our ideologies conform to, we are highly likely to share it.
Considering that we have “limited attention, and a limited amount of time,” chances are we will recklessly share a piece of information even before examining it.
“For example, I may share content just because it has been published by a friend that I trust and whose opinions are close to mine,” Bessi says.
Bessi hoped that the future will bring programs or algorithms that will help combat misinformation. As at the moment, carryout an individual fact-checking before you press the share or like button.