Did you ever get the feeling you are living in a bubble? Everyone around you mostly agrees with you, likes the same movies, has the same dream holiday and buys clothes from the same stores. Until one day you wonder where did everybody else go.
This pretty much sums up our virtual life on social media. Algorithms make us comfortable.
The term “filter bubble” was coined by Eli Pariser in 2011 (listen to his TED talk here) and has since sparked debate on the ethics behind tailoring our search results and news feed for us, implicitly, not explicitly (meaning not by direct choice when we set up our accounts). Are people blinded by their own past choices and clicks? Or is this a matter of self-inflicted isolation?
One can argue and prove that algorithms have a minimal effect on what information people are exposed to (like the study “Burst of the Filter Bubble? Effects of personalization on the diversity of Google News”, by Mario Haim, Hans-Bernd Brosius and Andreas Graefe). The fact still remains that what you don’t know, doesn’t hurt you (directly at least). The bottom line is that companies make money and political parties get ahead while using social media and search engines. You can’t really blame them, that’s their goal. The problem arises when this happens using false information and manipulating the truth.
People will by default like to be in circles similar to themselves and yes, they benefit from a personalized web, to some degree. The matter leaves us with one important task: to open our eyes, practice awareness and have a filtering algorithm of our own.
The ones who manipulate information and data to their benefit are waiting for an audience. Do yourself a favor and skip that show.