top of page
Writer's pictureScott DeJong

ECHO, Echo, echo

This is a blog post from my Masters work which was conducted in 2018-19. It is a re-post from the original site where the blog was held and was only slightly adjusted to the new website. If you are interested in the larger project you can read the thesis here.


Are we trapped in a chamber consistently repeating what we want to hear? Maybe you've heard such a notion referred to as echo chambers. They discuss how digital (or physical) spaces can be polarized by individuals as we surround ourselves with others who consistently reinforce our opinions (Sunstein, 2001). Focused on the individuals we connect with, echo chambers are curated through users and platforms specifically ignoring, removing or cutting out individuals who disagree with them (Bruns 2019). Yet, just like filter bubbles, echo chambers have equally come under scrutiny for their perceived existence.


Before I go further into critiques on echo chambers, I want to lay out the specific differentiation between echo chambers and filter bubbles. As discussed in the last blog post, filter bubbles involve a collaboration between user and algorithm to actively engage with content of their own ideas and exclude or disengage from content they agree with (Bruns, 2019). The more active we are in our engagement and disengagement, the stronger our filter bubble becomes (Bruns, 2019). Echo chambers are focused on the individuals we connect with, and the active ignoring of other individuals to build ourselves an ‘isolated’ network (Bruns, 2019). In short, as their name suggests, filter bubbles distill the content that we see based upon our actions and the platform’s studying of our actions, while echo chambers are created by an acceptance and active exclusion of ideas.


While these definitions help, the premise behind these terms might be flawed. People are exposed to content outside of their beliefs and opinions on a regular basis, and many actively choose to research information they hear (Dubois & Blank, 2018). Dubois and Blank (2018) suggest that previous research around the curation of echo chambers, focused on specific case studies and ignored the larger digital climate. As users, we engage in an array of platforms, while also maintaining connections with individuals we potentially disagree with such as extended family. Rather, as Dubois and Blank (2018) discuss, the real concern is on the opinion leaders or influencers who curate ideological segregation and mistrust. This claim matches with the concerns of Bruns (2019), who argues that the hyper-right has done an excellent job in discrediting media and pushing others as voices in their agenda. While I think that these are valuable claims, I am not as quick to discredit the existence of echo chambers and filter bubbles playing a role in how people perceive their ideas within the public sphere.


In all of this we arrive at a conversation of trust, something that appears to be fragmented today. The hyper-capitalization of the news media has caused it to lose some credibility among the public. This is only furthered by claims of fake news. Digital interference (such as Cambridge analytica) and misunderstood conversations around free speech emphasize how trust within the digital space (and our lives in general) is something that requires evaluation. Part of our critical analysis is asking ourselves why we trust a source, and doing the work to be able to trust the information we hear.


In previous preliminary work I did on defining digital privacy, trust arose as a valuable keyword. In this work, trust played a dichotomous role; on the one hand, machines, computers and code were seen as unbiased and trustworthy, while companies and corporations behind them were viewed skeptically. Despite extensive literature discussing how biased and untrustworthy code, AI, and machine learning can be, our culture has curated a trust in machines to output values and products that are true.


I think that echo chambers and filter bubbles do exist, but not in the all-encompassing manner that early authors suggest (Pariser, Negroponte and Sunstein). I agree with the critiques of Bruns, Dubois and Blank, however aspects of these systems still exist and can influence behaviour, decisions and opinions. While we are exposed to ideas and opinions outside of our personal beliefs, our current environments are still supporting the majority of our perspectives, which paints outside material be in a specific light. For example, I have friends within my social network who share content I disagree with, but its limited compared to the other bits of content that dominate what I see. We can also find videos that “portray the other side” but retain bias in their delivery, exposing us to content and ideology at the same time. Understanding this, changes how we should view filter bubbles and echo chambers. They are more open and fluid systems, but they can still skew our perceptions of content, complementing the objectives of opinion leaders, and narratives that discredit bodies such as the media. We need to talk about both issues, the systems and the content within them.


In any case, the next question is always so what do we do? Simple answers are to make sure you fact check claims and stories you are reading on social media, maintain a critical mindset to the content you are fed, and seriously consider the arguments and opinions of those counter to your own. However, those suggestions put a lot of responsibility on you as a user. While digital literacy is an important skill, perhaps we need to include it alongside critiques of these platforms and how they function. Part of what this blog will do, is explore ways to highlight these issues in a game, and using the game to provide a counter narrative promoting critical thought, the exchange of information, and open dialogue around these themes. I hope to further some of these questions, and provide one potential option to help solve this growing digitally apocalyptic scenario (and yes that is a tad dramatic).


References:

Bruns, A. (2019). Are Filter Bubbles Real? John Wiley & Sons.

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656

Negroponte, N. (1995). Being digital. Vintage Books.

Sunstein, C. R. (2001). Echo Chambers: Bush V. Gore, Impeachment, and Beyond. Princeton University Press.

Additional Reading:

Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

Rajan, A. (2019, March 4). Do digital echo chambers exist? BBC News. Retrieved from https://www.bbc.com/news/entertainment-arts-47447633

Taylor, A., & Sadowski, J. (2015). Digital Red Lining. Nation, 300(24), 24–27.

0 views0 comments

Recent Posts

See All

コメント


bottom of page