This post first appeared on Thesislink in October 2017.
In one of my thesis drafts, I wrote that most people would disapprove of the exploitation of marginalised groups. I didn’t have any particular source for that; I wasn’t citing a survey finding, or anything like that. But I had judged it a safe assumption that most people would be generally anti-exploitation.
My supervisor objected. I couldn’t assume anything, he said. Maybe ‘most people’ would disapprove of exploitation… but I couldn’t be certain how universal that opinion really was.
The thing is, I had been certain. My friends and family were all anti-exploitation. My Twitter and Facebook feeds were full of political statements and protest movements that countered various forms of exploitation. Anti-exploitation sentiment was everywhere, as far as I could see.
I had been fooled by the echo chamber effect.
What happened was this: I disapproved of the exploitation of marginalised groups, and I had surrounded myself with friends who felt the same way. I had inadvertently created a social (and especially, a social media) ‘bubble’. My friends expressed beliefs which (usually) echoed and reinforced my own. This is called the ‘echo chamber’ effect, and it gives us an incomplete view of the state of public opinion.
My echo chamber told me that everyone was anti-exploitation. But that’s not true. People will define exploitation differently, and have different perspectives on it. From within my echo chamber, I heard only a tiny fraction of beliefs from within what was, realistically, a much wider and more diverse spread of thoughts and opinions.
The echo chamber effect has a massive impact on what we view as ‘normal.’ As researchers, this can deepen our biases. Biases are nothing new – everyone, researcher or otherwise, has them, and so there is no such thing as truly objective research. But the echo chamber effect can make it hard to recognise our biases, and to see around them. (This is particularly obvious after elections. If your social media bubble has a left- or right-wing bias, it may start to seem like ‘your side’ is much more popular than the other. Then, if the result doesn’t go that way, it can be quite shocking.*)
Another danger of the echo chamber is that it is full of people who are predisposed to agree with us. If you want to test out an idea – which is often tempting, especially for those of us in the humanities and social sciences – putting it ‘out there’ on social media, or within existing social networks, will not really be a true test. While our networks will not be homogenous, they will definitely not be representative either.
So how do we manage this, as researchers?
Firstly, we have to recognise the bubbles that we are in. Whose voices fill your echo chamber? What might their biases be? Are your social media behaviours (following particular pages or types of people) amplifying the echo chamber effect? Are you unfollowing people who disagree with you? What does that do to your view of the diversity of thought?
Secondly, we have to take our ideas outside of our particular bubbles, and into the wider atmosphere. This counts for social media, but for intellectual bubbles too. For example, conferences – while terrific places to test and disseminate ideas within academia – can act as their own kind of bubble, because they are populated by like-minded intellectuals. Outside our ‘safe’ and agreeable academic networks, there will be a much more representative community ready to respond to our work, and contribute their diverse thoughts and opinions. This means testing and presenting ideas as broadly as possible; across disciplines, and even outside of academia altogether.
Break out of those echo chambers, and discover what you can hear outside!
*This post-election shock is such a common phenomenon that it inspired a browser extension called PolitEcho, which analyses the political bias of your Facebook feed. It’s designed for US users (and was developed after the 2016 presidential election) but can be interesting for non-US users as well.