Platforms such as Google, Facebook, Instagram, and TikTok all have their own closely guarded algorithms which personalise the content they show to us – different users who use the exact same search terms or scroll through the same social media platform are likely to see different content. The results that we are presented with are likely to reflect the likes and interests that our browsing history and personal data suggests we would like to see more of – after all, online platforms want you to use their site and to stay for as long as possible.
With so much content on the internet, these algorithms are used to reduce the volume of information and filter what is displayed to users. While there are benefits, including making sites we regularly use faster to access or making it easier to find information that we are interested in, it is also essential to understand how they can influence the type of content we are likely to see. For example, searching for exercise tips or liking a cute cat video will make it more likely that you will see content related to these topics in the future, or if you browse for a pair of trainers, you may see advertisements for those shoes on other websites you visit. All this information is building up a picture of who you are online.
While this is not necessarily bad, it is essential to know that the content being pushed to your newsfeed is filtered and tailored by what a social media network or online platform believes you are interested in or would like you to become interested in. One of the drawbacks is that we can very quickly get caught in a feedback loop. What we see are variations of the same thing, and alternative views or opinions are filtered out – this is sometimes called a filter bubble. Not seeing an alternative point of view can affect our ability to think critically about content, make us less open-minded, and can negatively influence how the world is presented to us. If there is content or a theme that makes us feel bad or affects our self-esteem, being presented with more of the same will not make us feel better, in fact it may make us feel worse!
What can you do?
What is displayed on your newsfeed or in your search results is determined by the algorithm of the platform you are using, and while it is based on several things, such as your interests and how engaging the content is, the exact details of why and how they work are largely unknown.
However, there are steps you can take to help improve perspective and broaden the variety of content you get.
- Keep an open mind – Be aware that what you see online has been tailored to your preferences, and online algorithms filter what content you see and what you don’t see in order to try to hold your interest. If an online platform is highlighting a piece of content, why might that be? Is it because you are likely to be interested in it?
- Search for new perspectives – It can be a good idea to look for a new perspective or opinion on topics that you are interested in. That way, you will start to see different points of view from what you are used to.
- Vary your sources of information – Find new trusted sources of news and information.
- Seek out the positive – If you feel like a topic is bothering you, unfollow or hide it and find positive, healthier content to have in your newsfeed.
- Refresh your settings – Regularly clear your browsing settings and consider turning off targeted ads. Find out how on the website of Webwise.
With so much information online, it can be overwhelming and algorithms help filter down the volume of content, but it’s important to think critically about what we do see. Carl Miller, Research Director, Centre for the Analysis of Social Media, has suggested some useful tips for avoiding online manipulation.
Find out more about the work of the Irish Safer Internet Centre, including its awareness raising, helpline, hotline and youth participation services – or find similar information for Safer Internet Centres throughout Europe.
Platforms such as Google, Facebook, Instagram, and TikTok all have their own closely guarded algorithms which personalise the content they show to us – different users who use the exact same search terms or scroll through the same social media platform are likely to see different content. The results that we are presented with are likely to reflect the likes and interests that our browsing history and personal data suggests we would like to see more of – after all, online platforms want you to use their site and to stay for as long as possible.
With so much content on the internet, these algorithms are used to reduce the volume of information and filter what is displayed to users. While there are benefits, including making sites we regularly use faster to access or making it easier to find information that we are interested in, it is also essential to understand how they can influence the type of content we are likely to see. For example, searching for exercise tips or liking a cute cat video will make it more likely that you will see content related to these topics in the future, or if you browse for a pair of trainers, you may see advertisements for those shoes on other websites you visit. All this information is building up a picture of who you are online.
While this is not necessarily bad, it is essential to know that the content being pushed to your newsfeed is filtered and tailored by what a social media network or online platform believes you are interested in or would like you to become interested in. One of the drawbacks is that we can very quickly get caught in a feedback loop. What we see are variations of the same thing, and alternative views or opinions are filtered out – this is sometimes called a filter bubble. Not seeing an alternative point of view can affect our ability to think critically about content, make us less open-minded, and can negatively influence how the world is presented to us. If there is content or a theme that makes us feel bad or affects our self-esteem, being presented with more of the same will not make us feel better, in fact it may make us feel worse!
What can you do?
What is displayed on your newsfeed or in your search results is determined by the algorithm of the platform you are using, and while it is based on several things, such as your interests and how engaging the content is, the exact details of why and how they work are largely unknown.
However, there are steps you can take to help improve perspective and broaden the variety of content you get.
- Keep an open mind – Be aware that what you see online has been tailored to your preferences, and online algorithms filter what content you see and what you don’t see in order to try to hold your interest. If an online platform is highlighting a piece of content, why might that be? Is it because you are likely to be interested in it?
- Search for new perspectives – It can be a good idea to look for a new perspective or opinion on topics that you are interested in. That way, you will start to see different points of view from what you are used to.
- Vary your sources of information – Find new trusted sources of news and information.
- Seek out the positive – If you feel like a topic is bothering you, unfollow or hide it and find positive, healthier content to have in your newsfeed.
- Refresh your settings – Regularly clear your browsing settings and consider turning off targeted ads. Find out how on the website of Webwise.
With so much information online, it can be overwhelming and algorithms help filter down the volume of content, but it’s important to think critically about what we do see. Carl Miller, Research Director, Centre for the Analysis of Social Media, has suggested some useful tips for avoiding online manipulation.
Find out more about the work of the Irish Safer Internet Centre, including its awareness raising, helpline, hotline and youth participation services – or find similar information for Safer Internet Centres throughout Europe.
- Related content
- algorithms social media
- < Previous article
- Next article >