toplogo
Sign In

Understanding Young Adults' Collaborative Practices and Heuristics for Assessing Online Information


Core Concepts
Young adults employ a range of social and collaborative practices, as well as heuristics, to assess the veracity and meaning of online information, often in collaboration with algorithmic curation.
Abstract
The study explores the decision-making practices, challenges, and heuristics involved in young adults' assessments of information encountered online. The researchers conducted a digital diary study followed by data-informed interviews with 14 young adults (aged 21-34) in the UK. Key findings: Information Sensibility Practices: Young adults use a range of 'sufficing' behaviors, such as crowdsourcing credibility and using search engines to cross-reference news, to assess information. Familiarity with the source and alignment with personal worldviews are important factors in accepting the veracity of online information. These practices need to be re-learned and adapted during life transitions that expose participants to new sources and types of information. Convenience as Emotional Wellbeing: Young adults are aware of the downsides of social media as a news source but still rely on it for convenience and enjoyment. They are willing to accept uncertainty and a loss of control as a price for convenience, enjoyment, and emotional protection from unwelcome information. The Algorithm as Collaborative Agent in Assessments of Information: Young adults see algorithms that curate information on social media as unnerving and excitable, but also as shielding them from information they do not want to encounter. They try to configure and 'collaborate' with the algorithm to meet their information needs, developing theories about how to 'game' the algorithm. The findings contribute to a deeper understanding of young people's information sensibility practices, challenges, and heuristics, highlighting the social and collaborative nature of their information assessment, as well as the role of algorithms as collaborative agents in this process.
Stats
"I then spent the time watching more reports and videos about where people were getting their source from, because I was like, Oh, he seems like quite a nice guy." "Maybe if it was something really serious, I would even look at the comments, if anyone's commented on it being a real thing." "I think I would put in quite a general term [on a search engine]. Maybe if what I was looking for didn't come up immediately, I would say, "I'm not that bothered." Then I wouldn't search for it further." "If I haven't seen anything in the mainstream media but see something mentioned on Twitter, that's when I think, "Oh that doesn't sound right, that doesn't match up with other things that I've read." "I don't know why I just assume it's correct. I don't know. I think it's because it's… It's probably because their politics seems to be quite similar to my own. So, because it's backing up what I believe and think, I'm like, "Oh, this must be right," which is probably a bit stupid." "I think, at the moment that's something that does happen to me, quite a bit, because I'm going into a new phase of my life where I'm going to be a parent, and therefore I'm more easily click-baited by stuff I like." "I quite like it. I think there's a lot of, "How much are they listening to? How much do they know?" But like at the end of the day it makes my experience more enjoyable." "I think as I've grown up with social media being more and more used for everything. I think it makes the news quite polarised. It's quite pessimistic, you see most of the bad things. Inciting hate and wanting people to engage in things, so making them as clickbait-y and controversial as possible. I don't like that's where social media has brought news to, but that is still the place I go to for news." "I think in a way, it's really good and healthy, because I get the kind of news that I'm interested in, and that's the news that I get to read, and I don't- if there's some kind of news that I don't like to consume, which is maybe about entertainment, or news and things I don't like to think about at all. And then I don't get those recommended, which is something I prefer a lot more." "If I'm doing a lot of searches on food recipes, then I know that my algorithm is going to change to show me lots of other food recipes. That makes me quite happy […] The only time it gets annoying is when I've had enough of that thing, and I can't get rid of it from my algorithm." "There have definitely been times where I've noticed something and I'm like, "That is actually really creepy." I'm not massively into the idea that, like, for example, if a friend shared a link with me and then all of a sudden, I'm getting ads for something, I'm like, "That is a little bit creepy." "Sometimes, I'll read through the thread, because there are interesting arguments in the comments, but I wouldn't ever click on the news source. Because I don't want to give it the click and give it the validity of having gone into it and looked at the article." "Try and scroll past it as quickly as, or don't click on it. Yes, just try and not engage with it and hope that it picks up that I've had enough. Which it normally does."
Quotes
"Everything I've ever learnt about Meghan and Harry [i.e. members of the British Royal family] has been against my own will," which I kind of agree with. It's like, I have never sought out information on those people, but I know more about their marriage than I do about my parents' marriage." "An app or a website, it's just set up differently, or in the way that I think of it. Where it has these top five, top ten, it's more likely to have banners relating to what everyone wants to click on rather than what I want to look at."

Deeper Inquiries

How can the design of social media platforms and information ecosystems better support young people's collaborative and social information assessment practices?

In order to better support young people's collaborative and social information assessment practices, the design of social media platforms and information ecosystems should focus on several key aspects: Transparency and Control: Providing users with more transparency into how algorithms curate their information feeds and offering explicit controls for users to customize their content preferences. This can help users understand and potentially influence the information they are exposed to, fostering a sense of agency in their information consumption. Diverse Perspectives: Encouraging exposure to diverse perspectives and sources of information to combat filter bubbles and echo chambers. By promoting a variety of viewpoints, platforms can help users develop a more comprehensive understanding of complex issues and avoid the pitfalls of confirmation bias. Collaborative Tools: Introducing features that facilitate collaboration and discussion around shared information. This could include built-in tools for fact-checking, annotation, and sharing, allowing users to engage in critical conversations and collectively assess the veracity of information. Emotional Wellbeing: Prioritizing emotional wellbeing by offering users the ability to customize their content based on emotional triggers. This could involve content warnings, filters for sensitive topics, and tools for managing exposure to potentially distressing information. Education and Guidance: Providing educational resources and guidance on digital literacy, critical thinking, and information evaluation. By empowering young people with the skills to navigate the information landscape effectively, platforms can help them make informed decisions and resist misinformation. By incorporating these design principles, social media platforms and information ecosystems can create a more supportive and conducive environment for young people to engage in collaborative and social information assessment practices.

What are the potential risks and downsides of young people's reliance on algorithmic curation for their information needs, and how can these be mitigated?

While algorithmic curation can offer personalized and tailored content experiences, there are several risks and downsides associated with young people's reliance on algorithms for their information needs: Filter Bubble: Algorithms may reinforce existing biases and preferences, leading to the creation of filter bubbles where users are only exposed to information that aligns with their beliefs. This can limit exposure to diverse perspectives and contribute to echo chambers. Manipulation and Misinformation: Algorithms can be manipulated by bad actors to spread misinformation and disinformation. Young people may unknowingly consume false or misleading information that has been amplified by algorithmic recommendations. Lack of Transparency: Algorithms operate based on complex and often opaque processes, making it challenging for users to understand why certain content is shown to them. This lack of transparency can erode trust and agency in the information ecosystem. To mitigate these risks, platforms can implement the following strategies: Algorithmic Transparency: Enhance transparency around how algorithms work and why certain content is recommended to users. Providing clear explanations and controls can help users make more informed decisions about their information consumption. Diverse Content Promotion: Actively promote diverse and credible sources of information to counteract the effects of filter bubbles. Platforms can prioritize content that offers a range of perspectives and encourages critical thinking. User Empowerment: Empower users with tools to customize their content preferences, adjust algorithmic recommendations, and actively engage in information assessment. By giving users more control over their information environment, platforms can promote a sense of agency and autonomy. By addressing these challenges and implementing proactive measures, platforms can mitigate the risks associated with young people's reliance on algorithmic curation for their information needs.

What role can digital literacy education play in helping young people develop a more critical and nuanced understanding of the information landscape, including the role of algorithms, while also preserving their emotional wellbeing?

Digital literacy education plays a crucial role in equipping young people with the skills and knowledge needed to navigate the complex information landscape effectively. Here's how digital literacy education can support young people in developing a more critical and nuanced understanding while preserving their emotional wellbeing: Critical Thinking Skills: Digital literacy education can teach young people how to critically evaluate information, fact-check sources, and discern between credible and unreliable content. By fostering critical thinking skills, individuals can make informed decisions about the information they encounter. Algorithm Awareness: Educating young people about how algorithms work, their impact on content curation, and the potential biases they may introduce can help individuals understand the role of algorithms in shaping their information environment. This awareness can empower users to navigate algorithmic recommendations more effectively. Emotional Resilience: Digital literacy education can also focus on building emotional resilience and coping strategies for dealing with potentially distressing or misleading information. By promoting emotional wellbeing and self-care practices, individuals can protect themselves from the negative effects of misinformation and online content. Ethical Considerations: Teaching young people about digital ethics, privacy, and responsible online behavior can help them engage with information in a conscientious and ethical manner. Understanding the ethical implications of sharing, consuming, and interacting with online content is essential for fostering a healthy digital environment. Collaborative Learning: Encouraging collaborative learning and discussion around information assessment can enhance young people's understanding of complex issues and promote collective sense-making. By engaging in collaborative activities, individuals can benefit from diverse perspectives and collective intelligence. Overall, digital literacy education plays a pivotal role in empowering young people to navigate the digital world with confidence, critical thinking, and emotional resilience. By integrating these principles into educational curricula and online resources, individuals can develop a more holistic understanding of the information landscape while safeguarding their emotional wellbeing.
0
visual_icon
generate_icon
translate_icon
scholar_search_icon
star