Humanities, Philosophy

AI and Social Media Literacy

By Sarah Allen
Cohort 2021-2022

INTRODUCTION

鈥淒o you think that the possession of all other things is of any value if we do not possess the good? Or the knowledge of all other things if we have no knowledge of . . . goodness?鈥 (Plato, Republic, 505b)

I introduced this teaching module in the context of my Fall 2021 Humanities, Knowledge and Wisdom course. The goal of the module is to help students reflect on: (i) the effects of digital technology on how we access and define knowledge; and (ii) the best use of digital technology for contributing to a life well-lived. As the title of the module suggests, it focuses on how social media in general and the use of AI by social media companies impact our access to, understanding of, and use of what we count as knowledge. Students are invited to engage in critical reflection on social media and AI, as relatively recent forms of technology that dominate our information, communication, and social landscapes. This critical reflection falls into a long Humanities tradition bringing together scientific and technological developments with ethical reflection on whether and how these developments can be used in ways that benefit (or at least do not harm) human flourishing.

DIGITAL TECHNOLOGY AND THE REDEFINITION OF KNOWLEDGE

Before addressing social media and AI, the unit starts with a broad reflection on how digital technology in general has transformed how we access knowledge, what we define as knowledge, and who gets to decide what counts as knowledge. This reflection is based on David Weinberger’s book: There are many relevant and interesting chapters in this book that could be useful readings for this part of the module: I focus especially on Chapters 1 and 3. In these chapters, Weinberger distinguishes between what he calls “traditional knowledge” and “networked knowledge.”

  • Traditional knowledge is defined as based mainly in paper media and as subject to a curatorial filtering carried out by people in positions of power and expertise, where the majority of available information and views get filtered out, and only a small fraction make it through the filters. Traditional knowledge in this sense is understood as justified beliefs constituting a (more or less) agreed upon body of shared truths. Weinberger argues that, while traditional knowledge still has its place in our society, the dominant view of knowledge has shifted towards networked knowledge.
  • Networked knowledge is accessed through digital media and goes through a filtering process that pushes forward information privileged by one’s online social networks and by online algorithmic filters. There is a greater amount and diversity of information available, but the verification and justification processes that lead us to counting this information as knowledge have become much more blurry. As such, while networked knowledge can still be defined as made up beliefs that are justified in at least some sense, there is a fracturing of the traditional body of knowledge and much less agreement on what would count as shared truths.

JOURNALING EXERCISE: STUDENT SOCIAL MEDIA USE

The topic of social media literacy is broached by having students journal on their own social media use. A great resource for coming up with journaling activities on social media is the from the Center for Humane Technology. I use Unit 5, especially the questions in Part 2, “What role does social media play in your life?” Beyond just this journaling exercise, the whole website of the is worth exploring for ideas and material related to critical thinking and digital technology, especially social media.

CRITICAL THINKING AND SOCIAL MEDIA LITERACY: THE FIVE QUESTIONS

The module then explores social media literacy through five guiding questions:

  1. What are they?
  2. How do they work?
  3. Who is in control?
  4. How are they beneficial and/or harmful?
  5. How can we use and regulate them properly?

Before addressing these questions in class, we watch the documentary film (2020, Netflix) together and students take notes on what kind of answers the documentary provides to questions 1 through 4. We then go through the questions together over several classes, using the following PowerPoint slide presentation: “AI and Social Media Literacy Slides.” I’m not going to say much here about question 1 as the material is pretty straightforward if you watch The Social Dilemma and look at the slides. I do want to elaborate, however, on how questions 2 through 5 are addressed.

QUESTION 2: AI AND HOW SOCIAL MEDIA WORK

The Social Dilemma emphasizes how the information that we see on our social media feeds is determined to some extent by our own personal preferences and our community of contacts, but in a more important and less transparent sense by AI-powered data collection, based on algorithms meant to optimize user engagement for the business profits of social media companies. The full picture of how this works is not made available to us by social media companies, but here are some resources that explain the role and use of AI in social media in an accessible way:

  • Hodgson, Ashley, “,” YouTube This is a fun and accessible, though heavily simplified, account of the development of social media algorithms. Hodgson, a professor of Economics from Saint Olaf College, also has many other interesting videos related to social media that might be worth exploring on her YouTube Channel: “.”
  • : The first four units of the (“1. The Attention Economy,” “2. Persuasive Technology,” “3. Social Media & the Brain,” and “4. Seeing the Consequences”) address the ways in which social media platforms are designed to be addictive (persuasive technology) and to keep users on social media platforms as long and as much as possible.

The above two sources are ready-made resources, easy to incorporate directly into teaching on this topic. To explore the addictive and persuasive side of social media design in more depth, some further readings that could be interesting are:

  • : This book gives a general account of behavioural addiction, and then looks at how behavioural addiction can be and is often engineered through digital technology. Alter also looks at some interesting solutions for countering the problem of digital addiction. It is a well-written and very readable book. Though I haven’t assigned parts of it to students yet, it would be a good addition to course readings for this unit, especially if you want to focus on the issue of addictive design.
  • This book defines and discusses the behavioural science behind persuasive technology. While it is in some senses quite dated 鈥 mobile technology has changed a lot since 2003 鈥 its basic ideas are still applicable. The most interesting passages of the book for this module are to be found in Chapter 8 (“Increasing Persuasion through Mobility and Connectivity”) and Chapter 9 (“The Ethics of Persuasive Technology”), especially its sections on the persuasive advantages of computers over humans and the ethical ambiguity of behavioural science techniques used in persuasive technology, like operant conditioning and surveillance. The text itself may be a bit dry to assign directly to students, but it is definitely informative for teaching on persuasive technology. Fogg is the main source for the definition of persuasive technology discussed in The Social Dilemma and in the above-mentioned YouthToolkit units from the Center for Humane Technology.
  • Wall Street Journal, (2021): This series of articles documents various problems with Facebook’s (now Meta) social media platforms, including looking at some of the negative effects of these platforms on teens, how Facebook’s algorithm promotes angry and divisive content, and how far AI can and cannot go in cleaning up content on Facebook. While a subscription is necessary to access the articles themselves, for many articles, there are related podcasts (with written transcriptions) that can be accessed for free. Here are some links to a few of the podcasts:
    • ““
    • ““

QUESTION 3: WHO IS IN CONTROL?

A closely related question to how social media works is who is actually in control on social media platforms. This is addressed quite well in The Social Dilemma. While user choices and preferences determine what users see on their platforms to a certain extent, it is worth having students reflect on the great asymmetry of power between individual users and the amount of information social media AI collects about users in order to keep them on platforms and get them to behave in ways that are beneficial to the social media companies and the advertisers who fund the social media companies. The topic of addictive design leads into the topic of engineering user behaviour and the loss of user autonomy. Another question that is worth exploring with students is what happens when social media algorithms have unintended consequences that were not foreseen (or purposely programmed) by companies and their programmers. Is there a sense in which the development of AI technology is pushing the boundaries of human control over that technology itself? (I use the sources mentioned for Question 2 to address Question 3 as well).

QUESTIONS 4 & 5: HARMS, BENEFITS AND BEST PRACTICES

In the final part of the module, questions 4 (harms and benefits of social media) and 5 (best uses and regulation) are addressed together. For these questions, I use several small group in-class discussion activities described in what follows (and also found in the “AI and Social Media Literacy Slides” above). For each of these activities, the class is broken up into small groups of 4 to 5 students to discuss the assigned questions or tasks for 10 鈥 15 minutes; we then reconvene as a large group and put the results of the small group discussions together.

We start this last part of the module, then, with such an in-class small group discussion activity where students are asked to complete two tasks:

  1. Reflect on what Tristan Harris in The Social Dilemma means when he describes social media as “simultaneous utopia and dystopia”; and
  2. Make a list of the potential harms and benefits of social media in their own life and in our society as a whole, based on their previous reflections on their own personal social media use (the journaling exercise from the beginning of the module).

Students are then assigned a reading from a last source: . Chapter 5 (“Retreat, Reform, Restraint”) is what is of most interest for this module. It addresses some of the harms and benefits of digital technology in general, including social media, and explores various ways the use of digital technology can be modified and controlled to decrease harms and increase benefits. Deibert begins by presenting these ways in a somewhat disparate fashion under the rubrics of: (i) retreat, including practices such as taking breaks from digital technology and creating digital-technology-free spaces; and (ii) reform, including greater governmental regulation, promoting ethical design; and encouraging greater corporate social responsibility for social media and other Big Tech companies. Ultimately, however, Deibert argues that we need to completely reset our approach to digital technology by creating a guiding ethical and political framework for its use. He thinks the foundational principle of this framework needs to be restraint, both in the sense of self-control in our own personal use and of limiting the power of the government institutions and companies that control digital technology. It is also worth mentioning the other chapters of the book. While I did not use them for this particular module, they address digital technology in relation to related themes like consumer culture and surveillance capitalism (Chapter 1), digital addiction (Chapter 2), abuse of power by police and autocratic governments (Chapter 3), and environmental costs (Chapter 4).

Once students have read Chapter 5 of Deibert, we discuss the harms and benefits of social media, especially for our understanding of and access to knowledge. Important themes that come up are: the way that algorithmic feed personalization leads to different versions of reality (the fracturing of a shared body of truths); the spread of fake news; and if and how social media companies and their algorithms can mitigate these problems.

Students then complete another in-class small group discussion activity where they address the following questions:

  1. What are some ways you can manage your own social media use in order to enhance the benefits and decrease the potential harms of social media?
  2. What solutions does Deibert consider to the problems posed by social media?

In answer to question 2, we explore Deibert’s four Rs of digital technology use mentioned above – retreat, reform, reset, and restraint.

In a final in-class small group discussion activity, students are asked to build on Deibert’s proposed ethical and political framework based on his liberal/republican conception of restraint by addressing the following questions:

  1. What ethical and/or political principles would you like to see at work in a guiding framework for how digital technology is designed and used in our society?
  2. What social values and goals do you think such a framework should promote?

Though we did not have time to do it in the Fall 2021 implementation of this module, a natural place to go from here would be to look at actual guiding frameworks for the ethical design and use of digital technology that already exist in our society and other ethically and politically similar societies.

FINAL PROJECT

In Fall 2021, students wrapped up this module by completing a final essay project for the course on the question: How have social media transformed our understanding of and access to knowledge? Here are the general guidelines students were given: Critical Reflection Essay Guidelines.

FINAL OBSERVATION

Because social media is such a big part of our students’ (and our own) lives, I found that discussing social media was a very accessible way to get students to think about what AI is and how it impacts their daily lives. Students were able to contribute their own personal experience with social media to the discussion of this topic and they showed a fairly high level of awareness of the effects of social media on their lives (e.g. its addictive design) and the ethical issues raised by the dominance of social media in our information, communication and social landscapes.

ACKNOWLEDGEMENTS

Thank you to Robert Stephens for his unobtrusive but always on-point way of leading the Fall 2021 AI Teaching Community of Practice. Thank you to Myriam Dimanche for her humorous contributions and administrative support of the CoP. Thank you to all my fellow participants for their inspiring presentations, source suggestions, comments, and questions: Patricia Campbell, Michel Fournier-Simard, Daniel Goldsmith, Andrew Katz, and Kasia Wolfson.

LIST OF RESOURCES

Alter, Adam. Irresistible. The Rise of Addictive Technology and the Business of Keeping Us Hooked. New York: Penguin Books, 2017.

Deibert, Ronald J. Reset. Reclaiming the Internet for Civil Society. CBC Massey Lectures. Canada: Anansi Press, 2020.

Fogg, B. J. Persuasive Technology. Using Computers to Change What We Think and Do. San Francisco: Morgan Kaufmann, 2003.

Harris, Tristan et. al. Center for Humane Technology, 2021, www.humanetech.com/

Hodgson, Ashley, “,” YouTube, July 11, 2021, www.youtube.com/watch?v=zzorwqPW7yM

Hodgson, Ashley, “,” YouTube, www.youtube.com/channel/UChTaQi88607D84YEg44It_g

Linebaugh Kate & Ryan Knutson. . The Wall Street Journal, 2021, www.wsj.com/articles/the-facebook-files-a-podcast-series-11631744702

Orlowski Jeff. The Social Dilemma. Exposure Labs, 2020. Netflix, netflix.com/title/81254224

Weinberger, David. Too Big to Know. Rethinking Knowledge Now That the Facts Aren’t Facts, Experts are Everywhere, and the Smartest Person in the Room Is the Room. New York: Basic Books, 2011.