Main Body

The Attention Economy & Algorithmic Search

“As users engage with technologies such as search engines, they dynamically co-construct content and the technology itself.” – Safiya Umoja Noble

Google bad corrrection
Screenshot of a Google search recommendation correcting “herself” to “himself.” Credit: Safiya Umoja Noble: Algorithms of Oppression: How Search Engines Reinforce Racism.

Overview

In the prior readings, you explored the theoretical construct of a mediatized communication ecology and the relationship to the human construction of reality.

This chapter, we focus on the digital systems within our mediatized ecology and the algorithms that determine what you see as you engage with them. If you have ever studied information literacy in a prior course, you were likely focused on the content of information, such as its objective validity and reliability. Instead, in this area of study, we are interested in knowing how that information appeared in front of you in the first place.

Why does this matter? It is because popular search engine systems like Google are where many users go to confirm “the real truth” instead of relying on other media institutions or experts. To many, a Google search is an operationalized method of “Doing your own research.”

The problem with this method is that there is a presumption that the top search results or most popular selections affirm the validity of a given answer. From the user’s perspective, digital systems are presumed to be objective arbiters of fact checking (Sundin, O., & Carlsson, H., 2016). However, this is not the case. The results of search queries reflect a combination of factors including what other people have selected in similar search queries.

“Knowledge is not simply conveyed to users, but co-produced by the search engine’s ranking systems and profiling systems, none of which are open to the rules of transparency, relevance and privacy in a manner known from library scholarship in the public domain (van Dijck, 2010, p. 575).”

When you trace the flow of information in algorithmically optimized systems, you find that the “truth” in digital network systems isn’t about the validity of information, it is about what the majority of people believe is true. As more people accept top search results as objective proof of a given body of knowledge, it creates a feedback loop that elevates popular content to a higher state of relevance regardless of its quality. In fact, search results are designed to be subjective and are, to a degree, determined by who you are, your interests, where you had just navigated on the Internet, and where you are physically located.

Our study of algorithmic systems suggests that there is more to “knowing” than the traditional analysis of information for its credibility, i.e. information literacy, critical thinking, etc. We must examine how the information we find is the output of robust calculations that are different for every user, by design.

The Attention Economy: We will also explore the algorithmically driven strategies found in social media systems that serve the Attention Economy. The Attention Economy is (among several definitions) a model of commercial activity designed to sustain the user’s attention for as long as possible to capture user data for use in serving targeted advertisements and content. One of the strategies to achieve platform “stickiness” is by using algorithms that make the user feel as though they belong to a community of like-minded people, which influences the user’s desire to connect, engage, share, and “stick” on the system for as long as possible.

While the idea of a personalized Internet or app experience is a good value proposition for users (especially since they can use these apps at no cost), there is a potential for users to experience an echo chamber effect that affirms and intensifies a singular worldview. When users co-construct online communities based on uniform worldviews, there is a greater risk of tribalistic divisiveness.

Key Terms

Algorithm – A set of mathematical rules that calculate an outcome according to the logic designed in the code. Algorithms are created by humans. There are many variations to this definitions according to the context of its use. This article by the MIT Technology Review offers an expanded analysis.

Homophily Principle – A term used to describe the psychological tendency of people to gravitate towards other people with whom they share a similar worldview, values, interests, etc. The diagram below shows how social media systems recirculate user engagement data to inform both the construct of affinity groups as well as the algorithms that predict which content would be most interesting to them (to foster further engagement and data collection).

The effects of homophily-based design are mixed: there is variation in the design of social media systems (such as reciprocal/non-reciprocal connections) as well as the degree to which any set of partisan content causes an individual to migrate their worldview towards one direction or another.

The more important takeaway from this research is to acknowledge that the content one encounters in all forms of social media are co-constructed according to the Homophily Principle, not by a purely objective landscape of content.

social media use and information source consumption
Image from “Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption” (Kitchens, B., Johnson, S. L., & Gray, P., 2020)

Personalized – In the study of algorithmically optimized systems, personalization refers to the response of digital systems to the characteristics of the user and their past navigation behavior. For example, if you search for an item in Amazon, a digital cookie is saved in your browser that is retrieved by a tracker embedded on the Web page you are visiting and then used as a basis for serving you ads in the sidebar. You will often see ads for items that you just search for because the ads are personalized to your (apparent) interests. While there is no personally identifiable information being transferred in this process, a person who uses the same computer or device for all of their online navigation will accumulate a body of metadata (data about a “person,” not specifically you) that serves as a profile signature to inform how an algorithmically optimized system should respond.

If you like, read more about metadata and tracking in this open e-book chapter, Metadata, Tracking and the User’s Experience.

Note: There are many other Key Terms to review in the S.1896 – Algorithmic Justice and Online Platform Transparency Act resource listed below.

What should you be focusing on?

The purpose of this week’s study is to comprehend how personalized content appears in an individual’s channels of communication according to the purposeful designs of commercially driven interests.

Your goal is to be able to explain how it is possible for individuals to construct a certain perception of reality according to the design of various channels of mediated communication they encounter.

Readings & Media

Thematic narrative in this chapter

In the following readings and media, the authors will present the following themes:

  1. Search algorithms are created by humans who bring their cultural biases to their design. As such, algorithms may serve as a codification of bias.
  2. Search algorithms do not produce objective results; Internet content is personalized according to profit-driven strategies.
  3. Algorithmic systems can limit the breadth of information people may encounter.
  4. Transparency in algorithmic design is a double-edged sword, though there are solutions to consider if we determine that social media should operate in the public good.

    Required     S.1896 – Algorithmic Justice and Online Platform Transparency Act (2021)

5:00 minute read. This congressional bill describes a set of Findings that justify legislative consideration to address the impact of algorithmic systems on the public. While its focus is primarily on responding to discriminatory harm, the Findings section summarizes the evidence that we will be studying as part of this course. Read:

  • Section 2. Findings
  • Section 3. Definitions

    Required    Personalization Algorithms: Why it matters and how it impacts people” – (2021).

10:00 minute read. This article summarizes the various use cases for algorithmically optimized personalization including both the benefits and risks to consumers. The rationale for employing these strategies is fairly obvious: It’s good for business.

    Required     Algorithms of Oppression: How Search Engines Reinforce Racism. Chapter 5 (excerpt) – Safiya Umoja Noble (2018).

20:00 minute read. Read the sub-chapter Search as a Source of Reality. You may need to scroll up or down to locate the beginning of this sub-chapter within the e-book.

Noble’s research in this area is centered on the premise that “…information provided to a user is deeply contextualized and stands within a frame of reference (Noble, S., 2018, p. 108),” which points to the need to deconstruct the historical philosophy that informs how information should be classified (and thus, retrieved). Noble goes a step further than merely stating that contemporary search engines are biased towards a commercial interest. They are biased towards reflecting a cultural frame of reference. Thus, algorithmic search is more than just personalized for us; it is recursively “culturalized” as well. (For further reading to support this proposition, skim through the chapter “Searching for Black Girls”).

Safiya Umoja Noble. (2018). Algorithms of Oppression : How Search Engines Reinforce Racism. NYU Press.
Connect to Safiya Umoja Noble on Twitter: @safiysanoble

    Required     “Searching for Alternative Facts: Analyzing Scriptural Inference in Conservative News Practices” – Francesca Tripodi (2018). Excerpt: “Googling for Truth”

15:00 minute read. This is an excerpt from a larger report which you can access directly from the Data & Society Research Institute website. While the larger theme of the paper focuses on the news engagement habits of self-identified Conservatives, the section we are interested in is generalizable to all users of search engines to seek information.

This chapter describes how people employ Google search as a method of “Doing your own research” as a way to circumvent the perceived biases of news media and other institutions.

Pay particular attention to the importance of how an initial search query is phrased. As you will see in later studies, if it is possible to influence how a search query is phrased, it is possible to control the search results.

Tripoli, F. (2019, May 16). Searching for Alternative Facts: Analyzing Scriptural Inference in Conservative News Practices. Searching for Alternative Facts. Data & Society. https://datasociety.net/library/searching-for-alternative-facts/

    Required    Behind The Filter Bubble” Politicology Podcast, May 25, 2022. (Look on the upper right corner for the red “Play Episode” button).

45:00 minute listen. This podcast focuses on the expertise of Stanford Professors Mehran Sahami and Rob Reich who discuss the difficulties in rectifying the need for businesses to operate according to their business models versus the reality that a small group of private companies are serving as the de facto public sphere of discourse.

What does it mean when opaque algorithms operate without transparency if they are presumed to be serving a public good? What would happen if there was total transparency of the algorithms? Would it make things better or worse? Below is a list of the topics covered.

The first two topics in the podcast repeat subject matter already covered in prior chapter resources, but they are articulated so well that it may help to solidify your understanding of the issues. Note that the player controls on this webpage include speed control if you prefer to play back the content faster than 1X speed.

  • 02:21 – What do social media algorithms do and how do they impact what you see on platforms?
  • 05:01 – The embedded values on social media platforms.
  • 11:07 – How algorithmic transparency could help bad actors.
  • 12:23 – What algorithms look for and what they’re trained to do.
  • 15:05 – Content moderation and algorithmic amplification.
  • 23:53 – Transparency without going open source.
  • 28:01 – Algorithmic choice through “middleware.”
  • 32:27 – Consumer Choice, misinformation, and filter bubbles.
  • 48:43 – Whether Democracy can withstand this.
Reich, R. (Host). (2022, May 25). Behind the Filter Bubble [Audio podcast episode]. In Politicology. https://www.podchaser.com/podcasts/politicology-1230612/episodes/behind-the-filter-bubble-139837562

Optional: Code-Dependent: Pros and Cons of the Algorithm Age” – By Lee Rainie and Janna Anderson, Pew Research Center, 2017.

5:00 minute read. This article outlines seven themes that have emerged as more algorithmically optimized systems are implemented throughout the Internet. Skim through this article to capture the major concerns expressed by the survey participants.

Optional:Many Tech Experts Say Digital Disruption Will Hurt Democracy” – By Janna Anderson and Lee Rainie, Pew Research Center. February 21, 2020.

The Pew Research Center conducts ongoing research on public opinion and expert commentary on contemporary issues, many of which center on technology, the Internet, and social media. This article presents survey results about questions related to public trust and civil divisiveness.

“About half predict that humans’ use of technology will weaken democracy between now and 2030 due to the speed and scope of reality distortion, the decline of journalism and the impact of surveillance capitalism. A third expect technology to strengthen democracy as reformers find ways to fight back against info-warriors and chaos (Anderson and Raine, 2020, p. 1).”

Optional: The Attention Economy – By Lexie Kane on June 30, 2019, Nielsen Norman Group.

This article describes the Attention Economy from the perspective of website and app design: How do advertisers force users to pay attention to ads or content? This may interest you if your studies center around marketing and business.

Optional: Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption – By Kitchens, B., Johnson, S. L., & Gray, P. (2020).

This is the full research study referred to in the Homophily Principle section above. Recommend reading the Literature Review and the Practical Implications section to gather a sense of the research questions and how the results were interpreted. It is critical to note that the findings do not suggest that there are direct causal relationships between the use of social media systems as a whole and the formation of certain psychological dispositions, i.e, liberal, conservative, etc. There is too much diversity in the design of social media systems, and the data is limited in this research to user behavior (not user beliefs).


References

Anderson, J., & Rainie, L. (2020). Many tech experts say digital disruption will hurt democracy. Pew Research Center. Internet & Technology. Feb21.

Kitchens, B., Johnson, S. L., & Gray, P. (2020). Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption. MIS Quarterly44(4).

Sundin, O., & Carlsson, H. (2016). Outsourcing trust to the information infrastructure in schools: how search engines order knowledge in education practices. Journal of Documentation.

van Dijck, José. (2010), “Search engines and the production of academic knowledge.” International Journal of Cultural Studies. Vol. 13 No. 6, pp. 574–592.

License

Icon for the Creative Commons Attribution 4.0 International License

Synthetic Media and the Construction of Reality Copyright © 2021 by University of New Hampshire College of Professional Studies (USNH) is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.