Algorithmic culture: What shapes our information consumption
MECO6936 Social Media Communication
Assignment 3 – Online Article and Comments
Student ID: 470191015
Tutor: Kai Soh, Friday 5:00-8:00 p.m.
Algorithms which may be invisible in online environments(Striphas) is a widely used in various fields.This essay seeks to analyze the algorithmic culture on YouTube, exploring the wide application of algorithm and the potential issues related to the phenomenon.
According to YouTube Internal Data (2017), there are 1.5 visitors each month on YouTube. YouTube Official Blog (2017) also points out that people around the world spend more than one million hours a day watching YouTube videos. Such amazing achievement may owe to several reasons, one must be algorithmic culture — we really enjoy what the media platform provides.
Algorithmic culture refers to a phenomenon that human behavior, organization, thought and expressions on various social media platforms are integrated into the logic of big data and large-scale computation, changing the way people used to practice, experience, and understand category culture (Striphas 2015). Hutchinson (2017) states that a personalized content will be recommended according to the consumer’s past consumption or searching history.
Take YouTube for example, these are two screenshots of YouTube homepage taken by my friends Anna and Tina. Both of them are frequent YouTube users (more than 20 hours a week) but their purposes are quite different. Anna is an ambitious and hard-working girl who wishes to be a brilliant female speaker in the future. Hence Anna spends her spare time watching TED Talks, speeches by famous women around the world, as well as academical presentations which she believes will expand her horizon. On contrary, Tina tends to enjoy life and watch videos online just for entertainment. She prefers to watch variety shows, thrill and funny content that can make her “forget all problems ”. As Tina is a Chinese international student with a boyfriend in China, she is also interested in and searches for long-distance love videos.
It is obvious that they have quite diverse recommended lists (see pic1 &pic2), as their interests and searching habits computed by Youtube are different.
Pic1: Anna’s recommended list
Pic2: Tina’s recommended list
Anna’s recommended list contains Get comfortable with being uncomfortable by Luvvie Ajayi on TED Talks, Story that will change your life-One of the Best on InspireDiscipline, Esther Perel on Sexual Desire and Successful Relationships by Lewis Howes, Stop trying so hard. Archive more by doing less by Bethany on TEDx Talks, and How to be a great leader: Inspiring others to do by Marie Forieo. Whereas Tina’s recommended list covers a popular variety show in China, Fancy Life 2，a short video about a horror situation on airplane, a video about a funny assumption of the relationship between men and women, a talk show discussing the problems dating people from different countries may encounter, and another Chinese variety show, Running men.The built-in algorithmic systems on YouTube can always filter unwanted content and show only what we want. It reduces the process to search other keywords and saves our time to some extent. For example, detecting Anna’s viewing history, the algorithmic systems recommend more relevant videos and channels (see pic3) such as TED channels, speech videos, which Anna is more likely to watch.
Pic3: Anna’s recommended channels and videos
The use of Algorithmic systems
Personalized content automatically selected by algorithms satisfies different users’ needs and saves their time to enter their interested keywords to search for information. For example, only typing a “t”, what I need (translate ) has already come out on the search engine (see Pic4).
Pic4: Search result for “translate” with a “t”
Algorithms can provide precise data on audiences’ preference and consumption habits, hence they are crucially important for marketers and policymakers to carry out marketing campaigns to reach their economic goals. Social media platforms can achieve higher audience rating and traffic and make more profits, as they can locate audiences’ preference and push pertinent attractive contents to keep audiences spending more time on their channels. On the other hand, people are glad to see things they favor so they will spend more time on the internet.
Algorithmic systems are widely used in many fields such as finance, research, education, individual user applications, etc (Argenton, 2017). Apart from recommender systems, the driven elements of algorithmic culture includes “metadata schemes”, “ontologies and taxonomies”, “semantic search tools”, and “machine based reasoning” (Tracy & Carmichael 2017) are playing significant roles in various social media platforms, such as Amazon(Pic5), Instagram(Pic6), Twitter(Pic7), Facebook(Pic8), eBay(Pic9), Taobao(Pic10) and etc. While we are surprised at how the apps are intelligent at knowing our needs, we really enjoy the sweet suggestions they provide.
When we were preparing the OWEEK campaign on assignment 2, we also use algorithms to help us decide what are the most popular words or phrases among college students, and the sorting sequence also helps us choose the right titles to post on Facebook and Instagram to attract more target audiences’ attention. To be honest, algorithmic systems is a super helpful tool in making marketing plans, as well as advertising campaigns.
However, as information consumers, we cannot be too optimistic about the fact that audiences are becoming addicted to algorithms. Increasing concerns are raised along with the wide use of algorithmic systems. Beer(2017) states that algorithms’ ability to choose and decide with little or without human control is the focus of discussion when talking about the potential social power of algorithms. While Brown and Duguid(2000) emphasize that it is the human that should ultimately decide the meanings and importance of information, Gillespie (2014) points out that our options, thoughts, and opportunities are influenced by algorithms. People are losing autonomy in complicated algorithms(Executive Office of the President 2014 ). With the power of Algorithmic systems, Turkle(2011) argues that people will become more reliant on technology and less on each other.
There are many potential problems coming with algorithms. As the “cognitive-affective stickiness” makes algorithms readily imitable, meanwhile, forbiddingly complex (Mackenzie 2006), Diakopoulos(2015) claims that it is difficult for people to scrutinize and employ the power and influence of algorithms. Introna and Nissenbaum (2000) concern that algorithms may result in bias, while Bozdag (2013) also declares that bias exists in “algorithmic filtering”. For example, as Tina likes entertainment content recommended to her while Anna keeps focusing on famous speeches and pay no attention to search information from other sources. Both of them seem to be an “expert” in their interested areas but they cannot communicate well with each other because their knowledge is limited in their small world, called “filter bubble” (Parise 2011).Some researchers also found that algorithms are associated with discrimination (Kraemer, Overveld, and Peterson 2010; Gillespie 2013; Edelman and Luca 2014). Besides, accountability (Felten 2012; Schuppli 2014) and surveillance issues (Introna and Wood 2002) in algorithms are also concerned by many scholars. Furthermore, Bucher(2012) is worried about the distribution of visibility caused by the automatical sorting systems. Whereas some other researchers pay attention to fairness (Dwork et al. 2011), as well as issues about the reproduction of power structures (Edelman 2014). We should use the algorithm-built-in platforms more carefully to avoid the negative side.
Eli Pariser’s TED talk on the Online Filter Bubble
In the “age of the algorithm” (Kelty 2003), what we see and hear are inevitably influenced by algorithmic systems, which in return feeds our demands for certain information. With the assumption that they “know” us, we trust them, rely on them, and even gradually give up our own natural ability to distinguish what is suitable for us or not. However, algorithmic culture is a two-edged sword. We should be open-eyed when going through content automatically sorted by algorithm systems. Despite the convenience, our thinking styles and behaviors will be changed little by little by the limited recommended lists. People will be stuck in their own traps if they are too dependent on algorithmic systems and losing critical judgment on deciding what kind of information they really need. We are human, it is dangerous if we just keep following algorithms without autonomous control on them.
Argenton, G. (2017). Mind the gaps: Controversies about algorithms, learning and trendy knowledge. E-Learning and Digital Media, 14(3), 183-197. doi:10.1177/2042753017731358
Beer, D. (2017). The social power of algorithms. Information. Communication and Society 20(1): 1–13.
Bozdag Engin. (2013). Bias in Algorithmic Filtering and Personalization. Ethics and Information Technology, 15 (3): 209–27.
Brown, JS. & Duguid, P. (2000b). The Social Life of Information. Boston, MA: Harvard Business School Press.
Bucher, T. (2012). Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook. New Media & Society 14 (7): 1164–80.
Diakopoulos, N. (2015). Algorithmic Accountability. Digital Journalism 3 (3): 398–415.
Dwork, C., Hardt, H., Pitassi. T., Reingold, O., Zemel, R. (2011). Fairness through Awareness. ArXiv e-print 1104.3913. Accessed August 20, 2015. Retrieved from http://arxiv.org/abs/1104.3913.
Executive Office of the President. (2014). Big Data: Seizing Opportunities, Preserving Values. Washington, DC: The White House.
Edelman, BG. & Luca, M. (2014). Digital discrimination: The case of Airbnb. com. Harvard Business School NOM Unit Working Paper No. 14-054.
Edelman, BG. (2014). Leveraging market power through tying and bundling: Does Google behave anti-competitively? Harvard Business School NOM Unit Working Paper No. 14-112.
Felten, E. (2012). Accountable Algorithms. Freedom to Tinker, September 12. Accessed August 20, 2015. Retrieved from https://freedom-to-tinker.com/blog/felten/accountable-algorithms/.
Gillespie, T. (2013). The Relevance of Algorithms. In Media Technologies, edited by Gillespie Tarleton, Boczkowski Pablo, Foot Kirsten, 167–94. Cambridge, MA: MIT Press.
Hutchinson, J. (2017). Algorithmic Culture and Cultural Intermediation. In Cultural Intermediaries: Audience Participation in Media Organisations (pp. 201–220). Cham: Springer International Publishing: Springer International Publishing: Imprint: Palgrave Macmillan.
Kelty, C. (2003). Qualitative Research in the Age of the Algorithm: New Challenges in Cultural Anthropology. Lecture presented at the Research Libraries Group 2003 Annual Meeting, Boston Public Library, Boston.
Introna, LD. & Nissenbaum, H. (2000).Shaping the Web: Why the Politics of Search Engines Matters. The Information Society 16 (3): 169–85.
Introna, LD. & Wood, D. (2002). Picturing Algorithmic Surveillance: The Politics of Facial Recognition Systems. Surveillance & Society 2 (2/3): 177–98.
Kraemer, F., Overveld, K., Peterson, M. (2010). Is There an Ethics of Algorithms? Ethics and Information Technology 13 (3): 251–60.
Mackenzie, A. (2006). Cutting Code: Software and Sociality. New York: Peter Lang International Academic.
Parise, E. (2011). Beware online “filter bubbles”. Retrieved from https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles
Striphas, T. (2015). Algorithmic culture. European Journal of Cultural Studies 18
Tracy, F., & Carmichael, P. (2017). Disrupting the dissertation: Linked data, enhanced publication and algorithmic culture. E-Learning and Digital Media, 14(3), 164-182. doi:10.1177/2042753017731356
Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. New York, NY: Basic Books.
YouTube Internal Data. Global. May 2017.
YouTube Official Blog. You know what’s cool? A billion hours. February 2017, Retrieved from