Skip to main content
SearchLoginLogin or Signup

Algorithmic selfhood: My Facebook ad preferences diary

Article reviewed by Dr Irene Fubara-Manuel (August 2023)

Published onAug 14, 2023
Algorithmic selfhood: My Facebook ad preferences diary

Targeted marketing is an everyday part of most web users’ experiences: if you have a social media or Google account, it is likely that some of your identity characteristics, lifestyle interests and behavioural patterns have been tracked in order to present you with personalised adverts. Marketers use a dizzying number of categories to profile people: gender, age, ethnicity, lifestyle and consumption preferences, political leanings, music and film tastes, credit status, home ownership status, marital status—the list goes on. Privacy advocates such as DuckDuckGo maintain tracking persists even if you use a tracker blocker.

Despite - or indeed because of - its monetizable qualities, targeting creates a host of social, cultural and ethical problems. Targeting can threaten individual and collective privacy (Meier & Krämer, 2023), result in the opaque and discriminating trading of users' personal data (Skeggs and Yuill, 2019), negatively impact marginalised social groups (Chun et al., 2018) and can create 'algorithmic identities' that shape users' behaviours and sense of self (Jarrett, 2014; Bucher, 2016; Kant, 2020).

Platforms such as Facebook give users a glimpse into targeting through Ad Preferences profiles. These pages allow a user to see the interests and topics that Facebook has algorithmically inferred to be ‘personally relevant’ to them. Over the past five years I have recorded my ad preferences on Facebook and in the following diary I present some reflections about who Facebook thinks I am. I conclude by considering why Facebook’s inferences matter for identity constitution.

My Facebook ad preferences diary

22nd July 2019: Today I am presented with a series of images accompanied with categories that according to the algorithm, I like. I’m presented with around 70 different ‘interests’. It’s true, I do like video games, feminism and rescue dogs (see fig. 1). Things get bizarrely specific – apparently I like the 1996 theatrical release of the film Emma. I like Gambling, Cattle and (embarrassingly) Toilet. I like Perfection. Why does Facebook think I like toilets? What have I clicked on to suggest I like to gamble? And what does perfection look like when advertised back at me? These supposed insights into my algorithmically inferred personality generate more questions than answers.

Figure 1. Screenshots of selected ad interests from Facebook’s ad preferences interface, captured July 2019.

8th Oct 2020: Things that I like are now grouped into subcategories such as ‘Business and Industry’, ‘Hobbies and Activities’ and ‘News and Entertainment’ (see fig. 2). There are around 50-100 different interests in each category – what a diverse person I am. My business interests include First-Time Buyer (I do like this, but buying my own flat is an impossible dream). I like Asda, Wayfair, Starbucks. My charitable side likes Cats Protection and the British Red Cross. It all starts to get a bit meta (pun not intended) – I like ‘online advertising’ as a business interest. Will Facebook show me ads about ads? University is there – I do I ‘like’ Uni (I work in one), but is University a business? According to Facebook, yes.

Figure 2. Screenshot of top ‘business and industry’ interests from authors Facebook ad preferences pages, captured 8th October 2020.

Monday 7th June 2021: I can now choose what I don’t want to see. Facebook seems to suggest I don’t want to see ads about Alcohol or Parenting (see fig. 3). Does Facebook think I’m an alcoholic and therefore a potentially bad mother? I find out that no, this time it’s not personal – Facebook suggests these potentially sensitive topics to everybody. I can see still what ad topics I do supposedly like, but the pictures are gone (see fig. 4). Hundreds of interests today – I am into both Vegetarianism and Barbecue. Hunting and Sustainability. I like Fear. According to Facebook, I seem an anxious mess of contradictions.

Figure 3. Screenshot of ‘Manage ad topics’ from the author’s Facebook preferences, captured June 2021.

Figure 4. Screenshot of selected ad interests from author’s Facebook Ad Preference pages, captured November 2022.

10th November 2022: The topic images are back but just small icons this time (see fig. 5). I like Date Night and Science, Barbie and Mass Media, Used Good (just the one?). There are so many categories, I keep scrolling, wondering when my interests will end. Finally I count around 640 topics, my algorithmic self both more and less quantifiable as the list gets longer. The coherence of any ‘true’ sense of selfhood seems to diminish with every new interest. I don’t subscribe to ideas of true or coherent self anyway, so maybe this is a good thing? Yet even as my selfhood expands, it remains packaged neatly for advertising value.

Figure 5. Screenshot of selected ad interests from author’s Facebook Ad Preference pages, captured November 2022.

July 25th 2023: My algorithmic self seems more orderly. The cute icons are gone but now I can ‘sort’ and ‘filter’ my ad topics (see fig. 6). According to my ‘most recent interactions’ on Facebook, I’m into Travel, Stand-up Comedy, Pets (Animals). I’m also interested in Davina McCall, Baby and Toddler Clothing, YouTube, Hardware Fastener and Nails (the manicured type or the hammering type?). My interest in Fear has been replaced by Self-Love and I’m into Nasty Gal, whatever that is. My algorithmic self continues to be gendered, but perhaps is getting more sassy? I know I’m self-selecting topics that pop out at me, but my apparently feminine attributes continue to persist, double down, stand out.

Figure 6. Screenshot of selected ad interests from author’s Facebook Ad Preference pages, captured November 2022.

Reflections.

Looking back on five years of ad preferences, I see an algorithmic self that is increasingly both bounded and boundless: boundless because my interests get more extensive, yet bounded because these lists are always-already fixed and compartmentalised to make my identity manageable and profitable. Szulc argues such systems render social media users both ‘abundant’ - ‘performing identities which are capacious, complex and volatile’ - and yet also ‘anchored’ - ‘singular and coherent’ (2018, 14).

The bigger my list of ‘interests’ becomes, the more contradictory and messy my identity appears. For me, the end result is an algorithmic self so abundant that trying to fix my interests to my identity ceases to have any social or economic value. In a forthcoming paper I argue that ad preference profiles create this illusion of abundance to a) avoid accusations of reductive targeting by the public and industry regulators and b) apparently offer detailed micro-targeting to advertisers. I will argue that ad preference profiles bear little to no resemblance to how Facebook actually delivers ads to users: instead, its targeted advertising mechanisms are repetitive, gender conservative and focused on one or two prime drivers of interest in groups of individuals.

If ad preference profiles create only the illusion of abundance, what can they tell us? Though they only ever afford partial insight, myself and Sophie Bishop argue that these ad preferences profiles shed light on targeting in ways that improve the algorithmic literacy of social media users (Bishop and Kant, 2023). Read more about our ‘algorithmic autobiographies’ project here and our creative methodologies for public engagements with ad preferences here.

References
Bishop, Sophie and Kant, Tanya (2023) ‘Algorithmic autobiographies and fictions: A digital method’, The Sociological Review, 0(0). https://doi.org/10.1177/00380261221146403

Chun, Wendy Hui Kyong Chun, Clemens, Apprich, Cramer, Florian and Steyerl, Hito (2018) Pattern Discrimination. Minneapolis: University of Minnesota Press.

Bucher, Taina (2016) ‘The Algorithmic Imaginary: Exploring the Ordinary Affects of Facebook Algorithms’, Information, Communication & Society 20, no. 1: 30–44.

Jarrett, Kylie (2014) ‘A database of intention?’ In König R., Rausch M. (Eds.), Society of the query reader: Reflections on web search (pp. 16–29). Instituut voor Netwerkcultuur.

Kant, Tanya (2020) Making it personal: Algorithmic personalization, identity, and everyday life. Oxford University Press.

Lukasz Szulc (2019) ‘Profiles, Identities, Data: Making Abundant and Anchored Selves in a Platform Society’, Communication Theory, Volume 29, Issue 3, pp257–276. https://doi.org/10.1093

Meier, Yannic & Krämer, Nicole C (2023) ‘A longitudinal examination of Internet users’ privacy protection behaviors in relation to their perceived collective value of privacy and individual privacy concerns’, New Media & Society, 0(0). https://doi-org.sussex.idm.oclc.org/10.1177/14614448221142799

Skeggs Beverly & Yuill, Simon (2019) ‘Subjects of value and digital personas: Reshaping the bourgeois subject, unhinging property from personhood’, Subjectivity, 12(1), 82–99. https://doi.org/10.1057/s41286-018-00063-4

Comments
0
comment
No comments here
Why not start the discussion?