💻 Week 10 - Class Roadmap (90 min)
2024/25 Autumn Term
AI and the information environment
Welcome to our week 10 seminar/lab class for DS101A.
In this class we will look at the effect of AI on perception of reality.
When AI is used to convey information to the individual person, we consider:
- Identity
- Rights
- Control - a.k.a. “mimicry”
- The Quantified Self - consent, measurement and estimation
There are also considerations for society as a whole:
- Mediating public discourse
- Trust & degradation of the information environment
- Psychology & behaviour
Preparation
To prepare for the class, you can watch a short clip from these two videos.
First of all, where Sir David Attenborough reflects on the digital re-purposing of his likeness:
And second, where Eric Schmidt – once CEO and Chairman of Google – talks about how it is possible to repurpose your identity to configure your world view:
It should take around 10 minutes of your time.
Step 00 - Identity: personal data 🐾 (15m)
👨🏫 Teaching moment (5m)
Your tutor will ground this class around a brief discussion of different sources and forms of data.
- “What is data?” – revisited
Discussion (10m)
For the individual, they are their data (their “digital footprints”) – but there are considerations of knowledge and consent and control that have become much more acute due to the onset of powerful AI.
Generative AI has made it possible to leverage tiny “snapshots” of data to synthesise completely realistic scenes, scenarios and narratives – whole new worlds – to the extent that within the digital realm, the individual has become a puppet, open and available to whoever holds the rights to “their” data.
“Digital mimicry” becomes possible for whoever holds the rights to personalised data downstream.
“Who owns the rights to your personal profile?”
“How is personalisation made possible?”
“How can personalisation be used for good?”
“How can personalisation be used for harm?”
“Is mimicry ever justified?”
Background:
Peter Benie, Department of Engineering, University of Cambridge, 21 Dec 2018 “The man who helped to preserve Stephen Hawking’s iconic voice” – (Peter Benie 2018)
Peter Hoskins, Business reporter, BBC News Online, 26 September 2024 “Dame Judi Dench and John Cena to voice Meta AI chatbot” – (Peter Hoskins 2024)
Jones, Nicola and Nature Magazine (2024). “Who Owns Your Voice in the Age of AI?”. Scientific American – (Jones and Scientific American 2024)
Manisha Ganguly, Guardian Online, 16 Oct 2024 “‘It’s not me, it’s just my face’: the models who found their likenesses had been used in AI propaganda” – (Manisha Ganguly n.d.) 👈 - focus
Step 01 - Media communications 🎤 📺 (15m)
Where data can be used to create useful illusions, for society, there is the role of the mediator, and then there is also sophisticated targetting.
AI makes it possible to influence the message in powerful new ways. In this class we will revisit the old debate around free-speech, propaganda and censorship, to consider whether and how we should upgrade our thinking.
Discussion (15m)
“Is this the same debate around censorship vs. free speech?”
“What if anything, has changed?”
Mediation here means control over the message. AI can be used to create scenarios from seeds that don’t exist, so we have to consider the effect on the individual and on society when there is confusion between the world we inhabit and the world (that looks like the world we inhabit, but) that actually lacks any grounding in actual events.
Background:
Jenny Kleeman, Guardian online, 11 May 2024 “She was accused of faking an incriminating video of teenage cheerleaders. She was arrested, outcast and condemned. The problem? Nothing was fake after all” – crying deepfake – (Jenny Kleeman n.d.) 👈 - focus
Dana Ruiter, Thomas Kleinbauer, Cristina España-Bonet, Josef van Genabith, Dietrich Klakow, ACL Anthology, July 2022 “Exploiting Social Media Content for Self-Supervised Style Transfer” – (Ruiter et al. 2022)
Katherine Atwell, Sabit Hassan, Malihe Alikhani , Openereview.net, 01 Jan 2022 - updated 08 Jul 2024 “APPDIA: A Discourse-aware Transformer-based Style Transfer Model for Offensive Social Media Conversations” – (Atwell, Hassan, and Alikhani 2022)
🍵 Break (~5 min)
Step 02 - Mimicry & personalised data 🎎 (15m)
In addition to perfect duplication – by digitisation of the real-world substrate, an AI model is able to incorporate a style, a tone, a “mental” world model, or even a whole personality . AI can be considered a new form of medium, where data can be copied, it can be translated and it can also be “coloured”.
Discussion (15m)
- “How good
is realityare deepfakes?” - “Opportunities for the future: ”How far can this go?”?”
Background:
Kyle Wiggers, TechCrunch, 29 Mar 2024 “OpenAI built a voice cloning tool, but you can’t use it… yet” – (Kyle Wiggers n.d.)
Kyle Wiggers, TechCrunch, 19 Nov 2024 “Microsoft will soon let you clone your voice for Teams meetings” – (Wiggers 2024)
BBC News, 17 Nov 2024 “Sir David Attenborough says AI clone of his voice is ‘disturbing’ - BBC News” – (BBC News 2024) – 👈 focus
Kate Berry, BBC news (Radio 5 Live), 24 Nov 2024 “I was scammed out of £75k by Martin Lewis deepfake advert: – (Kate Berry 2024)
Step 03 - Trust & degradation of the information environment 🎳 🚱 (15m)
AI is being used to filter events from the past, to present the news on current events, and also to construct new media for mass consumption.
Discussion (15m)
- “What do you consider to be acceptable use of AI in presenting documentary events?”
Background:
Luke Taylor, Guardian Online, 02 May 2023 “Amnesty International criticised for using AI-generated images” – (Luke Taylor n.d.) 👈 - focus
Mateusz Łabuz & Christopher Nehring, Springer Nature, 26 Apr 2024 “On the way to deep fake democracy? Deep fakes in election campaigns in 2023” – (Łabuz and Nehring 2024) 👈 - focus
Kolorize “Old Times in True Colors. Colorize Every Photo in Shine” – (Kolorize n.d.)
BBC Click, 15 Oct 2018 “Peter Jackson colourises World War One footage - BBC Click” – (BBC Click n.d.)
Mia Galuppo, The Hollywood reporter, 27 Nov 2023“Doc Producers Call for Generative AI Guardrails in Open Letter” - (Galuppo 2023)
Archival Producers Alliance, 2023 “Generative AI Inititiative” – (Jennifer Baichwal et al. n.d.)
Step 04 - Psychology & behaviour 🎭 (15m)
How will this situation develop to impact both the psychology of the individuals and of the behaviour of society?
Discussion (15m)
- “What happens when your world view is taken over by technology?”
Background:
- OpenAI, 15 February 2024 “OpenAI Sora All Example Videos” 👈 - focus – (Magna AI 2024)