⭐ Lab Preparation

2025/26 Autumn Term

Author

The DS101A Team

Published

11 December 2024

🗺️ Context

AI, as a technology, came sharply to the public awareness when ChatGPT was publicly released on November 30th 2022. And the hype has only grown ever since. What kind of future will this technology, which is, in fact much older than ChatGPT and much more diverse than simply LLMs, bring about? How will it change our societies?

📖 Lab Preparation

In preparation for this week’s case study, on the future of AI, please take a look at the following articles and video footage:

Readings + Videos

Building Stochastic Parrots v.s Machines that Understand

⚖️ Regulations (to come)

State of AI development

AI and the future of jobs

Visions

Dystopia: right now

Mundane

🪖 Military

Dystopia: horizon

Utopia: Medicine

AlphaFold2

References

Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. 2021. “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜.” In, 610–23. Virtual Event Canada: ACM. https://doi.org/10.1145/3442188.3445922.
Booth, Robert. 2025a. “Minister Indicates Sympathy for Artists in Debate over AI and Copyright.” The Guardian, November. https://www.theguardian.com/business/2025/nov/23/minister-indicates-sympathy-for-artists-in-debate-over-ai-and-copyright-liz-kendall.
———. 2025b. The Biggest Decision yet’: Jared Kaplan on Allowing AI to Train Itself.” The Guardian, December. https://www.theguardian.com/technology/ng-interactive/2025/dec/02/jared-kaplan-artificial-intelligence-train-itself.
Booth, Robert, Harry Fischer, Alessia Amitrano, and Tara Herman. 2025. It’s Going Much Too Fast’: The Inside Story of the Race to Create the Ultimate AI.” The Guardian, December. https://www.theguardian.com/technology/ng-interactive/2025/dec/01/its-going-much-too-fast-the-inside-story-of-the-race-to-create-the-ultimate-ai.
Chiang, Ted. 2023. ChatGPT Is a Blurry JPEG of the Web.” The New Yorker, February. https://www.newyorker.com/tech/annals-of-technology/chatgpt-is-a-blurry-jpeg-of-the-web.
Courea, Eleni, and Kiran Stacey. 2025. UK Ministers Delay AI Regulation Amid Plans for More ‘Comprehensive’ Bill.” The Guardian, June. https://www.theguardian.com/technology/2025/jun/07/uk-ministers-delay-ai-regulation-amid-plans-for-more-comprehensive-bill.
Dodd, Vikram. 2025. “UK Creating ’Murder Prediction’ Tool to Identify People Most Likely to Kill: Exclusive: Algorithms Allegedly Being Used to Study Data of Thousands of People, in Project Critics Say Is ’Chilling and Dystopian’.” The Guardian. April 8, 2025. https://www.theguardian.com/uk-news/2025/apr/08/uk-creating-prediction-tool-to-identify-people-most-likely-to-kill.
Edwards, Benj. 2023. “Biden Issues Sweeping Executive Order That Touches AI Risk, Deepfakes, Privacy.” Ars Technica. https://arstechnica.com/information-technology/2023/10/biden-ai-executive-order-requires-safety-testing-for-ai-that-poses-serious-risk/.
———. 2024a. “Trump Allies Want to Make America First in AI with Sweeping Executive Order.” Ars Technica. https://arstechnica.com/information-technology/2024/07/trump-allies-want-to-make-america-first-in-ai-with-sweeping-executive-order/.
———. 2024b. “Trump Plans to Dismantle Biden AI Safeguards After Victory.” Ars Technica. https://arstechnica.com/ai/2024/11/trump-victory-signals-major-shakeup-for-us-ai-regulations/.
———. 2024c. “Soon, the Tech Behind ChatGPT May Help Drone Operators Decide Which Enemies to Kill.” Ars Technica. https://arstechnica.com/ai/2024/12/openai-and-anduril-team-up-to-build-ai-powered-drone-defense-systems/.
Gibney, Elizabeth. 2025. “’AI Models Are Capable of Novel Research’: OpenAI’s Chief Scientist on What to Expect.” Nature 641 (May): 830. https://doi.org/10.1038/d41586-025-01485-2.
Judge, Brian, Mark Nitzberg, and Stuart Russell. 2025. “When Code Isn’t Law: Rethinking Regulation for Artificial Intelligence.” Policy and Society 44 (1): 85–97. https://doi.org/10.1093/polsoc/puae020.
Latham & Watkins LLP. 2025. “California Assumes Role as Lead US Regulator of AI.” https://www.lw.com/en/insights/california-assumes-role-as-lead-us-regulator-of-ai.
Lord Holmes of Richmond. 2025. Artificial Intelligence (Regulation) Bill [HL]. Parliamentary Bill. UK Parliament. https://bills.parliament.uk/bills/3942.
Marcus, Gary. 2025. “When Billion-Dollar AIs Break down over Puzzles a Child Can Do, It’s Time to Rethink the Hype.” The Guardian. June 10, 2025. https://www.theguardian.com/commentisfree/2025/jun/10/billion-dollar-ai-puzzle-break-down.
Montgomery, Blake. 2025. “Europe Loosens Reins on AI – and US Takes Them Off.” The Guardian, November. https://www.theguardian.com/technology/2025/nov/24/us-europe-artificial-intelligence-ai.
Rankin, Jennifer. 2025. “European Commission Accused of ‘Massive Rollback’ of Digital Protections.” The Guardian, November. https://www.theguardian.com/world/2025/nov/19/european-commission-accused-of-massive-rollback-of-digital-protections.
Sacha Alanoca, and Maroussia Lévesque. 2025. “Don’t Be Fooled. The US Is Regulating AI – Just Not the Way You Think.” The Guardian, October. https://www.theguardian.com/commentisfree/2025/oct/23/us-artificial-intelligence-regulations.
Sanders, Bernie. 2025. AI Poses Unprecedented Threats. Congress Must Act Now.” The Guardian, December. https://www.theguardian.com/commentisfree/2025/dec/02/artificial-intelligence-threats-congress.
Seth, Anil. 2023. “Why Conscious AI Is a Bad, Bad Idea.” Nautilus. https://nautil.us/why-conscious-ai-is-a-bad-bad-idea-302937/.
Sharwood, Simon. 2024. “Palantir and Anduril Form Partnership, as Sauron Funded.” https://www.theregister.com/2024/12/09/palantir_anduril_alliance/.
Wang, Angie. 2023. “Is My Toddler a Stochastic Parrot?” The New Yorker. https://www.newyorker.com/humor/sketchbook/is-my-toddler-a-stochastic-parrot.
White & Case LLP. 2025. AI Watch: Global Regulatory Tracker.” https://www.whitecase.com/insight-our-thinking/ai-watch-global-regulatory-tracker.
Yoshua Bengio et al. 2023. “Pause Giant AI Experiments: An Open Letter.” Future of Life Institute. https://futureoflife.org/open-letter/pause-giant-ai-experiments/.