There are parallel visions of the future of AI and society. In one, AI is embraced and embedded in everything we do, normalised as part of our lives across industries. That’s the vision that is most promoted by governments and big industry. The other vision is one where AI is rejected or used in limited and defined ways.

It’s the 25th anniversary of the Lord of the Rings (LOTR) Trilogy. But what does that have to do with AI?

Peter Jackson’s triumphant adaption of J. R. R. Tolkien’s books is a great case study for reflecting on technology.

When the movies were made, a plethora of CGI tools were readily available and very popular with studio executives. In fact, many a film was butchered to hell and back with ropey special effects (the Scorpion King comes to mind). In the push for the flashiest technology to be used, as a marketing tool and just because it was cool, many directors were influenced to rely more on CGI than practical effects.

Movie still from The Scorpion King showing male with long brown hair and a pixelated scorpion body. Source:
Screenshot showing the Scorpion King from The Mummy Returns (2001), one the most lambasted of CGI efforts contemporaneous to The Lord of the Rings: The Fellowship of the Ring (2001)

In LOTR, the director had a lot of control and permission from the studio to do things his way. Yes, he used CGI, but he used it sparingly for specific good reason. Mostly, however, he relied on inventive traditional techniques, like forced perspective and large-scale miniature sets. This brought authenticity and depth to his stories and has ensured that the films stand the test of time.

two seated male figures on a wooden cart, one is small, one is large with a beard and hat and smoking a pipe
Screenshot from Lord of the Rings showing forced perspective practical effects.

The real lesson here is that these films could and would never be made like this again, even with the same director. The movie industry doesn’t work like this anymore. The entire eco-system is now reliant on CGI as a normal. This means that practical effects-based productions are near-impossible to get off the ground (Dune is a recent partial-exception due to the prestige placed on Denis Villeneuve’s directing).

As ever, movie studios are concerned with budgets and bottom-lines. Big blockbusters that requir years of planning and complex production are riskier than CGI-based ones. None of the studios want to risk betting on practical creativity when they can have digital immediacy.

At least that’s the idea. Some studios bet big on AI with variable results (such as polarised views in the wake of AI-cloned voices and actors; and controversial director Darren Aronofsky’s AI studio’s productions of the American Revolution).

A recent article from the heart of US filmmaking, shares the experiences LA creatives: “For many of the crew members and craftspeople … AI doesn’t feel like an innovation. It feels like a new way to justify doing more with less, only to end up with work that’s less original or creative”.

AI offers tools to make some parts of filmmaking more efficient, but the shift towards CGI reliance also means the skills for practical visual effects are dying out and “It’s a fundamental change in what it means to create”.

Even the perennial technology-limit-pushing efforts of James Cameron is judicious in his use of technology, taking a more measured approach that never replaces human-creative input with AI-generated content stating: “We don’t need AI. We’ve got meat-I. And I’m one of the meat-artists that come up with all that stuff. We don’t need a computer. Maybe other people need it. We don’t.”

So what does this mean for us and for education?

There will always be technology optimists and pessimists, and we all have to learn to live together. More importantly, we need to collectively shape the future that is the best for everyone, harnessing the best that technology can offer without compromising education.

I find it lamentable that some aspects of creative skill and expression are tangibly dented by AI. These are real, verifiable and tragic stories. At the same time, I find the applications of AI technologies interesting and full of potential. We are in a similar position in education spheres where the potential attracts us, but the risks make us take a step back.

We are also at very different stages of AI-engagement across our education community which can cause interpersonal sparks to fly. The eco-system of the film industry changed permanently as a result of CGI, and AI is doing the same to most every industry right now. How impactful that is, what that means, and what values we hold on to during this seismic change stays up to us.

The case study above also gives us pause to reflect on the impacts on the realities of the world of graduate employment. The UK is the most negatively impacted of any nation due to AI with job opportunities markedly lower than other comparable economies. How does this change how we prepare our students for the future?

Altogether, we have a lot of choice in our response and a lot of responsibility lands on our shoulders to chart a course through murky waters, ones muddied by the ethical complexities of AI and the ramifications of technological adoption and application.

Next steps

Thankfully, we do have lots of guidance on how to tackle these thorny issues. Take some time to explore our brand new AI & Education training module available via Develop. This course equips you to respond thoughtfully, ethically, and creatively to AI. The focus of this course is on generative AI (GenAI) and includes an optional section covering prompt engineering. 

This course will help you:

  • Design inclusive and equitable learning experiences in an AI enabled environment.
  • Confidently review and adapt assessments to ensure academic integrity, fairness and skill development.
  • Reflect on AI literacy and student skills development, grounded in the Bristol Skills Profile.
  • Draw inspiration from real case studies, showcasing effective AI related practice across the university.

It is designed for:

  • Academic staff at all career stages.
  • Educational developers, librarians, learning technologists, and student facing support teams.
  • Programme directors, assessment leads, and curriculum designers.

There are several downloadable associated with the course, including a reflective workbook, prompt examples for everyday tasks, example rubrics and assessment briefs, and a template of AI use declaration. 

This course forms the first part of the University’s AI & Education training pathway. The second part builds on your reflections from this module and applies them through in‑person, faculty-led training, designed to strengthen the subject-specific application of your AI skills. Please look out for communications from your faculty with details on how to register for the follow‑on training. Take your completed reflective workbook to these in-person sessions. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending

Discover more from Bristol Institute for Learning and Teaching

Subscribe now to keep reading and get access to the full archive.

Continue reading