Documentary producers make guidelines for AI use



Generative video is growing rapidly. OpenAI’s Sora revealed its potential earlier this year, and Adobe is quickly incorporating gen-AI into video editing apps like Premiere Pro. At this point, the creative landscape of the future is looking pretty wild.

However, with that potential comes alarm, as we realise that certain places should be free from AI or at least have strict rules about when and how it can be used. That’s exactly why the Archival Producers Alliance (APA) banded together recently to create just that: a template for AI use in documentary settings.

“We recognize that AI is here, and it is here to stay. And we recognize that it brings with it potential for amazing creative opportunities,” APA co-founder Jennifer Petrucelli said at the IDA Getting Real event.

“At the same time, we want to really encourage people to take a collective breath and move forward with thoughtfulness and intention as we begin to navigate this new and rapidly changing landscape,” she added.

A rough guide

The guidelines are a 9-page document and are currently in draft form. The group intends to formally publish them in June and describes them as recommendations on the ethical use of AI, transparency, and preserving truth rather than hard and fast rules.

The crux of the guide states that primary sources of original images and video footage should come first in order to promote truth and trust.

They believe that AI can be used as a tool to enhance visuals but not to create new ones, drastically alter a primary source, or “change their meaning in ways that could mislead the audience.”

Finally, the guidelines advocate for transparency so that viewers know that what they are watching is AI-generated. It would be similar, I would imagine, to having an actor speak for a subject of a documentary or to re-enact certain scenes. These things are always communicated, and in this respect I don’t see AI use being any different from that.

One controversial use of AI might be generating somebody’s voice and having them say words that never happened or realistically creating ‘historical’ events that never took place. Examples of the Anthony Bourdain documentary “Roadrunner” were given where an AI voice is heard speaking. Viewers are not made aware of the fact that this isn’t Bourdain’s real voice.

When it comes to AI, the potential for misuse and even unintentionally misleading viewers is immense. The rapid advancement of the technology has left creators grappling for handholds. Thoughtful guidelines such as these will surely help the industry stay accountable.

[via indie wire]

We will be happy to hear your thoughts

Leave a reply

Bean town discount store
Logo
Compare items
  • Total (0)
Compare
0
Shopping cart