With every passing day, it appears there’s yet another headline-grabbing story about artificial intelligence. But whether it’s beating the best human players at ‘Go’, picking up racist language from social media (opens in new tab) or simply destroying humanity (opens in new tab), it’s perhaps not clear to see how AI will have a direct impact on our lives in the future.
So, with the merit of AI’s real-world applications still being debated, when are we likely to see its benefits? The answer is sooner than you think, but perhaps not in the way that you would expect.
Following Wimbledon 2017, IBM Watson generated highlight reels of what it thought were the best shots of the tournament. It’s hard to argue with the choices, and anyone that viewed these reels would be none-the-wiser that there was no human input.
Normally this would take a large team of editors a lengthy period of time to go through hundreds of hours of match footage. By the time you factor in multiple camera angles from 18 courts to get the best possible edit, you realise how big an undertaking this actually is.
However Watson could do all of this automatically. Using cognitive algorithms to analyse audio and video from the footage, it was then able to identify shots and points that were highlight worthy. For example, if the crowd roared the system could identify that the last point was important in the context of the match. These highlights were then bundled together in any number of combinations – whether these were the best shots of the day, a particular court or an individual player.
A tennis tournament is an excellent test of AI’s capabilities. At any given time there could be up to 18 matches in play – too much for any person to be able to follow. For tennis fans, watching a tournament on TV has meant being at the whim of the directors and editors that decide what we should be watching.
With more live footage available than ever before, the viewer still makes the choice on what tie to watch, and unless they are constantly changing channel, they may not witness the best of the action available. Meanwhile, the match of the century could be unfolding on an outside court and we have to make do with a replay later in the evening.
What if it were possible for an AI system to learn from what we liked? Our favourite players, the style of play that excited us, and if there was a break point in the final set. This is then delivered as a personalised broadcast, tailor made to our preferences. No filler, just the bits you want to see.
The truth is that we’re really not that far away from this.
The BBC has been at the forefront of some of broadcasting’s most important innovations. Right now, BBC R&D are working on something it calls object based media (opens in new tab). The BBC describes this as when a programme’s content is automatically edited to suit the needs of the individual viewer – how much time they have, what device they are viewing on or what they are most interested in.
The ‘objects’ are audio and video content that is broken down into smaller chunks, which could be a single scene, a soundbite or even a still image. Algorithms are then used to piece these objects together into packages that are delivered to viewers based on their preferences.
In an age when video on demand is challenging the dominance of scheduled broadcasting, this is the next stage in its evolution. Not just the content when you want it, but in the format that works best for you. Imagine condensing soap operas to focus in on a single storyline, or news reports giving greater levels of analysis if it directly affected you.
Teaching an old dog new tricks
Traditional broadcasters are losing viewers to the newcomers to the entertainment space like Netflix and Amazon. The seemingly limitless budgets of streamers consistently produce appealing, high-quality content, unhampered by the overheads of keeping a TV station running.
Broadcasters would appear to be at a significant disadvantage, but perhaps hold an ace up their sleeve in the form of their archives. AI could be used to breathe new life into old footage – re-cutting TV shows to appeal to modern audiences, and analysing viewing preferences to bring classic productions from yesteryear to new viewers.
There seems to be another must watch show on Netflix every week and it would be unreasonable to expect British broadcasters to match this level of output, especially considering how much is invested in news and other public interest broadcasting. Instead the largely untapped resource of archive shows could help win back eye-time from viewers and also be monetised around the world.
It’s true that we’re now only discovering the potential for AI in the entertainment space. But one question that is sure to be debated for some time is the creativity conundrum. Yes, a computer can learn how to edit together footage, but can it ever match a human in terms of creative or artistic output?
There are many examples (opens in new tab) of AI being used for artistic endeavours. However, for the time being at least, any reference points that an algorithm could perceive as ‘artistic’ or ‘creative’ would be based on human output. Therefore, any creative work that a computer system could produce is in some shape or form mimicking what has gone before it.
AI is already being used to write stories (opens in new tab), so it’s not hard to imagine complete animated features being created without any human input. Whether these are of high quality or not remains to be seen, and there will of course be a need for a human to review before they are released.
Human oversight is likely to be the dominant theme in relation to how AI is used in the media. For now at least, it’s likely that smart systems will be used to cut out routine tasks like editing, with production teams putting them to work to free up their time for other tasks.
Whichever way you look at it, in one shape or form, AI will be here to stay. It offers too many possibilities for content producers to ignore. As our viewing habits change, it’s the only feasible way to deliver the optimised experience and level of choice that we will demand.
The technology could even open the doors to a new breed of content creators. User generated content, could be given the polish of professional editing or post-production via AI systems – lowering the bar for entry to the media industry and giving viewers even more choice of things to watch.
Now, only if there was a system that could help me choose what to watch
Daniel Sacchelli, Events Director of BVE (opens in new tab)
Image Credit: Sergey Nivens / Shutterstock