Monday, August 21, 2017
facebook

google plus
Trends

Posted at: Aug 12, 2017, 1:56 AM; last updated: Aug 12, 2017, 1:56 AM (IST)

Blade Runner, autoencoded

The strange film that sums up our fears of AI and the future
Blade Runner, autoencoded

Andrew Griffin

Terence Broad’s Blade Runner sometimes looks a lot like the classic 1982 film. Sometimes it looks completely different.

His autoencoded version of Blade Runner is the film as a computer sees it — or, more specifically, as a computer sees it, remembers it, and then regurgitates it.

The film is being shown as part of the Barbican’s science fiction exhibition-meets-festival, Into The Unknown. And it’s perhaps the most cutting edge of all the work featured there — not only being about science fiction, but being created in a way that sounds like it comes straight out of the work of Philip K Dick.

Broad’s project works by analogy with memory, and uses cutting edge artificial intelligence to do so. It uses an autoencoder — that encodes a big data sample, in this case individual frames of films, into a tiny representation of itself which it can then reconstruct later on. When it does so, a great deal has been lost in the shrinking. But strange things can be found in that reconstruction, too — the technology looks to make up for what it can’t remember by filling in the gaps.

“The reconstructions are in no way perfect, but the project was more of a creative exploration of both the capacity and limitations of this approach,” wrote Broad in an introduction to his work that would later go viral.

In that way, it seems remarkably and uncannily similar to human memory. It shrinks everything down and stores it away, so that it can be opened back up and relived with the gaps filled in at a later date.

And the idea was inspired by strange experiments with humans, too. One of the first inspirations was a talk by scientists who had managed to make an MRI machine reconstruct things that people were looking at, simply by looking at the patterns showing in their brains. It could literally see through other people’s eyes, by looking right into their heads.

But all of those human inspirations and influences are taken and turned into a work that is undeniably technological. If the MRI experiment showed us what people are watching from inside their heads, the autoencoded Blade Runner almost allows us to watch how a computer sees, peering inside its own brain in the same way.

It definitely remembers in a different way to how humans do. It’s terrible at recalling and reconstituting faces, for instance, and can’t recognise that the same face belongs to the same person and so needs to move in a straight line. And it also appears to find it impossible to remember a black frame; because there are so few in the film, there’s no point storing the black and instead remembers it as an average of all the other parts of the film, throwing out a beautiful but decidedly unblank green image. 

For now the limits of the project are where the interest is found, and the imperfections of the reconstruction make it a work of art. But theoretically computers could eventually become perfect at the work — watching, shrinking and then reconstituting the film as it actually is.

It all sounds eerily like a question that would plague the noir world of Blade Runner. But Blade Runner wasn’t always the aim. The film began as a project for a university course, and required learning techniques that are at the very forefront of AI and visual technology.

“Originally it was really just an experiment; the whole thing started out as a research project,” says Broad. “For a long time I was training it on these videos of really long train journeys. After a couple of months I got really bored and thought it would be interesting to do it with Blade Runner.”

The choice of film happened by a kind of intentional coincidence but fits perfectly into the film because the themes seem to mesh so well. 

Blade Runner explores the edge of artificial intelligence, the beginnings of humanity and how to know the difference between the two; the autoencoded Blade Runner does the same thing but with the film itself. “I’d always had the idea [of Blade Runner] in the back of my mind. But I didn’t think it would work. But then as soon as we did it we saw that it obviously should be the sole focus for the project.”

Because the computer processes things over time — and takes a while to do it — the process of actually finding that it would work well was one that revealed itself gradually.

“When you’re training it, you would give it a batch of images of random frames. Then it would start giving you the output. So I was just looking at this output while it was training.

“I saw this image and saw you could recognise some of the scenes. But this was a really small resolution. So we saw this and then it was like, right we need to kind of do this in order and remake the video.

“Then we did a little 10 minute sample. And it was kind of mind blowing, for me and my supervisor. I’ve got the original 10 minute — it’s really noisy and really grainy. You can see what’s going on and it’s kind of mind-blowing. Then I thought — let’s just remake Blade Runner, the whole thing.”

That decision put the film squarely in the realm of science fiction — a decision that would see it sit among the greatest work of the genre in the Barbican exhibition. Not simply because it took such a seminal science fiction film, but also because it was a kind of science fiction itself, using brand new techniques to reprocess a film in a way that would be unimaginable and inexplicable to people even 10 or 20 years ago.

— The Independent

COMMENTS

All readers are invited to post comments responsibly. Any messages with foul language or inciting hatred will be deleted. Comments with all capital letters will also be deleted. Readers are encouraged to flag the comments they feel are inappropriate.
The views expressed in the Comments section are of the individuals writing the post. The Tribune does not endorse or support the views in these posts in any manner.
Share On