Tuesday, 23 June 2015

Asking the Big Questions

Can society ever be fixed? That’s the central question in the two TV shows I’m watching: Humans on Channel 4 and Psycho-Pass on Netflix. In terms of format, neither of these shows have anything in common, Humans is a British drama set  in the not-so-distant future, while Psycho-Pass is a Japanese anime set in a dystopian future.


Humans takes place in a world very much like ours, but with one small addition: there exists a type of Artificial Intelligence called Synths. They look human, they act human; and they do your washing, your shopping, and look after your kids. But it’s ok, because they can’t feel. Psycho-Pass is about a future world where the overarching Sibyl System is in control, where everyone has a Psycho-Pass, which measures everyone’s mental health, and gives them a “Criminal Coefficient”, if you’re above a certain value you are a “latent criminal”, that is you have the potential to commit a crime and you’re a risk to society. It’s then the law enforcement job to neutralise latent criminals before they commit a crime for the protection of society.

But what happens when you get a Synth that can feel? And what happens when you look too closely at a “perfect system”? Both shows examine how our society works, how we interact with other people, and can society ever “be fixed”?

If we introduce Artificial Intelligence that can do everything for us, does it give us more time to be useful, or does it take away our uses? In Humans Synths initially seem like a good idea, they’re basically like servants without the pay and human flaw. But we soon uncover the unsettling effect of their presence. Causing rifts in marriages, people bonding with them like parents and child. And as we follow the journey of a set of Synths who can feel, everything takes a new spin. As Niska, a feeling Synth, yells to a Synth brothel owner “Everything your men do to me, they want to do to you”.  Clearly Synths haven’t fixed society, they’ve just alleviated some of the damaging symptoms. Do the blurred lines between what is human and what we perceive as human matter?

In Psycho-Pass, the idea is that society is already fixed. The Sibyl system is optimising everyone’s happiness; it tells you what’s the best job for you, how to keep your mental health stable, but most importantly it identifies who’s the risk to this perfect society. Is it right to enforce controls upon people who are a threat to society even if they haven’t committed a crime? Is that more important than justice? A quandary that is presented in the first episode is around victims, who get emotionally traumatised from attacked, so much that their Psycho-Pass identifies them as a risk. Can we use logical machines to evaluate humanity?


I realised I’ve just asked a lot of questions in this, which I guess is why I enjoy these shows so much. I doubt we’re going to get artificial intelligence to the point in either of these shows, but it’s always good to ask “What If?”.

No comments:

Post a Comment