[quote_box_left author=”From ‘The Hitchhicker’s Guide to the Galaxy’ (Film, 2005)”]Marvin the Paranoid Android: You can blame the Sirius Cybernetics Corporation for making androids with GPP…
Arthur: Um… what’s GPP?
Marvin: Genuine People Personalities. I’m a personality prototype. You can tell, can’t you…?[/quote_box_left]
What sci-fi future is complete without interactive robots? Whether it’s Arthur C Clark’s HAL 9000 or Marvin the Paranoid Android, in art we create a world where human intelligence and artificial intelligence walk side-by-side.
Meanwhile, in real life, scientists and engineers are building machines that we can talk to, ask questions of, and request actions from. The aim has been to create the most perfect artificial companion possible, but it turns out there is such a thing as too perfect when it comes to human-robot relationships.
A study at the University of Lincoln, UK, shows that people prefer interacting with robots if they display human-like imperfections – deviations in judgement that give the appearance of individual characteristics or personality, complete with errors and flaws. The findings were presented at the International Conference on Intelligent Robots and Systems in Hamburg, this month.
The researchers introduced two flaws to the robot’s bank of rules and behaviours – “misattribution of memory”, and “empathy gap”. They used two interactive robots: ERWIN, which can express five basic emotions, and Keepon, a small yellow robot designed to study social development by interacting with children.
For around half of the interactions with human participants, ERWIN and Keepon were allowed to be the picture of perfection. For the remainder, ERWIN made mistakes when remembering simple facts, and Keepon expressed extreme happiness or sadness.
Participants rated their experiences, and almost all had a more enjoyable interaction with the robots when they made mistakes.
If these companion robots are to make an impact in care of the elderly or to support children with autism, Asperger syndrome, or attachment disorder – the main application of this technology – then this study shows that they need to be friendly, recognise the emotions and needs of their human companion, and act accordingly.
Featured Image courtesy of University of Lincoln.
We need your help to keep speaking the truth
Every story that you have come to us with; each injustice you have asked us to investigate; every campaign we have fought; each of your unheard voices we amplified; we do this for you. We are making a difference on your behalf.
Our fight is your fight. You’ve supported our collective struggle every time you gave us a like; and every time you shared our work across social media. Now we need you to support us with a monthly donation.
We have published nearly 2,000 articles and over 50 films in 2021. And we want to do this and more in 2022 but we don’t have enough money to go on at this pace. So, if you value our work and want us to continue then please join us and be part of The Canary family.
In return, you get:
* Advert free reading experience
* Quarterly group video call with the Editor-in-Chief
* Behind the scenes monthly e-newsletter
* 20% discount in our shop
Almost all of our spending goes to the people who make The Canary’s content. So your contribution directly supports our writers and enables us to continue to do what we do: speaking truth, powered by you. We have weathered many attempts to shut us down and silence our vital opposition to an increasingly fascist government and right-wing mainstream media.
With your help we can continue:
* Holding political and state power to account
* Advocating for the people the system marginalises
* Being a media outlet that upholds the highest standards
* Campaigning on the issues others won’t
* Putting your lives central to everything we do
We are a drop of truth in an ocean of deceit. But we can’t do this without your support. So please, can you help us continue the fight?