Amazon Echo And Google Home Should Have Morality Software
Boffins believe that smart assistants like Amazon's Echo and Google's Home should come with a "moral AI" to report their owners for violations.
Academics at the University of Bergen, Norway, touted the idea at the ACM conference on Artificial Intelligence, Ethics and Society in Hawaii.
Marija Slavkovik, associate professor the department of information science and media studies, led the research.
Leon van der Torre and Beishui Liao, professors at the University of Luxembourg and Zhejiang University respectively, also took part.
Dr Slavkovik suggested that digital assistants should possess an ethical awareness that simultaneously represents both the owner and the authorities - or, in the case of a minor, their parents.
Devices would then have an internal 'discussion' about suspect behaviour, weighing up conflicting demands between the law and personal freedoms, before arriving at the 'best' course of action.
I'm sure that many of you recall the Morality Device from the 1993 movie Demolition Man starring Sylvester Stallone, Sandra Bullock and Wesley Snipes.
(Verbal morality device from Demolition Man 1993)
This device should be easy as pie to implement, since both Google and Amazon have your credit card information; any lapses could be debited automatically.
AI Note-Taking From Google Meet
'... the new typewriter that could be talked to, and which transposed the spoken sound into typed words.' - Dr. David H. Keller, 1934.
Technovelgy (that's tech-novel-gee!)
is devoted to the creative science inventions and ideas of sf authors. Look for
the Invention Category that interests
you, the Glossary, the Invention
Timeline, or see what's New.
A System To Defeat AI Face Recognition
'...points and patches of light... sliding all over their faces in a programmed manner that had been designed to foil facial recognition systems.'
Smart TVs Are Listening!
'You had to live -- did live, from habit that became instinct -- in the assumption that every sound you made was overheard...'