The Prey of Gods (2017): In theory, robots can hold grudges, and in this book, boy do they!
Robots: service ‘bots, human-robot interaction, emotions, software architecture
Recommendation: Take a break from reading dark, ponderous sci-fi and enjoy a sci-fi/fantasy hybrid that has as many memorable characters and funny scenes as Avengers: Infinity Wars. And remember to be nice to robots!
A book about viral infections, genetic engineering, robots, and gender bending—sounds like Autonomous, right? What if I told you it was also about South African gods and their descendants in Capetown- now it sounds like American Gods or Anansi Boys. Except it is isn’t Newlitz or Gaiman, it’s Nicky Drayden and her fun romp The Prey of Gods.
In a near future everyone except the poorest person has a personal assistant robot: a one meter high, multi-legged robot that functions as a smart phone, laptop computer, and backpack. The cheapest versions are the durable alphas. Like the minions in Despicable Me, the mono-camera alphas just want to do a good job for their master. Of course, teenage boys being teenaged boys may be either too sentimental about their bot or a bit rough and demanding - either of which could impact saving the world.
One of the many points of view juggled in The Prey of Gods is that of alpha Clever 4-1. Clever 4-1 is the classic sidekick: it’s loyal to fault, highly ethical, and contributes humor and pathos to the many storyline. Clever 4-1 isn’t the first loyal sidekick robot, that honor goes to Robby the Robot in Forbidden Planet (1956). And we’ve certainly already seen realistic robots who have bonded with their master - Teddy in AI: Artificial Intelligence and Robot in Robot and Frank.
While Clever 4-1 is the loyal sidekick, Clever 4-1-1 is the “hell hath no fury like a robot scorned” bot, an antagonist to Clever 4-1’s protagonist.
So would a robot keep score and hold grudges? Well, it is plausible. One approach to building intelligent robots that has been explored since the 1980s is to embed emotions into the basic software architecture (see software architectures in my review of Sea of Rust). Emotions in biological systems are simple mechanisms to regulate control without having to think about it (more technically, react versus deliberate). If a behavior is not working or achieving its goals, it generates what we’d call frustration or a negative valence. As the frustration builds, the animal generally tries the same thing harder. It might adjust the gains and parameters on the behavior and other behaviors, like growling at nearby relatives who could be distracting it. Note that this so far hasn’t required any reasoning, just reacting to stimulus. Which is why emotions are so hard for us to control - they really are primal. If the frustration continues, the animal may abort the unsuccessful behavior and try something else- an alternative or redundant behavior. This requires a very low level of deliberation, usually just the “implementation” or scheduling function. If the alternative doesn’t work, then the animal either has to abandon the goal (e.g., times out) or reason about what to do. In summary, emotions are not something that can be successfully added to a robot at the very end as part of its interactive layer to fake it being more socially responsive.
That explains emotions, but what about grudges? If the robot is programmed to associate success or failure with reaching goals, executing behaviors, and any of the parameters of the behavior to reach that goal, it may learn that the behavior doesn’t work well in the presence of X or location L. X or L are now something to be avoided, they begin to have a negative semantic association. The robot could appear to be scared of areas of the office that have large glass walls, because over time it has learned that the glass interferes with correct interpretation of its sensors. Or more likely that just it doesn’t work as well in that situation, without a generating a reason - that would be the purview of an advanced deliberative set of capabilities called fault detection, identification, and recovery.
So it is extremely possible that a robot could prefer certain people and avoid others, just like how people preferring some books to others. If feel you have read one too many books with multiple points of view and threads that seem disconnected but will probably converge, then you probably won’t like The Prey of Gods and you’d be better off reading a more linearly plotted book. If you like imaginative genre mash-ups, then The Prey of Gods is for you. By the way, the audible version is spectacular, with the narrator capturing the different accents of the characters and even providing a non-annoying robot voice.
A piece of advice: regardless of whether you read The Prey of Gods or not, be nice to your robot sidekick!
- Robin
For a video review of 'The Prey of Gods', head over to the official RTSF YouTube channel or simply click below...