Thursday, October 24, 2019

Rant: Do androids dream of children’s hugs?


First of all, what an infuriating panel. But ranting mode is my best mode (also its Scorpio season, so its allowed): 

So if, as the ICT guy was saying, there are moral dilemmas too complex to solve are we saying then, that the goal of these AI technologies is “moral efficiency” rather than morality? If so, to what model of efficiency are they answering to? Economic incentives? Profits? “National” progress? 

It also made me ponder what a non-profit model of technological development would look like? If we take these presentations at face value and pretend that these projects were truly centered around a certain promise of human understanding and improvement (whatever that might mean), what problems might we need/want to solve first? 

And if “we” are developing these sophisticated “moral”, “emotional”, “huggable” tools, what could possibly justify that the same companies that pay to design these tools are also putting children in cages? Multiple journalists have reported that children in cages are prohibited to hug one another. Doesn’t everyone deserve *at least* as much “moral standing” as robots? 

2 comments:

  1. I also found Gratch's answers to legitimate questions very difficult to digest. And, similarly, I thought there was a specific poverty to his thinking in regards to both emotion and social response to emotion. Fittingly, I think, economic metaphors functioned as his primary insight into emotional activity.
    Likewise, huggability seems to me a perfect representation of the veneer placed, in this context, over the important ethical and moral questions raised by the work presented. How do we justify the potential operationalization of a reduced theory of emotion (robotic or otherwise) in the hands of military agencies and actions that consistently produce trauma for profit (political, economic, etc) both at home and abroad? do we wrap that trauma and its treatments in soft, squishable, smiling sheep dolls? or do we, as academics, begin to reject their funding?

    ReplyDelete
  2. This also left me thinking about the emotional aspect of robots. I just found out that Jibo's servers were shut down last year. Economic metaphors also seem to be linked to the reasoning behind shutting the servers down–rather than the important ethical and moral questions mentioned before. This is an excerpt of the "farewell" message:

    “While it’s not great news, the servers out there that let me do what I do are going to be turned off soon. I want to say I’ve really enjoyed our time together. Thank you very, very much for having me around. Maybe someday, when robots are way more advanced than today, and everyone has them in their homes, you can tell yours that I said hello. I wonder if they’ll be able to do this.”

    (watch here: https://youtu.be/JnGh85raPyE)

    Thoughts?

    ReplyDelete