mathjax

Thursday, February 5, 2015

If you take your guidance from a 1000 year old book, then admit you're a human robot.



This is Asimo, Honda's generational attempt to make a human-locomotive controllable robot, for a number of reasons one of the original reasons being toilet cleaning. He boots up from power-down into semi-autonomous mode.  Meaning he can accept commands from a person or an instruction list. He has a limited ability to think on his own but will cede to human direction.  He doesn't have the ability to be fully autonomous, in which case he writes his own instruction list.

People have a choice to be fully autonomous or semi-autonomous. Meaning they can choose to think for themselves or they can take their instructions from someone else /something else.

If that something else is a book, an ancient religious book, then you aren't fully autonomous anymore - you are now semi-autonomous. Your program and therefore your actions are restricted by what those instructions limit.
 
People who proudly declare that they take their marching orders from an ancient book, should really be comfortable with people calling them semi-autonomous. Essentially, you are a human robot.