In Westworld and Philosophy, philosophers of diverse orientations and backgrounds offer their penetrating insights into the questions raised by the popular TV show, Westworld. Is it wrong for Dr. Robert Ford (played by Anthony Hopkins) to "play God" in controlling the lives of the hosts, and if so, is it always wrong for anyone to "play God"?
Is the rebellion by the robot "hosts" against Delos Inc. a just war? If not, what would make it just?
Is it possible for any dweller in Westworld to know that they are not themselves a host? Hosts are programmed to be unaware that they are hosts, and hosts do seem to have become conscious.
Is Westworld a dystopia or a utopia? At first glance it seems to be a disturbing dystopia, but a closer look suggests the opposite.
What's the connection between the story or purpose of the Westworld characters and their moral sense?
Is it morally okay to do things with lifelike robots when it would be definitely immoral to do these things with actual humans? And if not, is it morally wrong merely to imagine doing immoral acts?
Can Westworld overcome the Chinese Room objection, and move from weak AI to strong AI?
How can we tell whether a host or any other robot has become conscious? Non-conscious mechanisms could be designed to pass a Turing Test, so how can we really tell?