I can’t entirely agree with emotional robots’ production, for they might create ethical conflict with humans. A survey in EU nations indicated that the public has a negative attitude towards emotional robots and recommends that they not be used for the disabled, elderly, and children (Wachsmuth). Though the machines may demonstrate emotions, such people need a human connection for their well-being; therefore, emotional robots are unethical. Besides, human emotions are accompanied by non-verbal cues that machines might not demonstrate.
I oppose the use of emotional robots as they will create a conflict of social rights. The primary purpose of robots is aiding humans. Producing emotional machines would mean they have a social status and rights, which would degrade humanity (Hewes). Consequently, machines with specific desires and emotions would deserve a particular position in the social structure. This change will cost society more as it must provide space for emotional activities such as leisure. The idea degrades human dignity, for it would erase the difference between humans and robots. Moreover, if they are granted this right, there is no if confidence they will govern themselves. Robots still need human programming and maintenance.
Despite the level of technology, there is no guarantee that robots will fully replace humans. Therefore, the use of emotional machines remains controversial. The target is to use them on lonely people such as the elderly. However, they do not retain the authenticity of human relations and might not affect human contact. Wachsmuth affirms that emotional robots are delusional as they deceive the elderly that they have a similar capacity to humans. Though lonely people might get an improvement in well-being, this technology denies them human dignity.