The use of social robots in mental health: What are the ethical issues?

mental health chatbot

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email
Share on pinterest
Share on reddit

What are social robots?

In simple words, social robots are Artificial Intelligence (AI) based robotic systems which are designed with a purpose to support humans by means of social interaction, conversation, and mimicry of human behaviors. They can also be used to provide emotional companionship to humans whenever these humans feel lonely or depressed.

Additionally, Socially Assistive Robots (SAR’s) are clinical instruments used in the healthcare industry to facilitate healthcare professionals, enable better treatment of patients, and improve functional outcomes along with promoting higher social engagement in the mental health industry. Research has proven that SAR’s have clinically proven positive effects on patients’ mental health with outcomes like stress reduction, reduced aggression, increased patience, and a higher feeling of contentment.

The biggest advantage of SAR’s is that they can interact with humans both in public settings such as offices, schools, shopping malls, and restaurants, as well as personal settings like households and healthcare facilities. Based on this, several experiments have been conducted by experienced mental health professionals around the world on SAR’s robots. It has been proven that social robots can easily provide companionship to humans, listen to their emotions, and also mirror their feelings.

Contribution in providing mental health support

The contribution of social robots in providing strong support to mental health professionals is immense. There is a long list of the contributions of social robots in mental health. The biggest is they could help eliminate the negative effects of face–to–face therapy such as the perceived judgment and stigma associated with reaching out for mental health assistance. Well trained robots can be incapable of judgement or bias, leading to a more welcoming setting for counseling and conversations.

In addition to that, these robots provide additional support to patients by their pre-programmed self–guiding mental health tracking and risk monitoring services to enable an alert based system both for patients and professionals. Also, they make assistance for mental health issues available 24*7 to the client and provide round the clock access to the patient for the professional. To build a chain between the patient and the counsellor, social robots can monitor a patient’s progress, suggest courses of action in case of an emergency, analyze data to support counsellors in understanding their clients at a deeper level, and alert human professionals about any concerns in case of an emergency.

Along with this, AI-powered mental health chatbots are also gaining popularity. These virtual robots provide a cost-effective, 24*7, communicative experience to the user and make it easy for patients to share things which they would rather find difficult to communicate with a therapist or during an in-person appointment.

Ethical considerations while using social robots in mental health

Social robots can prove to be a great alternative in workload management for mental health professionals but there are also some ethical considerations which should be taken into account while using this technology in our mental health support systems. One of the biggest concerns is Data Privacy, which creates a perception of constant monitoring in people’s day – to – day lives. Additionally, as these robots are not regulated by law as medical devices, there could be some legal concerns faced by mental health professionals while using them for working with their patients.

Apart from this, there are several other ethical questions like: Who is in command and to be held accountable for the actions of the robot? There could be many people involved in this like mental health practitioners, or the robot manufacturer, or finally everyone involved in the entire process flow of using the robot.

Several other considerations like differentiation of human from non – human during treatment, issue of the patients’ emotional attachment with the robot, substitution of humans participation by the robots, and conclusive decision making are some of the other ethical considerations that should be taken into account by decision-makers.

In the end…

The use of social robots in the mental health industry can prove to be a breakthrough in the fight against a growing global mental health epidemic, and a lot of opportunities lie in this domain for the mental health industry. But, with very few global studies, smaller sample sizes, limited use, narrow target group and limited governmental permissions for use of social robots in the mental healthcare industry they still have a long way to go.

To make social robots more emotionally intelligent and user–friendly, engineers have been struggling hard to come up with innovative social robots that overcome these limitations and ethical considerations. One of the best examples for this can be ‘MARCo – The Mental Health Assisting Robotic Companion’. This robot has been developed with a motive to combine emotional intelligence and artificial intelligence for using social robots in providing solutions for several global mental health issues. 

Jacob Boyle serves as CEO and CTO of MARCo Technologies. As CEO, his primary functions include defining the company direction, managing contractors and employees, and spearheading outreach and customer discovery. As CTO, he has handled the development of MARCo, having designed and produced the existing MARCo units; coordinate with manufacturers; develop software for the online edition; and oversee continued R&D. He also served as the Technical Lead in the National Science Foundation’s I-Corps accelerator program on behalf of the company.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Are you looking to improve your mental health?

we have someone who can help

Cute robot therapist sitting in chair