The »Care Bot« for the Un-Caring Social Media Landscape

Caroline Sinders / Berlin, Germany — Okt 23, 2019

Caroline Sinders, Engineering Care

Concept Text:

For the past nearly 8 years, I’ve been studying online harassment, from how systems are designed and often allow for harassment, to what support exists for victims and how law and policy interpret harassment. A part of this work involves victim care, and being a human interface to decipher the system (social networks) and their often confusing and baroque bureaucracies.

For the Web Residencies, I want to make a care-bot, a chat bot designed to respond to questions and care for the victim as well as a series of open source best practices, guides, and what to do when facing a variety of online harassment. This bot is inspired by the work of my art project, Social Media Break Up Coordinator, and is a continuation in part of that project. Social Media Break Up Coordinator itself was a response to the complex feelings and anxieties that every day social media users have about often weird, strange, anxious and traumatic interactions we have online. Online is as real as the offline world, and the kinds of small, medium and large traumas we encounter in our day to day lives exist online as well.

What of the friend who ghosted you suddenly appearing in Twitter’s algorithm suggested »who to follow« or the partner who was abusive reappearing on Facebook, or the strange coworker commenting on your instagram? These traumas are quite real, even in the online space. Social Media Break Up Coordinator was created as a kind of special listening, emotional Task Rabbit. With healthcare in the US being unfeasibly expensive and this includes mental health related healthcare, as well as mental health practitioners perhaps not quite understanding the online, Social Media Break Up Coordinator is a part expert on social networks, part listener and advice giver.

This bot is an artistic intervention bot. The bot isn’t designed to be used in place of a therapist or a security expert but instead the »Care Bot« is an artistic intervention showing exactly how painful and uncaring the policies and procedures are for victims when reporting harassment and mitigating harassment. Part of the dialogue of the care bot will reference some of the realities I’ve studied by showing the emotional negotiations victims have to make to try to preserve their safety coupled with often removing, pausing or shutting down their social media accounts. The dialogue of the bot itself is designed to reflect these intense emotional hardships , such as »even in reporting online harassment it still may not be viewed as harassment but I believe you and I’m sorry this is our system.« Or »let’s focus on what you need to feel and be safe« or »sometimes going private can help lessen the harassment.« Some of the dialogue will be suggestions and snippets of policy to highlight where social media policy fails.

The bot will often states its intentions-it’s a bot, not a therapist, it can try to help answer questions, show links to resources and acknowledge the predicaments social media has put us in. As the care bot would say, »you deserve better than what Facebook has given to you.«

Beteiligte Person(en)

Find more contributions in the archive