About Care B0t

This project was written and designed by Caroline Sinders and programmed by Alex Fefegha of Comuzi.

This project came out of a specific place of studying harassment online, but also thinking about the reaches of where technology will take us.

The artistic call that produced this work was "Engineering Care", and it focused on what the future of outsourcing or automating care could look like.

The creation of the Care B0t is to serve a purpose- a quick and easy way to provide actionable feedback when platforms provide such little support for victims.

The project itself is a bit tongue in cheek though- an automated care bot or chat bot should not be the entity or thing a platform creates to interface with victims of online harassment. It really should be humans, or a human team, along side better digital tools to help create safety for victim's.

Those tools could be better ways to report harassment (such as grouping reports together, or adding to a report, or being able to contest the outcome of a report), better privacy modes (beyond just going private on your account), ways to moderate your own content (such as allowing a user to moderate or remove comments in replies to their posts, or turn off comments, etc).

There are so many more kinds of tools platforms could create to help victims. This bot exists to be helpful but as the artist, I want to very much highlight this bot shouldn't have to exist.

Remember what you’ve learned and take it out into the world, to help us build a society that uses data and AI to make everyone’s lives better.

Help should be easy to find for victims, help should be accessible and provided by platforms. Help shouldn't be outsourced or automated.

But in a time where help is expensive or unscalable, the future of victim support may be mechanized, it may be only thought in terms of scale, and in terms of cost. Thus, help, which is so necessary, may only be created if it's cheap and scalable.

The issue of cheapness, of affordability, and of scalability in terms of online content is what has put platforms in this place of predicament in the first place.

We see this directly with content moderation - that it's cheap, and that it must be quick so as to be scalable to keep up with looking at content.

But this is precisely the problem. Scaling up help, like victim support, or moderating and analyzing content, shouldn't necessarily be cheap. Human problems need human solutions.

In the words of Care B0t, "these systems are broken, and it's not users' faults."


If you're facing online harassment, please check out this guide put together by Amnesty International on staying safe.

Talk with Care Bot