How might a CUI assist with coping mechanisms for users with Mental Disabilities such as Depression?

This was a small study I did to explore the connection between a human and an AI. My goal focused on prioritizing the inhuman aspect of an AI.

For people experiencing depression reaching out can be hard even if they want to. What if a conversational user interface stepped in to help them break away from the cycle.

To create an extreme hypothetical situation, the AI’s presence is indicated by the LED on the device. The system is designed to be as far from human as possible. It has no "form” and does not reside in one device, it’s ominous. It has no feelings, just a goal to accomplish.

But the question remains can we as users stop humanizing the AI?

Audio Trouble with Video. Will upload soon.

CUI States
Previous
Previous

Winged bookholder