Welcome to the tenth edition of Reality Test! Woohoo! I’m out in California pretending it’s still summer, so today’s newsletter is a quickie.
I learned in After School by Casey Lewis this week about Manifest, one of the many tech companies trying to “combat loneliness.” How it works, via TechCrunch:
“When you open the Manifest app, you’ll see a pastel gradient orb in the center of the screen. You can hold the button to talk, or tap it to type, in response to a number of prompts: “What’s on your mind?,” “What are you worried about?,” or “What would be useful for us to talk about?”
Then, the app’s AI will mirror your language and turn it into an affirmation, which you can turn into a personalized audio meditation.”
Nothing like an AI-generated affirmation to cure the old mental health. Per the founder, a 20-something Stanford grad:
“‘Gen Z is hanging out way less in person,” she said. ‘So it’s like, what do you give a generation that we’ve already done this to? Like, the idea that you tell that person to go outside and hang with their friend is an astronomical leap for them, so how do you go and give them something where they’re already at?’”
Meeting people where they are is a core tenet of social work. But this logic strikes me as justification for making tech the solution to a problem that is deeply human.
I’m reminded of Kate Lindsay’s excellent piece in Embedded earlier this summer about a similar AI-enabled piece of wearable tech designed to treat the “loneliness epidemic”:
“But the most annoying thing about Friend, ultimately, is that a tool like that simply cannot make us less lonely. Let’s say everyone buys one. Okay, cool—we now all have another personal device that’s with us at all times, and that we pay more attention to than the people around us. It does nothing for our relationships with other people, besides isolating us further. And people are exhausted with this. Case in point: Meta just shut down its celebrity AI chatbots due to lack of use. With any of these supposedly revolutionary tools, no matter how familiar the character they’re replicating or how colloquially the bot speaks, the human on the other end remains the same: alone. “
I’d offer one more thought here: Loneliness is an inexorable part of the human condition. Not necessarily so with chronic loneliness or isolation. But we all feel alone sometimes. It sucks and it is hard. I see it a lot in my clinical work.
I say that perhaps, with every quick hit of validation from an app like Manifest, we deprive ourselves the opportunity to build some tolerance for the uncomfortable feeling of loneliness. We can’t abolish loneliness, but maybe we can change our relationship to it. For example, it’s fine to need some quick reassurance from outside yourself when you undermine your own accomplishment (we all do!), but you will likely also feel inadequate about something else later. That feeling of inadequacy, more often than not, is part of a pattern, not a one-time response to a particular situation. And to try to app that feeling away every time is a losing battle.
I don’t want to understate the structural issues that contribute to the chronic loneliness so many of us feel at this challenging point in our history and the negative impacts that can result: One in three Americans reports feeling lonely once a week and chronic loneliness is linked to a number of mental and physical health issues. We need social interaction to thrive.
At the same time, therapy isn’t about getting rid of negative emotions entirely. It’s about learning to live with them and work through them without allowing them to steal our power and agency.
Sure, telling young people to go outside and hang out with their friends may feel like an “astronomical leap.” Telling anyone to do anything often is. Without understanding why it is such a leap for that individual, just giving them an orb to tap when they feel down, we don’t solve anything. It’s just another flimsy band-aid.
The warlike language of “combating” loneliness pits this feeling as an enemy, something to be eliminated through force. But perhaps we can find a more sustainable and forgiving paradigm if we can learn to work with and through our loneliness, accepting that the feeling comes and goes, but it doesn’t have to last forever. And inside of that loneliness, there might even be some patience, self-knowledge, or compassion to be found.
…
This week in toeing the ethical (on)line
I’ve been thinking a lot since launching this newsletter about being Online while being a therapist, and how to manage it all responsibly while maintaining the freedom to express myself as a regular person, too. One thing I know: Posting details of sessions for Content gives me the ethical ick. Clinicians who are reading, tell me your thoughts.
Do you use any mental health apps? What do you think? Do you have an Online Therapist?? As always, hit me up with your thoughts and suggestions. Thanks for reading! I appreciate you!
I've been thinking a lot about the line between sharing clinical vignettes to illustrate a point, versus posting details from sessions for content. I find the former genuinely helpful in the clinical literature I read and therefore would like to write that way, too; but I wouldn't want to veer into doing the latter because ick. But where exactly is the line? Maybe what is off-putting about that particular post was the disdainful tone about the session content, and how that would be hurtful to the client if they saw it? It has a bit of a mean girl vibe. No one wants to imagine their therapist having that reaction to their session (although I think that gallows humor and clinician's griping privately to each other in supervision about annoying things their clients do is a necessary release valve to avoid burnout, personally).