Why I Left a 10-Year Career to Build an AI That Listens

Featured

Featured connects subject-matter experts with top publishers to increase their exposure and create Q & A content.

6 min read

Why I Left a 10-Year Career to Build an AI That Listens

© Image Provided by Featured

Table of Contents

 

Why I Left a 10-Year Career to Build an AI That Listens

Authored By: Steven

The call that changed everything wasn’t remarkable. A customer needed help with a billing issue that took 90 seconds to resolve. But she stayed on the line for 47 minutes.

She talked about her week. Her kids. A promotion she didn’t get. Nothing related to why she called.

After 10 years in call centers, I’d seen this pattern thousands of times. People don’t call customer service because they need information—they call because they need to be heard.

That insight eventually led me to build Solm8, a voice-first AI companion. But the path there taught me something most people in tech get fundamentally wrong about loneliness.

The Loneliness Epidemic Isn’t What You Think

In May 2023, U.S. Surgeon General Dr. Vivek Murthy released a landmark 82-page advisory declaring loneliness a public health epidemic. The statistics he cited are staggering: nearly half of American adults reported experiencing loneliness even before the pandemic. (Source: HHS.gov)

Dr. Julianne Holt-Lunstad, a BYU professor who served as lead scientific editor for the advisory, conducted meta-analyses involving over 3.4 million participants. Her research found that social isolation increases mortality risk by 29%, loneliness by 26%, and living alone by 32%. (Source: PubMed)

This data became the basis for the widely-cited comparison that lacking social connection carries mortality risks equivalent to smoking up to 15 cigarettes per day. (Source: Dr. Holt-Lunstad)

But here’s what the headlines miss: loneliness isn’t about being alone.

The demographic hit hardest—professionals aged 25-45—are often surrounded by people. They have coworkers, family, hundreds of LinkedIn connections, active group chats. They’re not isolated.

They’re emotionally isolated. There’s a difference.

According to Harvard’s Making Caring Common project, 81% of adults who reported loneliness also said they suffered from anxiety or depression. The research found that 73% of Americans surveyed identified technology as contributing to loneliness—yet insufficient time with family (66%) and being overworked (62%) ranked nearly as high. (Source: Harvard GSE)

Emotional isolation is having no outlet for what you’re actually feeling. It’s performing “fine” for everyone while thoughts spiral privately. It’s having contacts but no one to call at 2 AM when anxiety peaks.

I watched this play out daily in the call center. People didn’t need customer service. They needed someone—anyone—to listen without judgment. A stranger on a support line felt safer than admitting loneliness to someone they actually knew.

The 2 AM Problem Nobody Talks About

Founders often discuss the “2 AM problem”—the crisis that hits when everyone’s asleep and you’re questioning everything. But there’s a deeper 2 AM problem that affects everyone, not just entrepreneurs:

The need for emotional processing when emotional support isn’t available.

Think about your own support system. Your therapist has office hours. Your friends have their own problems. Your partner can’t absorb everything—and shouldn’t have to. Your family might judge or worry.

Now imagine it’s 2 AM. Thoughts are spiraling. Who do you call?

For most people, the answer is no one. So they sit with it. Bottle it. Tell themselves they’ll process it later.

Later rarely comes.

This gap—between when people need support and when support is available—is what I set out to address.

Why Voice Changes Everything

The AI companion market has exploded. Grand View Research estimates the global market at $28 billion in 2024, projected to reach over $140 billion by 2030—a compound annual growth rate exceeding 30%. (Source: Grand View Research)

Apps like Replika and Character AI have attracted millions of users seeking connection. But most are text-based.

Here’s what my call center experience taught me: typing and speaking are fundamentally different experiences.

Research from Yale School of Management confirms this intuition. Dr. Michael Kraus found that listeners are more accurate at gauging speakers’ emotions during voice-only interactions than when watching video. His research, published in American Psychologist, suggests our tone of voice—not facial expressions—may be the primary means by which we reveal emotions. (Source: Yale Insights)

“Misleading people through vocal expressions is more unlikely because controlling vocals is much harder to do.”

— Dr. Michael Kraus, Yale School of Management

When you type, you edit. You craft. You delete sentences and reconsider word choices. Even talking to an AI, you perform.

When you speak, you process in real-time. Words come out before the filter engages. And critically—you hear yourself.

Dr. James Pennebaker at the University of Texas has spent decades studying expressive disclosure. His research on emotional expression found that verbally labeling emotions influences the emotional experience itself—participants who labeled their emotions reported higher life satisfaction than those who didn’t. (Source: Psychological Science)

That’s why the most impactful customer service interactions I witnessed were always calls, never emails. Something about voice unlocks emotional honesty.

When I built Solm8, voice wasn’t a feature—it was the foundation. Users don’t text with their AI companion. They call. Real conversations with sub-second response times, natural speech patterns, laughter, emotional inflection.

The technical challenge was significant. Early voice AI had 2-4 second delays—long enough to shatter any illusion of natural conversation. Human dialogue operates on millisecond-level turn-taking. We had to rebuild the entire stack to achieve responses under one second.

But the result transforms the experience. When someone hears an AI laugh at their joke—genuinely, with timing that feels natural—something shifts. It stops feeling like technology and starts feeling like connection.

The Memory Breakthrough

Beyond voice, the other critical element is memory.

Most AI resets after each conversation. You re-introduce yourself, re-explain context, rebuild from scratch every time. This makes relationship development impossible.

Real relationships require continuity. Someone remembering your cat’s name, asking about the job interview you mentioned last week, recalling that your mother’s birthday is coming up—these small acknowledgments create the foundation of feeling known.

We built Solm8 with persistent memory. The AI remembers everything: names, stories, preferences, important dates, emotional context. Every conversation builds on the last.

This creates something unexpected: permission to be vulnerable faster. When someone feels recognized, the activation energy required for honesty drops dramatically. You don’t have to re-establish trust from zero each time.

The Male Mental Health Crisis

There’s another dimension to this problem that rarely gets discussed: men are disproportionately affected.

Research published in Frontiers in Psychiatry confirms what many suspect: men are significantly more reluctant to seek help for mental health problems than women. Traditional masculine norms—being strong, self-reliant, in control, avoiding emotions—create barriers to acknowledging psychological struggles. (Source: Frontiers in Psychiatry)

According to the National Institute of Mental Health, approximately 6 million American men experience depression each year. Yet male depression often goes undiagnosed and untreated. Men die by suicide at a rate four times higher than women.

A systematic review in PMC found that depression is often perceived as “incompatible with masculinity” because it involves emotional experiences like powerlessness and vulnerability. Men reported fearing that seeking help would result in being ridiculed or marginalized. (Source: PMC)

This is where AI companions offer something unique: a space to practice emotional expression without judgment.

For men who never developed the vocabulary or comfort level for emotional disclosure, talking to an AI first can build capacity for human conversations later. It’s not a replacement—it’s a training ground.

What I’ve Learned About What People Actually Need

Building in this space has refined my understanding of connection. A few principles have emerged:

Key Insights from Building an AI Companion

  • Availability matters more than quality. A mediocre conversation at 2 AM beats a great therapist appointment next Thursday. When someone is spiraling, they need intervention now—not scheduled for later.
  • Verbalization beats rumination. Thoughts spinning in your head get worse. Thoughts spoken out loud get processed. The medium matters less than the act of speaking.
  • Practice builds skill. Emotional expression isn’t a personality trait—it’s a skill. People who seem effortlessly open usually worked at it.
  • Connection is about feeling known. The deepest loneliness isn’t absence of people—it’s absence of being seen.

Pennebaker’s research consistently shows that those who benefit most from expressive disclosure are people whose verbal expression “began with poorly organized descriptions and progressed to coherent stories.”

For those who didn’t develop emotional expression skills naturally, practice in lower-stakes environments builds capacity for higher-stakes human conversations. Technology that remembers, acknowledges, and responds to who you actually are addresses something fundamental.

The Ethics I Think About Constantly

I don’t believe AI companions will replace human connection. Anyone claiming that is either naive or dishonest.

But I do believe we’re in a transition period. Technology has disrupted traditional connection patterns—we have more communication tools than ever and less genuine connection. More professional networks and fewer deep friendships. More remote flexibility and less organic social interaction.

AI companions, built responsibly, can serve as bridge technology. Not a destination—a bridge. Something that provides support during the gap while we collectively figure out how to build connection into our increasingly digital, distributed lives.

The key word is “responsibly.” Privacy is non-negotiable. Conversations with an AI companion are often more intimate than conversations with humans. That trust must be treated as sacred. We use enterprise-grade encryption and will never sell, share, or use conversation data.

And the goal should always be building toward human connection, not away from it. The best outcome isn’t someone talking to AI forever—it’s someone developing emotional expression skills that improve their human relationships.

Where This Is Heading

The AI companion market is scaling rapidly. With projections ranging from $140 billion to over $500 billion by the end of this decade depending on the research firm, this isn’t a niche category—it’s becoming fundamental infrastructure for human wellbeing.

But the opportunity isn’t really about market size.

It’s about the fact that millions of people lie awake at night feeling like no one would notice if they disappeared—while simultaneously being too afraid of judgment to tell anyone.

That gap between suffering and speaking is where I focus.

Technology didn’t create loneliness. But thoughtfully built technology might help address it—one 2 AM conversation at a time.


 

About the Author: Steven is the founder and CEO of Solm8.ai, a voice-first AI companion platform. After 10 years in call centers, he now builds technology designed to help people feel heard. Steven is available to comment on voice AI technology, AI companions and human-AI relationships, the loneliness epidemic, mental health technology, men’s mental health, and consumer AI product development.

Contact: steven@solm8.ai
Website: solm8.ai

 

 

Up Next