Insights from Comparative “Sondering”
Why “Feeling” might be our most important human Superpower in the Age of AI
TL:DR // Key Insights:
🧠💡➡️👥🌍💭💫 — 🤯💬🧩🧍♂️🧍♀️❤️🩹 — 🤝💭😔
The concept of sonder—the realisation that every stranger lives a life as rich and complex as our own—reveals a profound, often unspoken truth about our shared human condition: we long for connection, yet often feel isolated in our emotional experiences.
As we increasingly turn to AI for emotional support, we risk outsourcing this very humanness, unless we reframe technology to reinforce, rather than replace, our capacity for empathy and meaningful connection.
🤖🫂➡️🧍♂️💔🧠 — ⚠️🤖❌🧑🤝🧑❤️ — ✅🔁🤖❤️🧠🤝
You might have come across this word, or you might not. In any case, it’s there, it’s out now. Someone (the American author John Koenig) has created this neologism as part of his project “The Dictionary of Obscure Sorrows”, that aims to come up with new words for emotions that lack words. And here I am specifically referring to the word “sonder”.
But what does it mean to sonder? Wikipedia defines it as
“the profound feeling of realizing that everyone, including strangers passing in the street, has a life as complex as one's own, which they are constantly living despite one's personal lack of awareness of it.”
Koenig’s own definition of sonder is even a bit more colourful and emotionally verbose, but at the very essence hints at the same insight - that we are merely scratching the surface when it comes to understanding the full depth and complexity of the lived experience of the people around us - and as a result of our very own complexity as social species.
To understand the phenomenon of naming this strange feeling a bit better, we should also have a look at how the CBC put Koenig’s act of naming his feelings into its respective context: Minnesota-born Koenig apparently grew up “feeling a bit ‘alien’” as an American child in Switzerland. Asked about his childhood experience, he said that his feelings of isolation during his decade in the multicultural Geneva got him thinking about how few words the English language has for emotional states.
Knowing about this deeply existential impetus for Koenig’s motivation to name all those feelings as part of his “Dictionary of Obscure Sorrows” also helps us better understand the fascination that comes from others actually recognising and “understanding” these feelings - although there have been no words for them before Koenig ‘decided’ to name them*. This might certainly have contributed to the fact that The Washington Post said that
"it’s undeniably thrilling to find words for our strangest feelings” and that “Koenig casts light into lonely corners of human experience…"
What makes this such an existential and uniquely human experience, is the fact that this capacity to intuitively understand, empathise and - at least temporarily - fully feel everything around us, paradoxically both connects and divides us from everyone around us. Which leaves us both completely isolated, but also longing for human connection. Something that lies at the very heart of what makes us human - of our human condition. A suffering that Koenig describes like this:
“The word sadness originally meant fullness, to be filled to the brim with some intensity of experience. It's not about despair, or distraction, or controlling how you're supposed to feel, it's about awareness. Setting the focus to infinity and taking it all in, joy and grief all at once; feeling the world as it is, the word as it could be. The unknown and the unknowable, closeness and distance and trust, and the passage of time. And all the others around you who are each going through the same thing.“
This is why it is important that we keep asking other humans “How do you feel today?” - but why it doesn’t make sense to ask this question to AI systems - although I imagine that more and more people do it, probably even quite regularly, given the current top 3 AI use cases today are exactly about this type of ‘loneliness support’.

Reddit is perfectly suited to capture and explore these raw, unfiltered and uniquely human experiences stemming from our ability to feel deeply. So it comes at no surprise that we can find an entire subreddit dedicated to the topic sonder there:
Interestingly, many Reddit users seem to also pick up on a special type of sonder, namely one that happens in close proximity to - but still highly disconnected from - other human beings. Like for example when you get a glimpse of the intimate lives of many other people around you, without much else than watching them through the window of a passing train.
In a time where less and less people are actually directly talking to each other, and where technology has become our go to place for any query, question and need we have, it seems weird to realise that we are actually never alone in how we feel. Especially given that we are able to even give language to previously unspoken feelings to unlock a shared emotional resonance - defining us as an empathising, deeply connected species.
This unique insight of course raises the question to what extent we are actually able - and willing - to connect. Be it with other human beings, or with human-like machines. To answer this question, I started asking a simple question to random human beings as well as to AI systems: “How are you feeling today?”. The first tentative results from a still very small (10 + 10) sample are already quite interesting and have intrigued me to continue asking this question…
Although it is still too early to draw any far-reaching conclusions, one thing struck me already as quite telling in the first results: How AI systems systematically deny, avoid or deflect the simple answer about their feelings - because they technically can’t have any. Which makes the desperate attempt of so many human users to connect with them over their own emotions obviously appear so much more ironic. As a highly intelligent social species we managed to use our intelligence to alienate ourselves from what distinguishes us as a species in the first place: Our social nature. Even if one would not consider oneself as a particularly ‘ethical’ person, this might bring up a valid concern around the use of AI systems as ‘social surrogates’ - and whether technology firms offering such systems should not be held accountable for the disintegrative damage they end up doing, on a societal, psychological and economical level.
On a more positive note - how can we turn around and make AI systems more human, allowing us to not forgo but support our uniquely human ability to deeply empathise and connect with each other? At least in specific contexts, and when used correctly, AI seems to be entirely able to help us do that.
* Being no professional linguist, I will abstain from commenting to what degree those words are ‘correct’ or even well-chosen with regard to their respective epistemic and linguistic roots. But even if there should be grounds to improve on the exact linguistic form of these words, the general randomness of language might still allow someone as perceptive and creative like Koenig to simply put his feelings into words.