With LLM’s ability to reason, do math, calculate, and write growing every day, one of the areas that is still underdeveloped is emotional recognition and digital empathy. I tried their avatars and was generally impressed by the experience. A machine’s ability to not only understand human emotions but also express them or mimic them in effective ways is still in progress.
The ability to express and recognize emotions is incredibly important, especially in activities that are not based merely on facts, like in psychology, coaching, or just in human-to-human interaction.
But we are slowly moving towards more developed digital empathy and expression of emotions by machines themselves. Part of this shift is happening with a change of interface. The evolution is looking like text-to-text, voice-to-voice, and finally, digital avatar to a person.
One of the services that offer AI consultants or AI assistants is Soul Machines, a company betting on this last approach: human-to-computer avatar. I was genuinely impressed by the experience. The speech recognition is fast, and the emotions that the AI avatar produces are mostly appropriate, though still a little clunky. Nonetheless, it’s incredibly impressive.
Yet, I’m still curious to see a digital agent that I feel truly connected with, emotionally. Weirdly enough, I think this moment has already passed with the invention of Tamagotchi, but that was still quite a limited interaction.
Comments