Join our daily and weekly newsletter for the latest updates and exclusive content on industry-leading AI coverage. learn more
As businesses rush to adopt AI, they discover unexpected truths. Even the most reasonable company buyers have not made purely rational decisions. Their subconscious requirements far exceed traditional software evaluation criteria.
Let me share the anecdote: November 2024. I’m sitting in a skyscraper in New York City and working with fashion brands as my first AI assistant. The avatar Nora is a 25-year-old digital assistant on display at a 6-foot-high kiosk. She has smooth brown hair, a chic black suit and an attractive smile. She waves “Hello” when recognizing the client’s face, nods as they speak, answers questions about the company’s history and tech news. Standard technical checklist: Response accuracy, conversation delays, facial recognition accuracy, and more…
However, my client did not take a glance at the checklist. Instead, they said, “Why doesn’t she have her own personality? I asked her favorite handbag and she didn’t give it to me!”
Changing the way technology is evaluated
It’s impressive how quickly these avatars forget that they’re not human. While many people worry that AI will blur the line between humans and machines, I see a more immediate challenge for businesses. It is a fundamental change in the way technology is evaluated.
When the software begins to look and act humanly, users stop evaluating it as a tool and start judging it as a human. This phenomenon – judging non-human entities based on human standards – is Personificationthis has been well studied in human pet relationships and is now manifested in human relationships.
When it comes to sourcing AI products, decisions by companies are not as reasonable as you think, as decision makers are still human. Research shows that unconscious perceptions are most shaped Human interactionsand enterprise buyers are no exception.
Therefore, companies signing AI contracts do not simply participate in “utilities agreements” in search of cost reductions or revenue growth. They are taking part in an implicit “emotional contract.” Often, they don’t even understand it to themselves.
Make “ai baby” perfect?
Every software product has always had an emotional element, but when the product resembles infinitely similarity to the real human, this aspect becomes much more prominent and unconscious.
These unconscious responses shape how employees and customers engage with AI, and my experiences tell us how widespread these responses are – they are truly human. Let’s consider these four examples and the underlying psychological ideas.
When my client in New York asked about Nora’s favorite handbag, they were tapping when she craved her personality Social presence theorytreats AI as a social entity that needs to be present and realistic.
One client stuck to the smiles of his avatar. “The mouth shows many teeth. It’s unsettling.” This response reflects Spooky Valley Effectalmost human-like characteristics cause discomfort.
Conversely, AI agents that are visually attractive yet low functionality are Aesthetically used effects – The idea that appeal can outweigh performance issues.
Yet another client, the meticulous business owner, continued to delay project launches. “We need to perfect our AI baby,” he repeated at every meeting. “Before you can see it in the world, you have to be perfect.” This obsession with creating idealized AI entities is The ideal self In our AI work, as if we were creating digital entities that embody our highest aspirations and standards.
What is most important to your business?
So, how can you take advantage of these hidden emotional contracts and lead the market by beating your competitors stacked with flashy AI solutions?
The key is to decide what is important to your business’s unique needs. Set up the test process. This not only helps identify your top priorities, but more importantly, it helps you get robbed of small details, no matter how emotionally persuasive you are. This sector is so new that there are few playbooks available right away. But by establishing the original way to understand what’s best for your business, you can become the first mover.
For example, client questions regarding “AI Avatar Characteristics” were validated by testing with internal users. On the contrary, most people were unable to convey the differences in some versions of how the business owner interacted with him for his “perfect AI baby.”
To make patterns more easily recognized, consider hiring a team member or consultant with a background in psychology. All four examples are not one-off, but well-studied psychological effects that occur in human-human interactions.
The relationship with high-tech vendors must also be changed. They must be partners navigating the experience with you. After signing the agreement, you can set up a weekly meeting and share takeout from the test to create a better product. If you don’t have a budget, at least buffer the extra time to compare products and test them with users, allowing you to surface the hidden “emotional contracts.”
We are at the forefront of defining how humans and AI interact. Successful business leaders embrace emotional contracts, set up processes and navigate ambiguities that help them win the market.
Joy Liu leads enterprise products with Microsoft’s AI startups and cloud and AI initiatives.