Artificial intelligence chatbots have been billed as productivity tools for consumers — they can help you plan a trip, for example, or give advice on writing a confrontational email to your landlord. But they often sound stilted or oddly stubborn or just downright weird.
And despite the proliferation of chatbots and other AI tools, many people still struggle to trust them and haven’t necessarily wanted to use them on a daily basis.
Now, Microsoft is trying to fix that, by focusing on its chatbot’s “personality” and how it makes users feel, not just what it can do for them.
Microsoft on Tuesday announced a major update to Copilot, its AI system, that it says marks the first step toward creating an “AI companion” for users.
The updated Copilot has new capabilities, including real-time voice interactions and the ability to interpret images and text on users’ screens. Microsoft also says it’s one of the fastest AI models on the market. But the most important innovation, according to the company, is that the chatbot will now interact with users in?a “warm tone and a distinct style, providing not only information but encouragement, feedback, and advice as you navigate life’s everyday challenges.”
The changes could help Microsoft’s Copilot stand out in a growing sea of general-purpose AI chatbots. When Microsoft launched Copilot, then called Bing, early last year, it was seen as a leader among its big tech peers in the AI arms race. But in the intervening 18 months, it’s been leapfrogged by competitors with new features, like bots that can have voice conversations, and easily accessible (albeit imperfect) AI integrations with tools people already use regularly, like Google Search. With the update, Copilot is catching up with some of those capabilities.
When I tried out the new Copilot Voice feature at Microsoft’s launch event Tuesday, I asked for advice on how to support a friend who is about to have her first baby. The bot responded with practical tips, like providing meals and running errands, but it also provided more touchy-feely advice.
“That’s exciting news!” the tool said in an upbeat male voice — Copilot is designed to subtly mirror users’ tone — that the company calls Canyon. “Being there for her emotionally is a big one. Listen to her, reassure her and be her cheerleader … Don’t forget to celebrate this moment with her.”
An AI companion
Copilot’s update reflects Microsoft’s vision for how everyday people will use AI as the technology develops. Microsoft AI CEO Mustafa Suleyman contends that people need AI to be more than a productivity tool, they need it to be a kind of digital friend.
“I think in the future, the first thought you’re going to have is, ‘Hey, Copilot,’” Suleyman told CNN in an interview ahead of Tuesday’s announcement.
“You’re going to ask your AI companion to remember it, or to buy it, or to book it, or to help me plan it, or to teach me it … It’s going to be a confidence boost, it’s going to be there to back you up, it’s going to be your hype man, you know?” he said. “It’s going to be present across many, many surfaces, like all of your devices, in your car, in your home, and it really will start to live life alongside you.”
The earlier iteration of the Microsoft AI chatbot received some backlash for unexpected changes in tone and sometimes downright concerning responses. The bot would start off an interaction sounding empathetic but could turn sassy or rude during long exchanges. In one instance, the bot told a New York Times reporter he should leave his wife because “I just want to love you and be loved by you.” (Microsoft later limited the number of messages users can exchange with the chatbot in any one session, to prevent such responses.)
Some experts have also raised broader concerns about people forming emotional attachments to bots that sound too human at the expense of their real-world relationships.
To address those concerns while still developing Copilot’s personality, Microsoft has a team of dozens of creative directors, language specialists, psychologists and other non-technical workers to interact with the model and give it feedback about the ideal ways to respond.
“We’ve really crafted an AI model that is designed for conversation, so it feels more fluent, it’s more friendly,” Suleyman told CNN. “It’s got, you know, real energy … Like, it’s got character. It pushes back occasionally, it can be a little bit funny, and it’s really optimizing for this long-term conversational exchange, rather than a question-answer thing.”
Suleyman added that if you tell the new Copilot that you love it and would like to get married, “it’s going to know that that isn’t something it should be talking to you about. It will remind you, politely and respectfully, that that’s not what it’s here for.”
And to avoid the kinds of criticisms that dogged OpenAI over a chatbot voice that resembled actor Scarlett Johansson, Microsoft paid voice actors to provide training data for four voice options that are intentionally designed not to imitate well-known figures.
“Imitation is confusing. These things aren’t human and they shouldn’t try to be human,” Suleyman said. “They should give us enough of a sense that they’re comfortable and fun and familiar to talk to, while still being separate and distant … that boundary is how we form trust.”
More new Copilot features
Building on the voice feature, the new Copilot will have a “daily” feature that reads users the weather and a summary of news updates each day, thanks to partnerships with news outlets like Reuters, the Financial Times and others.
Microsoft has also built Copilot into its Microsoft Edge browser — when users need a question answered or text translated, they can type @copilot into the address bar to chat with the tool.
Power users who want to experiment with features still in development will have access to what Microsoft is calling “Copilot Labs.” They can test new features like “Think Deeper,” which the company says can reason through more complex questions, and “Copilot Vision,” which can see what’s on your computer screen and answer questions or suggest next steps.
After some backlash over privacy risks with a similar AI tool it released for Windows earlier this year, called Recall, Microsoft says Copilot Vision sessions are entirely opt-in and none of the content it sees is stored or used for training.