User Review( votes)
Microsoft’s digital assistant has had a tough time recently. After starting off life on Windows Phone, Cortana appeared on Windows 10 PCs and looked set to launch on a range of smart speakers, fridges, toasters, thermostats, and more. While Alexa and Google Assistant have corned this part of the market, Cortana has been left behind, and Microsoft CEO Satya Nadella admitted earlier this year that Cortana no longer competes directly with Google Assistant or Alexa. Microsoft has a new vision for Cortana, involving conversational interactions for workers who are organizing their days. I sat down with the new Cortana chief, Andrew Shuman, this week at Microsoft’s Build conference to get an idea for how the company is approaching Cortana in 2019 and beyond.
“I think one of the challenges we’ve had over the last couple of years is finding those places where Microsoft can really add a lot of value,” explains Shuman. “I think that what we’ve been really working on over the last year is how we can better embed Cortana across Microsoft 365 experiences and really delight users, especially those users who really are on board, so we have to understand their calendar, their tasks, their work documents, their interfacing with their close collaborators.”
Microsoft is repositioning Cortana to focus on integrating the digital assistant into routines and parts of the company’s software and services that make sense and are very business-focused. It’s fair to say that Cortana is definitely going to start showing up more and more in these scenarios at first, but there’s still room for the consumer elements of Cortana to continue, too. “You have a whole life product and we hundred percent believe in that,” says Shuman. “We just think it makes sense to start with customers who are invested in Microsoft at work and think about how they come through their lives.”
That initial investment comes in the form of Microsoft’s acquisition of Semantic Machines last year. The company has been working to improve conversational computing so that a digital assistant like Cortana can understand conversations and not just commands. Microsoft showed off a scenario this week where you could use Cortana on a phone to manage meeting invites, locations, weather, and much more all in the context of an individual and with natural speech rather than dedicated commands.
A lot of this conversational AI is multi-step and doesn’t involve using the “Hey, Cortana” wake word every time. Google showed off its own similar approach at I/O today, and both companies seem to be working toward a future where digital assistants are more natural at responding rather than requiring a wake word every time. “There’s still a lot of room for innovation around how you can kind of carefully invoke the assistance in the right way,” says Shuman. “I do think that there’s interesting opportunities as you start to use a camera or other signals, for example, when the phone has been picked up or that you’re glancing at a screen. Those things are very exciting, and it just points to the unrealized potential in a lot of these spaces.”
To get to the point where Cortana understands the context of the questions you’re asking is still a ways off, but Microsoft is trying to break the current approach of siloed skills to get digital assistants to do things. In the video demonstration, there’s calendaring, weather, locations, restaurant information, and contact details. “That would be shattered across dozens of different skills in the standard approach,” explains Dan Klein, a Microsoft technical fellow who helped co-found Semantic Machines. “The key thing here is context, and the idea that everything you say and everything you do is bridging together.”
Microsoft is using this context and the fact it knows a lot about how its users use Windows, Office, and other Microsoft services to get work done. It’s building a platform to understand all of this with machine learning, to then connect up the necessary skills in the background to enable conversations in Cortana. This is key because while Microsoft also operates its Bing search engine, it’s not as widely used as Google so it needs to focus on areas where it has enough data and the ability to improve Cortana in meaningful ways.
“I think the thing that people maybe never quite get their heads around is the number of users we have who are doing their most important work with our tools and services,” says Shuman. Whether that’s a billion people using Office, or even the millions using the cloud-connected versions, Microsoft is leveraging that to improve Cortana. “That means that every interaction they have with us in a very compliant and trusted way becomes an opportunity for us to learn,” explains Shuman. “We can learn from the files you’re sharing or the titles even of your files, or the subject of your emails. And I don’t mean this in a creepy way, I mean this in a data resource way that these algorithms can learn, eyes off.”
That means Cortana will be able to learn your important projects, upcoming deadlines, meetings, and more to have the context when you ask questions. Where this ultimately ends up in Microsoft products is still a little unclear. Fridges, toasters, and thermostats might be off the menu now, but Microsoft is being more careful about when and where to implement Cortana. We’ve seen that recently with Windows 10, as Cortana is being separated from the search experience. Microsoft has launched Surface Headphones with Cortana integration, and rumors suggest we’ll see Surface earbuds take on Apple’s AirPods.
“The headphones have been great,” says Shuman. “It’s been a great place for us to learn about how we can kind of think about a full Office experience through voice.” Microsoft now looks at hardware as the stage for software, so it’s likely we’ll see Cortana show up in more interesting Microsoft hardware in the future.
For now, Microsoft is very much focused on making sure its digital assistant becomes a lot more useful to the people using its products and services the most.