Yesterday, Adobe released a new set of analytics tools, Adobe Sensei for Voice, to help brands take advantage of conversational data to ameliorate targeting and, ideally, conversions.

Adobe claims it can consume data from Alexa, Siri, Google Assistant, Cortana and Bixby (lol one day). The company ceases both user intent and contextual data — context that can be put to use by brands when targeting customers across other channels like social and email.

This implies that Adobe can track the actions users most often take with their conversational AI of choice and the things they regularly interact with — think calling an Uber versus listening to the latest Portugal. The Man album.

In practice, developers and brands will have access to the frequency of user interactions and the sequential actions taken after engaging with a particular service. Similarly, Amazon offers its own tool to help with gathering and aggregating similar metrics, like total customer interactions, usage time and frequency of intent.

Adobe has been paying a lot of attention to the market potential for conversational AI embedded in speaker hardware. The company launched metrics yesterday that painted the Amazon Echo Dot as the market leader in the growing space.

Regardless of what happens to this market, Adobe has a reasonable competitive edge with the relationships it already has with brands. And unlike tools built by companies with skin in the game like Amazon, Adobe is a relatively market-agnostic player.

Comments Below

comments