top of page
Blue Engine
TheRoad Logo

TheRoad

Product Management. Hands On Consulting.

  • Writer's pictureYoel Frischoff

Emerging Value Chains in Conversational AI

Updated: Mar 19





In a previous post I described a possible ecosystem where value propagates from Large Language Models to users through third party vendors.

At the time, just a few weeks back, only Open AI played a significant role in the textual landscape, while other competitors reigned supreme in the GenArt scene. Since then, in just a matter of days, both Google and Baidu announced Bard and Ernie, their own ai chatbots, and Microsoft integrated ChatGPT within its Bing search services, and as a part of Edge, the default browser for Windows operating system.


While OpenAI is yet to find a valid business model, at least google has had one, which - regardless of the new kid on the block's ability to turn profit - has just suffered a significant blow. After all, no one forces the billions of users to search the internet, let alone using google's products.


 

Value Chain for LLM based products

But first, let us revisit the value chain I posted just 10 days ago:

Value chai: from LLMs through intermediate layers and front end applications
Closing the gap between LLMs and users

In this somewhat speculative description of OpenAI dominated ecosystem, I specify OpenAi as an infrastructure provider. In this position, it largely avoids the tricky task of accessing the market with a value proposition targeting billions of users.


The diagram above describes an infrastructure as a service layer, offered to product builders through a monetized API, which I labeled as I/O Modifiers:

  • Query modifiers: Replacing human prompt-engineers, they will be built to modulate the natural language queries keyed in by users, parsing it into context-specific form, limiting admissible responses for coherence, safety, and reliability.

  • Output pre-processor: This part will modify the result generated to fit front-end and business case format and requirements.

The diagram shows how it is for third party vendors to tackle the thorny task of tailoring the process to each and every business case, each very different in technical details, performance and security requirements, as well as income potential.

 

Go To Market: Search

Why are third party vendors so important?


Despite OpenAI astonishing fit of acquiring 100mn users in just three months, the question remains wether it has genuine access to users.


In this context, it is Microsoft, with its Azure services, ChatGPT enhanced Bing, and Bing integrated Edge browser that comes roaring back into contention in the Search market.


Search (with capital S) is a unique, although popular, user experience.


In an enlightening interview, Satya Nadela, Microsoft's CEO, talks about Search, where users come to obtain information, while businesses allow the search platform to crawl their data in exchange for traffic.


The tension implied between content publishers and the search platform, competing on users' attention and ad-exposure, is now intensified as conversational ai capabilities are inserted into the search flow, resulting in a single discussion thread.


This Prompt-answer-refinement sequence might have the tendency to marginalize less prominent (and less authoritative) sources: Users are engaged in this single-threaded conversation, find it satisfying enough and stop searching.


Crucially, it reduces monetization opportunities, as paid search will definitely change its nature in the new experience: Paid results are less likely, and lower ranked organic results are all but dead, in this scenario.


What monetization can be maintained in the new "Conversational Search" experience?

  • Search Sidebar lives on, both for Google and Bing: These are related ad, probably with relatively low CPM (Cost Per Mile) as monetization strategy.

  • Paid Search will have to change, either by delimiting it to commercially motivated, high intent queries ("Where can I buy X", "How much does X cost"; or by restricting the number of allowed ads that will pop on the screen, separated from the generated content.

I tend to think that paid search will yield lower CTRs (Click Through Ratio), at higher conversions, due to the higher intent and the new capabilities that will evolve in ad optimization.


So, a lot to think about...

 

Go To Market: The Rest


Search, however, although huge, is not and will not be the sole business case - and this is where third party vendors will carve their niches.


You may find solution companies doing

  • Security analysis - adding the power of conversational ai to search, behind the scenes, for complementary information to be integrated and gradually refined in a coherent text.

  • Coding - Coding assistants are now offered to the professional market both for the business and enterprise community via Code Pilot, and to the general public, through Bing.

These are just two examples, in which the I/O modifiers I mentioned before come to play, starting at the Input parsing

  • Proprietary data training

  • Contextualization: Programming languages and IDEs, proprietary software libraries, different problem statements

  • Paid and free sources

And continuing with output pre processing:

  • Syntax

  • Format

  • Language register, tone of voice

Can Google or Open Ai, or even Microsoft, reach the myriad business cases and audiences? Can they price correctly, market the value in so many different business environments? I seriously doubt that.

 

Conversational AI as A Service


So what's to be done?... Luckily, there is a readily available alternative, that will bring a surge of exciting new products to the market:


LLMaaS: Large Language Models can be offered as an infrastructure service to third party vendors, monetizing API calls.


And how do we know it? because it's been already done in the speech synthesis arena.


All major TTS vendors (IBM, Amazon, Baidu, and Google) offer both TTS (Text To Speech) and STT (Speech To Text) as an API, for anyone to use.


This commoditized infrastructure has already brought to life a decent amount of innovative companies, such as LOVO.AI ; Alforithmic ; audiodots (of which I am a cofounder) and many others.


These vendors identify the need, tailor an offering, build customer relationships, price and monetize, while heavily relying on the AI capabilities available as infrastructure. My prediction is that we will see many similar use cases and companies emerging from this eruption of innovation we are in the midst of.

 
Are you interested in Conversational AI product strategy ?



Recent Posts

See All

Comments


bottom of page