After more than half a year of rapid growth, the traffic of ChatGPT has also peaked.
According to the monitoring data of the third-party website SimilarWeb, in June this year, the global traffic (PV) of ChatGPT’s website and mobile client decreased by 9.7% month-on-month, and the traffic in the United States decreased by 10.3% month-on-month. At the same time, the number of unique visitors (UV) of ChatGPT decreased by 5.7%, and the time spent by visitors on the website also decreased by 8.5%. [1]
This is the first time that ChatGPT has experienced negative traffic growth since its release on November 30, 2022. The momentum of ChatGPT’s slowing growth has already appeared in May this year, and the growth rate in May was only 2.8%.
It seems that ChatGPT will not become the most trafficked website in the world, and the threat it poses to Google, the world’s largest traffic website, will stop here.
The stagnation of ChatGPT’s growth came at a very delicate moment. Today, people have a strong fear of the bubble hyped by generative AI, and the secondary market is selling off the ChatGPT concept stocks that have risen this year. Michael Hartnett, chief investment strategist at Bank of America, likened the stock market’s mania for AI to a “baby version” of the dot-com bubble in the early 2000s, saying the bubble would soon “mature”.
There can be no broad consensus on this until the moment the bubble actually bursts. But at least it is certain that the negative growth of ChatGPT traffic does not yet explain any real problems.
SimilarWeb’s traffic data only counts user-side visits, but does not include the number of people using ChatGPT through the application programming interface (API). In fact, the latter is the strategic focus of OpenAI.
ChatGPT has been released for 7 months, and while the traffic is soaring, it is also actively promoting commercialization. Compared with the level of traffic, ChatGPT, as a “productivity revolution”, its commercialization process is more worthy of attention and will be more inspiring to the industry.
1. The most successful products suffer from stagnant growth
Despite the stagnation of growth, there is no doubt that ChatGPT is still one of the most successful to C products ever.
In the past, we have seen products that became popular in a short period of time, but most of them are short-lived and difficult to maintain long-term vitality. For example, Clubhouse, which became popular overnight in February 2021, saw a sharp drop in downloads two months later. 90%, no one cares about it today.
But ChatGPT is different. In the seven months since its release, ChatGPT has maintained a dizzying growth.
In just 5 days after its release, ChatGPT gained 1 million users; two months later, ChatGPT passed the 100 million user mark. ChatGPT is the fastest product ever to achieve these two indicators.
Image via Demand Sage
Although the traffic of ChatGPT fell by 10% month-on-month in June, its absolute value is still as high as 1.6 billion. In SimilarWeb’s website ranking, ChatGPT ranks 17th in the global website traffic list.
The person in charge of a technology company AI Lab told “Jiazi Guangnian”: "ChatGPT, including its similar products, is a generative dialogue robot driven by a large model, and it is directly open to the majority of users. The reason for the rapid growth is essentially It’s because you haven’t seen this thing before, and users will have a very high degree of freshness at the beginning, and they try it out of curiosity.”
The decline in traffic growth today represents the fading of this novelty.
The decline in ChatGPT traffic is not unique. SimilarWeb also monitored Google Bard, Microsoft bing, and Silicon Valley’s hot Character.AI, a chatbot founded by former Google engineers and second only to ChatGPT in popularity. Character.AI received investment from A16Z in March this year, at a valuation of $1 billion.
SimilarWeb’s results show that ChatGPT’s global traffic is still higher than Microsoft’s search engine bing, Character.AI and Google Bard, but the traffic of all three has declined to varying degrees. Character.AI’s global visits declined more than ChatGPT, down 32% month-on-month.
Image via SimilarWeb
The reason for the decline in traffic is actually not complicated. Although ChatGPT is already very smart and even shows reasoning ability, for the vast majority of ordinary people, no matter how interesting ChatGPT is, it is still just a “chat robot”.
The person in charge of the above-mentioned AI Lab told “Jiazi Guangnian”: "Most people register with the purpose of experience to see how powerful ChatGPT is, including me. But this does not mean that I will use it every day. .In addition to doing some simple science popularization, or writing essays for children, most people actually don’t know how to use it in daily life.”
Any product will usher in a day of stagnant growth, which is an objective law of the product life cycle, and ChatGPT has sounded the alarm to the market. From now on, any to C chatbot product must seriously consider whether it can provide more value besides “early adopters”.
2. The real influence lies in the to B market
While growth stalling is a major watershed moment for chatbots, it’s not all bad news for OpenAI. **Because OpenAI’s commercialization path does not depend on the traffic to C, but more on the API to B, and this part of the traffic has not been counted. **
The commercialization of ChatGPT is a high-profile issue, because it is not only related to OpenAI’s own profit-making methods, but also points out the direction for many large-scale entrepreneurship in the industry, whether it is an effective direction or an ineffective direction.
When ChatGPT was released to the public on November 30, 2022, OpenAI actually didn’t expect to achieve such a sensational effect. OpenAI CEO Sam Altman said in an interaction with fans on Twitter: “OpenAI will have to commercialize this at some point in time, because the computational cost is staggering.”
When Elon Musk, an early investor in OpenAI, asked about the cost of ChatGPT, Altman gave the data “the average cost of each conversation is a few cents.” In April this year, a foreign analyst estimated that the daily operating cost was as high as 700,000 US dollars. Microsoft is trying to reduce costs through self-developed chips and other methods. [2]
**According to the order of release time, OpenAI’s first commercialization method was the launch of the paid version of ChatGPT Plus on February 1 this year. **
ChatGPT Plus is priced at $20 per month, and the value-added services provided include “queuing during peak hours, fast response, and priority access to new features.” OpenAI launched a more powerful GPT-4 model on March 14, and users of ChatGPT Plus who have paid can directly experience GPT-4.
**One month after the launch of ChatGPT Plus, ChatGPT officially opened the API (gpt-3.5-turbo) on March 1st. Developers can integrate OpenAI’s large language models into third-party applications and products through the API. **
This is not OpenAI’s first open API. For different scenarios and functions, including text generation, question-and-answer robots, language translation, language detection, multiple rounds of dialogue, etc., OpenAI has previously released different types of APIs. And this time the open API, and the GPT-4 released soon, are the latest versions with more powerful performance.
Microsoft is OpenAI’s exclusive cloud partner. On March 9, Microsoft announced that ChatGPT can be used in its cloud service Azure OpenAI. Through the Azure OpenAI service, more than 1,000 customers are implementing state-of-the-art AI models—including DALL-E 2, GPT-3.5, Codex, and other large-scale language models powered by Azure’s unique supercomputing and enterprise capabilities.
A Credit Suisse report in March estimated that ChatGPT could provide Microsoft with an additional $40 billion in revenue over the next five years or more.
Open API is the way OpenAI truly exerts influence on B-end users. At present, more and more companies have connected to ChatGPT and launched their own AI applications, including software giants such as Microsoft, Salesforce, and Adobe, as well as well-known products such as SnapChat and Shopify.
On June 14, OpenAI released a series of API updates, including more powerful AI models, new function call capabilities, and longer context processing capabilities. At the same time, OpenAI announced a series of price reduction strategies to attract more enterprise developers to use. According to SimilarWeb data, traffic to OpenAI’s developer-oriented website (platform.openai.com) increased by 3.1% month-on-month in June.
So far, OpenAI has not disclosed the revenue generated by ChatGPT. In December 2022, OpenAI predicted that the company’s revenue in 2023 would reach US$200 million, and its revenue in 2024 was expected to exceed US$1 billion.
3. The ambition of “AI operating system” has not been realized
After opening the API, OpenAI has not stopped its commercial exploration. This time, OpenAI showed its ambition to build an “AI operating system”.
** On March 23, OpenAI launched the Plugin function to help ChatGPT access the latest information, run calculations or use third-party services. **
If the API is the ability to allow third-party applications to access ChatGPT, and the plug-in is the ability to allow ChatGPT to access third-party applications. Users can interact with third-party applications directly through plug-ins in ChatGPT, such as realizing quick reservations or purchasing products.
OpenAI has developed two plug-ins itself - a web browser and a code interpreter, while more plug-ins are provided by third-party developers. When the plugin was first released, OpenAI showcased 11 third-party plugins. According to the statistics of Reddit netizens, as of June 15, the total number of the latest plug-ins has exceeded 430.
Third-party plugins showcased by OpenAI
When the plug-in was first launched, many people used the “operating system in the AI era” to reflect its influence.
Minsheng Securities stated in a report that third-party plug-ins have strategic significance that cannot be ignored for OpenAI, and OpenAI is moving towards a strategic position higher than the operating system along the Apple-like model of “terminal + platform + ecology”. “The introduction of plug-ins marks that ChatGPT is on the great road of creating an ecosystem. The unified platform + plug-in model is expected to build a prosperous ecology similar to Apple + App Store.”
However, the subsequent development of the plug-in did not meet expectations.
On May 29, two months after the launch of the plug-in, Altman talked about the future of OpenAI in an interview with Raza Habib, CEO of artificial intelligence company Humanloop, and mentioned an important message:
"ChatGPT will not release a follow-up plugin soon, because from the actual market situation, the plugin has not yet reached the product/market fit (Product/Marketing Fit). Besides browsing, the usage of the plugin shows that they have not yet reached the The market has hit the sweet spot.” [3]
The next day, Humanloop promptly removed the share at OpenAI’s request.
Regarding the reason why the plugin did not reach the PMF, the reason given by Altman himself is,** "A lot of people think they want their app to be in ChatGPT, but what they really want is ChatGPT available in their app ". **
In this regard, the person in charge of the above-mentioned technology company AI Lab explained to “Jiazi Guangnian”: "The plug-in is to directly call the API of the third-party application in ChatGPT, and the essence is to perform data query. If it is watching movies, listening to songs, watching Simple businesses like weather are fine, but the difficulty is that a large number of third-party applications actually involve complex business logic and data structures. The business chain will be very long, and it is difficult to form a standardized API for ChatGPT to call. Even if it is made into a plug-in , the user experience is actually not friendly enough.”
Therefore, although ChatGPT has greatly improved the ability to understand and generate text, it is not good enough to support specific business scenarios. Once it involves medical, financial, legal and other professional knowledge, its accuracy rate will be very poor.
Plugins don’t seem to be the answer to this problem, professional models are. For example, in the financial industry, Bloomberg, a global provider of business, financial information, and financial information, released BloombergGPT, a large-scale model for the financial industry; The Chinese level of GPT-4.
For the layout of vertical industry large models, OpenAI should not miss it.
**4. Layout vertical large model? **
Although the development of plug-ins has not met expectations, OpenAI has not stopped pursuing the ambition of “AI operating system”, but in a different way.
On June 21, according to foreign media reports, OpenAI is planning to launch a large model store similar to the “App store”. [4]
Unlike plug-ins that are directly integrated into the ChatGPT chat page, the new large-scale model store allows developers to list their products based on OpenAI technology, such as chatbots or custom models for various vertical fields. Based on this model, enterprise customers do not need to train the vertical large model from scratch, but can directly call the fine-tuned version of the vertical industry in the large model store.
OpenAI has not officially announced this plan. The plan was revealed by Altman during a meeting with developers in London in May. He mentioned that two of OpenAI’s clients — repair software company Aquant and online education institution Khan Academy — are interested in joining the shop, offering their ChatGPT-based AI models.
**If the news is true, it means that OpenAI will formally lay out large-scale models in vertical domains on top of its own GPT series of general-purpose large models, and work with industry partners to provide solutions for retail, finance, medical and other industries. **
As OpenAI’s investor and strategic partner Microsoft is also providing Azure-based OpenAI services to enterprise customers, the two will inevitably encounter competition in the enterprise customer market. This will be a delicate situation.
An ideal situation is that Microsoft is oriented to the large enterprise market, because large customers often need more customized services and models that better meet their own needs, and the standardized vertical models on the shelves of large model stores are difficult to meet this demand; while OpenAI is oriented to For the SME market, the low-cost, out-of-the-box vertical model is sufficient to meet the needs of most SME users.
However, just like the plug-in has not been recognized by the market, it is still unknown whether the “AI trading market” model of the large-scale model store can go through. A spokesperson for OpenAI said the company is not actively pursuing the market for large-scale model stores and declined to comment on reports of Altman meeting with developers.
The idea of a large-scale model store can be regarded as the “middle layer” of the large-scale model ecological niche, intervening in the underlying general large-scale model (such as GPT-4, Baidu Wenxin Yiyan, Ali Tongyi Qianwen) and the upper-level massive applications (such as Notion, Salesforce).
As early as last fall, before ChatGPT was released, Altman shared his thoughts on the middle layer of the large model at an AI-themed summit held by the investment institution Greylock. He is skeptical of all startups trying to train their own models, and thinks that in the future there will be a new batch of startups training on top of existing base models, creating models for each vertical class. And this type of middle-tier business has a unique data flywheel and “will be very successful and different.” [5]
In the past half a year or so, the value of the middle layer, or the vertical large model, seems to have been ignored by the industry. More funds, talents, and spotlights have flowed to the basic model, and everyone wants to be China’s OpenAI.
When Baidu Wenxin was released, Baidu Chairman Robin Li said: “China’s OpenAI is not an opportunity for startups, and there is no need to reinvent the wheel again.” This is consistent with Altman’s point of view.
But from the perspective of actual actions, startups in the industry obviously do not agree with this view. Even if it is not considered a major manufacturer, there are at least 10 basic large-scale model manufacturers in China alone. Foreign basic large-scale models are also emerging. For example, Google revealed at this year’s I/O Developers Conference that it is developing a large-scale language model Gemini, and Google DeepMind CEO Demis Hassabis recently revealed that the model will be more capable than OpenAI’s GPT-4. .
More than one investor said that the industry does not need so many general-purpose models. Chen Yu, a partner of Yunqi Capital, told “Jiazi Guangnian”: “Just like the Hundred Regiments War before, in the end there are only a few left, including the big factories.”
Now, more and more people are beginning to recognize the opportunities for large models in the vertical field. And how will this contest of vertical large models unfold?
END.
References:
[1]
[2]
[3]
[4]
[5]
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
ChatGPT traffic fell by 10%, but the real challenge is not here
Author: Zhao Jian
Source: Jiazi Guangnian
After more than half a year of rapid growth, the traffic of ChatGPT has also peaked.
According to the monitoring data of the third-party website SimilarWeb, in June this year, the global traffic (PV) of ChatGPT’s website and mobile client decreased by 9.7% month-on-month, and the traffic in the United States decreased by 10.3% month-on-month. At the same time, the number of unique visitors (UV) of ChatGPT decreased by 5.7%, and the time spent by visitors on the website also decreased by 8.5%. [1]
This is the first time that ChatGPT has experienced negative traffic growth since its release on November 30, 2022. The momentum of ChatGPT’s slowing growth has already appeared in May this year, and the growth rate in May was only 2.8%.
It seems that ChatGPT will not become the most trafficked website in the world, and the threat it poses to Google, the world’s largest traffic website, will stop here.
The stagnation of ChatGPT’s growth came at a very delicate moment. Today, people have a strong fear of the bubble hyped by generative AI, and the secondary market is selling off the ChatGPT concept stocks that have risen this year. Michael Hartnett, chief investment strategist at Bank of America, likened the stock market’s mania for AI to a “baby version” of the dot-com bubble in the early 2000s, saying the bubble would soon “mature”.
There can be no broad consensus on this until the moment the bubble actually bursts. But at least it is certain that the negative growth of ChatGPT traffic does not yet explain any real problems.
SimilarWeb’s traffic data only counts user-side visits, but does not include the number of people using ChatGPT through the application programming interface (API). In fact, the latter is the strategic focus of OpenAI.
ChatGPT has been released for 7 months, and while the traffic is soaring, it is also actively promoting commercialization. Compared with the level of traffic, ChatGPT, as a “productivity revolution”, its commercialization process is more worthy of attention and will be more inspiring to the industry.
1. The most successful products suffer from stagnant growth
Despite the stagnation of growth, there is no doubt that ChatGPT is still one of the most successful to C products ever.
In the past, we have seen products that became popular in a short period of time, but most of them are short-lived and difficult to maintain long-term vitality. For example, Clubhouse, which became popular overnight in February 2021, saw a sharp drop in downloads two months later. 90%, no one cares about it today.
But ChatGPT is different. In the seven months since its release, ChatGPT has maintained a dizzying growth.
In just 5 days after its release, ChatGPT gained 1 million users; two months later, ChatGPT passed the 100 million user mark. ChatGPT is the fastest product ever to achieve these two indicators.
Although the traffic of ChatGPT fell by 10% month-on-month in June, its absolute value is still as high as 1.6 billion. In SimilarWeb’s website ranking, ChatGPT ranks 17th in the global website traffic list.
The person in charge of a technology company AI Lab told “Jiazi Guangnian”: "ChatGPT, including its similar products, is a generative dialogue robot driven by a large model, and it is directly open to the majority of users. The reason for the rapid growth is essentially It’s because you haven’t seen this thing before, and users will have a very high degree of freshness at the beginning, and they try it out of curiosity.”
The decline in traffic growth today represents the fading of this novelty.
The decline in ChatGPT traffic is not unique. SimilarWeb also monitored Google Bard, Microsoft bing, and Silicon Valley’s hot Character.AI, a chatbot founded by former Google engineers and second only to ChatGPT in popularity. Character.AI received investment from A16Z in March this year, at a valuation of $1 billion.
SimilarWeb’s results show that ChatGPT’s global traffic is still higher than Microsoft’s search engine bing, Character.AI and Google Bard, but the traffic of all three has declined to varying degrees. Character.AI’s global visits declined more than ChatGPT, down 32% month-on-month.
The reason for the decline in traffic is actually not complicated. Although ChatGPT is already very smart and even shows reasoning ability, for the vast majority of ordinary people, no matter how interesting ChatGPT is, it is still just a “chat robot”.
The person in charge of the above-mentioned AI Lab told “Jiazi Guangnian”: "Most people register with the purpose of experience to see how powerful ChatGPT is, including me. But this does not mean that I will use it every day. .In addition to doing some simple science popularization, or writing essays for children, most people actually don’t know how to use it in daily life.”
Any product will usher in a day of stagnant growth, which is an objective law of the product life cycle, and ChatGPT has sounded the alarm to the market. From now on, any to C chatbot product must seriously consider whether it can provide more value besides “early adopters”.
2. The real influence lies in the to B market
While growth stalling is a major watershed moment for chatbots, it’s not all bad news for OpenAI. **Because OpenAI’s commercialization path does not depend on the traffic to C, but more on the API to B, and this part of the traffic has not been counted. **
The commercialization of ChatGPT is a high-profile issue, because it is not only related to OpenAI’s own profit-making methods, but also points out the direction for many large-scale entrepreneurship in the industry, whether it is an effective direction or an ineffective direction.
When ChatGPT was released to the public on November 30, 2022, OpenAI actually didn’t expect to achieve such a sensational effect. OpenAI CEO Sam Altman said in an interaction with fans on Twitter: “OpenAI will have to commercialize this at some point in time, because the computational cost is staggering.”
When Elon Musk, an early investor in OpenAI, asked about the cost of ChatGPT, Altman gave the data “the average cost of each conversation is a few cents.” In April this year, a foreign analyst estimated that the daily operating cost was as high as 700,000 US dollars. Microsoft is trying to reduce costs through self-developed chips and other methods. [2]
**According to the order of release time, OpenAI’s first commercialization method was the launch of the paid version of ChatGPT Plus on February 1 this year. **
ChatGPT Plus is priced at $20 per month, and the value-added services provided include “queuing during peak hours, fast response, and priority access to new features.” OpenAI launched a more powerful GPT-4 model on March 14, and users of ChatGPT Plus who have paid can directly experience GPT-4.
**One month after the launch of ChatGPT Plus, ChatGPT officially opened the API (gpt-3.5-turbo) on March 1st. Developers can integrate OpenAI’s large language models into third-party applications and products through the API. **
This is not OpenAI’s first open API. For different scenarios and functions, including text generation, question-and-answer robots, language translation, language detection, multiple rounds of dialogue, etc., OpenAI has previously released different types of APIs. And this time the open API, and the GPT-4 released soon, are the latest versions with more powerful performance.
A Credit Suisse report in March estimated that ChatGPT could provide Microsoft with an additional $40 billion in revenue over the next five years or more.
Open API is the way OpenAI truly exerts influence on B-end users. At present, more and more companies have connected to ChatGPT and launched their own AI applications, including software giants such as Microsoft, Salesforce, and Adobe, as well as well-known products such as SnapChat and Shopify.
On June 14, OpenAI released a series of API updates, including more powerful AI models, new function call capabilities, and longer context processing capabilities. At the same time, OpenAI announced a series of price reduction strategies to attract more enterprise developers to use. According to SimilarWeb data, traffic to OpenAI’s developer-oriented website (platform.openai.com) increased by 3.1% month-on-month in June.
So far, OpenAI has not disclosed the revenue generated by ChatGPT. In December 2022, OpenAI predicted that the company’s revenue in 2023 would reach US$200 million, and its revenue in 2024 was expected to exceed US$1 billion.
3. The ambition of “AI operating system” has not been realized
After opening the API, OpenAI has not stopped its commercial exploration. This time, OpenAI showed its ambition to build an “AI operating system”.
** On March 23, OpenAI launched the Plugin function to help ChatGPT access the latest information, run calculations or use third-party services. **
If the API is the ability to allow third-party applications to access ChatGPT, and the plug-in is the ability to allow ChatGPT to access third-party applications. Users can interact with third-party applications directly through plug-ins in ChatGPT, such as realizing quick reservations or purchasing products.
OpenAI has developed two plug-ins itself - a web browser and a code interpreter, while more plug-ins are provided by third-party developers. When the plugin was first released, OpenAI showcased 11 third-party plugins. According to the statistics of Reddit netizens, as of June 15, the total number of the latest plug-ins has exceeded 430.
When the plug-in was first launched, many people used the “operating system in the AI era” to reflect its influence.
Minsheng Securities stated in a report that third-party plug-ins have strategic significance that cannot be ignored for OpenAI, and OpenAI is moving towards a strategic position higher than the operating system along the Apple-like model of “terminal + platform + ecology”. “The introduction of plug-ins marks that ChatGPT is on the great road of creating an ecosystem. The unified platform + plug-in model is expected to build a prosperous ecology similar to Apple + App Store.”
However, the subsequent development of the plug-in did not meet expectations.
On May 29, two months after the launch of the plug-in, Altman talked about the future of OpenAI in an interview with Raza Habib, CEO of artificial intelligence company Humanloop, and mentioned an important message:
The next day, Humanloop promptly removed the share at OpenAI’s request.
Regarding the reason why the plugin did not reach the PMF, the reason given by Altman himself is,** "A lot of people think they want their app to be in ChatGPT, but what they really want is ChatGPT available in their app ". **
In this regard, the person in charge of the above-mentioned technology company AI Lab explained to “Jiazi Guangnian”: "The plug-in is to directly call the API of the third-party application in ChatGPT, and the essence is to perform data query. If it is watching movies, listening to songs, watching Simple businesses like weather are fine, but the difficulty is that a large number of third-party applications actually involve complex business logic and data structures. The business chain will be very long, and it is difficult to form a standardized API for ChatGPT to call. Even if it is made into a plug-in , the user experience is actually not friendly enough.”
Therefore, although ChatGPT has greatly improved the ability to understand and generate text, it is not good enough to support specific business scenarios. Once it involves medical, financial, legal and other professional knowledge, its accuracy rate will be very poor.
Plugins don’t seem to be the answer to this problem, professional models are. For example, in the financial industry, Bloomberg, a global provider of business, financial information, and financial information, released BloombergGPT, a large-scale model for the financial industry; The Chinese level of GPT-4.
For the layout of vertical industry large models, OpenAI should not miss it.
**4. Layout vertical large model? **
Although the development of plug-ins has not met expectations, OpenAI has not stopped pursuing the ambition of “AI operating system”, but in a different way.
On June 21, according to foreign media reports, OpenAI is planning to launch a large model store similar to the “App store”. [4]
Unlike plug-ins that are directly integrated into the ChatGPT chat page, the new large-scale model store allows developers to list their products based on OpenAI technology, such as chatbots or custom models for various vertical fields. Based on this model, enterprise customers do not need to train the vertical large model from scratch, but can directly call the fine-tuned version of the vertical industry in the large model store.
OpenAI has not officially announced this plan. The plan was revealed by Altman during a meeting with developers in London in May. He mentioned that two of OpenAI’s clients — repair software company Aquant and online education institution Khan Academy — are interested in joining the shop, offering their ChatGPT-based AI models.
**If the news is true, it means that OpenAI will formally lay out large-scale models in vertical domains on top of its own GPT series of general-purpose large models, and work with industry partners to provide solutions for retail, finance, medical and other industries. **
As OpenAI’s investor and strategic partner Microsoft is also providing Azure-based OpenAI services to enterprise customers, the two will inevitably encounter competition in the enterprise customer market. This will be a delicate situation.
An ideal situation is that Microsoft is oriented to the large enterprise market, because large customers often need more customized services and models that better meet their own needs, and the standardized vertical models on the shelves of large model stores are difficult to meet this demand; while OpenAI is oriented to For the SME market, the low-cost, out-of-the-box vertical model is sufficient to meet the needs of most SME users.
However, just like the plug-in has not been recognized by the market, it is still unknown whether the “AI trading market” model of the large-scale model store can go through. A spokesperson for OpenAI said the company is not actively pursuing the market for large-scale model stores and declined to comment on reports of Altman meeting with developers.
The idea of a large-scale model store can be regarded as the “middle layer” of the large-scale model ecological niche, intervening in the underlying general large-scale model (such as GPT-4, Baidu Wenxin Yiyan, Ali Tongyi Qianwen) and the upper-level massive applications (such as Notion, Salesforce).
As early as last fall, before ChatGPT was released, Altman shared his thoughts on the middle layer of the large model at an AI-themed summit held by the investment institution Greylock. He is skeptical of all startups trying to train their own models, and thinks that in the future there will be a new batch of startups training on top of existing base models, creating models for each vertical class. And this type of middle-tier business has a unique data flywheel and “will be very successful and different.” [5]
In the past half a year or so, the value of the middle layer, or the vertical large model, seems to have been ignored by the industry. More funds, talents, and spotlights have flowed to the basic model, and everyone wants to be China’s OpenAI.
When Baidu Wenxin was released, Baidu Chairman Robin Li said: “China’s OpenAI is not an opportunity for startups, and there is no need to reinvent the wheel again.” This is consistent with Altman’s point of view.
But from the perspective of actual actions, startups in the industry obviously do not agree with this view. Even if it is not considered a major manufacturer, there are at least 10 basic large-scale model manufacturers in China alone. Foreign basic large-scale models are also emerging. For example, Google revealed at this year’s I/O Developers Conference that it is developing a large-scale language model Gemini, and Google DeepMind CEO Demis Hassabis recently revealed that the model will be more capable than OpenAI’s GPT-4. .
More than one investor said that the industry does not need so many general-purpose models. Chen Yu, a partner of Yunqi Capital, told “Jiazi Guangnian”: “Just like the Hundred Regiments War before, in the end there are only a few left, including the big factories.”
Now, more and more people are beginning to recognize the opportunities for large models in the vertical field. And how will this contest of vertical large models unfold?
END.
References:
[1]
[2]
[3]
[4]
[5]