This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.
Operator
Hello, and thank you for standing by for BIDU's fourth quarter and fiscal year 2023 earnings conference call. At this time, all participants are in listen-only mode. After management's prepared remarks, there will be a question and answer session. Today's conference is being recorded. If you have any objections, you may disconnect at this time. I would now like to turn the meeting over to your host for today's conference, Zhou Lin, Baidu's Director of Investor Relations.
Zhou Lin
Hello, everyone, and welcome to Baidu's fourth quarter and fiscal year 2023 earnings conference call. Baidu's earnings release was distributed earlier today, and you can find a copy on our website, as well as on newsletter services. On the call today, we have Zhou Lin, our co-founder and CEO of our CFO, and our EVP in charge of Baidu AI Cloud Group, ACG. After our prepared remarks, we will hold a Q&A session. Please note that the discussion today will contain forward-looking statements made under the safe harbor position of the U.S. Credit Security Litigation Reform Act of 1995. Forward-looking statements are subject to risks and uncertainties that may cause actual results to differ maturely from our current expectations. For detailed discussions of these risks and uncertainties, please refer to our latest ending report and other documents filed with SEC and Hong Kong Stock Exchange. Baidu does not undertake any obligation to update any forward-looking statements except as required under applicable law. Our earnings pledge release and this call include discussions of certain and audited financial measures. Our press release contains a reconciliation of the anonymous non-GAAP measures to the anonymous most directly comparable GAAP measures, and it's available on our IR website at irstopbydude.com. As a reminder, this conference is being recorded. In addition, a webcast of this conference call will be available by this IR website. I will now turn the call over to our CEO, Robin.
Robin
Hi, everyone. We concluded 2023 with solid core. Final core's total revenue for the fiscal year increased by 8% year-over-year, and the non-GAAP operating margin expanded from 22% to 24%. This improvement highlighted the resilience of our core base, which serves as a strong foundation for our ventures in earning and earning bots. 2023 marked a year of GenAI and foundation models, a massive technology shift that will re-engineer this process and bring about a real renaissance in every sector of the economy. Baidu is well prepared to benefit greatly from this groundbreaking economic revolution. We progressed from talking about the opportunities of Gen AI and foundation models to actively applying early and early bot at scale. As a frontrunner in AI, Baidu proudly became the first public company globally to launch a GPT model, with our EB 4.0 standing call as the most powerful foundation model in China. Ernie continues to gain market recognition as evidenced by Ernie API calls from multiple well-known companies. Notably, Samsung uses Ernie API on its Galaxy S24 5G series. Honor uses Ernie API on its Magic 8.0. And Auto Home uses Ernie API to power its multiple AIGC apps. The partnerships further solidify our leadership in GenAI. Furthermore, we continued improving the efficiency of earning. For example, the inference cost of EB3.5 is only about 1% of the March 2023 version. With lower inference costs, earning will become increasingly accessible to users and enterprises. We are able to do that primarily because of our unique four-layer AI architecture and our strong ability in end-to-end optimization. Additionally, we are offering LLMs in smaller sizes. These small models can balance between performance and efficiency, better serving various customer needs. Since Q2 2023, we have actively utilized Ernie to revolutionize our products and services, creating AI-native experiences. We believe real applications are essential to unleashing the full business potential of Ernie and ErnieBot. Recently, we began to generate incremental revenues from Ernie and ErnieBot. In the fourth quarter, we earned several hundred million RMB, primarily from ad technology improvement and helping enterprises build their own models. I'll provide a more detailed explanation in the BINS review section. Looking into 2024, we believe this incremental revenue will multiply to several billion RMBs. primarily from advertising and AI cloud businesses. While we are beginning to commercialize GenAI and foundation models, we see enormous monetization potential in this groundbreaking technology. We envision Ernie as the future foundation system, serving as the foundation for millions of AI-native applications developed by third-party and Baidu. This paradigm will enable us to create an ecosystem around earning, which opens up various revenue sources. Now, the key highlights in Baidu's AI-native applications. Equipped with an AI co-pilot, Baidu Wenku has become a platform for users to create content in a wide range of formats helping them express their ideas. In addition to summarizing, creating, and expanding content, users particularly appreciate Baidu Wenku's features for assisting them in automatically turning their inspiration into PowerPoint. We have been consistently improving and enriching its AI features. For example, we recently introduced a new one such as mind maps for organizing information. We are reconstructing Baidu Search using Ernie. Last quarter, we talked about how we used Ernie to generate search results, improving the one-shot experience. In Q4, Baidu Search introduced newsfeed-like information alongside the generated search results at scale. to offer users useful and relevant content to explore. In Q4, we also began facilitating certain search queries with multi-round conversations to delve deeper into understanding users' intentions. Overall, GenAI enabled search complements traditional search, expanding the tab of Baidu Search by serving a wider range of information needs and answering content creation related questions. We are actively encouraging SMEs in service industries to build AI chatbots as new landing pages for Baidu Search. AI chatbots enable them to better serve potential buyers directed to them by us, driving transactions on our platform. At the same time, our AI chatbots can help merchants increase productivity. Currently, about 4,000 merchants are using our AI chatbots, particularly in healthcare, education, travel, legal, B2B, and auto, where users often interact with merchants to learn about their offerings before making purchasing decisions. We believe the GenAI-enabled search will help increase user retention on Baidu, reinforcing our position as the primary entry point for users seeking information and services. We are also expanding the reach of Earning by enabling enterprises to easily develop AI-native applications on Earning. In December, we introduced AppBuilder, a comprehensive set of tools on our public cloud to support enterprises to swiftly view AI-native applications from the ground up. These tools enable enterprises to incorporate capabilities such as organizing on-ramp messages for prioritization, creating drafts, summarizing meeting notes, and facilitating GenAI enabled data analysis. Additionally, we have included several ready-made templates, such as AI Agent, to enable enterprises to promptly create applications for validating their ideas. Such progress is in large part based on our own pioneering experience in developing AI-native applications. In addition to the tools for app development, Enterprises partner with us because we have the most powerful and most efficient foundation models and the most cost-effective cloud platform for running apps. In December, about 26,000 enterprises are actively using Ernie through API on a monthly basis, increasing 150% quarter over quarter. And Ernie is now handling more than 50 million queries every day. That's up 190% quarter over quarter. It's a significant rise in third-party calling. As the number of AI-native applications hosted on our public cloud continue to rise, we are poised to accelerate the enhancement of our model performance by incorporating feedback from our users and customers. This will further widen the gap between Ernie and our domestic peers. Moreover, as more and more AI-native applications on Ernie become popular, Ernie will likewise be successful and support a robust ecosystem around it. Steering our multi-year growth forward. In conclusion, we find ourselves amidst a tremendous opportunity in general AI and foundation model. With Earning and Earning Bot, we have started generating incremental revenues. In 2024, we expect AI revenues contribution to become more meaningful, while our core business will remain resilient. Moreover, we are preparing to serve the next wave in the development of earnings. In the future, we believe that there will be millions of AI-native applications on Ernie, with a vast majority of them devised by our cloud customers. The expanding Ernie ecosystem will then unlock numerous monetization potentials for Baidu. Now, let me recap the key highlights for each business for the fourth quarter of 2023. Mobile ecosystem exhibited a solid performance across revenue, margin, and cash flow. In the fourth quarter, Baidu forced online marketing revenue increased by 6% year-over-year, driven by verticals in travel, healthcare, business services, and others. In Q4, we generated several hundred million RMB incremental ad revenue due to improvement in ad tech. We continue to use earning to enhance our monetization system, including targeting capability and bidding system in the quarter. Starting from January, our monetization system has been able to generate real-time text-based ads for search queries, demonstrating significant technology improvement in both targeting and ad creation. As of today, our new marketing platform has attracted about 10,000 advertisers. An advertiser in the medical aesthetics industry began to adopt our new marketing platform in Q4. With the AI co-pilot, the advertiser can articulate its requirements comprehensively through natural language and multi-round conversations for further clarification and explanation. The conversational experience has assisted in building optimized search and feed campaigns by generating relevant ad content. Our platform has also helped them reach highly targeted audiences while dynamically allocating budget to drive conversions. The advertiser achieved increase of 22% in conversion rate and a reduction of 5% in cost for sales lead acquisition after using this enhanced platform and capabilities. Currently, only a small portion of our entire advertiser base has adopted this new marketing platform, which means there is a significant opportunity for future growth. All these initiatives have helped advertisers improve their conversion, leading to an increase in their ad budget on our platform in the quarter. We expect continuous growth in ad revenue from GenAI and Foundation models in 2024 and beyond. While we enhanced our ad technologies, we also introduced AI chatbots for brands. an innovative ad product building on top of Ernie Bot in Q4. AI Chatbot has further enriched our ad product portfolio. It can enhance user engagement and customer service while also capturing customer attention and driving customer demand. Since last October, China Feihe, a local infant formula company, has adopted AI Chatbot to promote its brand. The AI chatbot has enabled Fei He to have multi-round interactions with our customers, increase brand recognition, and gain valuable insights into its potential consumers. These profound insights have also helped Fei He understand customers' views of its brand and products, leading to an enhanced marketing strategy. AI chatbots for brands has garnered significant interest from brand advertisers, and we expect more and more businesses to embrace the AI chatbot in the future. As I previously mentioned, we're using Ernie to build our apps. With continuous enhancement of our AI native product offerings, we believe more and more users will come to buy this platform. This, in turn, will assist us in gaining more market share in both user engagement and online advertising revenues. Looking into 2024, mobile ecosystem should continue to generate steady profit and cash flow. AI cloud revenue grow by 11% year over year to RMB 5.7 billion and continue to improve profitability in the fourth quarter. Revenue from GenAI and Foundation model represent 4.8% of our AI cloud revenue in Q4. The increasing demand for model building played a significant role in this accelerated revenue growth, along with increasing distributions from inference. We have seen a growing number of enterprises, in particular tech companies, turning to our public cloud to build their models. Additionally, the AI cloud revenue generated by two cores, other BINs groups, such as the mobile ecosystem group and the intelligent driving group, was about RMB 2.7 billion in Q4. Within the Q4 internal cloud revenue, GenAI and Foundation Model accounted for about 14%. On a combined basis, the total internal and external AI cloud revenue was RMB 8.4 billion in Q4, with GenAI and foundation model contributing around 656 million RMB. GenAI and LLMs have become pivotal considerations for many enterprises, driving a shift in the cloud industry from general purpose computing to AI computing. This evolution is reshaping the competitive dynamics within the cloud industry, strengthening our lead in AI, and expanding our total addressable market. Enterprises choose us thanks to our possession of the most powerful and cost effective AI infrastructure for both model training and inference in China. Thanks to our unique four-layer AI infrastructure to drive consistent efficiency gains and years of experience in earning. Last quarter, we explained that 98% of LLM training time on our AI infrastructure was valid. Moreover, our GPU networking is running at a 95% utilization for training LLMs. Both of these metrics have set industry benchmarks. Within our math, we offer Model Builder and App Builder, which are two sets of tools that help enterprises effortlessly build models and develop apps. As of now, enterprises have built about 10,000 models on our math. Since its inception, App Builder has facilitated the creation of thousands of AI-native apps, As more and more applications are being built on RMS, we will have greater revenue potential going forward. Looking into 2024, AI cloud should maintain strong growth in revenue and generate profits at the non-gas operating level. Our intelligent driving bins continue to focus on achieving new eBrick events for Apollo Go. In Wuhan, ApolloGo's largest operation, about 45% of our orders were provided by fully driverless vehicles in Q4. This metric surpassed 50% in January. The increase is because we intensified operations during peak hours in areas with complex traffic conditions and further expanded our operating area in the past few months. This development resulted from our ongoing efforts to improve technology through safely operating ApolloGo on public goals. In China, ApolloGo provided about 839,000 rides in Q4, marking a 49% year-over-year increase. In early January, the cumulative rise offered by ApolloGo exceeded 5 million. The substantial data collected from operations will further help us enhance the efficiency of safe operations. Looking into 2024, we will remain focused on getting closer to ApolloGo's UE breakeven target and managing our costs and expenses to reduce losses in intelligent driving. Upon reaching UE breakeven, we plan to swiftly replicate our successful operations in Wuhan to other regions. In summary, we are facing tremendous opportunities in GenAI and foundation models. We will continue to invest in these opportunities. And at the same time, we will strive to optimize our cost and expense structure for each business line to improve operational efficiency. With that, let me turn the call over to Rong to go through the financial results.
Rong
Thank you, Robin. Now let me walk through the details of our first quarter and the full year 2023 financial results. We closed 2023 with solid financial results. Total revenues in the first quarter was RMB 35 billion, increasing 6% year-over-year. Total revenues for the full year 2023 were RMB 134.6 billion, increasing 9% year-over-year. Baidu Call's Q4 revenue was RMB 27.5 billion, increasing 7% year-over-year. In 2023, Baidu Call generated RMB 103.5 billion in revenue, increasing 8% year-over-year. Baidu Call's online marketing revenue increased 6% year-over-year, to RMB 19.2 billion Q4, accounting for 17.70% of Baidu's total revenue. Baidu Core's online marketing revenue was up 8% year-over-year in 2023. Baidu Core's non-online marketing revenue was RMB 8.3 billion, up 9% year-over-year. For the full year 2023, non-online marketing business increased 9% year-over-year. The increase in our online marketing business was mainly driven by the AI cloud business. Revenue for IT was RMB 7.7 billion in Q4, increasing 2% year-over-year. Revenue for IT was RMB 31.9 billion in 2023, increasing 10% year-over-year. Cost of revenue was RMB 17.4 billion in Q4, increasing 3% year-over-year, primarily due to an increase in costs related to AI cloud business, partially offset by a decrease in content costs. Cost of revenue was RMB 65 billion in 2023, increasing 2% year-over-year, primarily due to an increase in traffic acquisition costs, partially offset by a decrease in the content costs and the costs related to AI cloud business. Operating expenses were RMB 12.1 billion in Q4, increasing 5% year-over-year, primarily due to an increase in server depreciation expenses and the server cost of disease, which supports AI research and development improves, and channel spending and promotional marketing expenses. Operating expenses were RMB 47.7 billion in 2023, increasing 9% year-over-year, primarily due to an increase in channel spending and promotional marketing expenses, and server depreciation expenses, and server custody fees, which support the general AI research and development inputs. Operating income was RMB 5.4 billion in Q4. By-law core operating income was RMB 4.7 billion, and by-law core operating margin was 17%, 17% in Q4. Operating income was RMB 21.9 billion in 2023. By-law core operating income was RMB 18.8 billion. and Baidu Core operating margin was 18%, 1A in 2023. Nongang Operating Income was RMB 7.1 billion in Q4. Nongang Baidu Core Operating Income was RMB 6.2 billion, and Nongang Baidu Core Operating Margin was 23% in Q4. Nongang Operating Income was RMB 28.4 billion in 2023. Nongang Baidu Core Operating Income was RMB 24.7 billion, and non-gan Baidu core operating margin was 24% in 2023. In Q4, total other loss net was RMB 2.5 billion, compared to total other income net of RMB 1.8 billion for the same period last year, mainly due to a peak of losses from equity massive investments as a result of modification of the system chains of underlying preferred shares. Income tax benefits, RMB 96 million. In 2023, total other income net was RMB 3.3 billion, compared to total other loss net of RMB 5.8 billion last year, mainly due to a fair value gain of RMB 198 million from long-term investments this year, compared to a fair value loss of RMB 3.9 billion last year. and a decrease of RMB 2.2 billion in impairment of long-term investments. Income tax expenses was RMB 3.6 billion. In Q4, net income attributable to Baidu was RMB 2.6 billion, and diluting earnings per ADS was RMB 6.07. Net income attributable to Baidu Core was RMB 2.4 billion, and net margin for Baidu Core is 9%. Non-GAAP net income attributable to Baidu was RMB 7.8 billion. Non-GAAP diluted earnings per ADS was RMB 21.86. Non-GAAP net income attributable to Baidu core was RMB 7.5 billion. And non-GAAP net margin for Baidu core was 27%. In 2023, net income attributable to Baidu was RMB 20.3 billion. And diluting earnings per ADS were RMB 55.08. Net income attributable to Baidu Core was RMB 19.4 billion. And net margin for Baidu Core was 19%, one nine. Non-GAAP net income attributable to Baidu was RMB 28.7 billion. Non-GAAP diluting earnings per ADS was RMB 80.85. Non-GAAP net income attributable to Baidu Core was RMB 27.4 billion. And non-GAAP net margin for Baidu Core was 26%. As of December 31, 2023, cash, cash equivalents, 3D cash, and shorting investments were RMB 205.4 billion. And cash, cash equivalents, 3D cash, and shorting investments, including ITE, were RMB 200 billion. Free cash flow was RMB 25.4 billion, and free cash flow, excluding IT, was RMB 22.1 billion in 2023. Baidu Corp had approximately 35,000 employees as of December 31, 2023. With that, operator, let's now open the call to questions.
Operator
Thank you. We will now begin the question and answer session. To ask a question, you may press star, then 1 on your touch-tone phone. If you are using a speakerphone, please pick up your handset before pressing the keys. If you would like to withdraw your question, please press star, then 2. Our first question today will come from Alicia Yap of Citi. Please go ahead.
Alicia Yap
Hi, thank you. Good evening, management. Thanks for taking my questions. My question is on the outlook. How does management think about the macroeconomics landscape in China for 2024? What's management view on 2024 growth outlook for Baidu as a whole? And also, what is the percentage of total revenues of Baidu will be contributed from the AI-related revenue in 2024? Thank you.
Robin
Hi Alicia, this is Robin. Before we dive into the outlook, let's take a quick look back last year. We had a very challenging macroeconomic environment, but our bins demonstrated very solid performance. We invested very aggressively in GenAI, but our non-GAAP operating margin expanded year over year. and revenue experienced solid growth. More importantly, we began to generate incremental revenue from GenAI and foundation models. And for this year, we have noticed that both the central and local governments are putting efforts to grow the economy. And during the eight-day Chinese New Year holiday, We saw a growth in consumption, particularly in the travel sector. But we're still operating in a macro environment with a lot of uncertainty. We are closely monitoring significant economic stimulus plans, which we think is essential for achieving this year's goals. But that being said, Baidu is facing a lot of opportunities. Our core business remains solid and the incremental revenue from general AI and foundation models will increase to several billion RMBs in 2024. This will contribute to the growth of our total revenue. More specifically, thanks to our leading position in LLM and GNI. Enterprises are increasingly building models and developing apps in the Baidu cloud. And for our mobile ecosystem, we have already accumulated a huge user base, and we keep renovating our products and enhancing our monetization capabilities through AI innovation. So when we combine cloud and mobile, I think we will be able to sustain our long-term growth, which we think will be faster than China's GDP growth.
Robin
Thank you.
Alex Yao
Thank you.
Operator
Okay, and our next question today will come from Alex Yao. of J.P. Morgan.
Alex Yao
Please go ahead. Hi, Alex. Your line is open. Are you muted?
Alex
Yeah. Sorry, I was unmuted. Good evening, management, and thank you for taking my questions. I have a couple of questions on cost structure and the modern trends. First of all, how much more room do we see in the cost-cutting and the optimization. How should we look at AI-related investments? In the past, you guys discussed that there will be a lag between cheap investments and the AI revenue contribution. How should we look at the modern trend into 2024 if you intend to expand?
Rong
Hi, Alex. Thank you so much for your questions. This is Julius. I think alongside the investments in our general AI businesses, it's pretty sure that there is still room we can manage the cost and the expenses in our legacy businesses. As we look into the 2024, we will continue to focus on our core businesses, and we also will be worried to reduce our resource allocations to the non-strategy businesses. Additionally, we are continuously enhancing our overall organization efficiencies by removing the layers, simplifying the executions, and flattening the organization structures. So heading into this year, we are very committed to the ongoing optimization of our operations, ensuring we have a more productive HR team. And with all these measures in plan, we aim to keep vital cost earnings solid, with our mobile ecosystem continuing to deliver the largest margins and generate the very strong cash flows, while AI cloud business continues its profitability. We have managed to maintain a solid operation profit margins despite our investments in AI. If you still remember from the year 2003 and on, started to invest in generated AI and large-range models. This investment is primarily reflected in our CapEx and mainly related to purchasing chips and servers for the model trainings and forensics. As you know, CapEx will be depreciated over several years. So despite of a 38% year-over-year increase in Baidu's cost capex in 2003, our long-term operating profit margins still saw a 2% growth on a year-over-year basis. Looking forward, in the process of developing our new AI business, it's inevitable to make new improvements. However, these investments are not expected to significantly affect our margins or profits. During the early stage of the market development, we also will now overly prioritize margins for our aircraft business. We believe that in the long run, this business is expected to yield much better margins. Additionally, there may be some promotional activities for the AI-native 2C products, but we will carefully manage and closely monitor the ROI to balance the investments and growth. We are happy to see that our efforts in investing in new initiatives have begun to yield the early results. As Robin has mentioned just now, the incremental revenue generated from the AI-native 2C projects from the ag-tech improvement have reached several million hundreds in Q4. And the incremental AI cloud revenue generated from the AI and foundation models also contributed to 4.8% of the total AI cloud revenue. I think all of these promising and top-line achievements have strengthened our confidence in Airbnb's strategy. So going forward, we will remain steadfast in our commitment to the development of the Gen-AI and large-language model.
Robin
Thank you, Alex.
Operator
Our next question today will come from Gary Yu of Morgan Stanley. Please go ahead.
Gary Yu
Hi, good evening, and thank you for taking my question. I have a question regarding AI contribution. So for AI-related revenue contribution, is there a way you can quantify or prove that the ad revenues Baidu will be generating is purely incremental contribution from AIGC and not cannibalizing from your existing search business? And if AI is purely incremental, should we be expecting faster than average growth? And excluding AI, how should we think about the core search growth rate for 2024? Thank you.
Robin
Hi, Gary. This is Robin. You know, we are the largest search engine in China. We have close to 700 million monthly active users. And we have established a very robust brand presence among the Chinese Internet and mobile users. They rely on us for comprehensive and reliable information. So we have a strong and stable base on revenue and profits. But we are also very sensitive to macro because our advertising business has a very broad coverage of different verticals. I mentioned earlier that there are still uncertainties with the macro environment. But GenAI and LLM are unlocking new opportunities both at the monetization front and the user engagement front. I think it's easier to quantify the incremental revenue on the monetization front. I mentioned earlier that GenAI has already helped increase advertising in eCPM. And our upgraded monetization system has allowed us to improve targeting capabilities, thereby generating and presenting more relevant ads. We earned several hundred million in Q4 from this kind of initiative, and the incremental revenue will grow to several billion this year. It's harder to quantify the user engagement part. GenAI is helping us improve user experience. We have seen initial outcomes in search and one pool and we will continue introducing new features going forward. This initiative will help us improve the user mind share and time spent over time and bring us even larger potential. So I think the purely incremental revenue will come from both the monetization side and the user engagement side.
Robin
Thank you.
Operator
Our next question today will come from Linking Kong of Goldman Sachs. Please go ahead.
spk14
Thank you, management, for taking my question. So my question is regarding the cloud business. How should we look at the incremental revenue growth that's driven by general AI? So what is the product mix for general AI cloud? And what exactly are the offerings that primary growth drivers here? And when we're looking at 2024, how should we expect the overall AI cloud revenue growth this year? And what will be the modern trend this year as well. Thank you.
AI
Thank you for the question, Zudo. As Robin just mentioned, the total revenue from the Gene AI and the Foundation Model related businesses, including both internal and external revenue, already reached 656 million RMB in Q4, and this number should grow to several billion RMB for the full year 2024. We have seen increasing interest from enterprises in using GeneAI and LLM to develop new applications and features. To achieve this goal, they are actively building models to power their products and solutions. So this is how we generate the majority of the revenue from external customers. Meanwhile, we are seeing a significant increase in model inferencing revenue from external customers. So the revenue from inferencing is still small. We believe that over the long term, it will become a significant and sustainable revenue driver. We think revenue generated by internal customers is also quite important because a significant portion of such revenue is for model inferencing. So Baidu is the first company to use GNI and LLM to reconstruct all the businesses and products. So with the number of products and features powered by GNI and LLMs continue to increase, Ernie API calls from internal customers have been increasing rapidly and have reached a significant magnitude. Such development has proven that Ernie and Ernie Bot can well enhance the productivity and efficiency in real-world applications. And we also believe a growing number of external customers will use Ernie. to develop their own apps and drive our external revenue in the future. Regarding your question about our product offerings, we have the most powerful AI infrastructure for model training and inferencing in China. This infrastructure helps our customers build and run models cost-effectively. Additionally, as Robin mentioned earlier, our masks offers various models and a full set of tools in terms of model builder and app builder for model building and application development. So in addition to that, we have developed our own AI-native solutions, such as GBI, that is Generative Business Intelligence. So those applications are helping enterprises increase productivity and efficiency. In addition to the incremental opportunity related to AI, Gene AI and foundation models also bring new opportunities to our legacy cloud businesses. So firstly, we continue to win customers and projects for CPU-centric cloud because we are being highly recognized for our AI infrastructure and the CPU-centric cloud offerings. especially in the internet and tech and education sectors. Secondly, GenAI foundation models have allowed us to build AI solutions for our customers more efficiently than before, facilitating digital and intelligent transformations for traditional industries. Both of these two factors are driving growth for our cloud revenue. And overall, we should see cloud revenue growth accelerate in 2024 over last year. Additionally, we are pretty confident in maintaining profitability for our AI cloud. For enterprise cloud, we should be able to consistently improve gross margins for the legacy cloud businesses. As for GAI and LLM businesses, the market is still at a very early stage of development. So we should hold a pretty dynamic pricing strategy to quickly educate the market and expand our penetration into more enterprise customers. We believe over the long term, the new business should have a higher normalized margins than the traditional cloud businesses. Thank you.
Operator
Our next question today will come from Thomas Chong of Jefferies. Please go ahead.
Thomas Chong
Hi, good evening. Thanks, management, for taking my questions. May I ask about the pace for our AI to see product development? How has the traffic growth been and any key metrics which can be shared on the new generative search? How AI benefits search traffic, time span, and when should we see the explosion of traffic or super app? Thank you.
Robin
Yeah, we are reconstructing all of our 2C products using generative AI. I think GenAI and foundation models are making all of our products more powerful. Over the past few months, we've made a significant stride in this kind of effort, and the initial user feedback has been encouraging. For search, the introduction of GenAI has enabled Baidu to answer a broader range of questions, including more complex, open-ended, and comparative queries. With EarlyBot, we can provide a direct and clear answer in a more interactive way. During the past few months, instead of just landing some content and providing some links, more and more search results were generated by EarlyBot. As a result, users are engaging with Baidu more frequently and asking new questions. For example, More and more users are coming to Baidu for content creation, be it text or images. During the Chinese New Year holiday, Baidu helped users create New Year greeting messages and generate personalized e-cards for their loved ones. This is not a typical use case for search engines, but we see a large number of users relying on Baidu for this kind of use. Going forward, we will increasingly use Ernie Bot to generate answers for search queries and then use multi-round conversation to clarify user intent so that complicated user needs can be addressed through natural language. While these initiatives have resulted in an enhanced search experience, we are still in the early stages of using ErnieBot to reconstruct the vital search. We will continue to test and iterate GenAI enabled features based on user feedback. And we will follow our typical playbook for testing and fine-tuning new experiences before upgrades are ready for a larger scale rollout. Overall, we believe GenAI will complement traditional search, ultimately increasing user retention, engagement, and time spent on Baidu. And in addition to search, Ernie Bot, acting as a co-pilot, Wenku has transformed from an application for users to find tablets and documents to a one-stop shop for users to create content in a wide range of formats. Year-to-date, I think about 18% of new paying users were attracted by GenAI features for BenQ. I want to emphasize that we are in the early stages of using EarlyBot to reconstruct our apps. and build new ones. And at the same time, we are attracting and helping enterprises build apps on Ernie. We believe the success of Ernie depends on its wide and active adoption, whether through Baidu apps or through third-party apps.
Robin
Thank you.
Operator
Our next question today will come from James Lee of Mizuho. Please go ahead.
James Lee
Great. Thanks for taking my questions. Can you guys maybe talk about Ernie's technology roadmap for 2024? Does it include, you know, multimodal features, maybe similar to Sora, or maybe opening up an AI store, or potentially launching an AI agent? Is there any sort of milestone or key metrics you can speak to? And the second part to that question is on the cost of running gen AI, how should we think about maybe puts and take on managing inferencing costs going forward? You know, obviously you talk about some way to make it more efficient. Are there any additional levers to optimize this process? Thanks.
Robin
Yeah, the chips we have on hand should enable us to advance EB4 to the next level. As I mentioned earlier, we will take an application-driven approach to intent learning and let our users and customers tell us where we should improve and adjust for our models. It could be building multi-modalities, agents, increasing reliability, and so on. It's important to note that we are focusing on using Ernie to bring real value to users and customers, rather than simply achieving high rankings on those research publications. And you're right, price is a very important issue. Making high-performance foundation models affordable is critical for large-scale operation. I mentioned earlier that we have been continuously reducing model inference cost. Now the inference cost of EB 3.5 is about 1% of March of 2023 version. By doing so, more and more enterprises are increasingly willing to test, develop, and iterate their applications on Earth. We understand that for many customers, they tend to balance efficiency with cost and speed, so we also offer a smaller language model and help our customers leverage MOE, that mixture of experts, for our best performance. With our end-to-end approach, we believe there's still ample room to reduce the cost of our most powerful models and make them increasingly affordable to our customers. And this will further drive the adoption of EARN. Internally, we are closely monitoring the number of apps developed on EARN. As I mentioned previously, Ernie is now handling over 50 million queries per day. And right now, Ernie API costs from internal applications are still larger than external app costs. But costs for Ernie in different sizes from external apps have been increasing rapidly. We're just at the beginning of this journey. Ernie will only become more powerful, smarter, and more useful as more and more end users use it, be it through Baidu apps or through third-party apps. This will enable us to cultivate an ecosystem around Ernie. As these apps and models are actively used by end users, they will also generate significant inferencing revenue for us.
Robin
Thank you.
Operator
Our next question today will come from Charlene Liu of HSBC. Please go ahead.
Charlene Liu
Thank you. Good evening, management, and thank you again for the opportunity. I have a question related to Ernie. How does Ernie's enterprise adoption compare to its peers? Can you please kindly share with us the latest number of enterprises that are using Ernie to build models and applications and help us understand how that has grown versus last quarter and what the underlying drivers may be? Lastly, can you help us also understand whether we can assume that enterprises who apply the ERNI API integration will very unlikely be using other elements? Thank you so much.
AI
Great questions, Charlene. So as Robin just mentioned, right, about 26,000 enterprises of different size spreading over different verticals. caught our earning API from our cloud in December, marking a 150% increase quarter over quarter. The earning API costs have exceeded 50 million on a daily basis, and we believe no one else in China has gained so many customers and received such a high volume of API requests. So enterprises chose us primarily for the following reasons. Firstly, we have the most cost-efficient AI infrastructure for model training and inferencing in China, primarily because of our strong ability of end-to-end optimization. As I mentioned before, GenAI and large-language models are reshaping the competitive landscape of China's public cloud industry and enhancing our competitive advantage. So our strong ability in managing extensive GPU-centric cloud with very high GPU utilization rate has continuously enhanced our AI infrastructure. So as a result, we can help enterprises build and run their models and develop AI native apps at low cost on our cloud. Secondly, EB family of models have attracted many customers to our cloud. Over the past few months, we have consistently enhanced Ernie's performance, receiving positive feedback from the customers. We also offer Ernie models in different sizes to better accommodate customers' needs regarding cost structures. Thirdly, we were the first company in China to launch a model-as-a-service offering, which is a one-stop shop for LLM and AI-native application development. So our mouse makes it easy for enterprises to use LLMs. We also provide a tool case to help enterprises easily train or fine-tune their models and develop applications on our cloud. So with the tool case, customers can train purpose-built cost effectively by incorporating their proprietary data and they can directly use Ernie API to power their own applications as well. We can also help them to support different product features using different models adopting the IMOE approach in the app development. So as a result, enterprises can focus on identifying customers' pain points rather than expanding their efforts on the program. So all of these initiatives have helped us establish a first mover advantage in GNI and LLMs. For your last question, as more customers use our masks, platform to develop AI-native applications aimed at attracting users. Substantial user and customer insights will be generated and accumulated on our cloud. So these insights will also allow us to further refine the toolkits. As our tools become increasingly customer-friendly and help enterprises effortlessly fine-tune models and create apps, they will be more inclined to stay with us. Additionally, it is worth noting that at the current stage of employing large language models, it is crucial for customers to create suitable prompts for their chosen models. So since they have to invest considerable effort in building and accumulating their best prompts, for using large language models, and it becomes challenging for them to switch to another model because they will have to re-establish their prompt portfolio. So as a result, with increasing adoption and active usage of our platform, customer satisfaction and switching costs will help us increase the customer retention.
Alex Yao
Thank you.
Operator
Our next question today will come from Miranda Zhang of Bank of America Securities. Please go ahead.
Miranda Zhang
Good evening. Thank you for taking my question, which is about the AI chip. Wondering what's the impact on your AI development after the recent further chip restriction from U.S. Is there any update on the alternative chips? And given the chip constraint, how is Baidu developing the AI model product and monetization differently versus the overseas peers? What can be achieved and what may become difficult? And what will companies do to keep up with the overseas peers in the next few years? Thank you. Hi.
Robin
In the near term, the impact is minimal for our model development or product reinventions or monetization. As I mentioned last quarter, we already have the most powerful foundation model in China, and our AI chip reserve enables us to continue enhancing earnings for the next one or two years. And for model inference, it requires less powerful chips. Our reserve and the chips available on the market are sufficient for us to power many AI native applications for end users and customers. And in the long run, we may not have access to the most cutting edge GPUs, but with the most efficient, homegrown software stack, and that the user experience will not be compromised. There's ample room for animations in the application layer, the model layer, and the framework layer. Our end-to-end self-developed four-layer AI architecture, along with our strong R&D team, will support us in using less advanced chips for efficient model training and inferencing. This provides Baidu with a unique competitive advantage over our domestic peers. And for enterprises and the developers, building applications on Ernie will be the best and most efficient way to embrace AI.
Robin
Thank you.
Operator
Our next question today will come from Sean Wan of UBS. Please go ahead.
Sean Wan
Good evening, management. This is on behalf of Canis. Thanks for taking my question. So my question is, recent days we have seen numerous developments in text-to-video or video generation technology. So how do you envision this technology impacting the broader AI industry development in China and what implications does it hold for Ernie? Could you elaborate on your strategic roadmap for Ernie moving forward? Furthermore, how does Ernie currently perform in text generation and text-to-image, text-to-video generation tasks, and what improvements do you foresee in these areas? Thank you.
Robin
This is Robin. First of all, multi-model or the Integration of multimodalities such as text, audio, and video is an important direction for future foundation model development. It is a must-have for AGI, and Baidu has already invested in this area and will continue to do so in the future. Secondly, if we look at the development of foundation models, The market for large language models is huge and still at very early stages. Even the most powerful language models in the world are still not good enough for a lot of applications. There's plenty of room for innovation. Smaller sized models, MOE, and agents are all evolving very quickly. We strive to make this offering more accessible to all types of enterprises and solve real-world problems in various scenarios. And thirdly, in the realm of visual foundation models, one notably significant application with vast market potential is autonomous driving. in which Baidu is a pioneer and a global leader. We have been using diffusion and transformer to train our video generation models for self-driving purposes. We have also consistently made strides in object classification, detection and segmentation, thereby better understanding the physical world and the rules of the physical world. This has enabled us to translate images and videos captured on the road into specific tasks, resulting in more intelligent, adaptable, and safe autonomous driving technology. Overall, our strategy is to develop the most powerful foundation models to solve real-world problems, and we continue to invest in this area. to ensure our leadership position.
Robin
Thank you.
Alex Yao
Thank you.
Operator
And ladies and gentlemen, at this time, we will conclude the question and answer session and conclude Baidu's fourth quarter and fiscal year 2023 earnings conference call. We do thank you for attending today's presentation. You may now disconnect your lines.
Disclaimer