Baidu, Inc.

Q4 2023 Earnings Conference Call

2/28/2024

spk13: Hello and thank you for standing by for BIDU's fourth quarter and fiscal year 2023 earnings conference call. At this time all participants are in listen only mode. After management's prepared remarks there will be a question and answer session. Today's conference is being recorded. If you have any objections you may disconnect at this time. I would now like to turn the meeting over to your host for today's conference, Zhou Lin, BIDU's Director of Investor Relations.
spk21: Hello everyone and welcome to BIDU's fourth quarter and fiscal year 2023 earnings conference call. BIDU's earnings release was distributed earlier today and you can find a copy on our website as well as on new services. On the call today we have Zhou Lin, our co-founder and CEO, Zhou Huo, our CFO, and Doshun, our EVP in charge of BIDU AI Cloud Groups, ACG. After our prepared remarks we will hold a Q&A session. Please note that the discussion today will contain forward-looking statements made under the safe harbor position of US Credit Security Medication Reform Act of 1995. Forward-looking statements are subject to risks and uncertainties that may cause actual results to differ materially from our current expectations. For detailed discussions of these risks and uncertainties please refer to our latest annual report and other documents filed with SEC and Hong Kong Stock Exchange. BIDU does not undertake any obligation to update any forward-looking statements except as required under applicable law. Our earnings press release and this call include discussions of certain unaudited non-GAP financial measures. Our press release contains a recommendation of the unaudited non-GAP measures to be unaudited most directly comparable to GAP measures and is available on our IR website at IR stopbydood.com. As a reminder this conference is being recorded. In addition a webcast of this conference call will be available on BIDU's IR website. I will now turn the call over to our CEO Robin.
spk11: Hi everyone. We concluded 2023 with a solid fall. BIDU course total revenue for the fiscal year increased by 8 percent year over year and the non-GAP operating margin expanded from 22 percent to 24 percent. This improvement highlighted the resilience of our core base which serves as a strong foundation for our ventures in Earning and EarningBot. 2023 marked a year of Gen. AI and Foundation models. A massive technology shift that will re-engineer this process and bring about a real renaissance in every sector of the economy. BIDU is well prepared to benefit greatly from this groundbreaking economic revolution. We progressed from talking about the opportunities of Gen. AI and Foundation models to actively applying Earning and EarningBot at scale. As the front runner in AI, BIDU proudly became the first public company globally to launch a GPT model with our EB 4.0 standing tall as the most powerful foundation model in China. Earning continues to gain market recognition as evidenced by Earning API calls from multiple well-known companies. Notably, Samsung uses Earning API on its Galaxy S24 5G series. Honor uses Earning API on its Magic 8.0 and AutoHome uses Earning API to power its multiple AIGC apps. The partnerships further solidify our leadership in Gen. AI. Furthermore, we continued improving the efficiency of Earning. For example, the Earning API is only about 1% of the March 2023 version. With lower inference costs, Earning will become increasingly accessible to users and enterprises. We are able to do that primarily because of our unique four-layer AI architecture and our strong ability in -to-end optimization. Additionally, we are offering LLMs in smaller sizes. These small models can balance between performance and efficiency, better serving various customer needs. Since Q2 2023, we have actively utilized Earning to revolutionize our products and services, creating AI native experiences. We believe real applications are essential to unleashing the full business potential of Earning and EarningBot. Recently, we began to generate incremental revenues from Earning and EarningBot. In the fourth quarter, we earned several hundred million RMB, primarily from ad technology improvement and helping enterprises build their own models. I'll provide a more detailed explanation in the business review section. Looking into 2024, we believe this incremental revenue will multiply to several billion RMB, primarily from advertising and AI cloud businesses. While we are beginning to commercialize chain AI and foundation models, we see enormous monetization potential in this groundbreaking technology. We envision Earning as the future foundation system, serving as the foundation for millions of AI native applications developed by third party and Baidu. This paradigm will enable us to create an ecosystem around Earning, which opens up various revenue sources. Now, the key highlights in Baidu's AI native application. Equipped with an AI co-pilot, Baidu WenKu has become a platform for users to create content in a wide range of formats, helping them express their ideas. In addition to summarizing, creating and expanding content, users particularly appreciate Baidu WenKu's features for assisting them in automatically turning their inspiration into PowerPoint. We have been consistently improving and enriching its AI features. For example, we recently introduced a new one, such as MindMaps for organizing information. We are reconstructing Baidu Search using Earning. Last quarter, we talked about how we used Earning to generate search results, improving the one-shot experience. In Q4, Baidu Search introduced news like information alongside the generated search results at scale to offer users useful and relevant content to explore. In Q4, we also began facilitating certain search queries with multi-round conversations to delve deeper into understanding users' intentions. Overall, .AI-enabled search complements traditional search, expanding the time of Baidu Search by serving a wider range of information needs and answering content creation related questions. We are actively encouraging SMEs in service industries to build AI chatbots as new landing pages for Baidu Search. AI chatbots enable them to better serve potential buyers directed to them by us, driving transactions on our platform. At the same time, our AI chatbots can help merchants increase productivity. Currently, about 4,000 merchants are using our AI chatbots, particularly in healthcare, education, travel, legal, B2B, and auto, where users often interact with merchants to learn about their offerings before making purchasing decisions. We believe the .AI-enabled search will help increase user intention on Baidu, reinforcing our position as the primary entry point for users seeking information and services. We are also expanding the range of EARNi by enabling enterprises to easily develop AI native applications on EARNi. In December, we introduced AppBuilder, a comprehensive set of tools on our public cloud to support enterprises to swiftly build AI native applications from the ground up. These tools enable enterprises to incorporate capabilities such as organizing on-line messages for prioritization, creating drafts, summarizing meeting notes, and facilitating .AI-enabled data analysis. Additionally, we have included several ready-made templates such as AI Agent to enable enterprises to promptly create applications for validating their ideas. Such progress is in large part based on our own pioneering experience in developing AI native applications. In addition to the tools for app development, enterprises partner with us because we have the most powerful and most efficient foundation models and the most cost-effective cloud platform for running apps. In December, about 26,000 enterprises are actively using EARNi through API on a monthly basis, increasing 150% -over-quarter. And EARNi is now handling more than 50 million queries every day, that's 190% -over-quarter. It's a significant ride in third-party calling. As the number of AI native applications hosted on our public cloud continue to rise, we are poised to accelerate the enhancement of our model performance by incorporating feedback from our users and customers. This will further widen the gap between EARNi and our domestic peers. Moreover, as more and more AI native applications on EARNi become popular, EARNi will likewise be successful and support a robust ecosystem around it. Steering our multi-year growth forward. In conclusion, we find ourselves amidst a tremendous opportunity in general AI and foundation model. With EARNi and EARNiBot, we have started generating incremental revenue. In 2024, we expect AI revenues' contribution to become more meaningful, while our core business will remain resilient. Moreover, we are preparing to serve the next wave in the development of EARNi. In the future, we believe that there will be millions of AI native applications on EARNi. There's a vast majority of them, divided by our cloud customers. The expanding EARNi ecosystem will then unlock numerous monetization potentials for Maidu. Now, let me recap the key highlights for each business for the fourth quarter of 2023. Mobile ecosystem exhibits a solid performance across revenue, margin, and cash flow. In the fourth quarter, by the fourth online marketing revenue increased by 6% year over year, driven by verticals in travel, healthcare, Fin services, and others. In Q4, we generated several hundred million R&B incremental ad revenue due to improvements in ad tech. We continue to use EARNi to enhance our monetization system, including targeting capability and bidding system in the quarter. Starting from January, our monetization system has been able to generate real-time text-based ads for search queries, demonstrating significant technology improvement in both targeting and ad creation. As of today, our new marketing platform has attracted about 10,000 advertisers. An advertiser in the medical aesthetics industry began to adopt our new marketing platform in Q4. With the AI co-pilot, the advertiser can articulate its requirements comprehensively through natural language and multi-round conversations for further clarification and explanation. The conversational experience has assisted in building optimized search and feed campaigns by generating relevant ad content. Our platform has also helped them reach highly targeted audiences while dynamically allocating budget to drive conversions. The advertiser achieved an increase of 22% in conversion rate and a reduction of 5% in cost for self-feed acquisition after using this enhanced platform and capabilities. Currently, only a small portion of our entire advertiser base has adopted this new marketing platform, which means there is a significant opportunity for future growth. All these initiatives have helped advertisers improve their conversion, leading to an increase in their ad budget on our platform in the quarter. We expect continued growth in ad revenue from Gen. AI and Foundation models in 2024 and beyond. While we enhance our ad technologies, we also introduced AI Chatbot for brands, an innovative ad product building on top of Erniebot in Q4. AI Chatbot has further enriched our ad product portfolio. It can enhance user engagement and customer service while also capturing customer attention and driving customer demand. Since last October, China Feihe, a local infant formula company, has adopted AI Chatbot to promote its brand. The AI Chatbot has enabled Feihe to have multi-round interactions with our customers, increase brand recognition and gain valuable insights into its potential consumers. This profound insight has also helped Feihe understand customers' views of its brand and leading to an enhanced marketing strategy. AI Chatbot for brands has garnered significant interest from brand advertisers, and we expect more and more businesses to embrace the AI Chatbot in the future. As I previously mentioned, we're using Ernie to build our apps. With continuous enhancement of our AI native product offerings, we believe more and more users will come to buy the platform. This, in turn, will assist us in gaining more market share in both user engagement and online advertising revenues. Looking into 2024, mobile ecosystems should continue to generate steady profits and cash flow. AI Cloud revenue grew by 11% year over year to RMB 5.7 billion and continued to improve profitability in the fourth quarter. Revenue from Gen.AI and Foundation model represent .8% of our AI Cloud revenue in Q4. The increasing demand for model building played a significant role in this accelerated revenue growth, along with increasing distributions from inference. We have seen a growing number of enterprises, in particular tech companies, turning to our public cloud to build their models. Additionally, the AI Cloud revenue generated by Google Core's other VINs groups, such as the Mobile Ecosystem Group and the Intelligent Driving Group, was about RMB 2.7 billion in Q4. Within the Q4 internal cloud revenue, Gen.AI and Foundation model accounted for about 14%. On a combined basis, the total internal and external AI Cloud revenue was RMB 8.4 billion in Q4, with Gen.AI and Foundation model contributing around RMB 656 million. Gen.AI and LLMs have become pivotal considerations for many enterprises, driving a shift in the cloud industry from general purpose computing to AI computing. This evolution is reshaping the competitive dynamics within the cloud industry, strengthening our lead in AI and expanding our total addressable market. Enterprise has chosen us thanks to our possession of the most powerful and cost-effective AI infrastructure for both model training and inference in China. Thanks to our unique four-layer AI infrastructure to drive consistent efficiency gains and years of experience in earning. Last quarter, we explained that 98% of LLM training time on our AI infrastructure was valid. Moreover, our GPU networking is drowning at a 95% utilization for training LLMs. Both of these metrics have set industry benchmarks. Within our MAP, we offer Model Builder and App Builder, which are two sets of tools that help enterprises effortlessly build models and develop apps. As of now, enterprises have built about 10,000 models on our MAP. Since its inception, App Builder has facilitated the creation of thousands of AI native apps. As more and more applications are being built on our MAP, we will have greater revenue potential going forward. Looking into 2024, AI cloud should maintain strong growth in revenue and generate profits at the non-gap operating level. Our intelligent driving business continues to focus on achieving new e-brake even for Apollo Go. In Wuhan, Apollo Go's largest operation, about 45% of our orders were provided by fully driverless vehicles in Q4. This metric surpassed 50% in January. The increase is because we intensified operations during peak hours in areas with complex traffic conditions and further expanded our operating area in the past few months. This development resulted from our ongoing efforts to improve technology through safety operating through safely operating Apollo Go on public goals. In China, Apollo Go provided about 839,000 rides in the public in Q4, marking a 49% -over-year increase. In early January, the cumulative rides offered by Apollo Go exceeded 5 million. The substantial data collected from operations will further help enhance the efficiency of safe operations. Looking into 2024, we will remain focused on getting closer to Apollo Go's new e-brake even target and managing our costs and expenses to reduce losses in intelligent driving. Upon reaching new e-brake even, we plan to swiftly replicate our successful operations in Wuhan to other regions. In summary, we are facing tremendous opportunities in Gen. AI and the foundation models. We will continue to invest in these opportunities. And at the same time, we will strive to optimize our cost and expense structure for each business line to improve operational efficiency. With that, let me turn the call over to Rong to go through the financial results.
spk06: Thank you, Robin. Now let me walk through the details of our first quarter and the full year 2023 financial results. We closed 2023 with solid financial results. Total revenues in the first quarter was RMB 35 billion, increasing 6% -over-year. Total revenues for the full year 2023 were RMB 134.6 billion, increasing 9% -over-year. BaiduCost's Q4 revenue was RMB 27.5 billion, increasing 7% -over-year. In 2023, BaiduCost generated RMB 103.5 billion in revenue, increasing 8% -over-year. BaiduCost's online marketing revenue increased 6% -over-year to RMB 19.2 billion Q4, accounting for 17% of BaiduCost's total revenue. BaiduCost's online marketing revenue was up 8% -over-year in 2023. BaiduCost's non-online marketing revenue was RMB 8.3 billion, up 9% -over-year. For the full year 2023, non-online marketing business increased 9% -over-year. The increase in non-online marketing business was mainly driven by the AI cloud business. Revenue from ITE was RMB 7.7 billion in Q4, increasing 2% -over-year. Revenue from ITE was RMB 31.9 billion in 2023, increasing 10% -over-year. Cost of revenue was RMB 17.4 billion in Q4, increasing 3% -over-year, primarily due to an increase in Q4 costs, related to AI cloud business, partially offset by a decrease in quantum costs. Cost of revenue was RMB 65 billion in 2023, increasing 2% -over-year, primarily due to an increase in track acquisition costs, partially offset by a decrease in the quantum costs and the costs related to AI cloud business. Operating expenses were RMB 12.1 billion in Q4, increasing 5% -over-year. Primarily due to an increase in server depreciation expenses and the server cost of disease, which support general AI research and development improves, and channel spending and promotional marketing expenses. Operating expenses were RMB 47.7 billion in 2023, increasing 9% -over-year, primarily due to an increase in channel spending and promotional marketing expenses, and server depreciation expenses and server cost of disease, which support the general AI research and development improves. Operating income was RMB 5.4 billion in Q4, Baidu Core operating income was RMB 4.7 billion, and Baidu Core operating margin was 17% in Q4. Operating income was RMB 21.9 billion in 2023, Baidu Core operating income was RMB 18.8 billion, and Baidu Core operating margin was .1% in 2023. Nonger operating income was RMB 7.1 billion in Q4. Nonger Baidu Core operating income was RMB 6.2 billion, and Nonger Baidu Core operating margin was 23% in Q4. Nonger operating income was RMB 28.4 billion in 2023, Nonger Baidu Core operating income was RMB 24.7 billion, and Nonger Baidu Core operating margin was 24% in 2023. In Q4, total other loss net was RMB 2.5 billion, compared to total other income net of RMB 1.8 billion for the same period last year, mainly due to a peak of losses from equity massive investments as a result of a modification of the efficient chance of underlying preferentials. Income tax benefit was RMB 96 million. In 2023, total other income net was RMB 3.3 billion, compared to total other loss net of RMB 5.8 billion last year, mainly due to a fair value gain of RMB 198 million from long-term investment this year, compared to a fair value loss of RMB 3.9 billion last year, and a decrease of RMB 2.2 billion in impairment of long-term investments. Income tax expenses was RMB 3.6 billion. In Q4, net income attributable to Baidu was RMB 2.6 billion, and diluting earnings per ADS was RMB 6.77 billion. Net income attributable to Baidu Core was RMB 2.4 billion, and net margin for Baidu Core is 9%. Nonger net income attributable to Baidu was RMB 7.8 billion. Nonger diluting earnings per ADS was RMB 21.86. Nonger net income attributable to Baidu Core was RMB 7.5 billion, and Nonger net margin for Baidu Core was 27%. In 2023, net income attributable to Baidu was RMB 20.3 billion, and diluting earnings per ADS was RMB 55.08. Net income attributable to Baidu Core was RMB 19.4 billion, and net margin for Baidu Core was 19.19. Nonger net income attributable to Baidu was RMB 28.7 billion. Nonger diluting earnings per ADS was RMB 80.85. Nonger net income attributable to Baidu Core was RMB 27.4 billion, and Nonger net margin for Baidu Core was 26%. As of December 31, 2023, cash, cash equivalents, 3D cash, and short-term investments were RMB 205.4 billion, and cash, cash equivalents, 3D cash, and short-term investments, including ITE, were RMB 200 billion. Free cash flow was RMB 25.4 billion, and free cash flow, excluding ITE, was RMB 22.1 billion in 2023. Baidu Core had approximately 35,000 employees as of December 31, 2023. With that we will now open the call to questions.
spk13: Thank you. We will now begin the question and answer session. To ask a question, you may press star, then 1, on your touchtone phone. If you are using a speakerphone, please pick up your handset before pressing the keys. If you would like to withdraw your question, please press star, then 2. Our first question today will come from Alicia Yap of Citi. Please go ahead.
spk05: Hi, thank you. Good evening, management. Thanks for taking my questions. My question is on the outlook. How does management think about the macroeconomics landscape in China for 2024? What's management view on 2024 growth outlook for Baidu as a whole? And also, what is the percentage of total revenues of Baidu will be contributed from
spk04: the AI-related revenue in 2024? Thank you.
spk18: Hi, Alicia. This is Robin.
spk11: Before we dive into the outlook, let's take a quick look back last year. We had a very challenging macroeconomic environment, but our business demonstrated very solid performance. We invested very aggressively in GEN-AI, but our non-GAAP operating margin expanded year over year, and revenue experienced solid growth. More importantly, we began to increase revenue from GEN-AI and foundation models. For this year, we have noticed that both the central and local governments are putting efforts to grow the economy. During the eight-day Chinese New Year holiday, we saw a growth in consumption, particularly in the travel sector. But we're still operating in a macro environment with a lot of uncertainty. We are closely monitoring significant economic stimulus plans, which we think is essential for achieving this year's goals. But that being said, Baidu is facing a lot of opportunities. Our core business remains solid, and incremental revenue from GEN-AI and foundation models will increase to several billion RMB in 2024. This will contribute to the growth of our total revenue. More specifically, thanks to our leading position in LLM and GEN-AI, enterprises are increasingly building models and developing apps in the Baidu cloud. For our mobile ecosystem, we have already accumulated a huge user base, and we keep renovating our products and enhancing our monetization capabilities through AI innovation. So when we combine cloud and mobile, I think we will be able to sustain our long-term growth, which we think will be faster than China's GDP growth.
spk19: Thank you.
spk13: Okay,
spk19: and our
spk13: next question today will come from Alex Xiao of JP Morgan. Please go ahead.
spk19: Hi Alex, your line is open. Are you muted?
spk27: Yes, sorry, I was unmuted. Good evening, management, and thank you for taking my questions. I have a couple of questions on cost structure and the margin trends. First of all, how much more room do we see in the cost cutting and optimization? How should we look at AI-related investments? In the past, you guys discussed that there will be lag between cheap investments and the AI revenue contribution. How should we look at the margin trend into 2024 if you intend to expand?
spk18: Hi Alex,
spk06: thank you so much for your questions. I think alongside the investments in our GEN-AI businesses, it's pretty sure that there is still room for cost and expenses in our legacy businesses. As we look into 2024, we will continue to focus on our core businesses and we also will be worried to reduce our resource allocations to the non-strategy businesses. Additionally, we are continuously enhancing our overall organization efficiencies by removing the layers, simplifying the executions, and flattening the organizational structures. So heading into this year, we are very committed to the ongoing optimization of our operations, ensuring we have a more productive HR team. With all these measures in place, we aim to keep by the core earnings solid, with our mobile system continuing to deliver the wide-spread margins and generate a very strong cash flow, while AI cloud business continues its profitability. We have managed to maintain a solid operation profit margins despite our investments in AI. If you still remember from the year 2023, we started to invest in generative AI and large-dange models. This investment is primarily reflected in our CAPEX, and mainly related to purchasing chips and servers for the modern trainings and forances. As you know, CAPEX will be depreciated over several years, so despite our 58% -over-year increase in Bidu's cost CAPEX in 2003, our non-GaR operating profit margins still saw a 2% growth on a -over-year basis. Looking forward, in the process of developing our new AI business, we have made
spk25: new
spk06: investments. However, these investments are not expected to significantly affect our margins or profits. During the early stage of market development, we will also now overly prioritize margins for our AI cloud business. We believe that in the long run, this business is expected to yield much better margins. Additionally, there may be some emotional activities for the AI-native 2C products, but we will carefully manage and closely monitor our own to balance the investments and growth. We are happy to see that our efforts in investing in new initiatives have begun to yield the early results. As Robin has mentioned just now, the incremental revenue generated from the architect improvement has reached several million hundred and four. The incremental AI cloud revenue generated from the AI and foundation models also contributes to .8% of the total AI cloud revenue. All of these promising and top-notch achievements have strengthened our confidence in our strategy. Going forward, we will remain steadfast in our commitment to the development of the GenAI and large-lenge model. Thank you,
spk19: Alex.
spk13: Our next question today will come from Gary Yu of Morgan Stanley. Please go ahead.
spk28: Hi, good evening and thank you for taking my question. I have a question regarding AI contribution. So for AI-related revenue contribution, is there a way you can quantify or prove that the ad revenues by do will be generated is purely incremental contribution from AIGC and not cannibalizing from your existing search business? And if AI is purely incremental, should we be expecting faster than average growth and excluding AI? How should we think about the cautious growth rate for 2024? Thank you.
spk11: Hi, Gary. This is Robin. We are the largest search engine in China. We have close to 700 million monthly active users and we have established a very robust brand presence among the Chinese internet and mobile users. They rely on us for comprehensive and reliable information. So we have a strong and stable base revenue and process. But we are also very sensitive to macro because our advertising business has very broad coverage of different verticals. I mentioned earlier that there are still uncertainties of macro environment. But GenAI and LLM are unlocking new opportunities both at the monetization front and the user engagement front. I think it's easier to quantify the incremental revenue on the monetization front. I mentioned earlier that GenAI has already helped increase advertising ECPM. And our upgraded monetization system has allowed us to improve targeting capabilities, thereby generating and presenting more relevant ads. We earn several hundred million in Q4 from this kind of initiative. And the incremental revenue will grow to several billion this year. It's harder to quantify the user engagement part. GenAI is helping us improve user experience. We have seen initial outcomes in Search and Wengu and we will continue introducing new features going forward. This initiative will help us improve the user mindshare and time span over time and bring us even larger potential. So I think the purely incremental revenue will come from both the monetization side and the user engagement side.
spk19: Thank you.
spk13: Our next question today will come from Lincoln Kong of Goldman Sachs. Please go ahead.
spk09: Thank you, Benjamin, for taking my question. So my question is regarding the cloud business. How should we look at incremental revenue growth that's driven by GenAI? So what is the drivers here? And when we're looking at 2024, how should we expect the overall the account revenue growth this
spk26: year?
spk09: And there will be the modern trend this year. So thank you.
spk07: Thank you for the question, Zudow. As Robin just mentioned, the total revenue from the GenAI and the foundation model related businesses, including both internal and external revenue, already reached 656 million RMB in Q4 and this number should grow to several billion RMB for the full year 2024. We have seen increasing interest from enterprises in using GenAI and LLM to develop new applications and features. To achieve this goal, so they are actively building models to power their products and solutions. So this is how we generate the majority of the revenue from external customers. Meanwhile, we were seeing a significant increase in model inferencing revenue from external customers. So the revenue from inferencing is still small. We believe that over the long term, it will become a significant and sustainable revenue driver. We think revenue generated by internal customers is also quite important because a significant portion of such revenue is for model inferencing. So Baidu is the first company to use GenAI and LLM to reconstruct all the businesses and products. So with the number of products and features powered by GenAI and LLM continue to increase, early API calls from internal customers have been increasing rapidly and have reached a significant magnitude. A such development has proven that Ernie and Erniebot can well enhance the productivity and efficiency in real world applications. And we also believe a growing number of external customers will use Ernie to develop their own apps and drive our external revenue in the future. So regarding your question about our product offerings, we have the most powerful AI infrastructure for model training and inferencing in China. So this infrastructure helps our customers build and run models cost effectively. Additionally, as Robin mentioned earlier, our MAPS offers various models and a full set of tools in terms of model builder and app builder for model building and application development. So in addition to that, we have developed our own AI native solutions such as TBI, that is Generative Business Intelligence. So those applications are helping enterprises increase productivity and efficiency. In addition to the incremental opportunity related to AI, GenAI and foundation models also bring new opportunities to our legacy cloud businesses. So firstly, we continue to win customers and projects for CPU-centric cloud because we are being highly recognized for our AI infrastructure and the CPU-centric cloud offerings, especially in the Internet and tech and education sectors. Secondly, GenAI and foundation models have allowed us to build AI solutions for our customers more efficiently than before, facilitating digital and intelligent transformations for traditional industries. So both of these two factors are driving growth for our cloud revenue. And overall, we should see cloud revenue growth accelerate in 2024 over last year. Additionally, we are pretty confident in maintaining profitability for our AI cloud. For enterprise cloud, we should be able to consistently improve loss margins for the legacy cloud businesses. As for GenAI and LLM businesses, the market is still at a very early stage of development. So we should hold a pretty dynamic pricing strategy to quickly educate the market and expand our penetration into more enterprise customers. We believe over the long term, the new business should have higher normalized margins than the traditional cloud businesses. Thank you.
spk13: Our next question today will come from Thomas Tong of Jefferies. Please go ahead.
spk02: Hi, good evening. Thanks, management, for taking my questions. May I ask about the place for our AI to see product development? How has the traffic growth been and any key metrics which can be shared on the new generative search? How AI benefits search traffic, time span, and when should we see the explosion of traffic or super apps? Thank you.
spk10: Yeah, we are reconstructing
spk11: all of our to see product using generative AI. I think GenAI and foundation models are making all of our products more powerful. Over the past few months, we've made a significant stride in this kind of effort. And the initial user feedback has been very encouraging. For search, the introduction of GenAI has enabled Baidu to answer a broader range of questions, including more complex, open-ended, and comparative queries. With early bot, we can provide direct and clear answers in a more interactive way. During the past few months, instead of just landing some content and providing some links, more and more search results were generated by early bot. As a result, users are engaging with Baidu more frequently and asking new questions. For example, more and more users are coming to Baidu for content creation, be it text or images. During the Chinese New Year, holiday, Baidu helped users create New Year greeting messages and generate personalized e-cards for their loved ones. This is not a typical use case for search engines, but we see a large number of users relying on Baidu for this kind of use. Going forward, we will increasingly use early bot to generate answers for search queries and then use multi-run conversations to clarify user intent so that complicated user needs can be addressed through natural language. While these initiatives have resulted in an enhanced search experience, we are still in the early stages of using early bot to reconstruct Baidu search. We will continue to test and iterate on Gen.AI enabled features based on user feedback, and we will follow our typical playbook for testing and fine-tuning new experiences before upgrades are ready for a larger scale rollout. Overall, we believe Gen.AI will complement traditional search, ultimately increasing user retention, engagement, and time spent on Baidu. In addition to search, early bot acting as a co-pilot, Wunkoo has transformed from an application for users to find templates and documents to a one-stop shop for users to create content in a wide range of formats. -to-date, I think about 18% of new paying users were attracted by Gen.AI features for Wunkoo. I want to emphasize that we are in the early stages of using early bot to reconstruct our apps and build new ones. And at the same time, we are attracting and helping enterprises build apps on early. We believe the success of early depends on its wide and active adoption, whether through Baidu apps or through third-party apps.
spk19: Thank you.
spk13: Our next question today will come from James Lee of Mizuho. Please go ahead.
spk08: Great. Thanks for taking my questions. Can you guys maybe talk about Ernie's Technology Roadmap for 2024? Does it include multimodal features, maybe similar to Sora, or maybe opening up an AI store or potentially launching an AI agent? Is there any sort of milestone or key metrics you can speak to? The second part to that question is on the cost of running Gen.AI, how should we think about maybe puts and take on managing inferencing costs going forward? Obviously, you talk about some way to make it more efficient. Are there any additional levers to optimize this process?
spk10: Thanks. Yeah,
spk11: the chips we have on hand should enable us to advance EB4 to the next level. As I mentioned earlier, we will take an application-driven approach to intense Ernie and let our users and customers tell us where we should improve and adjust for our models. It could be building multimodalities, agents, increasing reliability, and so on. It's important to note that we are focusing on using Ernie to bring real value to users and customers rather than simply achieving high rankings on those research publications. And you're right, price is a very important issue. Making high-performance foundation models affordable is critical for large-scale operations. I mentioned earlier that we have been continuously reducing model inference costs. Now the inference cost of EB3.5 is about 1% of March 2020 street version. By doing so, more and more enterprises are increasingly willing to test, develop, and iterate their applications on Ernie. We understand that for many customers, they tend to balance efficiency with cost and speed. So we also offer smaller language models and help our customers leverage MOE, that's mixture of experts for best performance. With our -to-end approach, we believe there's still ample room to reduce the cost of our most powerful models and make them increasingly affordable to our customers. And this will further drive the adoption of Ernie. Internally, we are closely monitoring the number of apps developed on Ernie. As I mentioned previously, Ernie is now handling over 50 million queries per day. And right now, Ernie API costs from internal applications are still larger than external app costs. Because for Ernie in different sizes from external apps have been increasingly rapidly. We just at the beginning of this journey, Ernie will only become more powerful, smarter, and more useful as more and more end users use it, be it through by-do apps or through third-party apps. This will enable us to cultivate an ecosystem around Ernie. As this app and models are actively used by end users, they will also generate significant revenue for us.
spk19: Thank you.
spk13: Our next question today will come from Charlene Lu of HSBC. Please go ahead.
spk20: Thank you. Good evening management and thank you again for the opportunity. I have a question related to Ernie. How does Ernie's enterprise adoption compare to its peers? Can you please kindly share with us the latest number of enterprises that are using Ernie to build models and applications and help us understand how that has gone versus last quarter and what the underlying drivers may be? Lastly, can you help us also understand whether we can assume that enterprises who apply the Ernie API integration will very unlikely be using other elements? Thank you so much.
spk07: Great questions Charlene. As Robin just mentioned, about 26,000 enterprises of different sizes spreading over different verticals called our Ernie API from our cloud in December, marking a 150% increase quarter over quarter. The Ernie API costs have exceeded 50 million on a daily basis and we believe no one else in China has gained so many customers and received such a high volume of API requests. So enterprises chose us primarily for the following reasons. Firstly, we have the most cost efficient AI infrastructure for model cleaning and inferencing in China, primarily because of our strong ability of -to-end optimization. Just mentioned before, gene AI and large-length models are reshaping the competitive landscape of China's public cloud industry and enhancing our competitive advantage. So our strong ability in managing extensive GPU-centric cloud with very high GPU utilization has continuously enhanced our AI infrastructure. So as a result, we can help enterprises build and run their models and develop AI native apps at low cost on our cloud. Secondly, the EB family of models have attracted many customers to our cloud. Over the past few months, we have consistently enhanced Ernie's performance, receiving positive feedback from the customers. We also offer Ernie models in different sizes to better accommodate customers' needs regarding cost structures. Thirdly, we were the first company in China to launch a model as a service offering, which is a one-stop shop for LLM and AI native application development. So our mass makes it easy for enterprises to use LLMs. We are also providing a tool case to help enterprises easily train or fine-tune their models and develop applications on our cloud. So with the tool case, customers can train purpose-built costs effectively by incorporating their proprietary data, and they can directly use Ernie's API to power their own applications as well. We can also help them to support different product features using different models, adopting the IMOE approach in the app development. So as a result, enterprises can focus on identifying customers' pinpoints rather than expanding their efforts on the program. So all of these initiatives have helped us establish a first-mover advantage in GNI and LLMs. For your last question, as more customers use our mass type form to develop AI native applications aimed at attracting users, substantial user and customer insights will be generated and accumulated on our cloud. So these insights will also allow us to further refine the tool case. As our tools become increasingly custom-friendly and help enterprises effortlessly fine-tune models and create apps, they will be more inclined to stay with us. Additionally, it is worth noting that at the current stage of employing large-language models, it is crucial for customers to create suitable prompts for their chosen models. So since they have to invest considerable effort in building and accumulating their best prompts for using large-language models, it becomes challenging for them to switch to another model because they will have to re-establish their prompt portfolio. So as a result, with increasing adoption and active usage of our platform, customer satisfaction and switching costs will help us increase the customer retention.
spk17: Thank you.
spk13: Our next question today will come from Miranda Zhang of Bank of America Securities. Please go ahead.
spk03: Good evening. Thank you for taking my question, which is about the AI chip. Wondering what's the impact on AI development after the recent further chip restriction from the US? Is there any update on the alternative chip? Given the chip constraints, how is Biden developing the AI model product monetization differently versus the overseas peers? What can be achieved and what may become difficult? What will companies do to keep up with the overseas peers in the next few years? Thank you.
spk11: In the near term, the impact is minimal for our model development or product reinvention or monetization. As I mentioned last quarter, we already have the most powerful foundation model in China and our AI chip reserve enables us to continue enhancing earning for the next one or two years. And for model inference, it requires less powerful chips. Our reserve and the chips available on the market are sufficient for us to power many AI native applications for end users and customers. In the long run, we may not have access to the most cutting edge GPUs, but with the most efficient homegrown software stack, that means the user experience will not be compromised. There's ample room for animations in the application layer, the model layer, and the framework layer. Our -to-end self-developed four-layer AI architecture along with our strong R&D team will support us in using less advanced chips for efficient model training and inference. This provides, by the way, a unique competitive advantage over our domestic peers. And for enterprises and the developers, building applications on earnings will be the best and most efficient way to embrace AI. Thank you.
spk13: Our next question today will come from Sean Wan of UBS. Please go ahead.
spk24: Good evening, management. This is Sean Wan on behalf of Canon. Thanks for taking my questions. So my question is, recent days we have seen numerous developments in text to video or video generation technology. So how do you envision this technology impacting the AI industry development in China and what implications does it hold for Ernie? Could you elaborate on your strategic roadmap for Ernie moving forward? Furthermore, how does Ernie currently perform in text generation and text to image, text to video generation tasks, and what improvements do you foresee in these areas? Thank you.
spk10: This is Robin. First of
spk11: all, multimodal or the integration of multimodalities such as text to audio and video is an important direction for future foundation model development. It is a must-have for AGI. And Baidu has already invested in this area and will continue to do so in the future. Secondly, if we look at the development of foundation models, the market for large language models is huge and still at very early stages. Even the most powerful language models in the world are still not good enough for a lot of applications. There is plenty of room for innovation. Smaller sized models, MOE, and agents are all evolving very quickly. We strive to make this offering more accessible to all types of enterprises and solve real-world problems in various scenarios. And thirdly, in the realm of visual foundation models, one notably significant application with vast market potential is autonomous driving, in which Baidu is a pioneer and a global leader. We have been using diffusion and transformer to train our video generation models for self-driving purposes. We have also consistently made strides in object classification, detection, and segmentation, thereby better understanding the physical world and the role of the physical world. This has enabled us to translate images and videos captured on the road into specific tasks, resulting in more intelligent, adaptable, and safe autonomous driving technology. Overall, our strategy is to develop the most powerful foundation models to solve real-world problems. And we continue to invest in this area to ensure
spk19: our
spk11: leadership position.
spk19: Thank you. Thank you. And ladies
spk13: and gentlemen, at this time, we will conclude the question and answer session and conclude Baidu's fourth quarter in fiscal year, 2023 earnings conference call. We do thank you for attending today's presentation. You may now disconnect your lines.
spk01: Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you.
spk13: Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Hello and thank you for standing by for Baidu's fourth quarter in fiscal year, 2023 earnings conference call. At this time, all participants are in listen only mode. After management's prepared remarks, there will be a question and answer session. Today's conference is being recorded. If you have any objections, you may disconnect at this time. I would now like to turn the meeting over to your host for today's conference, Jo Lynn, Baidu's Director of Investor Relations.
spk21: Hello everyone and welcome to Baidu's fourth quarter and fiscal year, 2023 earnings conference call. Baidu's earnings release was distributed earlier today and you can find a copy on our website as well as on mutual services. On the call today, we have Jo Lynn Lee, our co-founder and CEO, Jo Ho, our CFO, and Doshan, our EVP in charge of Baidu AI Cloud Group, ACG. After our prepared remarks, we will hold a Q&A session. Please note that the discussion today will contain forward-looking statements made under the safe harbor position of the U.S. Credit Security Medication Reform Act of 1995. Forward-looking statements are subject to risks and uncertainties that may cause actual results to differ materially from our current expectations. For detailed discussions of these risks and uncertainties, please refer to our latest annual report and other documents filed with SEC and Hong Kong Stock Exchange. Baidu does not undertake any obligation to update any forward-looking statements except as required under applicable law. Our earnings press release and this call include discussions of certain unaudited non-GAP financial measures. Our press release contains a recommendation of the unaudited non-GAP measures to be the most directly comfortable GAP measure and is available on our IR website at .baidu.com. As a reminder, this conference is being recorded. In addition, a webcast of this conference call will be available on Baidu's IR website. I will now turn the call over to our CEO, Robin.
spk11: Hi everyone. We concluded 2023 with a solid call. And of course, total revenue for the fiscal year increased by 8% -over-year and the non-GAP operating margin expanded from 22% to 24%. This improvement highlighted the resilience of our core base, which serves as a strong foundation for our ventures in earnings and earnings bot. 2023 marked a year of GEN.AI and foundation models, a massive technology shift that will re-engineer this process and bring about a real renaissance in every sector of the economy. Baidu is well prepared to benefit greatly from this groundbreaking economic revolution. We progressed from talking about the opportunities of GEN.AI and foundation models to actively applying earnings and earnings bot at scale. As the frontrunner in AI, Baidu proudly became the first public company globally to launch a GPT model with our EB 4.0 standing call as the most powerful foundation model in China. Earning continues to gain market recognition as evidenced by earning API calls from multiple well-known companies. Notably, Samsung uses earning API on its Galaxy S24 5G series. Honor uses earning API on its Magic 8.0. And AutoHome uses earning API to power its multiple AIGC apps. The partnerships further solidify our leadership in GEN.AI. Furthermore, we continued improving the efficiency of earnings. For example, the inference cost of EB 3.5 is only about 1% of the March 2023 version. With lower inference costs, earning will become increasingly accessible to users and enterprises. We are able to do that primarily because of our unique four-layer AI architecture and our strong ability in -to-end optimization. Additionally, we are offering LLMs in smaller sizes. These small models can balance performance and efficiency, better serving various customer needs. Since Q2 2023, we have actively utilized earning to revolutionize our products and services, creating AI-native experiences. We believe real applications are essential to unleashing the full business potential of Earning and EarningBot. Recently, we began to generate incremental revenues from Earning and EarningBot. In the fourth quarter, we earned several hundred million RMB, primarily from ad technology improvement and helping enterprises build their own models. I'll provide a more detailed explanation in the business review section. Looking into 2024, we believe this incremental revenue will multiply to several billion RMB, primarily from advertising and AI cloud businesses. While we are beginning to commercialize Gen.AI and foundation models, we see enormous monetization potential in this groundbreaking technology. We envision Earning as the future foundation system, serving as the foundation for millions of AI-native applications developed by third-party and by the world. This paradigm will enable us to create an ecosystem around Earning, which opens up various revenue sources. The key highlights in Baidu's AI-native application. Equipped with an AI co-pilot, Baidu Wanku has become a platform for users to create content in a wide range of formats, helping them express their ideas. In addition to summarizing, creating, and expanding content, users particularly appreciate Baidu Wanku's features for assisting them in automatically turning their inspiration into PowerPoint. We have been consistently improving and enriching AI features. For example, we recently introduced a new one, such as MindMaps for organizing information. We are reconstructing Baidu Search using Earning. Last quarter, we talked about how we used Earning to generate search results, improving the one-shot experience. In Q4, Baidu Search introduced newsfeed-like information alongside the generated search results at scale to offer users useful and relevant content to explore. In Q4, we also began facilitating certain search queries with multi-round conversations to delve deeper into understanding users' intentions. Overall, GenAI-enabled search complements traditional search, expanding the time of Baidu Search by serving a wider range of information needs, and answering content creation-related questions. We are actively encouraging SMEs in service industries to build AI chatbots as new landing pages for Baidu Search. AI chatbots enable them to better serve potential buyers directed to them by us, driving transactions on our platform. At the same time, our AI chatbots can help merchants increase productivity. Currently, about 4,000 merchants are using our AI chatbots, particularly in healthcare, education, travel, legal, B2B, and auto, where users often interact with merchants to learn about their offerings before making purchasing decisions. We believe the GenAI-enabled search will help increase user intention on Baidu, reinforcing our position as the primary entry point for users seeking information and services. We are also expanding the range of EARNi by enabling enterprises to easily develop AI-native applications on EARNi. In December, we introduced AppBuilder, a comprehensive set of tools on our public cloud to support enterprises to swiftly view AI-native applications from the ground up. These tools enable enterprises to incorporate capabilities such as organizing online messages for prioritization, creating drafts, summarizing meeting notes, and facilitating GenAI-enabled data analysis. Additionally, we have included several ready-made templates, such as AI Agent, to enable enterprises to promptly create applications for validating their ideas. Such progress is in large part based on our own pioneering experience in developing AI-native applications. In addition to the tools for app development, enterprises partner with us because we have the most powerful and most efficient foundation models and the most cost-effective cloud platform for running apps. In December, about 26,000 enterprises are actively using EARNi through API on a monthly basis, increasing 150% -over-quarter. And EARNi is now handling more than 50 million queries every day, that's 190% -over-quarter. It's a significant rise in third-party calling. As the number of AI-native applications hosted on our public cloud continue to rise, we are poised to accelerate the enhancement of our model performance by incorporating feedback from our users and customers. This will further widen the gap between EARNi and our domestic peers. Moreover, as more and more AI-native applications on EARNi become popular, EARNi will likewise be successful and support a robust ecosystem around it. Steering our multi-year growth forward. In conclusion, we find ourselves amidst a tremendous opportunity in general AI and automation model. With EARNi and EARNiBot, we have started generating incremental revenues. In 2024, we expect AI revenues' contribution to become more meaningful, while our core business will remain resilient. Moreover, we are preparing to surf the next wave in the development of EARNi. In the future, we believe that there will be millions of AI-native applications on EARNi. There's a vast majority of them, divided by our cloud customers. The expanding EARNi ecosystem will then unlock numerous monetization potentials for Baidu. Now, let me recap the key highlights for each business for the fourth quarter of 2023. Mobile ecosystem exhibits a solid performance across revenue, margin, and cash flow. In the fourth quarter, Baidu's online marketing revenue increased by 6% year over year, driven by verticals in travel, healthcare, Fin services, and others. In Q4, we generated several hundred million RMB incremental ad revenue due to improvements in ad tech. We continue to use EARNi to enhance our monetization system, including targeting capability and bidding systems in the quarter. Starting from January, our monetization system has been able to generate real-time text-based ads for search queries, demonstrating significant technology improvements in both targeting and ad creation. As of today, our new marketing platform has attracted about 10,000 advertisers. An advertiser in the medical aesthetics industry began to adopt our new marketing platform in Q4. The AI co-pilot, the advertiser, can articulate its requirements comprehensively through natural language and multi-round conversations for further clarification and explanation. The conversational experience has assisted in building optimized search and feed campaigns by generating relevant ad content. Our platform has also helped them reach highly targeted audiences while dynamically allocating budget to drive conversions. The advertiser achieved an increase of 22% in conversion rate and a reduction of 5% in cost for self-feed acquisition after using this enhanced platform and capabilities. Currently, only a small portion of our entire advertiser base has adopted this new marketing platform, which means there is a significant opportunity for future growth. All these initiatives have helped advertisers improve their conversion, leading to an increase in their ad budget on our in the quarter. We expect continued growth in ad revenue from Gen. AI and foundation models in 2024 and beyond. While we enhance our ad technologies, we also introduced AI Chatbot for brand, an innovative ad product building on top of Erniebot in Q4. AI Chatbot has further enriched our ad portfolio. It can enhance user engagement and customer service while also capturing customer attention and driving customer demand. Since last October, China Feihe, a local infant formula company, has adopted AI Chatbot to promote its brand. The AI Chatbot has enabled Feihe to have multiple interactions with our customers, increase brand recognition, and gain valuable insights into its potential consumers. This profound insight has also helped Feihe understand customers' views of its brand and products, leading to an enhanced marketing strategy. AI Chatbot for brand has garnered significant interest from brand advertisers and we expect more and more advances to embrace the AI Chatbot in the future. As I previously mentioned, we're using Ernie to build our apps. With continuous enhancement of our AI native product offerings, we believe more and more users will come to buy this platform. This in turn will assist us in gaining more market share in both user engagement and online advertising revenues. Looking into 2024, mobile ecosystems should continue to generate steady profits and cash flow. AI Cloud revenue will grow by 11% year over year to RMB 5.7 billion and continue to improve profitability in the fourth quarter. Revenue from Gen.AI and Foundation Model represent .8% of our AI Cloud revenue in Q4. The increasing demand for model building played a significant role in this accelerated revenue growth, along with increasing distributions from inference. We have seen a growing number of enterprises, in particular tech companies, turning to our public cloud to build their models. Additionally, the AI Cloud revenue generated by Google cores, other VINs groups such as the Mobile Ecosystem Group and the Intelligent Driving Group was about RMB 2.7 billion in Q4. Within the Q4 internal cloud revenue, Gen.AI and Foundation Model accounted for about 14%. On a combined basis, the total internal and external AI Cloud revenue was RMB 8.4 billion in Q4, with Gen.AI and Foundation Model contributing around 656 million RMB. Gen.AI and LLMs have become pivotal considerations for many enterprises, driving a shift in the cloud industry from general purpose computing to AI computing. This evolution is reshaping the competitive dynamics within the cloud industry, strengthening our lead in AI and expanding our total addressable market. Enterprise has chosen us thanks to our possession of the most powerful and cost-effective AI infrastructure for both model training and inference in China. Thanks to our unique four-layer AI infrastructure to drive consistent efficiency gains and years of experience in earning. Last quarter, we explained that 98% of LLM training time on our AI infrastructure was valid. Moreover, our GPU networking is drowning at a 95% utilization for training LLMs. Both of these metrics have set industry benchmarks. Within our MAP, we offer Model Builder and App Builder, which are two sets of tools that help enterprises effortlessly build models and develop apps. As of now, enterprises have built about 10,000 models on our MAP. Since its inception, App Builder has facilitated the creation of thousands of AI apps. As more and more applications are being built on our MAP, we will have greater revenue potential going forward. Looking into 2024, AI cloud should maintain strong growth in revenue and generate profits at the non-gap operating level. Our intelligent driving business continue to focus on achieving new e-brake even for Apollo Go. In Wuhan, Apollo Go's largest operation, about 45% of our orders were provided by fully driverless vehicles in Q4. This metric surpassed 50% in January. The increase is because we intensified operations during peak hours in areas with complex traffic conditions and further expanded our operating area in the past few months. This development resulted from our ongoing efforts to improve technology through safety operating through safely operating Apollo Go on public goals. In China, Apollo Go provided about 839,000 rides in the public in Q4, marking a 49% -over-year increase. In early January, the cumulative rides offered by Apollo Go exceeded 5 million. The substantial data collected from operations will further help us enhance the efficiency of safe operations. Looking into 2024, we will remain focused on getting closer to Apollo Go's new e-brake even target and managing our costs and expenses to reduce losses in intelligent driving. Upon reaching new e-brake even, we plan to swiftly replicate our successful operations in Wuhan to other regions. In summary, we are facing tremendous opportunities in Gen. AI and the foundation models. We will continue to invest in these opportunities. And at the same time, we will strive to optimize our cost and expense structure for each business line to improve operational efficiency. With that, let me turn the call over to Rong to go through the financial results.
spk06: Thank you, Robin. Now let me walk through the details of our first quarter and the full year 2023 financial results. We closed 2023 with solid financial results. Total revenues in the first quarter was RMB 35 billion, increasing 6% -over-year. Total revenues for the full year 2023 were RMB 134.6 billion, increasing 9% -over-year. By-law costs Q4 revenue was RMB 27.5 billion, increasing 7% -over-year. In 2023, by-law costs generated RMB 103.5 billion in revenue, increasing 8% -over-year. By-law costs online marketing revenue increased 6% -over-year to RMB 19.2 billion Q4, accounting for 17% of by-law costs total revenue. By-law costs online marketing revenue was up 8% -over-year in 2023. By-law costs non-online marketing revenue was RMB 8.3 billion, up 9% -over-year. For the full year 2023, non-online marketing business increased 9% -over-year. The increase in non-online marketing business was mainly driven by the AI cloud business. Revenue from ITE was RMB 7.7 billion in Q4, increasing 2% -over-year. Revenue from ITE was RMB 31.9 billion in 2023, increasing 10% -over-year. Cost of revenue was RMB 17.4 billion in Q4, increasing 3% -over-year, primarily due to an increase in costs related to AI cloud business, partially offset by a decrease in quantum costs. Cost of revenue was RMB 65 billion in 2023, increasing 2% -over-year, primarily due to an increase in track acquisition costs, partially offset by a decrease in the quantum costs and the costs related to AI cloud business. Operating expenses were RMB 12.1 billion in Q4, increasing 5% -over-year. Primarily due to an increase in server depreciation expenses and server cost decrease, which support general AI research and development improves, and channel spending and promotional marketing expenses. Operating expenses were RMB 47.7 billion in 2023, increasing 9% -over-year, primarily due to an increase in channel spending and promotional marketing expenses, and server depreciation expenses and server cost decrease, which support general AI research and development improves. Operating income was RMB 5.4 billion in Q4, Baidu Core operating income was RMB 4.7 billion, and Baidu Core operating margin was 17% in Q4. Operating income was RMB 21.9 billion in 2023, Baidu Core operating income was RMB 18.8 billion, and Baidu Core operating margin was 18% in 2023. Nonger operating income was RMB 7.1 billion in Q4, Nonger Baidu Core operating income was RMB 6.2 billion, and Nonger Baidu Core operating margin was 23% in Q4. Nonger operating income was RMB 28.4 billion in 2023, Nonger Baidu Core operating income was RMB 24.7 billion, and Nonger Baidu Core operating margin was 24% in 2023. In Q4, total other loss net was RMB 2.5 billion, compared to total other income net of RMB 1.8 billion for the last year, mainly due to a peak of losses from equity massive investments as a result of modification of the current trends of underlying preferentials. In context, finally, with RMB 96 million, in 2023, total other income net was RMB 3.3 billion, compared to total other loss net of RMB 5.8 billion last year, mainly due to a fair value gain of RMB 198 million from long-term investment this year, compared to a fair value loss of RMB 3.9 billion last year, and a decrease of RMB 2.2 billion in impairment of long-term investments. Income tax expenses was RMB 3.6 billion. In Q4, net income attributable to Baidu was RMB 2.6 billion, and diluting earnings per ADS was RMB 6.77 billion. Net income attributable to Baidu Core was RMB 2.4 billion, and net margin for Baidu Core is 9%. Nonger net income attributable to Baidu was RMB 7.8 billion. Nonger diluting earnings per ADS was RMB 21.86. Nonger net income attributable to Baidu Core was RMB 7.5 billion, and Nonger net margin for Baidu Core was 27%. In 2023, net income attributable to Baidu was RMB 20.3 billion, and diluting earnings per ADS was RMB 55.08. Net income attributable to Baidu Core was RMB 19.4 billion, and net margin for Baidu Core was 19.19. Nonger net income attributable to Baidu was RMB 28.7 billion. Nonger diluting earnings per ADS was RMB 80.85. Nonger net income attributable to Baidu Core was RMB 27.4 billion, and net margin for Baidu Core was 26%. As of December 31, 2023, cash equivalents, real 3D cash, and shorting investments were RMB 205.4 billion. And cash equivalents, real 3D cash, and shorting investments, including ITE, were RMB 200 billion. Free cash flow was RMB 25.4 billion, and free cash flow, including ITE, was RMB 22.1 billion in 2023. Baidu Core had approximately 35,000 employees as of December 31, 2023. With that, let's now open the call to questions.
spk13: Thank you. We will now begin the question and answer session. To ask a question, you may press star, then one, on your touchtone phone. If you are using a speakerphone, please pick up your handset before pressing the keys. If you would like to withdraw your question, please press star, then two. Our first question today will come from Alicia Yap of Citi. Please go ahead.
spk05: Hi, thank you. Good evening, management. Thanks for taking my questions. My question is on the outlook. How does management think about the macroeconomics landscape in China for 2024? What's management view on 2024 growth outlook for Baidu as a whole? And also, what is the percentage of total revenues of Baidu will be contributed from the AI
spk04: -related revenue in 2024? Thank you.
spk18: Hi, Alicia. This is Robin.
spk11: Before we dive into the outlook, let's take a quick look back last year. We had a very challenging macroeconomic environment, but our business demonstrated very solid performance. We invested very aggressively in general AI, but our non-GAAP operating margin expanded year over year, and revenue experienced solid growth. More importantly, we began to generate incremental revenue from general AI and foundation models. And for this year, we have noticed that both the central and local governments are putting efforts to grow the economy. And during the eight-day Chinese New Year holiday, we saw a growth in consumption, particularly in the travel sector. But we're still operating in a macro environment with a lot of uncertainty. We are closely monitoring significant economic stimulus plans, which we think is essential for achieving this year's goals. But that being said, Baidu is facing a lot of opportunities. Our core business remains solid, and incremental revenue from general AI and foundation models will increase to several billion RMB in 2024. This will contribute to the growth of our total revenue. More specifically, thanks to our leading position in LLM and GNI, enterprises are increasingly building models and developing apps in the Baidu cloud. And for our mobile ecosystem, we have already accumulated a huge user base, and we keep renovating our products and enhancing our monetization capabilities through AI innovation. So when we combine cloud and mobile, I think we will be able to sustain our long-term growth, which we think will be faster than China's GDP growth.
spk19: Thank you.
spk13: Thank you. Okay, and our next question today will come from Alex Xiao of JP Morgan. Please go ahead.
spk19: Hi Alex, your line is open. Are you muted?
spk27: Yeah, sorry, I was unmuted. Good evening, management, and thank you for taking my questions. I have a couple of questions on cost structure and the modern trends. First of all, how much more room do we see in the cost cutting and optimization? How should we look at AI-related investments? In the past, you guys discussed that there will be a lag between cheap investments and AI revenue contribution. How should we look at the modern trend into 2024 if you intend to expand?
spk18: Hi Alex, thank
spk06: you so much for your questions. This is Julius. I think alongside the investments in our general AI businesses, it's pretty sure that there is still room we can manage the cost and the expenses in our legacy businesses. As we look into 2024, we will continue to focus on our core businesses, and we also will be worrying to reduce our resource allocations to the non-strategy businesses. Additionally, we are continuously enhancing our overall organization efficiencies by removing the layers, simplifying the executions, and flattening the organization structures. So heading into this year, we are very committed to the ongoing optimization of our operations, ensuring we have a more productive HR team. With all these measures in place, we aim to keep by the core earnings solid, with our mobile system continuing to deliver the water flow margins and generate a very strong cash flows, while the AI cloud business continues its profitability. We have managed to maintain a solid operation profit margins despite our investments in AI. If you still remember from the year 2023, we started to invest in generative AI and large-dange models. This investment is primarily reflected in our CAPEX, and mainly related to purchasing chips and servers for the modern trainings and forings. As you know, the CAPEX will be depreciated over several years, so despite our 58% -over-year investors in bi-dupes across CAPEX in 2003, our non-GaP operating profit margins still saw a 2% growth on a -over-year basis. Looking forward, in the process of developing our new AI business, inevitable new investments. However, these investments are not expected to significantly affect our profit margins. During the early stage of market development, we will also now overly prioritize margins for our AI cloud business. We believe that in the long run, this business is expected to yield much better margins. Additionally, there may be some emotional activities for the AI-native 2C products, but we will carefully manage and closely monitor our own to balance the investments and growth. We are happy to see that our efforts in investing in new initiatives have begun to yield early results. As Robin has mentioned just now, the incremental revenue generated from the architect improvement has reached several million hundred and four. The incremental AI cloud revenue generated from the AI and foundation models also contributes to .8% of the total AI cloud revenue. I think all of these promising and top-notch achievements have strengthened our confidence in our strategy. So going forward, we will remain steadfast in our commitment to the development of the general AI and large-language model.
spk18: Thank you, Alex.
spk13: Our next question today will come from Gary Yu of Morgan Stanley. Please go ahead.
spk28: Hi, good evening and thank you for taking my question. I have a question regarding AI contribution. So for AI-related revenue contribution, is there a way you can quantify or prove that the ad revenues by do will be generated is purely incremental contribution from AIGC and not cannibalizing from your existing search business? And if AI is purely incremental, should we be expecting faster than average growth and excluding AI? How should we think about the core search growth rate for 2024? Thank you.
spk11: Hi, Gary. This is Robin. We are the largest search engine in China. We have close to 700 million monthly active users and we have established a very robust brand presence among the Chinese internet and mobile users. They rely on us for comprehensive and reliable information. So we have a strong and stable base revenue and process. But we are also very sensitive to macro because our advertising business has very broad coverage of different verticals. I mentioned earlier that there are still uncertainties with macro development. But JNI and LLM are unlocking new opportunities both at the monetization front and the user engagement front. I think it's easier to quantify the incremental revenue on the monetization front. I mentioned earlier that JNI has already helped increase advertising ECPM. And our upgraded monetization system has allowed us to improve targeting capabilities thereby generating and presenting more relevant ads. We earn several hundred million in Q4 from this kind of initiative and the incremental revenue will grow to several billion. This year it's harder to quantify the user engagement part. JNI is helping us improve user experience. We have seen initial outcomes in search and we will continue introducing new features going forward. This initiative will help us improve the user mind share and time over time and bring us even larger potential. So I think the purely incremental revenue will come from both the monetization side and the user engagement side.
spk19: Thank you.
spk13: Our next question today will come from Lincoln Kong of Goldman Sachs. Please go ahead.
spk09: Hi, thank you, Madeline, for taking my question. So my question is regarding the cloud business. How should we look at incremental revenue growth that's driven by JNI? So what is the mix for JNI cloud and what are the offerings that primary growth drivers here? And when we're looking at 2024, how should we expect the overall AI cloud revenue growth this
spk26: year
spk09: and what will be the margin trend this year? Thank you.
spk07: Thank you for the question, Zudow. As Robin just mentioned, the total revenue from the JNI and the foundation model related businesses, including both internal and external revenue, already reached $656 million RMB in Q4 and this number should grow to several billion RMB for the full year 2024. We have seen increasing interest from enterprises in using JNI and LLM to develop new applications and features. To achieve this goal, so they are actively building models to power their products and solutions. So this is how we generate the majority of the revenue from external customers. In the meanwhile, we were seeing a significant increase in model inferencing revenue from external customers. So the revenue from inferencing is still small. We believe that over the long term, it will become a significant and sustainable revenue driver. We think revenue generated by internal customers is also quite important because a significant portion of such revenue is for model inferencing. So Baidu is the first company to use JNI and LLM to reconstruct all the businesses and products. So with the number of products and features powered by JNI and LLM continue to increase, early API calls from internal customers have been increasing rapidly and have reached a significant magnitude. A such development has proven that Ernie and Erniebot can well enhance the productivity and efficiency in real world applications. And we also believe a growing number of external customers will use Ernie to develop their own app and drive our external revenue in the future. So regarding your question about our product offerings, we have the most powerful AI infrastructure for model training and inferencing in China. So this infrastructure helps our customers build and run models cost effectively. Additionally, as Robin mentioned earlier, our MAUS offers various models and a full set of tools in terms of model builder and app builder for model building and application development. So in addition to that, we have developed our own AI native solutions such as TBI, that is Generative Business Intelligence. So those applications are helping enterprises increase productivity and efficiency. In addition to the incremental opportunity related to AI, JNI AI and foundation models also bring new opportunities to our legacy cloud businesses. So firstly, we continue to win customers and projects for CPU-centric cloud because we are being highly recognized for our AI infrastructure and the CPU-centric cloud offerings, especially in the Internet and tech and education sectors. Secondly, JNI AI and foundation models have allowed us to build AI solutions for our customers more efficiently than before, facilitating digital and intelligent transformations for traditional industries. So both of these two factors are driving growth for our cloud revenue and overall we should see cloud revenue growth accelerate in 2024 over last year. Additionally, we are pretty confident in maintaining profitability for our AI cloud. For enterprise cloud, we should be able to consistently improve growth margins for the legacy cloud businesses. As for JNI AI and LLM businesses, the market is still at a very early stage of development. So we should hold a pretty dynamic pricing strategy to quickly educate the market and expand our penetration into more enterprise customers. We believe over the long term, the new business should have higher normalized margins than the traditional cloud businesses. Thank you.
spk13: Our next question today will come from Thomas Tong of Jefferies. Please go ahead.
spk02: Hi, good evening. Thanks, management, for taking my questions. May I ask about the place for our AI to see product development? How has the traffic growth been and any key metrics which can be shared on the new generative search? How AI benefits search traffic, time span, and when should we see the expulsion of traffic or super apps? Thank you.
spk10: Yeah, we are reconstructing
spk11: all of our to see product using generative AI. I think JNI and foundation models are making all of our products more powerful. Over the past few months, we've made a significant stride in this kind of effort. And the initial user feedback has been encouraging. For search, the introduction of JNI has enabled Baidu to answer a broader range of questions, including more complex, open-ended, and comparative queries. With early bot, we can provide direct and clear answers in a more interactive way. During the past few months, instead of just landing some content and providing some links, more and more search results were generated by early bot. As a result, users are engaging with Baidu more frequently and asking new questions. For example, more and more users are coming to Baidu for content creation, be it text or images. During the Chinese New Year holiday, Baidu helped users create New Year greeting messages and generate personalized e-cards for their loved ones. This is not a typical use case for search engines, but we see a large number of users relying on Baidu for this kind of use. Going forward, we will increasingly use early bot to generate answers for search queries and then use multi-run conversations to clarify user intent so that complicated user needs can be addressed through natural language. While these initiatives have resulted in an enhanced search experience, we are still in the early stages of using Erniebot to reconstruct Baidu search. We will continue to test and iterate on Gen.AI enabled features based on user feedback, and we will follow our typical playbook for testing and fine-tuning new experiences before upgrades are ready for a larger scale rollout. Overall, we believe Gen.AI will complement traditional search, ultimately increasing user retention, engagement, and time spent on Baidu. In addition to search, Erniebot, acting as a co-pilot, WenKu has transformed from an application for users to find templates and documents to a one-stop shop for users to create content in a wide range of formats. -to-date, I think about 18% of new paying users were attracted by Gen.AI features for WenKu. I want to emphasize that we are in the early stages of using Erniebot to reconstruct our apps, and build new ones. At the same time, we are attracting and helping enterprises build apps on Ernie. We believe the success of Ernie depends on its wide and active adoption, whether through Baidu apps or through third-party apps.
spk19: Thank you.
spk13: Our next question today will come from James Lee of Mizuho. Please go ahead.
spk08: Great. Thanks for taking my questions. Can you guys maybe talk about Ernie's technology roadmap for 2024? Does it include multi-modal feature, maybe similar to Sora, or maybe opening up a AI store, or potentially launching an AI agent? Is there any sort of milestone or key metrics you can speak to? The second part to that question is, on the cost of running Gen.AI, how should we think about maybe puts and take on managing inferencing costs going forward? Obviously, you talk about some way to make it more efficient. Are there any additional levers to optimize this process? Thanks.
spk10: Yes.
spk11: The chips we have on hand should enable us to advance EB4 to the next level. As I mentioned earlier, we will take an application-driven approach to intense Ernie and let our users and customers tell us where we should improve and adjust for our models. It could be building multi-modalities, agents, increasing reliability, and so on. It's important to note that we are focusing on using Ernie to bring real value to users and customers, rather than simply achieving high rankings on those research publications. You're right. Price is a very important issue. Making high-performance foundation models affordable is critical for large-scale operations. I mentioned earlier that we have been continuously reducing model inference costs. Now the inference cost of EB3.5 is about 1% of March of 2023 version. By doing so, more and more enterprises are increasingly willing to test, develop, and iterate their applications on Ernie. We understand that for many customers, they tend to balance efficiency with cost and speed. So we also offer smaller language models and help our customers leverage MOE, that's mixture of experts for best performance. With our -to-end approach, we believe there's still ample room to reduce the cost of our most powerful models and make them increasingly affordable to our customers. And this will further drive the adoption of Ernie. Internally, we are closely monitoring the number of apps developed on Ernie. As I mentioned previously, Ernie is now handling over 50 million queries per day. And right now, Ernie API costs from internal applications are still larger than external app costs. But costs for Ernie in different sizes from external apps have been increasingly, have been increasing rapidly. We just at the beginning of this journey, Ernie will only become more powerful, smarter, and more useful as more and more end users use it. Be it through by-do apps or through third-party apps. This will enable us to cultivate an ecosystem for our party. As these apps and models are actively used by end users, they will also generate significant inference revenue for us. Thank
spk19: you.
spk13: Our next question today will come from Charlene Lu of HSBC. Please go ahead.
spk20: Thank you. Good evening management and thank you again for the opportunity. I have a question related to Ernie. How does Ernie's enterprise adoption compare to its peers? Can you please kindly share with us the latest number of enterprises that are using Ernie to build models and applications and help us understand how that has gone versus last quarter and what the underlying drivers may be? Lastly, can you help us also understand whether we can assume that enterprises who apply the Ernie API integration will very unlikely be using other elements? Thank you so much.
spk07: Great questions Charlene. So as Robin just mentioned, about 26,000 enterprises of different sizes, spreading over different verticals, called our Ernie API from our cloud in December, marking a 150% increase quarter over quarter. The Ernie API costs have exceeded 50 million on a daily basis and we believe no one else in China has gained so many customers and received such a high volume of API requests. So enterprises chose us primarily for the following reasons. Firstly, we have the most cost efficient AI infrastructure for model training and inferencing in China, primarily because of our strong ability of -to-end optimization. Just mentioned before, gene AI and large-length models are reshaping the competitive landscape of China's public cloud industry and enhancing our competitive advantage. So our strong ability in managing extensive GPU-centric cloud with very high GPU utilization
spk25: has
spk07: continuously enhanced our AI infrastructure. So as a result, we can help enterprises build and run their models and develop AI native apps at low cost on our cloud. Secondly, the EB family of models have attracted many customers to our cloud. Over the past few months, we have consistently enhanced Ernie's performance, receiving positive feedback from the customers. We also offer Ernie models in different sizes to better accommodate customers' needs regarding cost structures. Thirdly, we were the first company in China to launch a model as a service offering, which is a one-stop shop for LLM and AI native application development. So our mass makes it easy for enterprises to use LLMs. We are also providing a tool case to help enterprises easily train or fine-tune their models and develop applications on our cloud. So with the tool case, customers can train purpose-built costs effectively by incorporating their proprietary data, and they can directly use Ernie's API to power their own applications as well. We can also help them to support different product features using different models, adopting the MOE approach in the app development. So as a result, enterprises can focus on identifying customers' pinpoints rather than expanding their efforts on the program. So all of these initiatives have helped us establish a mover advantage in GNI and LLMs. For your last question, as more customers use our mass Typhoon to develop AI native applications aimed at attracting users, substantial user and customer insights will be generated and accumulated on our cloud. So these insights will also allow us to refine the tool case. As our tools become increasingly cost-friendly and help enterprises effortlessly fine-tune models and create apps, they will be more inclined to stay with us. Additionally, it is worth noting that at the current stage of employing large-language models, it is crucial for customers to create suitable prompts for their chosen models. So since they have to invest considerable effort in building and accumulating their best prompts for using large-language models, it becomes challenging for them to switch to another model because they will have to reestablish their prompt portfolio. So as a result, with increasing adoption and active usage of our platform, customer satisfaction and switching costs will help us increase the customer retention.
spk17: Thank you. For your next question.
spk13: Our next question today will come from Miranda Zhang of Bank of America Securities. Please go ahead.
spk03: Good evening. Thank you for taking my question, which is about the AI chip. Wondering what's the impact on AI development after the recent further chip restriction from the US? Is there any update on the alternative chip? Given the chip constraints, how is it by developing the AI model product and monetization differently versus the overseas peers? What can be achieved and what may become difficult? And what will companies do to keep up with the overseas peers in the next few years? Thank you.
spk11: In the near term, the impact is minimal for our model development or product reinvention or monetization. As I mentioned last quarter, we already have the most powerful foundation model in China and our AI chip reserve enables us to continue enhancing earning for the next one or two years. And for model inference, it requires less powerful chips. Our reserve and the chips available on the market are sufficient for us to power many AI native applications for end users and customers. And in the long run, we may not have access to the most cutting edge GPUs, but with the most efficient homegrown software stack, then the user experience will not be compromised. There's ample room for animations in the application layer, the model layer, and the framework layer. Our -to-end self-developed four-layer AI architecture along with our strong R&D team will support us in using less advanced chips for efficient model training and inference. This provides by doing with a unique competitive advantage over our domestic peers. And for enterprises and the developers, building applications on earnings will be the best and most efficient way to embrace AI. Thank you.
spk13: Our next question today will come from Sean Wan of UBS. Please go ahead.
spk24: Good evening, management. This is Sean Wan on behalf of Canon. Thanks for taking my question. So my question is, recent days we have seen numerous developments in text to video or video generation technology. So how do you envision this technology impacting the broader AI industry development in China? And what implications does it hold for Ernie? Could you elaborate on your strategic roadmap for Ernie moving forward? Furthermore, how does Ernie currently perform in text generation and text to image, text to video generation tasks, and what improvements do you foresee in these areas? Thank you.
spk10: This is Robin. First
spk11: of all, multi-model or the integration of multi-modalities such as text, audio, and video is an important direction for future foundation model development. It is a must have for AGI. And by do we have already invested in this area and will continue to do so in the future. Secondly, if we look at the development of foundation models, the market for large language models is huge and still at very early stages. Even the most powerful language models in the world are still not good enough for a lot of applications. There is plenty of room for innovation. Smaller sized models, MOE, and agents are all evolving very quickly. We strive to make this offering more accessible to all types of enterprises and solve real world problems in various scenarios. And thirdly, in the realm of visual foundation models, one notably significant application with fast market potential is autonomous driving, in which by do is a pioneer and global leader. We have been using diffusion and transformer to train our video generation models for self-driving purposes. We have also consistently made strides in object classification, detection, and documentation, thereby better understanding the physical world and the role of the physical world. This has enabled us to translate images and videos captured on the road into specific tasks, resulting in more intelligent, adaptable, and safe autonomous driving technology.
spk18: Overall,
spk11: our strategy is to devise the most powerful foundation models to solve real world problems. And we continue to invest in this area to ensure our leadership position. Thank
spk19: you. Thank you. And ladies and gentlemen, at
spk13: this time, we will conclude the question and answer session. And conclude BIDU's fourth quarter and fiscal year, 2023 earnings conference call. We do thank you for attending today's presentation. You may now disconnect your line.
Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-