2/11/2025

speaker
Shengzi
Investor Relations

Welcome to the Confluent Q4 and Fiscal Year 2024 Earnings Conference Call. I'm Shengzi from Investor Relations, and I'm joined by Jay Kreps, co-founder and CEO, and Rohan Sivaram, CFO. During today's call, management will make following statements regarding our business, operations, sales strategy, market and product positioning, strategic partnerships, financial performance, and future prospects, including statements regarding our financial guidance for the fiscal first quarter of 2025 and and fiscal year 2025. These following statements are subject to risks and uncertainties which could cause actual results to differ materially from those anticipated by these statements. Further information on risk factors that could cause actual results to differ is included in our most recent fund thank you file with the SEC. We assume no obligation to update these statements after today's call except as required by law. Unless stated otherwise, certain financial measures used on today's call are expressed on a non-GAAP basis. and all comparisons are made on a year-over-year basis. We use these non-GAAP financial measures internally to facilitate analysis of financial and business trends and for internal planning and forecasting purposes. These non-GAAP financial measures have limitations and should not be considered in isolation from or as a substitute for financial information prepared in accordance with GAAP. A reconciliation between these GAAP and non-GAAP financial measures is included in an earnings-based lease and supplemental financials, which can be found on our IO website at investors.confluent.io. References to profitability on today's call refer to non-GAO operating margin unless data otherwise. With that, I'll hand it over to Jay.

speaker
Jay Kreps
Co-Founder and CEO

Thanks, Shane. Good afternoon, everyone, and welcome to our fourth quarter earnings call. I'm happy to report we exceeded all guided metrics. Subscription revenue grew 24% to $251 million. Confluent Cloud revenue grew 38% to $138 million. and non-gap operating margin was 5%, our third consecutive positive quarter. These results highlight our customers' need for a complete data streaming platform, our world-class innovation engine, and our successful transformation to a consumption-driven go-to-market model. I'm also thrilled to announce a major expansion of our strategic partnership with Databricks. This collaboration brings together Confluence's complete data streaming platform and Databricks' data intelligence platform to empower enterprises with real-time data for AI-driven decision-making. The bidirectional integration between Confluence TableFlow with Delta Lake and Databricks Unity Catalog will provide consistent real-time data across operational and analytical systems that is discoverable, secure, and trustworthy. Built on open standards, these integrations ensure real-time interoperability and flexibility across diverse ecosystems, seamlessly working with the tools and data infrastructure that companies already use. Operational data from Confluent becomes a first-class citizen in Databricks, and the insights created within Databricks can be driven back into any processor in the enterprise via TableFlow and Confluent. This is an incredibly exciting prospect for companies worldwide. This partnership extends TableFlow's reach across the analytics ecosystem, positioning Confluent as the leading vendor for delivering data across the enterprise. This is crucial for scaling AI innovation by giving application developers, data engineers, data analysts, and data scientists a single real-time source of truth to power advanced analytics and next-gen AI-driven applications. Additionally, our partnership with Databricks includes comprehensive go-to-market efforts encompassing field and partner enablement, solutions architectures, co-marketing, and co-selling initiatives. Together, we will enable businesses to harness the power of real-time data to build sophisticated AI-driven applications for their most critical use cases. TableFlow continues to be one of the most exciting and powerful additions to our data streaming platforms. TableFlow exposes Confluent's data streams as continuously updating tables in cloud object storage using open standards like Apache Iceberg, now Delta Lake. This allows Confluent to act as the bridge between operational systems that run the business and analytical platforms like Athena, BigQuery, Databricks, and Snowflake that extract valuable insights. Historically, operational and analytical systems operated in separate silos, managed by different teams and composed of different technologies and workflows. To compensate, teams built point-to-point batch-oriented pipelines to move operational data to the analytical estate, resulting in insights based on outdated and incomplete data. In the era of real-time data and AI, stale data simply isn't enough for modern businesses. Data streaming has emerged as a major data platform, putting Confluent in a unique position to replace the brittle connections between operations and analytics with something that is much more robust and much more scalable. In fact, our mission and business model is built on creating continuous trustworthy and discoverable streams of data that can be consumed anywhere in the organization. TableFlow further realizes our mission by making real-time consistent and secure data available as structured tables in cloud object storage using an open table format. With TableFlow, data is defined and stored once, but accessible across multiple processors, breaking down silos and enabling faster, more accurate insights. TableFlow was met with excitement across our customer base and is already delivering real benefits to early customers. Take a U.S. digital native customer that simplifies bookings and operations for ground transportation companies across three continents as an example. Their customers require real-time data for a seamless online booking experience with accurate pricing and precise arrival and drop-off times. Previously, they did this by loading data into Snowflake with batch jobs for analytics. However, this caused delays and increased data processing costs. So the digital native customer implemented TableFlow as part of our early access program to bridge their operational data with their analytic systems using Apache Iceberg. With TableFlow, Confluent Cloud automatically performs the required data pre-processing and ingests it into Snowflake as high-quality data. The customer can now derive business insights directly from the data faster and at a much lower cost, significantly speeding up analysis and decision-making for its transportation customers. TableFlow is also a great example of what we discussed last quarter. Our third wave of growth will come from being a complete data streaming platform. Over the last year, we continued to see strong DSP momentum in our cloud. Connect, process, and govern accounted for approximately 13% of our cloud business, with consumption growing substantially faster than overall cloud. No other vendor is as intensely focused as us on building and delivering a complete data streaming platform that connects streams, processes, and governs data continuously in motion. This puts Confluent in a highly advantageous position because over time, as more parts of our platform are used, a virtuous cycle of adoption drives stickiness and more growth. Above all else, we believe a complete data streaming platform solves our customers' hardest problems, reduces complexity, and provides substantial ROI. Let me walk through a few customer examples. Zazzle is a global online marketplace and platform for creating and customizing unique designs. The company is built on a foundation of technological innovation that connects customers, creators, and makers, powering the creation of almost anything. As a leading global marketplace, Zazzle needs to efficiently process massive amounts of clickstream data to deliver personalized experiences across its unparalleled range of products and designs. With hundreds of thousands of independent creators making a living through the Zazzle platform, and customers in every country worldwide maintaining platform performance while eliminating redundant data became critical. So Zazzle implemented Confluence's fully managed Flink offering to transform their largest data pipeline. By shifting stream processing earlier in the pipeline before writing to Google BigQuery, Zazzle reduced storage and computation costs while delivering more relevant product recommendations, directly impacting revenue. But this is just the start. With this foundation in place, Zazzle plans to expand their Flink usage to additional data streams to further enhance real-time personalization capabilities. A leading European grocery delivery service has also built its data architecture on Confluence data streaming platform and leverages our Apache Flink offering to optimize real-time order management. This digital-first company manages the entire grocery delivery process, overseeing daily inventory updates for thousands of products across local warehouses. To meet their ambitious promise of delivery within 20 minutes, they require a robust real-time stream processing solution to monitor and manage orders end-to-end, from warehouse to driver to customer. Previously, their reliance on REST API calls and event-driven architectures using open-source Kafka led to inefficiencies, delays, and fragmented data sources. By transitioning to Confluent Cloud with Apache Flink, they now seamlessly process key data streams, including grocery orders and warehouse locations, into a unified order topic that is easy to manage and consume. Flink further enables them to aggregate orders, assign them to delivery drivers, and track driver speed and location to provide accurate delivery time estimates for customers. This streamlined solution has significantly enhanced their order management capabilities, resulting in reduced development costs, minimized downtime, and a faster, more seamless experience for both drivers and customers. A top three Fortune Global 100 telecom customer shows the value of a complete DSP. As consumers make calls, send texts, and access the internet, a constant flow of data from mobile devices is sent to cell towers across the network. This data then needs to be shared with partners and subsidiaries in real-time so they can optimize network coverage and analyze customer usage patterns. The telco's previous legacy streaming service was unable to deliver the scale and speed required for the mission-critical cell tower workloads. So they turned to Confluence Complete DSP for real-time visibility and improved scalability across their wireless network. With Confluence data streaming platform, the telecom provider streams mobile data from 70,000 U.S. cell towers with connectors feeding data into lakes and warehouses. Stream processing provides real-time insights into network reliability and performance of mobile devices and wireless plans, which is then shared with partners and subsidiaries. This allows the telecom provider to unlock new revenue streams by making faster, smarter decisions, ranging from where to invest in 5G capabilities to where to lay down fiber optic cables. Citizens Bank, one of America's oldest and largest financial institutions, transitioned from open source Kafka to Confluent DSP to strengthen its digital banking offerings. Today's banking customers expect real-time experiences like immediate fraud alerts and instant access to deposited funds. But with an expanding data footprint, citizens struggled self-managing open source Kafka, which required 20 full-time employees just to support. So they turned to Confluent's complete data streaming platform. Using Confluent Cloud, Citizens Bank connects data from sources like consumer checking account, credit cards, and FICO fraud scores, generating real-time actionable insights. Stream governance ensures the bank's data is high quality and trustworthy by using schemas to reduce data inconsistencies and ensure compliance. With Confluent, Citizens Bank has reduced IT costs by 30% and saved $1.2 million annually by reducing fraud false positives by 15%. Additionally, the bank has seen a 20% increase in customer engagement, a 40% faster loan processing time, and a 10-point increase in net promoter score from improved customer interactions. And finally, I'd like to close with an update on the innovation and momentum in our Kafka business, including wins in early WarpStream success. At Confluent, our strategy starts with capturing the vast opportunity presented by Apache Kafka, which serves as the real-time backbone for more than 150,000 organizations worldwide. Doing this requires having the right offering to meet the performance, TCO, and reliability requirements of an incredibly wide variety of use cases. In 2024, we invested in offerings to better capture the entirety of this market. One area we targeted was the high-volume, cost-sensitive workloads common in observability, analytics, security, and IoT. Now generally available, freight clusters are a great offering for this segment. They're designed to attract cost-conscious customers with high-throughput, latency-tolerant workloads. We paired this with WarpStream, which serves a similar customer base but allows running directly in the customer's account, adding a low-friction midpoint between self-managed and SaaS. We've seen great traction with both of these offerings in helping us penetrate new accounts, particularly in the large-scale digital native segment with offerings that address use cases that demand zero access security and have tolerance for relaxed latency. This is playing out already. In fact, every WarpStream deal closed since acquisition was from our digital native cohort, including companies like Elastic, the search AI company used by over half the Fortune 500, and Cursor, the AI code editor that's become one of the hottest names in AI. And the vast majority are net new customers to Confluent. We're excited for the future of WarpStream and continue to invest in its innovation, including a whole host of new features released in January. One expansion with WarpStream this quarter was with a digital native customer that helps developers access real-time data. Their large-scale and high storage requirements made this a significant cost driver. Previously, they used Red Panda, but as their usage increased, so did their bill. The unsustainable costs combined with performance issues with Red Panda led to their decision to adopt WarpStream. Switching from Red Panda to WarpStream delivered immediate benefits for the customer, including nearly 10x cost savings and the ability to handle much higher data volumes on their platform. This has unlocked entirely new use cases for the customer that weren't possible before. In closing, we're pleased with our strong performance this year. The progress we made and the innovation we delivered to customers in 2024 set us up to capture more of the significant market opportunity ahead. With that, I'll turn it over to Rohan.

speaker
Rohan Sivaram
CFO

Thanks, Jay. Good afternoon, everyone, and thank you for joining our earnings call. We had a strong finish to fiscal year 2024, delivering durable growth, significant operating leverage, and positive free cash flow. In fiscal year 2024, subscription revenue grew 26% to $922.1 million and non-GAAP operating margin improved 10 percentage points to 2.9%, and free cash flow margin improved 17 percentage points to 1%, making fiscal year 2024 our first non-GAAP profitable year in the company's history. These results well exceeded our initial expectations across all guided metrics entering the year, reflecting the power of our data streaming platform and our team's excellent execution against our large market opportunity. Turning to the Q4 results, subscription revenue grew 24% to $250.6 million, exceeding the high end of our guidance and representing 96% of total revenue. Confluent platform revenue grew 10% to $112.7 million and accounted for 45% of subscription revenue. Demand for enterprise-grade data streaming in regulated industries remain a key growth driver for Confluent platforms. Confluent Cloud revenue grew 38% to 137.9 million and accounted for 55% of subscription revenue compared to 49% a year ago. During the quarter, we saw stable consumption and continued use case expansion across our large customer base, driving robust growth in our core streaming business. Additionally, we saw continued adoption of new components of our data streaming platform. DSP cloud consumption, which includes connect, process, and govern grew substantially faster than overall cloud and accounted for approximately 13% of our cloud business. Turning to geographical mix of total revenue, revenue from the U.S. grew 20% to 153.7 million. Revenue growth from outside the U.S. accelerated to 26% and was 107.5 million. Earlier today, we announced a multi-year strategic partnership with Geo Platforms Limited, an Indian multinational technology company and a subsidiary of Reliance Industries Limited. By making Confluent Cloud available on Geo Cloud services and Confluent Platform as a managed service, the partnership is expected to accelerate India's development of Gen AI and next-gen applications, delivering the power of real-time data to more businesses in the country. Moving on to rest of the income statement, I'll be referring to non-GAAP results unless stated otherwise. While continuing to drive top-line growth at scale, we once again demonstrated significant operating leverage in our model. In Q4, subscription gross margin increased 90 basis points to 82%, primarily driven by the economies of scale in Confluent Cloud. Operating margin was 5.2%, exceeding our guidance of approximately 2%, and was primarily driven by revenue and gross margin outperformance. Pre-cash flow margin expanded approximately 8 percentage points to reach a record high of 11.1%. Net income per share was $0.09, using 362.1 million diluted weighted average shares outstanding. Fully diluted share count under the treasury stock method was approximately 370.1 million. Net income per share was $0.09, using 362.1 million diluted weighted average shares outstanding. Fully diluted share count under the treasury stock method was approximately 370.1 million. And our balance sheet remains strong, ending the fourth quarter with $1.91 billion in cash, cash equivalents, and marketable securities. Turning now to other business metrics. Q4 win rate for new business once again saw a notable increase both year over year and sequentially. Our win rate against the CSP offerings and smaller startups remained well above 90%. This underscores the strength of our complete data streaming platform, providing outstanding performance with unparalleled reliability and flexibility and favorable TCO and ROI for our customers. This coupled with our consumption transformation has driven a year of high velocity land and expand. We ended fiscal year 24 with approximately 5,800 customers, representing an increase of 840 customers, nearly double the total increase from the previous year. New customers in the quarter include a top five video gaming company, one of the world's largest sports media outlets, a Fortune 100 pharmaceuticals company, a global cruise operator, a leading European airliner, and many more. We also drove robust expansion in our large customer base. We grew our 100K plus ARR customer count to 1,381, an increase of 12% from a year ago. This represents approximately 24% of our total customers, a key success indicator of our expansion strategy after landing a customer. These 100K plus ARR customers represented approximately 90% of our revenue, Our 1 million plus ARR customers grew even faster, accelerating to 23% and ending the quarter at 194 customers. New 1 million plus ARR customers include customers from a wide variety of industries spanning financial services, healthcare, manufacturing and logistics, retail, technology and more. Q4 NRR was 117% while GRR remained above 90%. Our stabilized NRR in recent quarters coupled with continued strength in GRR provides a solid foundation for delivering on our growth target this year. Before turning to our guidance, I would like to discuss Confluence positioning for 2025 and beyond. 2024 was in many ways a consequential year. First, we have optimized our pricing and packaging with the introduction of enterprise clusters, FRIT clusters, and the acquisition of WarpStream. This has significantly increased our serviceable addressable market as we are positioned to deliver best-in-class TCO for a broad range of use cases across self-managed, fully managed, and BYOC deployment models. Second, we extended our technology lead by expanding our DSP capabilities with more than 200 features and capabilities across stream, connect, process, and govern. With the upcoming TableFlow GA release, we will expand our growth vector by unifying the operational and analytical estates in data management. Finally, we have successfully transitioned to the next generation go-to-market model, focusing our team on consumption-based selling. By increasing consumption of our data streaming platform, we help customers realize substantial ROI for powering their mission-critical and real-time AI workloads. As we drive ROI-based expansions throughout our customers' data streaming journey, we expect our growth and profitability profile to strengthen over time. Following a year of substantial transformation, we have established a major data platform for the enterprise, unlocking the power of data streaming for thousands of customers and operating at a $1 billion plus revenue run rate. Given the strong foundation we set last year, we expect to begin reaping the benefits in 2025. Our objective is to continue soaking up the world's Kafka and to establish 2025 as the year of DSP. We will support these initiatives with a resource allocation strategy focusing on efficient growth and prioritizing our investments in expanding our DSP capabilities, hiring and enabling our team to sell DSP, and forming strategic partnerships and alliances. We look forward to driving durable and efficient growth in 2025 as we execute against our large and growing market opportunity. Now let's turn to our guidance. we are providing Q1 and fiscal year 2025 subscription revenue outlook ahead of expectations. In addition to guiding fiscal year 2025 non-GAAP operating margin within our midterm target set at the time of our IPO. For the first fiscal quarter of 2025, we expect subscription revenue to be in the range of 253 to 254 million, representing growth of approximately 22 to 23%. Non-GAAP operating margin to be approximately 3%. and non-GAAP net income per diluted share to be in the range of $0.06 to $0.07. For fiscal year 2025, we expect subscription revenue to be in the range of $1.117 to $1.121 billion, representing growth of approximately 21% to 22%. Non-GAAP operating margin to be approximately 6%, and non-GAAP net income per diluted share to be approximately $0.35. I'd also like to provide a few modeling points. For subscription revenue seasonality, at the midpoint of our guidance, we expect the first half of fiscal year 2025 will be approximately 46.5% in line with the average of the first half seasonality in the last two years. For cloud revenue, we are comfortable with the current consensus dollar estimate for fiscal year 2025, and we expect to see approximately one point of increase in cloud subscription revenue mix each quarter with a Q4-25 exit of approximately 59 to 60%. For free cash flow margin, we expect a one-time negative impact of approximately 15 points to Q1-25 or approximately three to four points to fiscal year 2025, resulting from a change to timing of cash compensation payments for most of our non-go-to-market employees. Excluding this one-time impact, we expect adjusted free cash flow margin for fiscal year 2025 to be approximately 6%. In closing, a strong finish to 2024 is a testament to our large TAM, the market leadership of our technology platform, and our world-class team. Powered by the secular tailwinds of cloud, data, and AI, we are incredibly excited to take advantage of the market opportunity ahead. Before turning to Q&A, we will host Investor Day 2025 on March 6th in San Francisco. If you are interested in attending in person, please contact the IR team at investors at concluent.io. Now, Jay and I will take your questions. Thanks, Rohan.

speaker
Shengzi
Investor Relations

To join the Q&A, please click on the raise hand icon. When selected for Q&A, we ask that you limit yourself to one question and one follow-up. And today, our first question will come from Matt Pepper with RBC, followed by Wells Fargo. Matt, please go ahead.

speaker
Matt Pepper
Analyst at RBC

All right, thanks a lot Shane. Congrats team on the results really strong into the year. I guess Jay, I want to start with you. There was news a couple of weeks ago about Snowflake potentially looking at acquiring Red Panda. And today you announced a really an excited expanded relationship with Databricks. I guess I'm wondering, could you talk more at a high level of how customers think about using Confluent for streaming and processing outside of analytic engines and then maybe where that processing makes sense within the data lake or data warehousing layer?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, yeah, it's a great question. So there's, you know, there's a really two kind of key points that are driving this. One is the rise of these AI applications, which increasingly, you know, it's not just like a report that has yesterday's data or kind of a warehouse where people can do ad hoc analysis periodically. You know, this is actually operating as part of the business. And so it needs data that's kind of up to speed with what's happening in the business. And, you know, that's driving the need for real-time data, you know, in the analytics world. The second is the rise of these open data formats. So, you know, Iceberg or Delta, these are kind of a new way of laying out all the data in the analytics realm in kind of open object storage. And, you know, it may not be obvious, but this table flow offering that we talked about is kind of, you know, we think the best way to get this. So it's a real-time feed of all the data that's available in the streaming world that's then projected out as iceberg or Delta tables. And so, you know, when we talk about the partnership with Databricks, it's both integration into Delta, their format and Unity, their catalog, which opens up anything in Confluent for use in the Databricks ecosystem, including those types of applications. And then, you know, going along with that is go-to-market activities to take that out to customers. And back to the question of, hey, where is this useful to customers? There's a very clear ecosystem of analytics, use cases, applications. There's a very clear ecosystem of operational, kind of run-the-business applications. And then there's a whole murky middle of all the AI stuff and things that are happening all the way in between. Our goal is to unify that around streaming data and make that available through all these different form factors to the different use cases that we need. So, yeah, we see it as incredibly complementary, and we're excited to be working with the Databricks folks.

speaker
Matt Pepper
Analyst at RBC

Got it. That's a great answer. And then maybe just one quick follow-up. Ron mentioned that 2025 is the year of DSP, which is, I think, great to see, especially given it increasing in the mix of cloud revenue. What gives you the confidence that those trends continue? Are you starting to see some anecdotal evidence of just increased pipeline? What are some of the tea leaves that you're reading that makes you suggest that to be the case?

speaker
Jay Kreps
Co-Founder and CEO

Yeah. It's always a mixture of qualitative and quantitative. Qualitative is You know, look, some of these things, the connectors, they have massive open source options. So we just know that it's out there. We know that customers are excited about this. They're talking about it. One is part of a managed cloud offering. And that comes out just in more or less every conversation we would have with our customers. There's really been an evolution of what's expected in the streaming world from just kind of the pure stream of data to a whole set of real-time data capabilities. And we think we're kind of at the forefront of delivering that. The second bit is a little more quantitative. which is, yeah, we look at the actual ramp and growth of consumption of these areas and the kind of forward-looking pipeline we think we have for customers. A lot of the changes we've made in terms of our consumption transformation make us much better at tracking these workloads, driving kind of product area by product area, the growth of these specific items. Thanks a lot, Jake. Congrats.

speaker
Shengzi
Investor Relations

All right. Thanks, Matt. We'll take our next question from Michael Turn with Wells Fargo, followed by Deutsche. Michael.

speaker
Michael Turn
Analyst at Wells Fargo

Hey, thanks very much. There's a lot of useful commentary throughout the prepared remarks. Appreciate that, Jay. I wanted to spend some time with you just speaking through what you're seeing in terms of AI-related use cases for streaming and or DSP technology. Is there commonality in terms of

speaker
Jay Kreps
Co-Founder and CEO

customers use cases you're seeing and maybe how you'd expect that to progress over the uh the coming year yeah yeah i'm happy to do that so you know if we think about the first wave of use cases there's a lot of chatbots and the usage pattern there was hey you know get the data or the ai and so to kind of boil it down it was basically a real-time etl you know like capture the data, be able to transform it into the right format, be able to store it in a way that's accessible and usable for a RAG application, which is combining the stored data with LLM or language model. And we've seen a number of those kind of progress through to things that are either internally facing in customers or externally facing. And so that was the first usage pattern was get the data. The second usage pattern we're seeing is this ability to apply language models kind of on the stream of data directly, right? Carry out some actual use cases, something like, say, an insurance customer may have a claim filed. There was a set of digital work that would happen, but there was also a set of manual work that would happen to actually prep, analyze, look at different documents. Being able to orchestrate that and take it further in the digital realm is a huge cost savings. It's a big quality improvement. And I think that's very much the goal that people have in the adoption of AI is not just more chatbots, but actually being able to plug this into parts of the business, have it take over some of the work that would otherwise be done by humans or augment that work in some way. And so I think that's the second pattern that we're starting to see rise up in customers.

speaker
Michael Turn
Analyst at Wells Fargo

Great. And then just as a follow-up, if I may, Rohan, you mentioned some differences between 24 and 25. You're calling for pretty consistent growth rates. We look at the Q1 subscription revenue guide. and the rest of the year. Maybe just frame for us what you're assuming with that initial guide in terms of some of the changes that you're laying out and just more context there I think would be helpful. Thank you.

speaker
Rohan Sivaram
CFO

Thanks for your question, Michael. Will absolutely do. First of all, we're very pleased with our guide for Q1 and fiscal year 25 for subscription. It is above the consensus estimates that were out there. And when you really look at the underlying drivers, I'll call out maybe three of them, right? First is for the cloud business, we called out that we are comfortable with the estimates that are out there, which is roughly greater than 30% growth rate, give or take, at a very high scale. And that's kind of underpinned by just the stability in the NRR that we spoke about in the second half of the year. The second aspect I'll call out is some of the drivers. The first driver is DSP that we spoke about. The DSP products are in their earliest stages of their growth curve. So obviously, we have a lot of runway out there. So that's an opportunity. The second piece around the growth areas are some of the newer products that we spoke about. We just GA'd freight cluster. We're going to be GA'ing table flow. you know, Wall Street momentum. So that's category number two. And category number three is the partnerships that we spoke about, GEO and Databricks. And these partnerships are not only on the technology side, but also on the go-to-market side. So the revenue contribution from the partnerships is probably not as material, but these are all kind of drivers of growth as we look ahead into next year. What I'm telling you is when I take a step back, I think it's helpful to have multiple road drivers and, more importantly, multiple parts to get to your objective. So that's how I would categorize it.

speaker
Michael Turn
Analyst at Wells Fargo

Great. Thanks very much.

speaker
Shengzi
Investor Relations

Thanks, Michael. We'll take our next question from Brett with Deutsche Bank, followed by Morgan Stanley.

speaker
Brett
Analyst at Deutsche Bank

Thanks very much. Great to see everybody, and congrats. Jay, maybe just for starters, you know, this past year, you made a transition in your go-to-market, changed up the incentives. As we looked at 2025, what's the big message coming out of sales kickoff? What adjustments, if any, are you making to the model that we should keep in mind that'll perhaps change incentives and behavior? Thanks.

speaker
Jay Kreps
Co-Founder and CEO

Yeah, yeah, it's a great question. So obviously last year was a bigger transformation. We were changing a lot of the fundamental incentives, but that trickles down to even how we would track workloads with customers, our definition of pipeline, number of organizational changes to support it. You know, I think one of the really positive things heading into this year is we feel like, you know, broadly those changes have been effective. You know, it was a big adjustment, but, you know, heading into this year, it's a, much more minor set of tune-ups, which are kind of the equivalent of what you would do heading into any new year, where you kind of tune this aspect or that aspect of the comp plan. One of the big advantages in what we've done is we can now track and drive the specific workloads and the amount of each DSP component that's part of that is not just some generic commit to some raw number of dollars. It really is a set of use cases which are going to drive consumption of these individual new product areas, and that allows us to drive it So, you know, we actually just had our sales kickoff this last week. A lot of fun, really positive energy, I think, from the team. I came back pretty excited and, you know, with a little bit of a sore throat from talking so much. But, yeah, I think it resonated really well, and people are excited to go take this out to all their customers.

speaker
Brett
Analyst at Deutsche Bank

Awesome. And maybe just a quick follow-up, Rohan. You talked about packaging changes that you made that expand the serviceable, addressable market. And we just hope that you can expand on that because that sounded like a really important point. And I want to make sure that we understand it. Thank you.

speaker
Rohan Sivaram
CFO

That's right. Brad, the best way to think about it is when I look at, say, 18 months back, when you look at our pricing and packaging options that were out there, It was primarily our dedicated clusters and the standard and basic clusters, which were probably the two bookends. In the last 18-odd months, we've added Enterprise Q, which is essentially multi-tenant, you know, and it provides private networking, provides the right amount of security. Things like you can, instead of flying private, you can fly commercial right now with Enterprise. And then we also came up with our freight clusters, which is, Essentially, for high throughput, latency-insensitive workloads, you can do it at a very high TCO ROI. And then we acquired Workstream, which is pretty similar to Grid Clusters, but doing it in your own infrastructure. So when you kind of look at all of this, it is all around Confluent providing our customers with the right amount of ROI and TCO for all your workloads, not selective workloads. So that's the commentary around us being able to expand our serviceable addressable market and essentially take a step forward to our mission around soaking up the world's Kafka. Hope that helps.

speaker
Brett
Analyst at Deutsche Bank

It does. Thank you so much.

speaker
Shengzi
Investor Relations

Thank you. We'll take our next question from Sanjay Singh with Morgan Stanley, followed by Barclays.

speaker
Sanjay Singh
Analyst at Morgan Stanley

Thank you for taking the questions and congrats on Q4. Jay, you mentioned that 2024 was a pretty transformational year across go-to-market changes, you know, broadening out the product portfolio. You made some changes on pricing over the last 12 to 15 months as well. I guess the last piece of the question is, or a piece of the puzzle is kind of the broader spending environment. And so how would you sort of characterize Q4 across like, you know, sort of executing its pipeline, consumption trends, going into the holidays, and then consumption trends coming out of into January, into early February?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, yeah. So, you know, I would say overall, certainly, you know, we've seen kind of stability in the market in our different customer segments. You know, we feel like that's continued into what we've seen so far in Q1, you know, as reflected in the results, right? You know, we felt very strong results, not some step change from, you know, Q3 or what came before, but overall strength in demand, a little bit more certainty in you know, kind of budgets and spending. And so, yeah, we're expecting that to, you know, continue through to this year. Understood.

speaker
Sanjay Singh
Analyst at Morgan Stanley

And then on the sort of go-to-market side, I imagine partnerships will be a little bit more in focus. Last year was more sort of consequential change on the go-to-market side. But what if any changes sort of left going into 2025?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, you know, certainly in terms of the consumption-related changes, you know, it's more of a kind of tuning here and there, how you incentivize the DSP multiplier and other smaller aspects of the comp plans. It's less revolutionary. We are putting more effort into the support for selling use cases that really consume this whole DSP, being able to land that directly, being able to support some of these use cases to integrate into the analytics realm, a little bit of what we talked about with the partnership with Databricks. So those are definitely some areas of particular focus for us. Appreciate the touch. Thank you.

speaker
Shengzi
Investor Relations

Thanks. We'll take our next question from Ryan with Barclays, followed by J.V. Morgan.

speaker
Ryan
Analyst at Barclays

Hey, thank you. Jake, the last few quarters we talked, or this time last year, we talked a lot more about Flink. And this time, you know, because you added so many more products, you know, it's kind of like, it's kind of came a little bit out of the limelight. Can you talk a little bit about what you're seeing there and what you can maybe do kind of to kind of drive momentum there in the coming year, if you have any plans here? And then one follow-up. Yeah, yeah, yeah.

speaker
Jay Kreps
Co-Founder and CEO

We've been pleased with the progress. So, you know, the It's a fairly large effort to get a new cloud data processing layer up and operational. We've seen a lot of maturity in the product offering, including opening up across a lot of the private networking types in the different clouds, which then allows us to start to begin to service more production workloads. We've seen good broad-based adoption in the customer base. it takes time for them to kind of rebuild applications that they have, but overall, you know, good strength in the build there. And we're kind of pleased with the progress and, you know, basically feel equally bullish as, as we would have, you know, Q1 of last year.

speaker
Ryan
Analyst at Barclays

And then on the Databricks relationship, like obviously there's a big discussion. Do you need, kind of, you know, stream, you know, you guys in that journey or not, can Databricks do it themselves? This is obviously a very, very strong confirmation, you know, about your role in that ecosystem. What are you seeing there in terms of, like, how you would fit in other vendors? How exclusive is Databricks? What's the opportunity to broaden this out? Thank you. Congrats from me as well.

speaker
Jay Kreps
Co-Founder and CEO

Yeah, yeah. You know, look, our role is as this broker of streaming data across, you know, these different platforms within the analytics realm, that's increasingly very relevant as they look at trying to be able to participate in some of the AI workloads that are more real-time, that are more about running the business. We think we're in a very good position in that ecosystem and with some of the product functionality we're bringing to market, and we're really pleased to be working with them. We certainly view them as a leader in that space, and I think it's going to be a very productive partnership. Okay, perfect. Thank you.

speaker
Shengzi
Investor Relations

Thanks, Raimo. We'll take our next question from Pendulum Bor with JP Morgan, followed by Piper Sandler.

speaker
Pendulum Bor
Analyst at JP Morgan

Oh, great. Thank you, guys. Congrats on the quarter. Jay, I just want to ask you one thing. In our conversations with partners and customers, kind of notion has emerged that the agentic architecture, it's not only real-time data flows, but real-time data processing is kind of becoming important for agents to take decisions. Is that broadly true? Are you seeing kind of flink attachments, attach rates higher for genetic workflows? And then one for Rohan. Any way to understand what is assumed within the guidance for 2025 around DSP mix and flink mix?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, it's a really insightful question. This is actually the topic of my keynote for the last current conference that we held in the U.S. for our customer base, and I talked about some of these use cases. Exactly as you say, I talked in the previous question about these agents that are actually reacting to what's happening in the business, right? So you can see the streams of data as being a little bit like the sensory system that's saying, hey, what's happening, being able to take bits of work as they come in, act on them, produce whatever output to the rest of the business. that is fundamentally a stream processing problem, however you tackle it. And so, yeah, we feel like the technology we have in that space is particularly relevant. The advantage of using this kind of streaming technology is very much that you can kind of take the input data, run it against it, look at the output, run it again, tweak your model, try a different model, add new context information, bring together other information you have. So, yeah, I think it's very applicable. And if anything, you know, one of the interesting things that's happening in the AI world, you know, with relationship to the older machine learning is in some sense, the action is kind of moving into real time. You know, I, prior to Confluent, I worked at LinkedIn. One of the teams that we had was a big data science team that would do model building, right? And so the role for streaming relative to that team was feed a bunch of stuff into some lake and They would go offline and kind of build some models that would eventually be kind of shipped out. But each model was very specific to the problem being solved. If you think about what's happening in the new world, companies aren't building as many bespoke models. You're mostly getting your model pre-built off of a very large general purpose data set. Mostly what you're doing is applying your specific context data with the model at inference time, like at runtime. That's kind of the RAG architecture. So if you think about, well, what's the resulting data infrastructure from that, it's going from this kind of offline processing to something that's very much online, that's in sync with the business. And that's obviously advantageous, I think, you know, for folks in the streaming world.

speaker
Rohan Sivaram
CFO

Yeah, and Angelim, to answer your second part of the question, when I think about DSP consumption, it grew substantially faster than the overall cloud business in Q4 2020. And as I mentioned, it accounted for roughly 13% of our cloud business, which we're obviously very pleased with. And as I think ahead, all the key components of the DSP are in their earlier stages of their growth curve. And we expect them to be material drivers of growth for the next couple of years. And specifically for 2025, while I'm not going to get into exact numbers, we expect that mix to increase as we go through the year.

speaker
Shengzi
Investor Relations

Thank you. All right, thanks, Pendulum. We'll take our next question from Rob Owens with Piper Sandler, followed by Gilman.

speaker
Rob Owens
Analyst at Piper Sandler

Thanks, Shane, and good afternoon, everyone. One question, but multiple parts, so you guys can both answer. I guess I'd like to focus a little bit on your million-dollar customers and your success upmarket. What's driving that? Are customers getting to that million-dollar bogey quicker, especially as you move to the DSP platform? And for Roe, long-term deferred revenue of $20 million sequentially kind of sticks out, especially given the move to consumption. So maybe you can address that. Thank you.

speaker
Jay Kreps
Co-Founder and CEO

Yeah, I'll take the first part of that. Yeah, obviously a large part of what we're trying to do is build the journey from that first use case to a large cross-company platform that has a significant portion of the data and application workloads in the company. And that's a big project in each one of these customers. And everything we do is about how can we accelerate it. So when we think about services offerings, partnerships, features, it's all about how can we build that journey and make that easier for customers to achieve. In particular, a lot of the DSP functionality is very much that, right? So the connectors make it really easy instead of having to come in and build a bunch of custom integration. You can just plug this into the things that you have and have streams of data flowing. Flink makes it much easier to build real-time applications that are fault-tolerant and scalable and correct and work with the languages you know, like SQL, which is kind of the lingua franca of the data world. And similar with the kind of governance functionality that makes this actually usable across the organization. And we do see that flywheel, right? Indeed, as customers adopt these things, we see greater usage of the core streams of data. And so we feel very much that as these things spin up, they all kind of feed off of each other. And that should very much take customers more quickly to larger scale within the organization. And I think that we're starting to see realization of that. So for CIOs, CTOs, engineering leaders, as they're thinking about an overall data strategy, as they're thinking about what's needed for AI, I do think this is a very important piece that they're thinking about and starting to contemplate the role that data streaming plays. And that kind of broad understanding of our space Hasn't always been there, you know, in the early days of the company, it was not, but I do think that that's a key accelerant when you think about getting customers to buy in, to build around something and to take it to a large scale quickly.

speaker
Rohan Sivaram
CFO

Rob, I'll add a couple of quick points to what Jay just said, then I will go to the default revenue piece. On the total customer, large customer ecosystem, I also called out that when you look at our 100K plus ARR customers, that cohort now basically contributes approximately 90% of our subscription revenue. And then when you look at our million dollar plus customer and you look at the last 12 months, the momentum has been great. So I'll give you a different lens as in, when I think about consumption, there are probably three drivers of consumption, right? You have an existing use case and there is more data flowing through that. That's number one. The second area is our ability to unlock net new use cases. And with AI, you know, that's obviously a level of growth. And the third, albeit in early days, is us selling DSP into our existing customer base. So the opportunity with our existing customers is still pretty large, just to add to what Jay said. Now, going back to your second question around deferred revenue, honestly, not a whole lot to read into it. Like I said, when you look at Therefore, revenue or RPO, that's probably not the only indicator for the organic momentum of the business purely because how we think about cloud, we are consumption first. And we're truly not leading with the largest commit possible. It's all around unlocking the consumption and working with the customer around it. So changes in therefore revenue is primarily driven by timing of large confidence platform, multi-year deals, but not a whole lot of reason to do it.

speaker
Rob Owens
Analyst at Piper Sandler

All right. Thank you.

speaker
Shengzi
Investor Relations

Thank you. We'll take our next question from Cash Rangan with Goldman Sachs, followed by Goodline.

speaker
Cash Rangan
Analyst at Goldman Sachs

Hey, thank you very much, Jay, Rohan, and Shane. Good to see you guys. Jay, I was wondering, since we've had the benefit of a full year of the rollout of the new model and we brought in Flink into the fold, how satisfied are you with the tilt towards compensating salespeople through consumption? Because I think a year back, the hope was that we'd come out of this with the ability to grow even faster and allow customers to consume even more freely, more faster. And what are the things we should be watching for to ensure that that is happening? Maybe the net expansion rates are not there yet. The cloud growth rate could be hitting an inflection point. Maybe that is still ahead. So how satisfied are you that we've seen or we begin to see the effect, the fruits of all the actions that were put in the last year. And also second, finally, if you have the time, if AI were to be a tailwind for the company, where would it show up in the way that people like us can read in the financial statements? Thank you so much once again. Yeah.

speaker
Jay Kreps
Co-Founder and CEO

Yeah, so to the first question, yeah, there's really two things we're trying to achieve. One is really match the way our customers are buying and thinking about the type of product that we offer. Make it easier for them to land, make it easier for them to expand and add workloads. The second was to be in a position to really drive adoption of DSP around our cloud. And I think we exit this last year, enter this year, kind of the position to do both of those. And that was very much the objective. You know, I think that's a much better position to be in relative to what we're doing with customers. I think it means we're tracking the kind of individual workloads that they're bringing out to production. And so, yeah, I do do that as a, you know, growth driver for us over time and as a necessary step to be in sync with the peer companies and in sync with the new offerings we want to bring to market. To the, you know, to the latter question, On AI use cases, yeah, I think it's really about the broader set of these coming into production usage. That's where we're going to see this happen. I think you would see it both in the kind of customer references. We've shared some of the ones that we've seen so far, as well as in the, you know, kind of overall growth numbers for the company. But there's obviously not a broken out category. You know, I don't think it's necessarily disproportionate, you know, cloud or CP. You know, we see use cases across both. So, yeah, I don't think you would see it, you know, in a single customer stat that we produce.

speaker
Shengzi
Investor Relations

All right. Thanks, Cash. We'll take our next question from Howard Ma with Guggenheim, followed by me, Suho.

speaker
Howard Ma
Analyst at Guggenheim

Great. Thanks. It's great to see everyone. And congrats on a strong finish of the year. Jay, in your prepared remarks, you mentioned a couple of customer examples where I believe they're replacing traditional data integration vendors with Confluence. I believe to date, though, most of Confluence's new business has come from replacing open source Kafka for real-time use cases. But it seems like now you're going maybe more direct rip and replace of commercial data integration vendors for batch processing companies. And I'm not sure if this is always the case and what the historical mix has been, but can you confirm if that is true, that you're winning more of these larger commercial patch workloads? And if that is the case, is that changing your go-to-market at all, the types of prospects you're targeting and how you're targeting them? Yeah, it's a really good question.

speaker
Jay Kreps
Co-Founder and CEO

So I would say it's always been the case that we've been replacing other legacy integration technologies. That's not solely new. However, often our first land was a conversion of open source Kafka. Now, of course, the open source Kafka may have replaced some legacy data integration. It may have been a net new project. But one of the things we've always found is, you know, whatever that initial land is, as customers get to scale, they start to think about the portfolio of technology and what is it that they want to bet on going forward. And they start to have a plan of what they're divesting and what they're investing in. And I do think it's very much the case in our large customers that they see data streaming as the future of how data flows across the organization. And I think that that does displace a number of the existing vendors in that space. And I think they are looking at replacing that. Now, can we land directly against those use cases? I think the answer is increasingly often yes, right? One of the things that's happened as there's more awareness of the data streaming space overall And also, as our product is more complete, you know, having an out-of-the-box set of connectors that plug in, having transformation capabilities, it is more of a, you know, apples-to-apples replacement of some of these things with something that's net better, that's more scalable, that's real-time, that's an open application platform that has a lot of benefits, but actually can just kind of displace a lot of what's there. And so, yeah, I do think we're seeing more of that. Great. Thanks so much. Yeah, of course.

speaker
Shengzi
Investor Relations

All right. Thanks, Howard. We'll take our next question from Gray. Moscow is with Mizuho, followed by DA Davis.

speaker
Gray Moscow
Analyst at Mizuho

Thanks, and congrats on the results and the partnership announcements. Jay, it sounds like Warpstream got off to a very good start, and so you now have three types of deployment mechanisms, each of them valuable for different reasons, in addition to things like freight clusters. That being said, could this potentially lead to a little more customer deliberation, a slight lengthening of sales cycles as part of that decision-making process, or is that really not a concern from your perspective?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, it's not a huge concern. Obviously, anytime you're adding more packages or options, you want to be cognizant of that. But we thought that actually when you look at customer workloads, there's a fair amount of diversity. And so the first order thing to do is make sure that you have a really excellent TCO story, deployment story, set of capabilities for each of those. And you can kind of walk into a customer and say, hey, tell me what you're doing. We've got the right solution for all of these things kind of out of the box. And, you know, one of the things I mentioned, we had our sales kickoff. You know, one of the things that I talked with our team about and that a lot of our product leaders talked about was the fact that entering this year, we really felt like we were in this position where across every use case, we were dramatically, you know, superior to whatever the customer was doing in the streaming space. And then it could be kind of a no-brainer both on functionality but also on price, you know, from very large scale to the kind of early starter stuff to the, You know, very high SLA, but maybe not as big. You know, really across the board, we had a very strong start. And, you know, that was a deliberate path over the course of last year. Now, it does mean we've got more cluster types, but, you know, I think in any of these areas that succeed, like you look at the AWS product offerings, sure, you know, they start with, you know, three EC2 instances. As this gets more use, they do kind of fill out the matrix a little bit more to be able to really, you know, cover all the different compute needs that a customer may have. And I think that that's actually kind of a very important part of maturity in this kind of cloud infrastructure. So, you know, net-net, we feel very good about the kind of full portfolio.

speaker
Gray Moscow
Analyst at Mizuho

Terrific. Thank you.

speaker
Shengzi
Investor Relations

Thanks, Greg. We'll take our next question from Rudy Kessinger with DA Davidson, followed by Ken Acorn.

speaker
Rudy Kessinger
Analyst at DA Davis

Hey, thanks, guys. I know you talked about going into year two, this go-to-market change, and really just how that's gone pretty smoothly and just kind of some tune-ups at this point. I am curious, though, Rohan, within the guide, are you assuming higher sales productivity levels this year, just given you're going into year two, or just what's implied from that standpoint?

speaker
Rohan Sivaram
CFO

That's a great question. Again, I'll start with 2024, Rudy. When I look at 2024 and I look at just a broad efficiency on the go-to-market side of the world, In fact, we improved our sales and marketing as a percentage of revenue in 2024 by roughly six percentage points. That essentially tells you that as we've kind of gone through the year, we've gotten more and more efficient. So that's one aspect of it. And given our resource allocation philosophy, we're not going to lose sight of that. That's going to be a continued focus for us, us getting more and more efficient as we deliver, try to deliver and deliver durable growth. When I think about where we are from a capacity standpoint at the beginning of the year, we feel good with the capacity we have. We've had a good hiring year towards the latter half of 2024. So we have the capacity on board to deliver on the plans that we have for 2025.

speaker
Shengzi
Investor Relations

Thanks, Rudy. We'll take our final question today from Kinsley Cray with Kinecor.

speaker
Kinsley Cray
Analyst at Kinecor

Thanks. Uh, so data gravity, it's really important when you look at all the strategic moves and acquisitions made by these large data platforms, a lot of them tied back into data gravity. Data mobility has also never been higher. Table flow fits perfectly into that narrative. Uh, just taking a step back, does all of this give you more confidence that stream processing will be a much bigger market over time? And then what ending do you think we're in, in that market?

speaker
Jay Kreps
Co-Founder and CEO

Yeah. Um, yeah, so I, you know, if you look at the kind of broad trends that we're seeing The rise of AI and the push of a lot of the analytics world more into real time, that continued investment in a set of different data platforms across. The rise of the public cloud, but the endurance of some of the on-premise stuff. Net-net, the connectivity between all of this is really important. And more of the workloads are moving into this kind of operational, continuous processing world. So yeah, it's a phenomenal setup for streaming. What inning are we in in it? We're in a very early inning. So if you add up the dollar spent on batch processing, it is a large pile of dollars. If you say, hey, what percentage of those dollars have moved to streaming? Well, a good percent. We're excited to talk about it every quarter. But there's a lot more dollars to go. And I think that's an exciting thing for us. And I think as we talked about before, what this area encompasses is both a set of data integration, kind of bespoke data half worked out data integration verticals, but also a set of application workloads that were maybe running offline at the end of the day and are moving into the operation of the business. I think both of those are very exciting opportunities. Both of those are being done by our customers today. And the scope of those is expanding both with the kind of DSP capabilities we're bringing, as well as the continued expansion in these customers as this becomes a really critical data platform for that.

speaker
Kinsley Cray
Analyst at Kinecor

Great. And a quick follow-up. So great to see diverse streaming adoption, video gaming, sports media, pharma, airlines. Are you seeing any particular vertical strength with respect to newer AI-related projects, or is that similarly diverse?

speaker
Jay Kreps
Co-Founder and CEO

Yeah, yeah. We've done really well in the kind of dedicated AI companies. And then, of course, you know, the inside of an existing industry kind of AI projects. The AI companies are obviously wonderful customers to have. We've talked about OpenAI in the past. We talked about Kircher in the call today, which is a really cool tool that helps software engineers write code faster. So we're very excited about what's happening there. Those are a very fast-growing set of customers that we're pleased to be working with.

speaker
Shengzi
Investor Relations

All right, so this concludes our earnings call today. Thanks again for joining us. We look forward to seeing many of you at our March Investor Day. Have a nice evening and take care.

speaker
Ryan
Analyst at Barclays

Thanks, everyone.

speaker
Shengzi
Investor Relations

Thank you.

Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-