Confluent, Inc.

Q1 2024 Earnings Conference Call

5/7/2024

spk05: Hello, everyone. Welcome to the Confluent Q1 2024 Earnings Conference Call. I'm Shane Zee from Investor Relations, and I'm joined by Jay Kraps, co-founder and CEO, and Rohan Sivaram, CFO. During today's call, management will make forward-looking statements regarding our business, operations, sales strategy, market and product positioning, financial performance, and future prospects. including statements regarding our financial guidance for the fiscal second quarter of 2024 and fiscal year 2024. These forelooking statements are subject to risks and uncertainties, which could cause actual results that differ materially from those anticipated by these statements. Further information on risk factors that could cause actual results to differ is included in our most recent Form 10-K filed with the SEC. We assume no obligation to update these statements after today's call except as required by law. Unless data otherwise, certain financial measures used on today's call are expressed on a non-GAAP basis, and all comparisons are made on a year-over-year basis. We use these non-GAAP financial measures internally to facilitate analysis of our financial and business trends and for internal planning and forecasting purposes. These non-GAAP financial measures have limitations and should not be considered in isolation from or as a substitute for financial information prepared in accordance with GAAP. A reconciliation between these GAAP and non-GAAP financial measures is included in our earnings price release and supplemental financials, which can be found on our IO website at investors.confluent.io. And finally, once we have concluded our earnings call, we will post the Confluent earnings report to our IL website. The report is a single PDF which contains our earnings infographic, one-pagers on our technology, our prepared remarks, and slides from today's earnings call. Going forward, we plan on publishing the report at the end of our quarterly earnings call. With that, I'll turn the call over to Jay.
spk10: Thanks, Shane. Good afternoon, everyone, and welcome to our first quarter earnings call. I'm happy to report we had a strong start to the year, exceeding the high end of all guided metrics. Total revenue grew 25% to $217 million. Confluent Cloud revenue grew 45% to $107 million, which now accounts for the majority of our subscription revenue and remains our fastest growing offering. Non-gap operating margin improved 22 percentage points, our fourth consecutive quarter of more than 20 points in improvement. These results reflect our team's strong execution amid a still uncertain but stabilizing macro environment. In Q1, we launched our consumption transformation. We oriented our sales compensation for cloud towards incremental consumption and new logo acquisition. We rolled out new systems, metrics, and measures and made pricing adjustments to reduce friction in landing new customers. It remains early days, but we are encouraged by the strong promising signals of our consumption orientation, particularly around new customer acquisition and stabilization of consumption trends. With an increased focus on new logo growth, we added 160 customers to our total customer count, our largest sequential growth since Q1 23. We not only increased the volume of customer additions, but we're better able to target high potential customers, increasing the quality of our customer ads as well. We recently hosted Kafka Summit in London and Bangalore. Kafka Summit Bangalore was the first ever Kafka Summit in APAC. These events are a great illustration of their tremendous growth and innovation happening within the data streaming category. Between the two events, we had more than 7,000 people joining us in person or registered virtually, spanning startups, enterprises, and everything in between, including organizations like Apple, Bloomberg, CERN, ING, Stripe, Uber, and many others. Our relentless pace of product innovation was on full display with 15 major customer-facing features and price and performance optimizations announced across both events, including the general availability of Flink and early availability of a powerful new feature we call TableFlow. TableFlow makes all the data streams that flow through Confluent Cloud available as structured tables in cloud object storage using an open table format called Apache Iceberg. Let me provide a little background on what this means and why it's so powerful. Historically, data in the analytics and data warehousing world has existed in closed systems that trapped their data inside a walled garden. As the complexity of the analytics world has grown, this has led to a mishmash of data warehouses, data lakes, AI products, and reporting systems. This created lots of value for technology vendors, but created yet another data silo for the end user. However, over the last five years, a trend has emerged of standardizing around open data formats and metadata on top of cloud object storage. The rise of cheap cloud object storage like S3 means another path is possible. Instead of fragmenting data across various analytical systems, the tables of data can be shared across systems. Apache Iceberg has arisen as the de facto standard for these open analytic tables on top of cloud object storage. Iceberg is an open source project that has near universal support across the open source systems like Apache Spark and Flink, as well as the data warehousing and data lake house world, including products like AWS Athena, Redshift, Google's BigQuery, and Snowflake. TableFlow is more than just a connector. Already Kafka and Confluent are one of the most common feeds of data into the analytics system. But with TableFlow, we can make that integration far deeper. Quora, our cloud-native Kafka implementation, already heavily relied on cloud object storage for storing the streams of data in Confluent Cloud. TableFlow means that we can open up these same streams directly as iceberg tables with the click of a button. This means data is defined a single time, stored a single time, and no complex mappings or translations are needed. TableFlow is in early access now and taking on its first users. For some vendors, the rise of open data formats like Iceberg is perceived as a threat as it opens up data that was locked in a silo to an ecosystem of processing and analytics layers, letting vendors compete on a level playing field based on cost, performance, and features, rather than any new entrant having to overcome significant data gravity. However, we believe Confluent is uniquely positioned to benefit from this trend as our goal, and indeed our business model, is built around the sharing of data. So the rise of Iceberg creates a very important destination for data that can increase the value of the streams in our platform. This makes TableFlow central to Confluent's vision of opening up and connecting all the data in an organization. We've heard overwhelmingly positive feedback from customers with this announcement and look forward to making this a significant part of our business over time. Last quarter, we discussed the world of stream processing and why our Flink offering is uniquely positioned to win this market. And we've been seeing incredible interesting excitement for our Flink offering. Nearly 600 prospects and customers have tried Flink since its preview. At Kafka Summit London, we announced the general availability of Confluent Cloud for Apache Flink. Early customer feedback has been strong. We see many of these customers starting the ramp towards production applications that will drive significant consumption over time. We announced another exciting Flink development at Kafka Summit Bangalore. We'll be adding Flink to our on-premise software offering, Confluent Platform. This helps our on-premise and hybrid customers adopt Flink for critical workloads running in their data centers. These are very exciting steps for Confluent, and it cements our position as the only complete data streaming platform. TableFlow and Flink are new capabilities beyond Kafka and represent significant progress towards building what we believe will be the most important data platform in a modern company. GenAI continues to be top of mind for many companies, but most are coming to realize that LLMs don't stand alone. RAG, or retrieval augmented generation, has emerged as the common pattern for GenAI to extend the powerful LLM models to domain-specific datasets in a way that avoids hallucination and allows granular access control. Data streaming platforms play a pivotal role in enriching RAG-enabled workloads with contextual and trustworthy data. It enables companies to tap into a continuous stream of real-time data from the systems that power the business and transform it into the right format to be used by vector databases for AI applications. Another announcement from Kafka Summit Bangalore that helps make this kind of RAG architecture easier was support for AI models and remote inference in Flink SQL. This capability is designed to simplify the development and deployment of AI applications by enabling software developers to integrate inference and embedding computation directly into their data processing, making it easier than ever to bring AI to real-time apps. We're seeing particularly strong traction with GenAI in the digital native segment, with companies like OpenAI, Notion, and Motive, who are leveraging GenAI to reimagine customer experiences in nearly every industry. One such customer is an AI-powered customer intelligence platform to manage contact centers and customer engagements. A powerful communications AI is central to its platform and is used for a variety of use cases, including surfacing real-time insights for call center managers and identifying when agents need immediate assistance or intervention in handling problematic situations. Their existing architecture was unable to handle the demands of real-time, with latency sometimes exceeding a minute. This sluggishness was unacceptable for an AI application that requires access to fresh and continuously updated data. So this customer turned to Confluent Cloud for fast and scalable data streaming. By integrating Confluent with other components of its architecture, the customer was able to significantly reduce latency for response times from over a minute down to as low as 10 milliseconds. With faster, fresher data and more real-time insights available, the customer is better equipped to meet the needs of its customers and provide them with valuable tools and analytics for managing their contact centers and customer engagements. But it's not just digital natives who are putting GenAI into action. Another great example is GEP Worldwide, a global leader in supply chain and procurement solutions. This billion-dollar revenue company provides software, consultancy, and managed services to some of the world's biggest multinationals. Its software offerings are infused by GenAI to support chatbots and decision support tools. Previously, the team was an open source Kafka shop, but operating and maintaining open source became too burdensome to maintain, ultimately stifling their ability to iterate and innovate quickly. So they turned to Confluent. With Confluent serving as the central nervous system of its software, the company is able to more quickly connect data across hundreds of applications, including both custom apps and the operational and analytical estates, to provide contextual, relevant, and real-time insight into its AI platform. Confluent continues to innovate across our products and partner ecosystem to make it easier for customers so organizations can quickly scale and build AI-enabled applications using trusted data streams. In closing, I'm incredibly proud of our team. Our rapid pace of innovation is phenomenal, and our field and go-to-market teams are leaning into our consumption transformation with early positive results. I've never been more excited or confident in Confluent's ability to capture the lion's share of the data streaming platform market. With that, I'll turn things over to Rohan.
spk07: Thanks, Jay. Good afternoon, everyone. We delivered solid first quarter results, beating all our guided metrics in a still uncertain macro environment. Key highlights include robust top line growth and bottom line improvements, the largest sequential customer growth since Q1 2023, and great momentum in multi-product adoption. These results reflect our team's strong execution on our consumption transformation and our expanding multi-product platform leadership in data streaming. Turning to the Q1 results, total revenue grew 25% to 217.2 million. Subscription revenue grew 29% to 206.9 million. Within subscription, Confluent platform revenue grew 15% to 100.1 million, representing 46% of total revenue. Customers rely on Confluent Platform to harness data streaming on-prem, on the edge, and bridge to the cloud. We continue to see healthy demand for Confluent Platform as most organizations are still early in their move to the cloud. Confluent Cloud revenue grew 45% to $106.8 million, exceeding our guidance of $105 million, and ended the quarter at 49% of total revenue, compared to 42% a year ago. Our cloud performance was driven by the ramp in consumption from select customers added in recent quarters. And we started seeing stabilization of new use case expansion in our existing customer base, including our digital native segment. Turning to the geographical mix of revenue, revenue from the US grew 23% to 127.4 million. Revenue from outside the US grew 28% to 89.8 million. Moving on to the rest of the income statement, I'll be referring to non-GAAP results unless stated otherwise. Total gross margin was 76.9%, up 470 basis points. Subscription gross margin was 80.7%, up 320 basis points. We're pleased with operating above our long-term target level of 75% for total gross margin, even with a continued revenue makeshift to cloud. Our cloud offering has significant architectural advantages in multi-tenancy, elasticity, data balancing, networking, and data replication. Combined with continual optimizations at every layer of the stack, we have driven a significant cost advantage in operations while delivering industry-leading innovations to our customers at a lower TCO. Turning to profitability and cash flow, operating margin improved 22 percentage points to negative 1.5%, representing our seventh consecutive quarter of more than 10 points and fourth consecutive quarter of more than 20 points in margin improvement. Operating margin performance was driven by our gross margin performance and our continued focus on driving efficient growth across the company with the most pronounced progress made in sales and marketing. The improvements in sales and marketing demonstrates focused efforts in driving operating leverage and improving unit economics. Net income per share was $0.05 for Q1, using 350.2 million diluted weighted average shares outstanding. Fully diluted share count under the Treasury stock method was approximately 362.4 million. Free cash flow margin improved 33 percentage points to negative 14.6%. And we ended the first quarter with $1.91 billion in cash, cash equivalents, and marketable securities. Turning now to other business metrics. Total customer count was approximately 5,120, representing an increase of 160 customers sequentially. This is our largest sequential growth in total customers since Q1 2023, reflecting the early signs of success from our consumption transformation. Customers with 100k plus in ARR grew 17% to 1,260 and continue to account for greater than 85% of our revenue. Customers with 1 million plus in ARR grew 24% to 168, reflecting the power of our network effect and customers' continued standardization on our data streaming platform. NRR was healthy and in line with our target range of 120% to 125% for this year. Gross retention rate remained strong and was above 90%. As discussed last quarter, we expect NRR to exceed our midterm target threshold of 125% starting fiscal year 25 as we exit the consumption transformation. RPO was 840.2 million, up 13%. Current RPO was estimated to be 570.6 million, up 20%. As discussed in prior quarters, RPO-related metrics are now less relevant given our greater focus on driving consumption for our cloud business. Starting this quarter, investors can access our RPO metrics in our Supplemental Financials document on our IR website. Now, I would like to discuss our long-term opportunity with our data streaming platform, or DSP. We have driven success with our cloud-native streaming product, with Kafka accounting for the substantial majority of our cloud revenue. Over the last few years, we have added Connect, Process, and Govern to complete our multi-product platform. As Jay mentioned, early customer reception of our stream processing product, Flink, has been strong. As our customers start building and ramping their streaming applications, we expect Flink will contribute to revenue meaningfully in 2025. But Confluent isn't just about streaming and stream processing. Our growth potential with Connect and Govern is often underestimated. Legacy integration companies have a massive installed base around connectors, and this is a significant opportunity for our Connect portfolio. Connect is our first and largest DSP product after streaming, and its revenue growth trajectory has been robust. For Govern, the increasing complexity around data, security, regulation, coupled with the rise of gen AI are driving the demand for our governance products. In fact, revenue growth for stream governance has been the fastest of any products we have launched to date. The multi-product aspect of our unified platform adds to our growth vectors and extends our runway to drive durable and efficient growth. Let me put it into more context. First, each pillar of our platform has the potential to become a large, independent business on its own. The three DSP products, which include Connect, Process, and Govern, are earlier in their S-curve of maturity and adoption than Kafka. But over time, we think their growth potential will be larger than Kafka itself. Second, our opportunity with our DSP products remain in very early days. In Q1 2024, the three DSP products accounted for approximately 10% of cloud revenue, but with a substantially faster growth rate than our overall cloud business. We expect the three DSP products to remain the fastest growing part of our business and account for a much larger portion of our cloud revenue over time. And third, multi-product customers have a higher NRR profile. In Q1 2024, customers using three or more products in our 100K plus customer cohort increased 47% year over year. These multi-product customers had an NRR substantially higher than the company average. This underscores the strong network effects of our unified platform where the success of one product drives additional success in the others. As our data streaming platform matures and multi-product adoption continues to increase, we believe we will be in a stronger position to address our 60 billion market opportunity ahead. Before turning to our financial outlook, I'd like to note that our guidance philosophy is consistent with prior quarters with the overall objective of setting prudent and achievable targets. We don't forecast a better or worse macro environment in our guidance. And as a reminder, beginning with the third quarter of 2024, we will fully transition to providing total subscription revenue guidance only. Now let's turn to guidance. For the second quarter of 2024, we expect total revenue to be in the range of 229 to 230 million, representing growth of 21% to 22%. Subscription revenue to be in the range of 217 to 218 million, representing growth of 23% to 24%. Cloud revenue to be approximately 116 million, representing growth of approximately 39%. Non-GAAP operating margin at approximately negative 1%, representing improvement of approximately 8 percentage points. And non-GAAP net income per diluted share to be in the range of $0.04 to $0.05. For the full year 2024, we now expect total revenue to be approximately 957 million, representing growth of approximately 23%. Subscription revenue to be approximately 910 million, representing growth of approximately 25%. Non-GAAP operating margin to break even, representing improvement of approximately seven percentage points. Free cash flow margin to break even, representing improvement of approximately 16 percentage points. and non-GAAP net income per diluted share to be in the range of 19 cents to 20 cents. Finally, we expect net dilution for fiscal year 24 will be approximately 3% in line with our midterm target. Our long-term target is to bring net dilution down to under 2%, which we expect will drive SBC as a percentage of total revenue down to the mid teens over time. In closing, we're pleased with our strong top line and bottom line performance in the first quarter. Our consumption transformation has shown early signs of success. The value proposition of our multi-product platform is resonating with customers. We will stay focused on delivering innovation and value to our customers while continuing to fine-tune our go-to-market effort, which we believe will put us in a stronger position to capture our market opportunity ahead. Now, G and I will take your questions.
spk05: All right. Thanks, Rohan. To join the Q&A, please raise your hand on Zoom. And today, our first question will come from Sanjit Singh with Morgan Stanley, followed by RPC. Sanjit, please go ahead.
spk00: Yeah, thank you for taking the question. Congrats on a solid start to the year. Jay, I want to go back to the big macro environment in terms of just the pace of new software development projects. I remember last year, that had definitely slowed down quite a bit. What are you seeing now in terms of just new software development initiatives? And maybe you can sort of tie that into some of the sales transformation efforts that you have going on in your organization.
spk10: Yeah, I think we've seen overall a stabilization. I would say the focus for a lot of our customers over the last year was really heavy focus on cost optimization with some amount of new developments, but really only the most necessary things. And I do think that's picked up a little bit. That's probably most pronounced in the digital native segment where they were probably the hardest hit. last year. And, you know, they probably have the biggest bounce back in terms of focus on AI related initiatives and other developments. So I would say that's positive. And then, you know, on the consumption transformation, you know, I think that's gone really well. You know, I do feel like we've seen, you know, we had to execute really a large number of changes in a pretty short period of time. And, you know, I think we really significantly de-risked the set of changes by, rolling out a bunch of system changes. They were well adopted by our field team. They've actually proven themselves out with customers. And I think that's shown up in the higher rate of new customer acquisition. And I think one of the nice things is in addition to just getting more customers, we're actually targeting and getting higher propensity customers. So it's kind of more volume and higher quality. So yeah, we felt like that was overall a really good result.
spk00: That's great to hear. I really appreciate the breakout on some of the new product contribution to Q1 2024. In terms of the monetization strategy across the pillars of DSP, could you sort of outline that for us? And how does TableFlow potentially get monetized over time?
spk10: Yeah. Yeah. So each of those represents kind of a distinct monetization opportunity. So the connectors, you know, we charge for each of these connectors. You know, there's a couple pricing levers, but it roughly correlates with how many instances of the connector and the amount of volume of data flowing. For Flink, it's kind of the compute hours, very similar to the models you'd see for other processing services like Snowflake. For governance, it's an uplift that comes as kind of a flat fee as you move to our advanced governance package, as well as something that scales up with your usage of the product. And TableFlow is new, so it's just in early access now. We haven't announced any pricing, but that will also have monetization opportunities to come along with it.
spk00: I appreciate this call again. Thank you.
spk05: All right. Thanks, Sanjit. We'll take our next question from Matt Hepper with RBC, followed by Barclays.
spk15: Matt. Hey, thanks, guys. Congrats on the results. Really, really nice to see. Maybe as a follow-up to Sanjit's question, you know, you guys have a lot of company-specific drivers that are certainly seeming to be apparent in your numbers. I'm just sort of curious, though, could you help us think about, you know, how important improving hyperscaler trends and growth rates that we're seeing in that is also relevant to your success? I'm just trying to get a sense for how much of it is just sort of more of the environment versus Confluence specific.
spk10: Yeah, it's a good question. I mean, I don't know that the specific performance of other companies directly drives us, but there's obviously some amount of correlation in all spend in the cloud. If we were breaking out the different things, I would say the success of our consumption transformation thus far, that's an important factor. I think the kind of DSP components that Ron outlined You know, our early contributors probably, you know, Connect is the furthest along, followed by Governance and Flink just went GA. So that, you know, that's just starting to ramp to revenue contribution. We'll contribute more, you know, coming into next year.
spk15: If I could just, one quick follow-up, actually that's a nice tie. You know, it sounds like, Jay, you mentioned 600 prospects have tried Flink. It's great to hear. I mean, we're starting to hear it show up in partner conversations as well. It sounds like a 25 thing. I'm just wondering, Rohan, when you think about it in guidance, are you considering any Flink contributions in the second half of 2024?
spk07: Thanks for your question. Well, we've said this before. Obviously, Flink is a big opportunity for us and 2024 is all about adoption and 2025 is all about monetization. So from an overall what's included in guidance, we're basically assuming that the contribution, the material contribution from Flink will happen in fiscal year 25. Got it.
spk14: Thanks, guys.
spk05: All right. Thanks, Matt. We'll take our next question from Raimo Lensja with Barclays, followed by William Blair.
spk03: Hey, thank you. Thanks for taking my question and congrats from me as well. Jay, on the Flink side, now that you have proper early customer conversations, what are you seeing in terms of the the adoption curve do you see there? And you obviously have seen Kafka before. What's the early customer conversations there? And what does it drive you to think about the addressable market coming out of that one?
spk10: Yeah, I would say it's been very positive. You know, there's incredible enthusiasm in our customer base, you know, really, really across the broad set of customers from the digital natives to large enterprises. You know, it's early in the adoption for any of these cloud offerings. Nobody wants to build production workloads against a pre-GA product. So this kind of milestone of going GA is really kind of the starting line. And then it's really about the build of production workloads. And each workload adds a little bit of continuous revenue production. And as those build up within customers, that's where it starts to contribute meaningfully. And so I would say overall, both the development of that product And the market reception has gone about as well as we could possibly expect as we kind of initiated the development of the Flink offering. Now it's really on us to go execute it as a business and make customers successful with it, which is obviously a very important next step.
spk03: Yeah, and then one follow-up, Rohan, where I get a lot of the questions at the moment from the financial communities on RPO, CRPO. Maybe it's worth a reminder of how that number came together and how that number needs to be seen in the overall context of results. Thank you.
spk07: Absolutely. Remo, we've called it out last quarter. When we think about the consumption transformation, one of the important changes that we are driving is making sure we're driving and incentivizing and focusing on consumption and not the commitment from the customer. And what that means is RPO is nothing but the commitment from the customer. And that's not a huge focus for us because what drives Confluent Cloud is the next unit of consumption. And as a result of that, we said that when you think about the forward-looking indicators of our business, consumption revenue and subscription revenue are true indicators of organic growth for us. And that will be probably a more leading indicator than RPO, CRPO.
spk03: Very clear. Thank you. Appreciate that.
spk05: Thanks, Raimo. We'll take our next question from Jason Ader, William Blair, followed by Wells Fargo.
spk09: Yeah, thanks, Shane. Good afternoon, guys. Just wanted to ask about Gen AI. You gave some customer examples where folks are using your technology as part of Gen AI projects. Can you talk a little bit more about the timing of actual impact to the revenue? And then what specific products are you selling? Is it just the streaming or is it some of the other elements of the DSP?
spk11: Yeah.
spk10: Yeah. You know, so kind of as we described, I would say that's ramping now. Like we're seeing customers that are adopting this. Usually the digital natives are a little further ahead in their use cases. You know, this is one of a number of use cases for us. It's not the only thing happening, but it's an important one. And I think a strategic one for customers. So. So, yeah, I think that as these initiatives evolve, hit production, I think we'll see an increasing ramp of contribution from them, you know, heading into next year. The, you know, what their customers are using is really the full platform. Like our role in this is to be the kind of supply chain of data. So that involves, you know, our connectors, involves, you know, Quora, our Kafka engine. It will increasingly involve Flink. and the integration into the LLMs that we just announced at Kafka, Samba, Bangalore. And so, yeah, I think all of those will be driven by these use cases.
spk09: Yeah, and one quick follow-up for Rohan. Rohan, can you talk about hiring right now? You seem to be doing a good job managing expenses, but I assume that with things stabilizing, you guys are ramping up some of your hiring, and that's one of the reasons why the op margins are going to be flat this year, but just maybe some thoughts on the hiring.
spk07: Yeah, I mean, Jason, when we really think about our resource allocation philosophy, it is obviously driving durable and efficient growth. And when I say durable growth, it's essentially extending our runway to growth over a long period of time. And as we think about that, of course, like investment and investment in headcount is a key part of that goal. So, for example, in Q1, we've had one of our strongest hiring quarters for the go-to-market organizations. which is great. And so, yeah, I think we feel pretty good with respect to where we are and how we are thinking about a balanced approach on growth and profitability. On your question on the margins, as you know, over the last, say, 24 odd months, we've improved our efficiency by over 40 percentage points. And heading into this year, we're on track to deliver the seven percentage point improvement, which is going to get us to break even for the full year. And we're on track to get there.
spk09: Good luck.
spk07: Thanks.
spk05: Thanks, Jason. We'll take our next question from Michael Turin with Wells Fargo, followed by Misuhu.
spk06: Hey, great. Thanks. I appreciate you taking the question. Jay, back to Flink. Is there a way for us to think about the customers you see as best suited to take advantage of the offering? I'm wondering if the addition of platform is important from that perspective. And then any early sense you can provide us around how getting newer products to GA can help as the go-to-market conversations are shifting more towards consumption profile and away from bookings?
spk10: Yeah. Yeah, I'm happy to do that. So, you know, we've certainly seen interest across our customer base. One of the things that we will see is a slightly different dynamic between the Confluent Platform Flink offering and the cloud offering. With the cloud offering, the early versions of the cloud offering tend to be best suited to new use cases, just beginning development, whereas there's more of a lift and shift opportunity on premise, as well as, you know, suiting new use cases. Over time, as that cloud offering reaches feature completeness and proves itself out with customers, there will be more of a shift of existing Flink workloads. That's the behavior we saw with Kafka, where the early adoption was the incremental use case and the lift and shift of kind of large install bases took more time. So I think we'll see a similar behavior here. And that's what we've seen out of customers. And nonetheless, across the full set of customers, there's a ton of eagerness. So people are kind of lining up, even if there's some feature they're waiting on, you know, they're, they're eagerly awaiting the delivery of that feature. So yeah, I think that we'll see growth on both dimensions in terms of why we added it to the software offering, you know, Confluent platform that was By popular demand, you know, originally the intention was just to do it in cloud. Ultimately, we have a set of customers that have pretty extensive on-premise operations, some of whom are very big Flink users. And they were very eager to have an offering for them as well. And for us, you know, our goal is to serve customers everywhere. And so as soon as we had capacity to kind of take on the development of that, we added plans for that and built it up.
spk06: Ron, if I may just, if you can comment from your perspective and how the go-to-market changes you've made are progressing and how that impacts your confidence around the initial fiscal year guide you framed alongside Q3, it's encouraging to see the numbers move up, but just any additional context is useful. Thanks.
spk07: Absolutely. Well, Jay touched on it. I mean, the early indicators from the consumption transformation have been very positive. And we've gone through, made a lot of changes with respect to processes, systems. And, you know, some of the early data points, for example, if you look at the customer ads that we had in Q1, it's the highest we've had in five quarters. Obviously early, but very positive. And then when you look at our Q1 performance in general, we're very pleased with our total revenue performance and particularly our cloud revenue performance and the growth we saw 45% year over year. And that momentum has actually continued into the month one of the second quarter as well. So when you kind of put this into context, Michael, for the full year, we've increased our full year guidance from 22% to 23%. And what has also happened is we've delivered a strong Q1 response. with a strong guide for Q2. So that has somehow de-risked the second half of the year in a manner which I'm candidly very happy about. And more importantly, when you look at the growth rates, first half versus second half, that was obviously a point of discussion same time last quarter. Now we're looking at flattish because of the de-risked nature of our first half performance. So overall, I would say early indication is very positive. feel good about our full year guide and obviously happy about our Q1 performance.
spk06: Very clear. Thank you.
spk05: Thanks, Michael. We'll take our next question from Gray Moskowitz with me, Suho, followed by Needham.
spk02: Hey, thank you for taking the questions. Jay, obviously there's a lot of buzz in the industry around Apache Iceberg. So once TableFlow becomes GA, what are your expectations on the adoption curve among your install base? Also, do you think that it will help you land new logos as well?
spk11: Yeah, yeah, I think it will.
spk10: We were actually expecting a fair amount of enthusiasm around this. As you said, there's a great deal of buzz around Iceberg. Despite that, I think we were actually surprised by how widespread the interest was. And we felt like, well, in many ways, sometimes the analytics environment is kind of you know, a little bit to the side of the team that we naturally serve. We weren't sure if they would have a direct interest in that, but in fact, it's been a huge drop and topic of discussion in almost every conversation that I've had with customers. So now it's on us to, you know, deliver a GA product. This is just the, you know, first step in that journey. So, you know, it'd be too early to project the rate of adoption or, you know, revenue contribution or whatever, but we feel that that has a ton of potential as it comes out onto the market. And as I said, it It does kind of align with our business in, I think, a really fantastic way. Confluence very much about opening up data and sharing it across an organization. And this is a fantastic mechanism for us to do that. In many ways, the fragmentation of the analytics market made it hard to deliver data in the volume that we would like across all the different systems there. And this really helps with that. And our ability to integrate that directly into Quora and offer that data in a very natural, low friction, low overhead way, I think is a great competitive differentiation. And I think a huge boon to that area as well, where one of the challenges in that environment is always getting access to high quality, reliable data that's up to date. So yeah, I think we're very excited about it. It's still early and we have to go finish the delivery of the product and make all the customers successful.
spk02: Very, very helpful. And then either for you or for Rohan. So we've spoken before about the potential to do a lot more business with SIs going forward. You know, the new Accelerate with Confluent program, you know, will that or can that be a difference maker in your view? And if so, why?
spk10: Yeah, Rohan, you want to take that?
spk07: Well, I would be happy to. Greg, I mean, we've called it out before as well. When we think about the broader partner ecosystem, I'm kind of up-leveling a little bit. That's an opportunity which is in early innings. Most of the opportunity is ahead of us. So specifically around SI and the Accelerate program that you called out, absolutely, that's an opportunity for us to drive revenue. But again, it's in early days, so it's not that something you're going to see next quarter or next month. It is a huge opportunity for us, and we're working very hard to make sure that we're taking advantage of that opportunity. So long story short, I think SIs and in general GSIs and the partner ecosystem will continue to be a focus for us as we look ahead. Thank you.
spk05: Great. We'll take our next question from Mike Sikos with Needham, followed by TD Cowan.
spk08: Hey, thanks Shane, and thanks Jay, thanks Rohan. I wanted to come back to the multi-product adoption that you guys are citing today. And what I was thinking through, and I just wanted to stress test this, is it fair to think that your shift in this go-to-market to prioritize consumption over commitments is actually driving faster adoption across Confluence platform? Or is it still too early to start seeing this in the numbers? This has been like a slow go, but that's something to come as a result of this go-to-market transformation you guys have put in place.
spk10: Yeah, it's actually a very good observation. So this is a subtle point. But previously, the field team really sold commitments, which was just dollars. And so the incentive to drive adoption of these DSP components was much less, right? Of course, if the customer adopts Splink, they'll consume more. As it comes time for them to renew next year, they might commit to more. But that payoff could be a year out. It's somewhat delayed. In a consumption world, of course, as soon as the consumption ramps higher, the immediate compensation arrives. And so that payoff is much more immediate. And so the consumption transformation was actually quite important for driving adoption of these additional components around Kafka. So in terms of, you know, have we seen that effect? Yeah, I think we've seen an increased focus from our field team on these components. You know, the use cases around that, it's not like it just materializes overnight all in one quarter, you know, that will build. But, you know, getting that model right to be set up for multi-product delivery was actually a substantial motivation for us in doing this more quickly. because we felt we had actually very good offerings now around Kafka, and we wanted to make sure that we were set up to sell them.
spk08: Awesome. And then just to follow up on the go-to-market, a bit of a two-porter here, but I guess to start with, Jay, it's interesting. One of the things that I think you guys are calling out is this attach you're seeing from even higher quality customers, despite the fact that you're not pressing on commitments, right? And I would have thought or the presumption would be that if you're not pressing on commitments, you might get some lower quality customers. So can you kind of tease that out for folks? I think that would be beneficial. And then I guess the second piece for Rohan, if you could just articulate like I know there was a lot of angst into the first half of this year given the go to market changes. But like what more is there to do on your front now that you guys are kind of clicking along here with four months and change behind you?
spk10: Yeah, it's a great question. So I'll start with a bit on customers. Yeah, so the change we made on the field side was to directly incentivize the land as part of the comp plan, but not only that, to actually target a set of high potential customers that we felt were particularly important to land and compensate even more highly for those because those would be worth a lot more to Confluent as they ramp to large consumption. And so, you know, what was exciting to us was not only did the volume of customers go up, but then as you said, yeah, those customers were actually better targeted into that set of high propensity spenders than they had been previously. And, you know, I think that's just the direct result of the incentives. And, you know, we had that differentiation because as you said, we wanted to make sure these are high quality customer additions that we're picking up. And I'll let you take the second half, Rowan.
spk07: Yeah, Mike, on the second half, we're obviously, we saw this was the first quarter of the transformation. And as Jay called out earlier, the early indications have been very positive, which showed up in our cloud performance for the quarter. And more importantly, when we look at our month one and how we are entering Q2, we feel good that some of the momentum had actually carried on to Q2. So that's good. But in general, like we still have a couple of quarters of execution that we need to focus on. But if you ask me, how am I feeling? Obviously what's evident in our Q1 performance and our Q2 guide, which are strong and which we feel are ahead of our expectations. And what that does is the strength in the first half is also de-risking our second half from an overall guidance perspective. So overall where we are, we feel really good with the transformation. But at the same time, we still need to execute a couple of more quarters.
spk08: Great. Thank you for that.
spk05: Thanks, Mike. We'll take our next question from Derek Wood with TD Cowen, followed by Bernstein.
spk12: Great. Thanks, guys. Nice to chat with you. Jay, you mentioned having gone through some pricing changes recently. Can you remind us what changes you made and what kind of dividends you're expecting to see as this gets absorbed in the market?
spk10: Yeah, there's a set of changes. Some of these are actually product offerings, which effectively allow better TCO and incentivize the use of our multi-tenant offerings, which are more efficient for us. So we announced freight clusters in Kafka Summit Bangalore. We announced enterprise cluster type, which is a high-performance multi-tenant offering with private networking. We made adjustments to some of the throughput-oriented pricing So there was a number of changes that came out. All of these were meant to reduce friction in the land and expand process. When we thought about this consumption transformation, a big part of it was changes on the field team, changes in our systems, changes in compensation. But I think going along with that, we felt it was very important that there not be a ton of product or pricing friction in that land process. So if we're trying to tell the team to go sell in a way that gets customers up and going. It can't be the case that to get to a reasonable price, there's a six month negotiation at the very front door of the process. And so those changes have lined up with that. Why do that? It's ultimately because there's a ton of open source Kafka and we want to go soak that up with our cloud offering. We feel that's very important. So kind of growing the breadth of that customer base, that sets us up for all the growth in those customers over time. And, you know, we do feel like these kind of changes and new offerings unlock workloads that would have been harder to access. And that comes out of the TCO of the offering, right? We've talked in the past about how Quora is able to really offer a better TCO for customers. And it's important that we make sure we have offerings that cut across all the different workloads they have so that it's a bit of a no-brainer across everything they do, not just a certain workload type or a certain use case. So that was our goal.
spk12: Yeah, that's a helpful color. And I don't know, for Jay or Rohan, you guys talked about rebound and digital native consumption trends. Wanted to ask about financial services vertical, which is obviously important for you. Just curious what you're seeing there around demand conditions and deal sizes and whether you're seeing much composition change in platform versus cloud.
spk10: Yeah, yeah. Yeah, that's continued to be a strong segment for us. And over the last few years, we have seen a pretty significant ramp up in Confluent Cloud adoption. And I would say that that happened first in the smaller banks. And then over time, that's spread to some of the largest financial institutions. And they tend to be a little bit slower to start with the new cloud offering. There's actually very substantial security, reliability, scrutiny that goes into the adoption of any part of their stack. But increasingly, we're really a great fit for their use cases. And actually, allow them to meet the requirements that they have faster than if they were trying to build this out themselves. And so we've started to see great adoption of cloud in financial services. And I think that's a very promising thing as these very large institutions open up something that is very low friction to consume across their very broad set of use cases. So we're really excited about kind of getting in the front door in a lot of these very large banks.
spk12: Great. Thanks. Congratulations.
spk05: Thank you. Our next question will come from Peter Wheat with Bernstein, followed by Guggenheim.
spk01: Thank you very much. Obviously, great to see the continued momentum on the cloud side and the transition to consumption working out kind of as you planned. But I may have missed it, but I feel like we haven't talked very much more on the platform side where I think we saw a sequential step down in revenue and And I wonder, you know, how we should think about some of that, you know, a little bit more weakness there and whether or not some of that's cannibalization of people moving to cloud. And so it's just some of the some underlying share shifting or whether or not we should think about, you know, like, you know, slower growth going forward on that side of the business, given that it's an important part of the revenue stream.
spk07: Yeah, you want to speak to that, Ron? Yeah, I'll be happy to take it. Thanks for your question, Peter. Well, when we look at our Confluent platform performance, we're very pleased actually. We grew 15% year over year. And when you generally think about the platform business, more of as a reminder, what happens is about 20% of total contract value is recognized as licensed revenue upfront. So what that can do, that can add a little bit of lumpiness in the revenue, Purely based on the timing of large deals or the timing of renewals for large deals, those have an impact. But when I take a step back and I look at, say, the last 12 months for this business, we've been very pleased with the overall momentum. And Jay also called out with respect to product innovation. You know, we launched Flink on-prem, which is obviously going to help this part of the business as well. So, yeah, I mean, you know, listen, we've said that Confluent needs to be wherever our data and applications reside. If it's on-prem, we need to be on-prem. If it's in the cloud, we need to be cloud. Just keeping that in mind, you know, we do feel that this is going to be an important part of the business as we look ahead.
spk01: Thank you.
spk05: All right. We'll take our next question from Howard Ma with Guggenheim, followed by J.B. Morgan.
spk14: Great. Thanks for taking the question. Jay, can you talk about some of the alternative options that you're aware of for the transport layer in RAG architectures? I don't believe in the standard yet. And on that point, do you have plans to establish a more formalized reference architecture program for RAG implementations? and perhaps a broader inference use cases too. And it really aimed at making confluence the standard for transport and transformation as well.
spk10: Yeah. It has been a focus for us evangelizing this architecture, because as you say, it is something that's just coming into formation now. The reality is I don't think that there are great alternatives for real-time data movement outside of Confluent. I would say we have a strong status as a de facto for real-time movement of data across the enterprise. There is opportunities for customers to just try and build it in batch. There's plenty of batch ETL products. The reality is that for a lot of these use cases, they're answering questions about the business, and that's really just not good enough. For a lot of these use cases, it's something that's you know, customer support related, or in other words, you're driving some aspect of the business where kind of answering with out of data information is very likely to be wrong relative to what the customer was just doing. And so, so we are seeing a, you know, a real push towards real time and yeah, it's on us to, to make sure that that as that stack solidifies, we have, you know, a permanent position in that.
spk14: That's great. And maybe I can slip in one more just on the topic of open source Kafka conversions. Can you talk about any progress that you're seeing with the Confluent Migration Accelerator tool, I believe it's called? And is that increasing your wallet share among Fortune 500? And to what extent are partners using that tool?
spk10: Yeah, we're just ramping that up. So, you know, somewhat surprisingly, we haven't had really a focused effort on these migrations. It's been, you know, somewhat more one off customer by customer. And so both in terms of tooling and with our partners, creating a focused effort to move customers over. As you can imagine, in any of these situations where there's kind of a better TCO alternative, but some effort that's required to make the switch, you want to reduce as much as possible that effort and make it really easy for customers to get from point A to point B. So I think that's just coming into being now. We believe that'll contribute over the next years. Great. Thanks so much.
spk05: Thank you. We'll take our next question from Pendulum Bora with J.P. Morgan. Pendulum.
spk04: Hey, thanks for squeezing me in. Congrats, everyone, for the quarter. One clarification. So help me understand how broad-based was the cloud consumption ramp? I heard it was driven by a select set of customers. So I wanted to clarify. And if you would understand if some of the new AI vendors that you recently added materially contributed in the core?
spk07: Yeah, you want to take it, Rahul? Yeah, happy to. Hey, Benjamin, thanks for your question. So, you know, when you look at a cloud performance, I'd put it in maybe two categories, the performance if I had to call out for Q1. The first one is when you look at our broad base of customers, we did see stabilization in consumption and the net new use cases. And the digital native segment is inclusive in there. So that's good. That's a broad base of our customer. And the second call out was some of our newer customers. We've seen the ramp up of these newer customers. I'd say something that we are very pleased on. And the Gen AI customer that you spoke about is probably in that cohort of customers. It's a few of them who've kind of, we've ramped and where the ramp schedule looks in line and we're pretty happy with that. And that's for Q1. And as we enter Q2, most of these trends have continued into month one of Q2, which has informed our guidance for Q2 as well. So that's the overall context around the consumption patterns.
spk04: Yeah, thank you for that. One question for you, Jay. We have been picking up on this notion that Flink's SQL being SQL, which is most understood by almost every developer, kind of opens up the aperture versus a skilled set of Java developer or something else. And bringing in more developers to do more Flink and then Flink additionally drives more Kafka. that kind of creates a little bit of a flywheel. Are you starting to see some of that?
spk10: I like this question. I mean, this question sounds like my answer already. Go ahead, go ahead. No, no, no. Please answer. Are we starting to see that? Yeah, we are. Yeah, I mean, our goal is to open up the full set of APIs. So the first thing we launched was SQL. Our intention is to bring out Java and Python APIs as well. We think they serve different use cases. There's a set of kind of core applications that will probably always be in these um you know more application oriented programming languages like java there's a set of more dynamic use cases and transformations which are well suited to sql one of the powerful things about flink is kind of opening up that broad set of tools all on top of a core engine you know i think that's one of the things that's made it the leader in stream processing and you know as we do that yeah our goal is very much to make this easier and easier to use you know the um for a long time, I think it's been the case that customers would prefer real-time data. They would rather work with apps that updated in real time, that reacted in real time. They would rather be able to connect things in real time. Nobody wants the data to be slow. It's actually just been difficult to do that. So making this really easy is kind of a core way of enabling this. Like there's an obvious benefit if you can make it not more costly and not more complicated for customers. So when you see us kind of focusing on both this ease of development and TCO-oriented things, that really is the kind of core thing that drives this. And as we do that, we think there's a huge opportunity for this whole set of batch data movement, batch processing that really needs to move and will move, you know, as the alternative becomes appealing because of that ease of use in TCO.
spk05: Got it. Thank you. All right, thanks. So as a reminder, the Confluent Earnings Report is now on our IO website. The report contains our earnings infographic, our one-pagers on our technology, the prepared remarks, and earnings slides from today's call. We encourage you to go take a look. And today, our final question will come from Miller Jump with Chur Securities.
spk13: Right. Great. Thank you for taking the question. And I will echo my congrats on the strong start. So just you talked about the strength in governance. And I'm just curious, like, is the need to get your data estate ready for AI driving more conversations there? And then maybe if you could just remind us what that opportunity looks like, maybe on a unit economics level, if you're spending a dollar on streaming, you know, what does that look like for governance?
spk10: Yeah, yeah, it's a great question. So yeah, AI is definitely one of the drivers. I would say that there's a whole set of forces that have driven interest in data governance. You know, one of those is just the kind of rising compliance regime around data. You know, GDPR is the start, but there's a long list of things that organizations have to do. The second is around just the safety of data. The third is actually around opening it up. Those first two are maybe things you have to do. But in order to really take advantage of data, it has to be the case that the right team can find the right data set, know what it means at the right time. That kind of discovery process documentation is actually really critical to the integrity of data as something that customers can build around and against. And then as you say, you know, all of that, I think has been supercharged by AI, where you have a set of applications that are much more data rich, you know, draw on many more data sources across an organization than a traditional enterprise app might. But in order for that to work well, you have to know what's going where and is it up to date? Is it getting there in the right way? Is it supposed to be there at all? And, you know, managing all of that has just gotten harder and harder. And managing it on top of some crusty set of old bespoke pipelines is trending towards impossible. And I think that's one of the things that has driven the rise of data streaming. And the nice thing for us is the ability to bring these governance capabilities kind of right there with the platform, so there's not extra effort to go and adopt this use case by use case. The data is naturally tracked as it flows. You have the lineage of what goes where. You have strong schemas that allow the creation of these data products that are shared across an organization. This is a really powerful thing for customers as they think about how they use this technology in the large and how they really take advantage of the data they have to better serve customers and be more efficient. And on the unit economics, Yeah, this will change over time as that product line develops. Right now, it is kind of a step up with some additional usage as you use it more broadly. I think we're adding more and more functionality around the encryption of fields of data. you know, around other aspects of how you use and analyze data. And I think that will increase the monetization over time. I think it's too early to call the, you know, kind of final ending state ratio probably for any of these offerings. But we do think that that will be a sizable business for us.
spk13: That is helpful. Thank you. And if I could squeeze in one quick one for Rohan, any gross margin changes to consider as these use cases outside of streaming start to scale?
spk07: Yeah, from a gross margin perspective, what we've said, Miller, is we are essentially a long term target is 75 plus percent gross margin. We are operating well above that. And it's been consistently above that. So as we look ahead for, say, rest of the year, we expect to be in the zip code of gross margins. So not a whole lot to call out there with respect to any impact one way or the other on gross margins.
spk05: All right. Thanks for all the questions. This concludes our earnings call today. Thanks again for joining us. Bye, everyone. Thanks, everyone.
spk07: Thank you.
Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-