Confluent, Inc.

Q3 2022 Earnings Conference Call

11/2/2022

spk03: Hi everyone, welcome to the Confluent Q3 2022 Earnings Conference Call. I'm Shane Z from Investor Relations, and I'm joined by J. Krabs, Co-Founder and CEO, and Stephan Tomlinson, CFO. During today's call, management will make forward-looking statements regarding our business, operations, financial performance, and future prospects, including statements regarding our financial outlook for the fiscal fourth quarter of 2022, fiscal year 2022, and fiscal year 2023. These four latent statements are subject to risks and uncertainties, which could cause actual results that differ materially from those anticipated by these statements. Further information on risk factors that could cause actual results to differ is included in the most recent Form 10-Q filed with the SEC. We assume no obligation to update these statements after today's call except as required by law. As a reminder, certain financial measures used on today's call are to express on a non-GAAP basis. We use these non-GAAP financial measures internally to facilitate analysis of financial and business trends and for internal planning and forecasting purposes. These non-GAAP financial measures have limitations and should not be considered in isolation from or as a substitute for financial information prepared in accordance with GAAP. A reconciliation between these GAAP and non-GAAP financial measures is included in our earnings price release and supplemental financials, which can be found on our investor relations website at investors.confluent.io. And with that, I'll hand the call over to Jay.
spk10: Thanks, Shane. Welcome everyone to our third quarter earnings call. I'm pleased to say that Confluent delivered another strong quarter with results exceeding the high end of all our guided metrics. Total revenue grew 48% year over year to 152 million. Confluent Cloud continued to be the fastest growing area of our business with revenue up 112% year over year to 57 million. We are also continuing to increase our operating leverage with a 14 point year over year improvement in non-gap operating margin. Despite the pressure from a more difficult macroeconomic backdrop, we think these strong and consistent results are testament to our ability to drive durable and efficient growth. The rise of data streaming is one of the most fundamental shifts in the world of data. Soon it will be hard to imagine a time when companies didn't have ubiquitous access to real-time data and ability to react to it instantaneously. Apache Kafka has emerged as a de facto standard of this movement. Hundreds of thousands of organizations, including more than 75% of the Fortune 500, use it every day for critical use cases across their business. But there are also incredible things happening within the larger data streaming ecosystem, including an extraordinary number of new use cases and technologies. In October, we hosted Current 2022, the first ever industry event for data streaming. It put Confluent at the center of the ecosystem and brought together over 4,000 attendees, hundreds of ecosystem partners, and more than 50 sponsors to learn, network, and explore the future of data streaming. In today's call, I want to give a view of this emerging data streaming category and explain how it relates to some of the legacy technologies it replaces. We believe data streaming represents a major new data platform that has the potential to be as broad in scope as databases have been. However, many of the legacy products in the space have been quite limited in adoption and scope. How can we square the skyrocketing adoption of Kafka, rapid success of Confluent, and expansive view most technologists have about streaming with the more limited success previous tools in the space have had? To answer that requires a brief excursion through the legacy technologies that Confluent displaces. Today, we could broadly think of three major estates of software and data. First, custom applications that businesses built from scratch and the operational databases that support them. Second, the hundreds of SaaS applications like Salesforce and Workday that address common yet critical needs for business functions. And finally, analytic systems that improve decision-making. Each of these areas has grown a set of ad hoc fragmented tools for integration and a primitive kind of data in motion. Let's review these previous generation of tools and then discuss how it is displaced by data streaming. Custom applications communicate with message queues, database change capture products, ad hoc APIs, and enterprise service buses. These technologies fit the real-time requirements of application workloads, but were impossible to scale, low level, and labor intensive to work with and limited in their application. SAS applications, meanwhile, grew their own set of tools, including proprietary application integration platforms, business process management tools, and bulk file transfers. These tools achieved some success in their domains, but were again limited in scalability, unable to handle complex data transformations or logic, fragile to work with, and disconnected from modern development platforms. Finally, businesses drive data into analytical systems primarily through a combination of legacy ETL or ELT tools, as well as pre-processing and data lakes. These tools support rich transformations but are stuck with slow batch processing that makes the data hours or days late by the time it arrives. These tools are all flawed in different ways. They are either slow and batch-oriented, non-scalable, require significant maintenance from centralized teams, or are unable to work with more sophisticated data and processing. But more importantly, the critical problem is that in a modern company, all three of the major states of data must be highly connected. The custom applications must interconnect with the off-the-shelf applications and the analytics applications. Consider a simple example of a modern retailer. Data about what is selling is needed by dozens or hundreds of custom applications, SaaS applications, and analytics platforms. Having to create point-to-point plumbing across a dozen different tools for each use case is simply not a feasible or scalable approach. Data streaming works by rethinking this problem from the ground up. Whereas the previous generation of tools were ad hoc and limited to a narrow domain, data streaming starts with a broad foundational approach. It takes the core architectural concepts of a database, such as a ledger of changes, transactional guarantees, horizontal scalability, and easy dynamic data transformations, and translates them from the world of data at rest to data in motion. We believe the result is something vastly more general and powerful than any of the previous solutions. It consolidates the fragmented ecosystem of integration tools with a solution that can achieve all the capabilities each of the previous tools could not. It's real time, it's horizontally scalable, it's transactionally correct, and it provides an open, programmable platform. This provides a solution that is better in each of the domains than the previous generation of systems. But its key strength is that it treats data in a reusable manner. A single stream of data can feed all use cases, whether custom applications, SaaS applications, or analytics stores. This allows vastly more simplicity and reuse than previous solutions. But the power of data streaming is well beyond the integration technologies of yesterday. Because it starts with an open, programmable foundation, it is not limited to building pipelines. The stream processing capabilities in Kafka allow any application logic, whether for data transformation or applying smart business rules. This is what has made Confluent a foundation for developing real-time applications that react to the stream of business events continuously. The real-time applications that organizations can build with Confluent are limitless. It includes fraud detection, fleet management, customer 360 platforms, real-time inventory management, and many more. Thus, while each of the legacy technologies was limited in scope and adoption, data streaming has much broader potential as it both consolidates this landscape and expands well beyond it. Indeed, considering the three estates of data I mentioned before, the custom applications, SaaS applications, and analytic systems, it's worth noting that each of these is a repository of stored data, that is, data at rest. However, equally important is the data in motion that Confluent is providing the underlying platform and foundation for. We believe this data in motion represents a fourth estate of data which will be equally critical to the operation of a modern business. This background provides a good framework for contextualizing a few exciting product releases from Confluent. While Confluent has revolutionized the underlying infrastructure for integration with data streaming, some of the legacy tools still had one advantage. Whereas Confluent was primarily a programmatic tool, many of the legacy tools were low-code or no-code GUIs, which, while limited in power, were easy to learn and use. This is what makes me so excited for our recent announcement of Stream Designer. Stream Designer brings a dead simple UI for building pipelines, familiar from other integration tools, but it does it on top of our modern data streaming platform. Stream Designer is the first drag and drop visual builder to rapidly build and deploy streaming pipelines natively on Kafka. It integrates with the core capabilities of our platform, Kafka, Connect, stream processing and governance to make building mission critical data pipelines simple. Stream Designer also makes deploying streaming data pipelines accessible to more people throughout an organization, including people less familiar with Kafka. Users with different skill sets don't have to give up the power of the underlying infrastructure either. They can seamlessly switch between the UI, code editor, and a command line interface to quickly and declaratively build data pipelines. Back in 2021, we mentioned the wide variety of up-the-stack use cases we are uniquely positioned to address. Stream Designer represents our first step in that direction and lets us serve the set of use cases broadly characterized as data pipelines. This isn't the end of the story, though. By having one layer where data flows throughout the organization, this enables Confluent to add additional value. As data streaming use cases grow and real-time data flows more freely across the business, it's critical that this data can easily be discovered, understood, and governed in real-time. Stream Governance Advanced does exactly that. The newest capability in our Stream Governance suite makes governing mission-critical workloads at any scale more reliable with a 99.95% uptime. And the ability to add user-generated business context makes it easier to find the data that's needed to power new use cases. Now customers can more easily scale the power of data streaming from individual projects to central nervous systems for their business. Taken together, Stream Designer and Stream Governance Advanced are powerful examples of the fundamental paradigm shift occurring with data streaming. They also show our unique ability to build out-of-the-box solutions on top of our platform that reach a broader set of customers, accelerate their adoption, and grow our addressable market over time. What was once thought of as different software categories are today consolidating into one much more general, powerful, and valuable market – data streaming platforms. Next, I'd like to touch on something I mentioned in my opening remarks. Confluent is set up to drive durable and efficient growth. We've seen fantastic momentum to date, but what's most exciting is our approximately $60 billion market opportunity in front of us. We've started to demonstrate both the breadth and depth of this opportunity. The breadth is captured by the massive adoption of Kafka, which provides a large installed base for new customer acquisition through self-service on Confluent Cloud. This is foundational to our strategy of converting open source users and landing Greenfield customers in high volume. We've also shown the depth of the value of these opportunities with our ability to expand rapidly after we land a customer. This is evident in our best-in-class large customer ratio, where 22% of our customers have an ARR of 100K or more. But at the same time, we believe there are strong expansion opportunities with our largest customers still, including those with ARR of 10 million or more, particularly as we make it easier to connect and consume data throughout the platform. We also continue to benefit from the secular move to the public cloud, particularly in an environment where there's increased pressure for organizations to run their businesses more efficiently. Our cloud native platform significantly simplifies operational complexities and reduces total cost of ownership, saving valuable engineering resources allocated to manually building and managing lower level infrastructure tools like Kafka. A new Q3 customer is a great example of the cost savings of Confluent Cloud. Armus is a leading cybersecurity platform for connected devices that enables its customers to discover and secure their IT, cloud, IoT, and edge assets in real time. Today, Armus tracks over 3 billion devices for its customers, from printers, laptops, and mobile devices to connected medical devices and factory equipment. Kafka is a central part of their business, responsible for ingesting data bidirectionally from billions of devices to provide real-time protection and policy enforcement. But with the rapid growth of its business and the proliferation of connected devices, the cost and overhead of managing open-source Kafka was unwieldy. This quarter, Armus turned to Confluent Cloud for a cloud-native Kafka service that can scale alongside its business. Confluent will be the central nervous system for Armus' data streaming platform, managing data from billions of devices in real-time. all while enabling them to reassign 70% of the expensive engineering talent previously focused on Kafka to projects that move the needle for the business. The durability of our growth is also reflected in our ability to rapidly expand once we land a customer. A great illustration of this dynamic is one of our largest Confluent Cloud customers. As one of the highest trafficked job websites in the world, this customer sends more than 4.5 gigabytes per second through Confluent Cloud every day. Kafka was a no-brainer choice to start their data in motion journey, but they soon found themselves spending too much developer time managing Kafka. Our commercial relationship started with a small deal in 2020 for a single use case in a single business unit. As that pilot proved successful, we landed a $1 million plus deal that expanded Confluent Cloud to more business units across the company. Inspired by our platform's extensive capabilities and an accelerated move to the cloud, our customer reimagined their data architecture in 2021, leading to our first multi-million dollar deal with this customer. As Confluent became a critical unified data layer across the organization, their annual spend surpassed $8 million. As you can see, what often starts as a small land for a single use case can rapidly expand to a large customer in just two years time. But we believe we are still at the beginning of a great partnership as use cases and streaming data become more pervasive throughout the organization. Turning to efficiency. On a year-over-year basis, we improved non-gap operating margin by 14 points in Q3 and 8 points in Q2. We are pleased with the substantial margin improvements we've driven this quarter and excited that there are significant opportunities to continue these improvements. We're making substantial progress in creating strong connective tissue between our product-led and enterprise sales motions to help accelerate our customers' time to value. And we are still early in leveraging our partner ecosystem and bringing to bear a solution and industry focus. As an eight-year-old company, we believe our go-to-market model will drive differentiation and separation from our competitors, which will generate greater leverage and efficiency in our model over time. Looking forward, we remain confident in our ability to achieve positive non-gap operating margin when we exit Q4 2024, Confluent's 10-year mark as a company, a profitability timeline comparable to many of our successful high-growth peers. And finally, we're pleased to announce that Ray Perez has joined Confluent as our Chief Customer Officer. Ray joins us from New Relic, where he most recently held the role of CCO, leading the Solutions Engineering, Solutions Architecture, Enablement, and Expert Services teams. We'd also like to thank Roger Scott for his leadership and impact while at Confluent and wish him the best in his next chapter. To summarize, we have entered the data streaming era. Kafka is at the center of this movement, but represents just the foundation of the emerging platform. Our recent releases of Stream Designer and Stream Governance are great examples of this and show how Confluent is moving up the stack to help our customers connect, process, store, govern, and share data from across their businesses. We believe this model will be the basis to drive continued durable and efficient growth for our business and allow us to capture the lion's share of our large market opportunity ahead. With that, I'll turn the call over to Stefan to walk through the financials.
spk11: Thanks, Jay. In Q3, we beat our revenue and bottom line guidance as we've done in each quarter since going public. Key highlights include strong top-line growth, including the largest increase in sequential revenue add for Confluent Cloud, robust expansion of large customer cohorts, which translated to another quarter of greater than 130% net retention rate, and a 14-point year-over-year improvement in non-gap operating margin. These strong results reflect that our market-leading data streaming platform continues to resonate with customers despite a cost-conscious environment. and we continue to demonstrate our ability to drive durable growth and improve efficiency and profitability. Turning to the detailed results, I like to note all comparisons are on a year-over-year basis unless otherwise noted. RPO in the third quarter grew 72% to $663.5 million. Current RPO, estimated to be 62% of RPO, was approximately $408.2 million, up 59%. Total revenue grew 48% to $151.7 million. Subscription revenue grew 50% to $138.7 million and accounted for 91% of total revenue. Within subscription, Confluent platform revenue was $81.8 million, up 25%, and accounted for 54% of total revenue. Confluent Platform remains an important component of building a central nervous system for our customers and continues to drive upsell and cross-sell opportunities for Confluent Cloud. And we saw another record quarter of sequential revenue add for Confluent Cloud, up $9.9 million sequentially and up 112% to $56.9 million, representing 38% of total revenue compared to 26% of revenue a year ago. Cloud accounted for more than 60% of new ACV bookings, and Q3 marks the fourth consecutive quarter where Cloud accounted for greater than 50% of new ACV bookings. As cloud accounts for a larger share of new ACV bookings, Confluent platform will have lower ACV and less upfront revenue. Confluent cloud momentum was driven by our strong product advantage, our continued focus on decreasing time to value, and use case expansion, leading to robust consumption across a broad spectrum of verticals. Additionally, customers run their operational workloads on Confluent, and these workloads are directly responsible for driving the core operations of our customer's business, which reflects the mission criticality and resiliency of our data streaming platform. Turning to the geographic mix of revenue, Revenue from the US grew 44% to $95.1 million. Revenue from outside the US grew 56% to $56.6 million. On our last earnings call, we called out some deals that were taking longer to close in Q2 due to additional scrutiny in pockets across geographies. While this dynamic has continued, I'm pleased to report that the vast majority of those deals were closed as expected in Q3. Turning to customers, we added 120 net new customers, ending the quarter with approximately 4,240 total customers, up 40%. We're pleased with the improved sequential growth despite the continued impact of paywall removal. This strategic move removed the payment friction for developers to test drive Confluent, and it continues to have a positive effect for new signups. As a reminder, we expect the impact of total customer count will take a few quarters to work through, and new pay-as-you-go customers do not have a material contribution to our revenue in any given quarter. The growth in our large customer base continue to be robust. Customers with $100K or more in ARR grew 39% to 921, representing 22% of our total customer count, and these customers contributed more than 85% of total revenue in the quarter. customers with 1 million or more in ARR grew 53% to 113. As discussed at current, we have a growing number of customers with 5 million plus and 10 million plus in ARR, and we see strong expansion opportunities across our customer base, including these two largest customer cohorts. Dollar-based net retention rate in the quarter remained above 130% for the sixth consecutive quarter, driven by 90%-plus gross retention and strong expansion across platform and cloud. Importantly, NRR for cloud continue to be higher than the company average and NRR for hybrid customers remain the highest. A period of tough economic times is when the real durability of demand for our product is tested. And we think our consistent and strong NRR is a testament to our TCO advantage and the mission criticality of our use cases. Moving on to gross margins, I like to note that I'll be referring to non-GAAP results unless stated otherwise. Total gross margin was 71% and subscription gross margin was 76.9%. Our focus on improving the unit economics of our cloud offering continued to pay off, driving another quarter of healthy gross margins despite a continued revenue mix shift to Confluent Cloud. In the near term, we anticipate total gross margin to fluctuate near our mid-term target of approximately 70%. Turning to profitability and cash flow, operating margin improved 14 percentage points to negative 28%. The improvement was primarily driven by improved sales and marketing efficiency and our balanced approach of investing in the highest ROI projects while continuing to proactively manage spend across the organization. Net loss per share was negative 13 cents using 282.3 million basic and diluted weighted average shares outstanding. Free cash flow margin was negative 30% in line with our expectations. As discussed last quarter, we changed our annual bonus structure by moving 13 and a half million payout into Q3 22 from Q1 23. The bonus payment had a negative impact of approximately 9 percentage points on free cash flow margin in the quarter. And we ended the third quarter with $1.94 billion in cash, cash equivalents, and marketable securities. Now I'll turn to our outlook. We are raising our revenue and bottom line guidance for Q4 in FY22. The magnitude of the raise incorporates what we've experienced since June, where deal cycles are elongated due to the additional scrutiny on budget approvals. Our forecast assumes that this macro dynamic persists in Q4. For the fourth quarter of 2022, we expect revenue to be in the range of 161 to 163 million, representing growth of 34 to 36%. Confluent Cloud sequential revenue add to be in the range of 9.8 to 10 million. non-GAAP operating margin to be approximately negative 28%, and non-GAAP net loss per share to be in the range of negative 16 to negative 14 cents, using approximately 286 million weighted average shares outstanding. For the full year 2022, we expect revenue to be in the range of 578 to 580 million, representing growth of 49 to 50%. non-GAAP operating margin to be approximately negative 32%, and non-GAAP net loss per share in the range of negative 65 to negative 63 cents, using approximately 280 million weighted average shares outstanding. Looking ahead, while we're still in the midst of annual planning, I'd like to provide a preliminary outlook for next year. For the full year of 2023, we expect revenue to be in the range of 760 to 770 million, Incorporated in the preliminary revenue outlook is a negative impact of 12 to 17 million, stemming from the increased scrutiny on deal approvals, and we're assuming that the overall macro dynamic that we see today will continue to persist throughout next year. And we expect non-GAAP operating margin for the full year 2023 will be approximately negative 21%. We will continue to invest with discipline, focusing on the highest ROI segments of our business to drive efficient and high growth. We'll monitor and course correct if the macro conditions change materially, and action is warranted to ensure we meet our margin targets. In closing, our strong Q3 results are another proof point of our ability to drive high growth with increased efficiencies. The demand environment for data streaming remains strong despite the macro dynamics around deal approvals. Our Confluent Cloud momentum reflects our TCO value proposition, the differentiation of our use case driven consumption model, and the mission critical nature of our cloud native platform. Looking forward, we're well positioned to drive efficient digital transformation with high ROI for our customers. Now, Jay and I will take your questions.
spk03: All right, thank you Stefan. To join the Q&A, please raise your hand on Zoom and when you're selected, please be sure to unmute and turn on your camera. We will now pause a few seconds to assemble the Q&A roster. And today, our first question will come from Phil Winslow with Credit Suisse, followed by William Blair. Phil?
spk05: Great. Thanks for taking my question and a really strong bookings quarter, but also delivering a pretty impressive operating leverage. Jay, one of our takeaways from Current 22 is that large packs sheet open source DIY simply wasn't really scaling for them. And they were therefore increasingly looking at Confluent Cloud. Now, I also appreciate your customer example on this call today. So with IT budgets as a whole coming under pressure due to just macro, how is this changing conversations with those open source Apache Kafka users in terms of TCO and potentially moving those to Confluent Cloud? And then I just have one follow-up for Stephan.
spk10: Great, great observation. That's exactly right. So, you know, the TCO angle of our cloud offering has been part of our sales process for a while now, but it has kind of come to the forefront. And, you know, it's one of a number of advantages of moving to, you know, a true cloud offering, right? You get a more complete product. you get something that's really cloud-native and elastic and expands as you need it, and then it's actually just a better deal. We think that's compelling. We see those, call them large tech customers that have a big install base of Apache Kafka. We see that as a huge opportunity for us where They're already using the open source. You know, in many cases, they started before there was a Confluent or before there was a Confluent cloud. But now that we have a cloud offering that really works well at scale and is cost effective, it kind of meets the needs of even very demanding customers. We're very optimistic about that segment.
spk05: Awesome. And then Stefan, it was great to see that sequential increase in Confluent Cloud. And I remember you flagged that two quarters ago. Could you just remind us of the seasonality that you see there, kind of Q4, Q1, just the trend there? Because obviously the bookings were strong again at a high level this quarter, but just maybe help us think through just sort of the, call it the seasonality in terms of translating that into Confluent Cloud revenue.
spk11: From Q3 to Q4 or from Q4 to Q1?
spk05: Both, actually.
spk11: Yeah. So sure. So we were very pleased with the fact that we were able to deliver the highest sequential net add in Q3 of 9.9 million for Confluent Cloud. So that came in above our expectations. And as we think to Q4, we factored in a couple of things into our guide for the $9.8 to $10 million sequential increase. And two main factors, one is we have very robust consumption modeling. um system that we use to help drive that and then we also learned from last year that at the end of a calendar year um some companies uh aren't as aggressive in terms of deploying new workloads um in in q4 as they look to lock down systems uh you know for the end of the year so we took that into consideration we're very pleased though with the with the 9.8 to 10 million dollar uh guidance range for for the quarter And then Q4 to Q1, typically most companies that have a hybrid revenue model would see sequential net add from Q4 to Q1, but not at the same rate as the prior quarter. And while it's a little bit too premature to talk about specific quarterization of what FY23 looks like, we wanted to share that dynamic with you as well, because when you look at the overall 760 to 770 range and the growth rates that we're posting, Confluent Cloud is going to be a continued driver and it's going to continue to account for probably two to three points of additional revenue share per quarter throughout next year. Awesome. Keep up the great work. Thank you. Thanks, Bill.
spk03: Thanks, Bill. We'll have Jason Ader from William Bear next, followed by Morgan Stanley.
spk07: Thanks, Shane. Hey, guys. I guess my question, well, first I want to say, great job on explaining the power of Kafka. I thought you guys did really an amazing explanation, really simple and articulated the history, I think, extremely well. My question is on consumption and whoever wants to take it, but we've seen some other companies, obviously, Azure, AWS, Mongo, get hit by this sort of
spk10: consumption impact related to macro um doesn't seem like it's affected you guys as much maybe can you talk through that and why yeah yeah i i you know it's an interesting one i do think this is more about the individual product or products than it is about just consumption models you know i think no matter what the business model is when there's tight economic times customers look at hey you know, are we really getting the value out of this? Is there a way to use less seats, less instances, less servers, or in a consumption model, can we dial down overall usage in some way, right? I think the reason that you see less of that with Confluent, you know, a few things, like one, we tend to serve mission-critical use cases, and there's a lot of use cases, right? So the opportunity to expand is very large. And that counterbalances any headwind from optimization that we've seen. Beyond that, I think these mission critical use cases are important. You're not going to turn it off in one quarter. And they usually come out more or less pre-optimized. And so we haven't seen a huge amount of that in our customer base. Obviously, economic pressure does create a headwind. you know i think the tailwind in this area is strong enough that it you know just doesn't show up and that's kind of shown in the overall nrr you know that's remained strong i think shown in the in the kind of sequential net ad in cloud and a quick follow-up shane for stefan just on percentage of arr through partners and then particular cloud partners can you give us those numbers stefan
spk11: Well, we don't break those out specifically, but what I can tell you is we saw tremendous growth from all the three main cloud service providers through the marketplaces. We saw the best growth that we've seen in a long time from all three. So when we look at the overall mix of business, primarily where we sell direct to our customers, and oftentimes we fulfill through the channel, but the marketplace business that we do with the three cloud service providers is definitely an accelerant, and it's been a bright spot for us specifically this past quarter.
spk07: And it has better unit economics for you?
spk11: It does, for sure. Thank you.
spk03: Great. Thanks, Jason. We'll take our next question from Sanjit Singh with Morgan Stanley, followed by DA Davidson.
spk06: thank you for taking the questions guys congrats on a very very solid q3 um jay i think over the past couple of quarters you and i have talked about different use cases it has sort of another sort of take on that if you sort of aggregate it up in your mind what percentage of the business confluent cloud or overall is coming supporting like operational use cases whether it's just in time inventory or a dynamic pricing something that's driving a more operational use cases versus um consulate being used for you know analytical decisions to support um for more analytical use cases is there any way to think about that and how that maybe evolves over time
spk10: yeah yeah I don't have something more scientific and there's obviously a set of use cases that kind of blur the lines where okay it's some machine learning APP it's kind of analytics see, but it does have some feedback loop into production. yeah, but I would say the vast majority are kind of operational in nature right they tend to be things built by software engineers, rather than data scientists. You know, we do have an interaction with the analytics world we're often. kind of a very significant feed into the snowflakes and Databricks and big queries of the world, but we're not really competing with those technologies or replacing them. I think to them, we would just be an upstream feed, even though obviously in our domain, we're the exciting thing doing data processing.
spk06: Is that part of the, in terms of just the position that you just laid out, is that part of the reason why we're seeing relative resiliency versus some of the other data platforms that are seeing customers optimize and ration more?
spk10: yeah i think it's one aspect i mean there's a number of things i think when you peer behind the curtains in any of these companies i couldn't speculate you know company by company but i do think if you think about operational use cases this is some chunk of software software engineer has spent probably a better part of a year maybe a team of people better part of a year building it doesn't just get turned off and on so it's not you know like when we when we look at our spend on you know reporting for example we realize oh look we have a bunch of reports nobody looks at let's delete them you know it's pretty easy to do that's generally not the case with these kind of more core applications but you know if you look at what happens with consumption there's a number of things that impact it including just the particular product how much traction it has that space you know you can't fold all that into just the space but i do think that's one important variable And, you know, I think it's a positive thing to serve these mission critical use cases in this environment. I think these things are typically like the very important big bet for the company. It's the thing that's likely to kind of go forward in a purchasing decision. And it's the thing that's, you know, unlikely to get cut when you're looking to optimize. And I think that shows up, you know, both in the continued expansion, you know, sequential adding cloud, as well as we shared the gross retention number. You know, I think all of it kind of supports that story.
spk06: Gareth J. makes no sense and then one for stuff and stuff and thank you so much for the color on seasonality and confident cloud as it relates to fiscal year 23 the the 12 to 17 million headwind which I think is primarily from longer sales cycles any way to like. I just sort of arrived at that number and guidance more broadly in terms of setting expectations this early, just given the dynamics of your model. How do you how do you guys sort of, you know, what are the underlying assumptions that got you to 2023 guidance?
spk11: Well, it starts with the track record of us delivering on our commitments throughout FY22 and in the basis of the FY22 numbers was the FY22 plan. And right now we are in the midst of doing our FY23 plan. we have very good visibility to multiple streams of future revenue. And then we also took a look at a number of drivers around productivity per head. We look at pipeline, we look at a number of other factors, and then we layer on the macro dynamics that we've seen around sales cycle elongation and that in through all that process, we arrived with the 12 to $17 million headwind for FY23 that we're calling out. And look, you know, the environment has been tough. We've been able to go through and execute against what we've committed to folks. And our level of confidence and enthusiasm around FY23 is there. We take things into consideration like the macro and everything else, but at the end of the day, we felt good about the 760 to 770. and the improvement operating margin. We are definitely focused on balancing growth and profitability, and that has shown up not only this year, but how we're guiding next year.
spk06: I really appreciate that, Stephan. Thank you so much.
spk03: All right. Thanks, Sanjit. We'll take our next question from Rudy Kelsinger with DA Davidson, followed by Callan.
spk12: Hey guys, thanks for taking my questions. I know you talked about, you know, in the macro, you know, extended deal cycles, etc. I guess I'm curious, and you also talked about, you know, current customers that can't really dial you down. But I guess as you look at your current customers and the pace that they're expanding to new use cases, how has that pace changed over the last six months, given the macro?
spk10: Yeah. You know, when we, when we kind of give that, you know, two to 3 million of pressure for Q4 and as, as we did for Q3, I think that, you know, takes into account both new use cases, like net new customers, new use cases, as well as expansion. And I think it's relatively similar between both. There's obviously a little more scrutiny on a new customer than on an expansion. The reality is there's a lot of use cases to go after. And so despite that, I think we've shown strong performance on NRR, which is kind of the best roll-up staff that shows what's actually happening. And so I think that is the takeaway as we're kind of seeing that continue. And then we did talk about, hey, some of these deals that slipped, we felt confident they would close last quarter. And that was worn out. The vast majority of those did close, took a little longer, but we were pleased to see that. And I think that's credit to the strength of the TCO argument, that these things are getting a lot of scrutiny, but they're going through with a lot of scrutiny.
spk12: Yeah, got it. Stephen, quick follow up for you. Given the commentary on 23, I know you're not given two granular details yet, but just with cloud expected increases two to three to three points as a percentage of the mix. Is it right to think that platform growth is really going to start to level off? I think that forces you into about, you know, high single digits to low double digits growth on platform. Does that does that sound about right?
spk11: We'll be providing more color commentary around. Justin Delacruz- guidance and mix on our on our Q4 call but, by definition, given the mix shift that we've already seen this year with cloud becoming a greater portion of new ACB by definition that that that means that confluent platform is. is reducing in terms of percentage of ACV and we see that trend to continue. So growth will be moderating for Confluent Platform. With that said, Confluent Platform is incredibly important and it's important for the solution that we're providing. Customers need to be running their data in motion solution everywhere. That means on-prem, hybrid cloud, multi-cloud and And so Confluent Platform is going to continue to be a very important part of the story, but growth will be moderating a little bit as we've seen this mix shift. And by the way, the cloud business continues to overperform our expectations, and it's really being customer led. And we feel like we're still in the very early stages of the growth curve with cloud. Got it. Thanks, guys.
spk03: Thanks, Rudy. We'll take our next question from Derek Wood with Cowan, followed by Wells Fargo.
spk09: Great. Thank you for taking my questions. Great job on a solid quarter. I guess, Jay, just to start with you, going back to the topic of this mindset shift from DIY to cloud, that was certainly something we heard loud and clear at your conference. Two questions about this. One, as this shift happens, how does this change the competitive landscape and kind of shortlist the vendors that companies may look at and And two, how are you changing your go-to-market strategy from a direct and indirect sales standpoint as you really try to lean more into the cloud?
spk10: Yeah, both are excellent questions. I think this is a very significant thing that's happening. It's happening in the big tech companies where they have heavy investment. It's happening in mainstream enterprise. Companies are just looking at where resources are going and looking for a way to do the same thing better, faster, cheaper. That's what this kind of time tends to encourage. And there's a really obvious one with these managed services where You're putting a lot of effort into a very difficult problem that you're not doing well, and there is a better way to do it. So yeah, we're absolutely seeing it. What does it impact in the terms of the competitive landscape? I don't think it's a huge change. I would say, net net, the folks who are focused in an on-premise world become less relevant. And there are a set of companies that provide you know something around kafka support or some of these legacy technologies that primarily exist in data centers i mean i do think there's a there's a shift away from that stuff so you know that that's a change i don't think it's a dramatic change because we didn't view those as kind of compelling competitors to begin with because this shift was already underway On the partner side, I don't think it's a major revelation. We work very closely with the cloud providers, and they're obviously a very important partner, as Stefan alluded to, each in their own environment, have a significant sales force. And they're looking for ways to bridge between these environments. When I talk to any kind of large enterprise, they are all in on the transition to the cloud. They have a set of systems in data centers that are going to persist for a very long time. that connectivity and the fabric for data that spans, that's incredibly strategic and important to them. And I do think that's a part of the conversation around data streaming that they view as very strategic because it's what enables them to actually make this move. And so it's both about you know, the shift of open source stuff like Kafka to a managed service, as well as the overall story of migration to the cloud and how that plugs into an existing business. And so I think both of those are kind of important aspects of what we're doing.
spk09: That is great. Thank you. Stephan, one for you. I know Billings is not a number you closely watch, but just wanted to ask about this because there's been a bit more deceleration and deferred revenue than in the RPO or CRPO. CRPO numbers have been um holding up quite quite high even against tougher comp so i guess just as you um as you shift more to the cloud does that have some sort of impact on deferred especially if there's pay as you go or there's some things around invoice duration that have changed because of the macro just just trying to get an understanding of the potential dynamics around the deferred revenue side
spk11: Well, for both billings and deferred revenue, they are impacted by mix. And as we have more confluent cloud business, specifically on billings, we have a mixture kind of in practice, but usually it's monthly or quarterly billings for Confluent Cloud. And one of the reasons why we do focus on RPO and CRPO is that's more consistent with a consumption type model, as opposed to more of a legacy model that would be more focused on traditional billings. So the more that cloud becomes a bigger percentage of our new ACV and revenue, there will be more of a disparity between billings and short term revenue versus RPO and CRPO. And I guess the last thing I would say is on just Billings practices in general, we haven't seen very much change in Billings requests or anything along those lines. Things have been relatively stable.
spk03: Great. Thanks, Derek. Thank you. Thank you. We'll take our next question from Michael Turn with Wells Fargo, followed by Sculpture Bank.
spk08: Ney there, thanks, appreciate you taking the question and nice job to everyone with the execution here. I mean, look, you're managing your business through a challenging backdrop. Maybe Jay, you can speak to how you keep the go to market and engineering teams all focused and moving in the right direction given just all the noise and volatility you've seen since going public. And then Stefan, can you maybe expand on how that informs your approach to the growth margin framework and trade off and just what underlines the confidence and just the consistent approach to making sure you're taking advantage of the growth opportunity first and foremost here.
spk10: yeah hey Michael um yeah you know look I would say there's there's no silver bullet the the thing I think does help us is as a company we've done some hard things before so you know we went through a very significant kind of cloud transition you know you would see a lot of that kind of shift happening now but the work was done earlier on and doing that in a small company is is a lot of work and just difficult you're asking people to you know, build and launch a very difficult second product in the middle of the startup, take it out to market. You know, I think just working through hard things kind of builds the muscle. And then if you're in more difficult times, people who have been through, you know, something hard before, you know, you've kind of built a little camaraderie around how you go do it. There's some resilience, there's some, you know, people are used to it. And so, you know, I think that helps. And I think that kind of built a bit of a core of a team that is ready for some challenges. Obviously, it doesn't help if there's kind of friction in the world. It doesn't make it easier. I think we're lucky to be in a space that just has potential that can outpace that as long as we execute well.
spk11: And on the growth and profitability side of the House, we've been very prescriptive ever since we've been public around. We have a growth and profitability framework. It bears to mention we've never been a company where it's been growth at all costs. We've been always about growth and improving operating margin, and we've constructed our internal plan and what we've communicated with Wall Street through the following lens. It is all about making sure that we are adequately invested to tackle the $60 billion market that's out there. while we're also increasing the fundamental drivers of efficiency. Think about productivity per head. Think about unit economics of cloud. These are projects that are going to be continuing on for probably forever. And as we've constructed this model, showing the year-over-year improvement in operating margin is critical for us to demonstrate that we're showing that efficiency and productivity while still in high growth mode. And we define high growth mode as greater than 30%. So this is a framework that we put in place. We've been operating against it And we're committed to driving high growth and improving profitability towards the break-even target of exiting Q4 24, which, as Jay mentioned, is the 10-year mark of the company's age.
spk08: It's been a great journey. Congrats on the results. Thank you.
spk03: Thanks, Michael. Thanks, Michael. We'll take our next question from Nick Altman with Scotiabank, followed by Barclays.
spk01: Yeah, great. Thanks, guys. Maybe one for Jay to start. You guys outlined your TAM at the analyst day, and the numbers kind of insinuated that half of the market is kind of at the commercial end. And just given you guys don't talk about the lower end of the market as much, I was wondering if you guys could maybe elaborate on the strategy there, how meaningful the commercial segment is to revenue and to bookings, and how much of a focus is that portion of the market in the medium term versus the large round of the market?
spk10: yeah that's it's a fantastic area for us um you know a few years back that was absolutely the worst part of the business and had very low uh focus internally and as we started to have a low friction cloud product we realized hey we could build just really a cloud native motion that would help land companies in that segment and really kind of reimagined that team And they they got better on every metric right that now it's you know one of the highest performing areas in the company. It's you know that's newer right? But it's definitely an area where we are doubling down, and you know, kind of disproportionately adding capacity. You know they're doing great against their targets, and we we think that's really important. You know the To us, it's important to have a mechanism to capture both breadth and depth, right? So depth is definitely the larger customers. We do tend to talk about those because people always have the question of like, hey, what is the breadth of depth? What could a customer be in this space? How much might they spend? And so I think there's value in that. We don't talk as much about how we're going after the broad market. I think there it's more about the mechanisms. How does the self-service kind of feed into a more transactional early stage sale? How does that kind of grow and expand? It turns out a lot of these commercial customers can actually become quite sizable and they move pretty quickly. And so what starts small actually can turn into something pretty reasonable. So we're very excited about that area. And it's probably good feedback that we should talk about it more.
spk01: Great and then just um one more you guys have talked about you know how the cloud marketplaces have been a very much successful from a go to market perspective from a lead gen perspective. I guess as we're sort of entering a chop your macro environment, what are sort of the puts and takes there, because I guess on one side of the equation. You know, if there's consolidation of spend from a procurement perspective, maybe that's a tailwind as people are using AWS credits to sort of burn down commits and allocating those to Confluent. And the flip side of the equation, maybe it's a little bit easier to slow consumption or reallocate those credits. So is a choppier macro and sort of your relationship, you know, with these, you know, cloud marketplaces, is that more of a tailwind or is there a little bit of a headwind there potentially too?
spk10: I think it's a tailwind. So the way the marketplaces come into play, it's a little bit of a lead gen thing, but the bigger aspect is that companies have significant committed spend with the cloud providers, and it's possible to buy products through that, through your marketplace agreement. And so you could think of that committed spend as kind of a shadow budget that's flexible across different things you could purchase. And so if you're in tighter times, you're probably optimizing everything in the cloud. And it's a really nice way to be able to fit in a new thing like Confluent. And so, yeah, we see it as something that reduces friction in the sale for sure. It also provides a great mechanism for cooperation with the cloud providers. What motivates them is both that marketplace consumption as well as the flow of data into all of their other services, which kind of helps drive utilization and consumption in their cloud. And so, you know, both of those are the things that are going to activate their kind of go-to-market teams to help. And so, you know, we think these are great programs. We've leaned heavily into them and been quite successful.
spk03: Great. Thanks, Nick. We'll take our next question from Ryan Molenscha with Barclays, followed by JMP Securities.
spk04: Hey, thank you. Two quick questions for me. Jay, the interesting thing at the conference was to see the excitement, like how messaging and what you're offering is changing the world. In light of that, how do you see this at the moment playing out in customer conversations? Because as macro is getting tougher, in the past, it tended to be the tendency to stay with the old for now because you're just keeping the lights on versus also having the pressure to become more real-time and actually do something. So the question is more, how are the customer conversations changing now as macro is getting a little bit more choppy? And then I had one follow-up from Stefan.
spk10: Yeah, I mean, I think the critical question is, hey, does that excitement persist in times where there's more economic pressure? And what we've seen is, yeah, it absolutely has. And on both sides of it, we're continuing to grow through more difficult times. I think, if anything, the big change is how prevalent is that TCO side effects? Is there a really strong story of how you can displace legacy technologies, how you can get out of this internal operations of open source that can be very costly, and then how you can just get more efficient use out of your data and enable the things that you have to do as a business, but now with a little bit of a tighter overall management of budget? And I think that we actually have a very compelling story in that area. And I think that goes along well with the overall excitement about the area. I think that that's now, you kind of have to have both of those things be true about any product area for it to continue to succeed in the current environment.
spk04: And then Stefan, more question on the investment. So as you guys doing more cloud, what does it kind of do in terms of investment levels in sales and marketing, et cetera, going forward? Because this cloud, in a way, one beauty of cloud is it increases visibility for you in terms of you see what a customer is doing, you see how much they're doing. Does that kind of trigger like,
spk11: different investment in sales and marketing is can you do more with less like how do i have to think about that dynamic thank you as as we look at the investment profile for sales and marketing um we we're definitely leaning into uh the mix ship that we're seeing that's being more driven by our customers where it's more cloud oriented um we have a customer growth go-to-market journey where we have these five stages um ultimately with getting to the central nervous system. And so along the way of that journey, what we're doing from a sales and marketing standpoint is making sure that we have the right model in place around number of account executives and systems engineers and customer success people. In the cloud business, when we segmented and we look at the commercial business, The commercial business is 100% cloud. That business is growing very healthily. And then we look at the enterprise business, which is growing healthily too. But we see more hybrid opportunities in the enterprise business where they have existing confluent platform deals, and we're looking to upsell and cross-sell into into those existing customers where they could be adopting Confluent Cloud. So when you net it all out, what we're looking at is we're looking at a more efficient go-to-market structure, sales and marketing with the shift to cloud, because a lot of it is expansion selling. And we feel like it's a good setup as we look to get more leverage out of sales and marketing over time.
spk10: A big part of that, just to tag on, is the ability to drive product-led growth and adoption and then expansion with software in addition to an excellent field sales team is just not that easy to do on-premise. It is hard to get net new data systems stood up on-premise. You got to hire people, you have to order servers. there's no way to just do it with software in you know in the cloud that's totally different so our ability to kind of augment what that team does make our uh field force like more efficient and everything they do and how they work with customers is just so much better as well as our data about how that's progressing and our ability to be smart and targeted in what we do. And so, yeah, that's kind of where that efficiency comes from. Now, obviously there's some investment to get all those parts built and strung together. We've talked a little bit about how we're trying to orchestrate that journey, but yeah, I think it's a huge ability to amplify what you're doing with humans and really be intelligent and direct it well to drive efficient growth. Perfect. Congrats. Thank you.
spk03: All right, our next question will come from Patrick Wall Ravens with JMP Securities.
spk02: Oh, great. Thanks, Shane. And let me add my congratulations. So, Jay, I saw that about an hour ago you announced a new chief customer officer, Ray Perez, I guess.
spk00: Yep.
spk02: And I was looking at his background, this actually seems really similar to Roger Scott's. So I'm just wondering, yeah, really similar. What's the rationale there?
spk10: Yeah, they knew each other. This is a planned transition, so we feel really good about it. Basically, Roger's stepping back, and we were looking for who would be good to step into it, and we're really excited about Ray. So yeah, I think it's an awesome evolution of the team, and we're excited about what he adds.
spk02: All right, and then also my follow-up is, And lots of people have this issue. But with the stock having gone from the 90s to 22, that makes retention harder. What are you guys doing? What's your message to people? And is there anything you can do to offset the pressure that comes from having the equity do that?
spk10: Yeah, well, I mean, look, it was a turbulent time for us to go public. There was a big run up and then some fall off. I think the employee base we have has been through a number of things that were difficult. We're kind of in it for the long haul. There's a huge opportunity here. People are excited about it. We did do some small things to try and help. So Stefan talked about the fact that we kind of split the bonus payout. We felt like that was a cost-efficient way of trying to give people something a little earlier. So what would have been a payout in Q1, we split into two payments and gave the first one up front so that it wasn't kind of cushioned some of it. um and i i think that helps that was extremely well received so you know we you know i think we're in a good state with uh our employees and you know it's also gotten much easier to hire uh new people and grow because it's a more difficult economy so i guess that's you know that's a silver lining in it the other thing i'd beg on is uh as you think about our
spk11: demonstrated ability to deliver on our commitments. That is very important as all the employees, including everyone on this call, we all have goals and objectives that we're trying to accomplish each quarter. And those get boiled up into how we guide the street, how we execute. And the level of execution since we've been public has been a big bright spot for employees because people are delivering on their commitments in a very tough environment. And a lot of things are going on in the macro, in the volatility in the stock market. There are some things that are outside of our control, but what we can control is our execution. And that has been really stellar in a very volatile environment. And I think that provides some comfort to the employee base as well.
spk10: Yeah, I couldn't have said it better.
spk03: All right, that concludes today's earnings call. Thanks again, everyone, for joining us. We'll talk soon. Take care. Thank you all.
Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-