This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

Datadog, Inc.
8/7/2025
Good day and thank you for standing by. Welcome to the Q2 2025 Data Dog Earnings Conference Call. At this time, all participants are in a listen-only mode. After the speaker's presentation, there will be a question and answer session. To ask a question during the session, you will need to press star 1-1 on your telephone. You will then hear an automated message advising your hand is raised. To withdraw your question, please press star 1-1 again. Please be advised that today's conference is being recorded. I would now like to hand the conference over to your speaker today, Yuka Broderick, SVP of Investor Relations. Please go ahead.
Thank you, Dee Dee. Good morning and thank you for joining us to review Data Dog's second quarter 2025 financial results, which we announced in our press release issue this morning. Joining me on the call today are Olivier Plamel, Data Dog's co-founder and CEO, and David Obstler, Data Dog CFO. During this call, we will make forward-looking statements, including statements related to our future financial performance, our outlook for the third quarter in the fiscal year 2025, and related notes and assumptions, our growth margins and operating margins, our product capabilities, and our ability to capitalize on market opportunities. The words anticipate, believe, continue, estimate, expect, intend, will, and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. These statements reflect our views only as of today and are subject to a variety of risks and uncertainties that could cause actual results to differ materially. For a discussion of the material risks and other important factors that could affect our actual results, please refer to our form 10-Q for the quarter ended March 31, 2025. Additional information will be made available in our upcoming form 10-Q for the fiscal quarter ended this year in the fiscal year 2025 and other filings for the SEC. This information is also available on the Invest Relations section of our website along with a replay of this call. We will discuss non-GAP financial measures which are reconciled to their most directly comparable GAP financial release in the tables in our earnings release, which is available at .datadoghq.com. With that, I'd like to turn the call over to Olivier.
Thanks, Yuka, and thank you all for joining us this morning to go through our results for Q2. Let me begin with this quarter's business drivers. Overall, we saw trends for usage growth from existing customers in Q2 that were higher than our expectations. We experienced strong growth in our AI native cohort. The number of AI native customers are growing meaningfully with us as they see rapid usage growth with their products. Meanwhile, we saw consistent and steady usage growth in the rest of the business. We continue to see the overall demand environment as solid with an ongoing healthy pace of cloud migration and digital transformation. And churn has remained low with growth revenue retention stable in the mid to high 90s, highlighting the mission critical nature of our platform for our customers. Regarding our Q2 financial performance and key metrics, revenue was $827 million, an increase of 28% year over year and above the high end of our guidance range. We ended Q2 with about 31,400 customers, up from about $28,700 a year ago. This includes about 150 new customers from our Apple and Metaplan acquisitions. We ended Q2 with about $3,850 customers with an ARR of $100,000 or up from about $3,390 a year ago. And these customers generated about 89% of our ARR. And we generated free cash flow of $165 million with a free cash flow margin of 20%. Turning to platform adoption, our platform strategy continues to resonate in the market. At the end of Q2, 83% of customers were using two or more products, the same as last year. 52% of customers were using four or more products, up from 49% a year ago. 29% of our customers were using six or more products, up from 25% a year ago. And 14% of our customers were using eight or more products, up from 11% a year ago. So our customers continue to add up more products, including our security offerings. As a reminder, our security customers can identify managed vulnerabilities with code security, cloud security, and sensitive data scanner. And they can detect and protect from attacks with app and API protection, welfare protection, and cloud team. We are pleased that our security suite of products now generates over $100 million in ARR and is growing mid-40s percent year over year. While we are pleased to achieve this milestone, we're still just getting started in solving customer problems in this area with new innovations such as our Beats AI Security and Noise. Moving on to R&D, we held our Dash User Conference in June, where we announced over 125 exciting new products and features for our users. So let's go through some of the announcements. First, we launched fully autonomous AI agents, including Beats AI SRE agent to investigate alerts and coordinate incident response, Beats AI Dave agent, an AI-powered coding assistant to proactively fix production issues, and Beats AI Security Analyst to triage a lot of cloud scene signals. To further accelerate our users' incident response, we announced AI voice agent for incident response, so users can quickly get up to speed and start taking action on their phones. We also announced handoff notifications that make it easy to jump straight into the relevant context and quickly communicate with our responders, and status pages to enable automatic updates for customers and their grid incidents. Second, we delivered a series of products to help customers ship better software with confidence. With the Datadog internal developer portal, developers can ship better and faster by gaining a real-time view into their software systems and APIs with the subject catalog, by provisioning infrastructure, scaffolding new services, and managing code changes and deployments with self-service actions, and by following engineering and standards with scorecards. We launched a Datadog MCP server to enable AI agents to access telemetry from Datadog and to act as a bridge between Datadog and MCP compatible AI agents like OpenAI Codex, Cursor, and Codecode from Atomic. We worked together with OpenAI to integrate our MCP server within the OpenAI Codex CLI, and the Datadog Cursor extension now gives developers access to Datadog tools and observability data directly within the Cursor ID. Third, we are reimagining observability to meet our customers' increasingly complex needs. Our APM latency investigator formulates and explores hypotheses in the background, helping teams to quickly isolate root causes and understand impact without combing through large amounts of data. Proactive app recommendations help users stay ahead of growing system complexity by APM data to detect issues and propose fixes before they become problems. We announced a FlexFrozen tier so customers can keep logs in fully managed storage for up to seven years and be able to search without data movement or rehydration. Archive Search now enables teams to query archival logs directly in cloud storage like Amazon S3 buckets or in the FlexFrozen tier. And Datadog now supports advanced analysis features within notebooks. Fourth, our security products cover new AI attack vectors across the application, model, and data layers. At the AI data layer, Sensitive Data Scanner can now prevent the leakage of sensitive data in training data as well as LLM prompts and responses. At the model layer, we help secure against supply chain attacks in open source models and prevent model hijacking attacks. At the application layer, we have prevent prompt injection attacks and data points to prevent poisoning in runtime. And finally, we showcase our new -to-end AI and data observability capabilities. Engineers and machine learning teams can use GPU monitoring to gain visibility into GPU fleets across cloud, on-prem, and GPU as a service platform such as CoreWave and LambdaLabs. With AI Agents console, enterprises can monitor the behavior and interactions of any AI agent used by their teams. We now offer LLM observability experiments to help understand how changes to prompts, models, or AI providers influence application outcomes. We added a new agentic flows visualization to LLM observability to capture and understand the decision path of AI agents. And last but not least, and accelerated by our recent acquisitions of Metaplane, Datadog now offers a complete approach to data observability across the entire data life cycle from intention to transformation to downstream usage. So we continue to relentlessly innovate to solve more problems for our customers. In doing so, we are being rightfully recognized by independent research. And we are pleased that for the fifth year in a row, Datadog has been named as a leader in the 2025 Gartner Magic Quadrant for observability platforms. We believe this validates our approach to deliver a unified platform which represents silos across teams. Now let's move on to sales and marketing. We had a number of great new logo wins and customer expansions this quarter. So let's go through a few of those. First, we signed a seven figure annualized expansion in a three-year contract worth more than $60 million with one of the world's largest banks. This company believes getting to the cloud is essential so they can use AI on their extremely rich data set to improve how they manage risk and serve their customers. They are using Datadog as their strategic cloud observability platform and they continue to migrate more applications to the cloud. This customer is expanding to 21 Datadog products with thousands of users who log into the data platform every month. Next, we signed a seven figure expansion to an eight figure annualized contract with the leading U.S. insurance company. Datadog is supporting these customers' efforts to consolidate observability tools and expand their cloud-based products. By adopting Datadog, they are experiencing fewer and less severe incidents with estimated savings of over $9 million per year in incident response costs and improving more than 100,000 customer transactions that would otherwise be impacted every year. With this expansion, this customer will adopt 19 Datadog products and will consolidate a couple dozen tools across multiple business units. Next, we signed a nearly seven figure annualized expansion with the leading American media company, Google. This customer has about 100 observability tools across more than 300 business units and this tool fragmentation has resulted in inefficiencies in extra costs and lost engineering time. They are expanding to 21 Datadog products including all of our security products and replacing their paging solution with Datadog on-call and incident management. Next, we landed a seven-figure annualized deal with the leading Brazilian e-commerce companies. These customers' previous observability vendor was unable to support them as they moved to newer software platforms and modern cloud infrastructure. By replacing this tool with Datadog, the company was able to gain full visibility into its cloud-based app and saw significant improvements in application stability and incident resolution times. This customer will start with seven Datadog products including sex logs. Next, we landed a seven-figure annualized deal with the delivery app of a major American retailer. This customer found our run and our tracking products to be immediately valuable, finding an issue on the first day of their Datadog trial that they hadn't identified after months of searching with their old tool. By adopting Datadog with seven products to start, this customer will consolidate half a dozen tools while meeting their PCI compliance requirements. Finally, we welcomed back a leading US mortgage company in a nearly seven-figure annualized deal. This customer had moved to using a dozen open source disconnected tools which led to fragmented visibility, a little fatigue, and poor customer experience. In returning to Datadog, they plan to adopt six products including replacing their paging system with Datadog Onco. And that's it for another productive order for -to-market teams who are now very hard at work on a BBQ three. Before I turn it over to David for a financial review, I want to say a few words on our longer-term outlook. There's no change to our overall view that digital transformation and cloud migration are long-term secular growth drivers of our business. As we think about AI, we are incredibly excited about opportunities. First, AI is a tailwind for Datadog as increased cloud consumption drives more usage of our platform. Today, we see this primarily in our AI-native group of customers who are monitoring their native applications with us. There are hundreds of customers in this group. They include more than a dozen that are spending over a million dollars a year with us and more than 80 who are spending more than $100,000. And they include eight of the top 10 leading AI companies. While we know there's a lot of attention on this cohort, we primarily see it as an indication of what's to come as companies of every size and every single industry incorporate AI into their cloud applications. And we continue to see rising customer interest for next-gen AI observability and analysis. Today, over 4,500 customers use one or more Datadog AI integrations. Second, next-gen AI introduces new complexity and new observability challenges. Our AI observability products help our customers gain visibility and deploy with confidence across their entire AI stack, including GPU monitoring, LLM observability, AI agent observability, and data out. And we will, of course, keep innovating as the AI landscape develops further. Third, we are incorporating AI into the Datadog platform to deliver more value to our customers. As I discussed earlier, we launch BITS AI, SRE agent, DAVE agent, and security agent. We are seeing very good results with those, with more improvements and new capabilities to come. Finally, as a SaaS platform focused on our customers' critical workflows, we have a large volume of rich, clean, and detailed data which allows us to conduct groundbreaking research. A great example of that is our Toto financial model for time series forecasting, which shows -the-art performance on all benchmarks, even going well beyond specialized observability use cases. And you should expect to see more from us on that front in the future, as well as taking novel research approaches and models straight into the market. So, we are extremely excited about our progress so far, against what we expect to be a generational growth opportunity. In other words, we're just getting started. And with that, I will turn it over to our CFO. David?
Thanks, Olivier. Q2 revenue was $827 million, up 28% -over-year, and up 9% -over-quarter. Now, to dive into some of the drivers of this Q2 revenue growth, first, overall, we saw trends for usage growth from existing customers in Q2 that were higher than our expectations. This included strong growth in our AI native cohort, as well as usage growth from the rest of the business that was consistent with recent quarters, amidst a healthy and steady cloud migration environment. We saw a continued rise in contribution from AI native customers in the quarter, who represented about 11% of Q2 revenues, up from 8% of revenues in the last quarter, and about 4% of revenues in the year-ago quarter. The AI native customers contributed about 10 points of -over-year revenue growth in Q2, versus about 6 points last quarter, and about 2 points in the year-ago quarter. Now, as previously discussed, we do see revenue concentration in this cohort in recent quarters. But if we look at our revenue without the largest customer in the AI native cohort, our -over-year revenue growth in Q2 was stable relative to Q1. We remain mindful that we may see volatility in our revenue growth on the backdrop of long-term volume growth in this cohort, as customers renew with us on different terms, and as they may choose to optimize cloud and observability usage over time. As you heard from Oli, we continue to believe that adoption of AI will benefit Datadog in the long term, and we believe that the growth of this AI native customer group is an indication of the opportunity to come, as AI is adopted more broadly and customers outside the AI native group begin to operate AI workloads in production. Now, regarding usage growth by customer segments, in Q2, our -over-year usage growth was fairly similar across segments relative to previous quarters, as SMB and mid-market usage growth improved in Q2, while enterprise customer usage growth remained roughly stable. Note that we are excluding the AI native cohort for the purposes of this commentary. And as a reminder, we define enterprise as customers with 5,000 or more employees, mid-market as customers with 1,000 to 5,000 employees, and SMB as customers with less than 1,000 employees. Regarding our retention metrics, our 12-month trailing net retention percentage was about 120, higher than the high 110s last quarter, and our trailing 12-month gross revenue retention percentage remains in the mid to high 90s. Now, moving on to our financial results, first billings were $852 million up 20% -over-year, and remaining performance obligations, or RPO, was $2.43 billion up 35% -over-year. Our current RPI growth was in the 30s -over-year, and our RPO duration was up slightly -over-year. As previously mentioned, we continue to believe that revenue is a better indication of our business trends than billings in RPO, as those can fluctuate relative to revenue based on the timing of invoicing and the duration of customer contracts. And now let's review some of the key income statement results. Unless otherwise noted, all metrics are non-GAAP. We have provided a reconciliation of GAAP to non-GAAP financials in our earnings release. First, gross profit in the quarter was $669 million for a gross margin of 80.9%. This compares to a gross margin of .3% last quarter and .1% -over-year. As we've discussed in the last call, we saw an increasing impact of our engineers' cost savings throughout this quarter as they delivered on cloud efficiency projects. And we are continuing our focus on cloud efficiency and believe that we have further opportunity for gross margin improvement in the second half of the year. Our Q2 OPEX grew 30% -over-year, up from 29% last quarter. As we've communicated over the past year, we plan to grow our investments to pursue our long-term growth opportunities, and this OPEX growth is an indication of our execution on our hiring plans. Q2 operating income was $164 million for a 20% operating margin compared to 22% last quarter and 24% in the year-ago quarter. Within that, as we've noted, we held our Dash user conference in June, and as expected, the event cost $13 million. We also experienced a rising impact from the weaker dollar and absorbed $6 million of negative FX impact during Q2. Excluding those expenses, operating income would have been 22% in Q2 or 200 basis points higher. And now turning to the balance sheet and cash flow statements, we ended the quarter with $3.9 billion in cash, cash equivalents, and marketable securities. And our cash flow from operations was $200 million in the quarter. After taking into consideration capital expenditures and capitalized software, free cash flow was $165 million for a free cash flow margin of 20%. And now for our outlook for the third quarter and the remainder of fiscal 2025. First, our guidance philosophy overall remains unchanged. As a reminder, we base our guidance on recent trends observed and apply conservatism on these growth trends. For the third quarter, we expect revenues to be in the range of $847 to $851 million, which represents a 23% -over-year growth. Non-GAAP operating income is expected to be in range of $176 to $180 million, which implies an operating margin of 21%. And non-GAAP net income per share is expected to be $0.44 to $0.46 per share, based on approximately $364 million weighted average diluted shares outstanding. For fiscal 2025, we expect revenue to be in the range of $3.312 to $3.322 billion, which represents a 23% to 24% -over-year growth. Non-GAAP operating income is expected to be in the range of $684 to $694 million, which implies an operating margin of 21%. And non-GAAP net income per share is expected to be in the range of $1.80 to $1.83 per share, based on approximately $364 million average diluted shares. Some additional notes on our guidance. We expect net interest and other income for fiscal 2025 to be approximately $150 million. And due to the impact of the recent federal tax legislation, we now expect cash taxes for 2025 to be about $10 to $20 million. We continue to apply a 21% non-GAAP tax rate for 2025 and going forward. And finally, we expect capital expenditures and capitalized software together to be 4% to 5% of revenues in fiscal year 2025. To summarize, we are pleased with our execution in Q2, including the many products and features we launched at Dash. We are well positioned to help our existing and prospective customers with their cloud migration and digital transformation journeys, including their adoption of AI. I want to thank all Datadogs worldwide for their efforts. And with that, we'll open the call for questions. Operator, let's begin our Q&A.
Thank you. As a reminder, to ask a question, please press star 1-1 on your telephone and wait for your name to be announced. To withdraw your question, please press star 1-1 again. Please stand by while we compile the Q&A roster. And our first question comes from Rima Lenchchow of Barclays. Your line is open.
Perfect. Thank you. Two quick questions from me. Olivier, like you talked about the AI contribution and slowly broadening out, how should we think about it in terms of when this goes much broader into inference, et cetera? So does that everyone like Barclays, JP Morgan, et cetera, they all kind of need to do more around observability because they're going to do more inference, et cetera? So in a way, like open AI, et cetera, is just setting the scene for future. And what do you think about the market opportunity there? And then David, in the second half of last year, you hired a lot of extra sales guys. Can you talk a little bit about that REMB and where they are in their productivity curve? Thank you.
Yeah, on the AI opportunity, so there's really multiple layers to it. The first layer is largely what we see today, which is companies that are running their inference stack and the application around it in cloud environments. So that's the case of the model makers or think of the companies that are doing coding agents, things like that. That is what we see today. And it looks a lot like normal compute. So you have normal machine CPUs, some GPUs, quite a few other components, no data, but his web servers, things like that. So that's the bulk of what we see today. And there's going to be more of it as the applications come into production. There are more specialized inference workloads and even training workloads in some situations that rely on instrumenting GPUs. And for that, we have a new product out there that does GPU monitoring that we announced at Dash. But all that, that would call the infrastructure layer of AI. Then on top of that, there's new problems in terms of understanding what the applications themselves are doing. And the applications are largely non-deterministic anymore. They're either are run by a model that is non-deterministic by nature, or they're running code that was not as carefully written as it used to be. It's not completely written by humans, it's largely written by AI agents. And as a result, you also need to spend a lot more time understanding how that is working and that largely happens in production. So that's all. Brand new array of observability, which is how do you deal with applications that have not been completely defined in development and that have to be evaluated in production. And what we think is the whole market is going there, not just the AI natives. The AI natives are definitely doing that today. Both applications are running on models and code that has been largely written by agents, but the rest of the data that we see of that is the very, very broad adoption today, both of the API-gated AI models and of the coding agents, which you see in every single large enterprise today.
Thank you.
Yeah, and as to sales capacity, we have been successful in increasing both our salespeople and our ramp sales capacity.
We
started that, as you said, in the last part of 2025. And we are seeing evidence of that through our new logo production and our pipeline. We need to, as we talked about previously, go through the ramping of that, but in looking at size and productivity and performance, we see some good signs that that quota capacity is becoming productive. That's a
card.
Thank you.
Our next question comes from Sanjit Singh of Morgan Stanley. Your line is open.
Thank you for taking the questions. Congrats on the really stellar results this quarter. David, when I look at the guide, this is probably one of the more impressive guides coming out of a Q2 that I've seen in a couple of years. If I square that against the commentary that you guys made on the AI native cohort that looked, there could be volatility from this cohort. When I try to put those two together, the guidance is really strong. And so when I think about that potential risk, is it fair to assume that it's not something that you're seeing right now and may come to play later on down the road? Because the guidance seems really strong. It doesn't seem to, at least on the face, anticipate that much volatility from the AI native cohort.
Yeah. I think we gave metrics indicating that based on what we saw in the quarter and we're seeing now that the AI cohort continues to grow quite rapidly. And we're winning a good market share in that. And so how we incorporate that into the guidance is, as we discussed previously, we know that there might be volatility in usage or as we negotiate contracts in unit rates. And so therefore, we adopt conservative assumptions as to that performance in the remainder of the year. It's not something, as you can tell from the growth metrics that we've seen yet in our results. But as we learned in the previous cycle with cloud natives, there can be volatility and we want to make sure we incorporate that in our guidance.
Perfect. And then Olivier, with the new security disclosures, congrats on crossing the $100 million threshold. Is there any sort of change in the buying behavior? There's been consolidation in industry. You guys have been advancing your portfolio quite significantly. You guys have fully autonomous security agents. What's your prospect for this pool of the business, this part of the business to drive growth for the balance of the year and going into 2026?
Yeah. So we have a very good product set and we have, we mentioned three different products in there. There are a couple of those products that are really, I would say, reaching an inflection point in terms of what they're doing on the customer side. When I think of where we're very successful at getting broad adoption, like a large number of customers, a few customers that are spending a million plus on security with us. So we're good with the, we're happy with the proof points we have there. What we haven't done very well yet is getting standardized adoption wall to wall in large enterprises. And that's the next focus for us on the security side. And some of that is product work, but a lot of it is a few customizations to go to market there. So we get better at filling enterprise-wide security top down, which is not something we have done a lot of in the past. So that's sort of where we are as a product. So happy with where we are. A lot of groundwork has been done on the product side, but there's quite a bit more work to be done and a ton more opportunity in front of us. So that's what we're focusing on.
Appreciate the thoughts, Olivier. Thank you.
Thank you.
And our next question comes from Cash Rangan of Goldman Sachs. Your line is open.
Hey, thanks for taking my question. This is Matt Martino on for Cash Rangan. David, you called out enterprise consumption volatility last quarter. It sounds like that may have been consistent this time around while SMB continues to improve. So could you perhaps characterize any discernible trends between these two customer demographics? What went right relative to your expectations heading into 2Q and really how that informs your second half guide? Thanks a lot. That's it for me.
Yeah, I think broadly we're calling out that the usage trends across the segments were roughly consistent with the previous quarters. We said we did see some more concentrated. This is not a about AI. This is coming about enterprise. Take down less and less consumption relative to a spike, but we saw that stabilized and we've seen small but gradual improvement of the SMB as results of the usage of our products.
Thank you.
And our next question comes from Mark Murphy of JP Morgan. Your line is open.
Thank you. My congrats. So Olivier, I actually wanted to ask you about Coto and Boom. Those announcements, it looks like you're bringing very serious AI research to a space where it is applicable and opening it up very broadly. The size of the data set is vast. I'm curious what type of response do you expect to see here? And just help us understand maybe how that can sustain growth in future years and have a quick follow up for David.
And look, we think there's so much opportunity in automation with autonomous AI agents. Like we really broke it out in three different categories so far. One is the SRE and responding to alerts and investing in alerts and maybe automating those issues. Second one is coding, fixing issues that we find in the code that happen in production and verifying the sixes ourselves. And the last one is security, investing in security signals on our own so that customers don't have to do that themselves. There's so much that can happen there. A lot of it is going to depend on great research, which is why we built a research team and which is why we developed and released with open weight research models already. Of course, the next step after releasing these research models is to incorporate them into the product. So that's also one of the things working on right now. But there's just so much opportunity in front of us there that at this moment we're happy. We got a great start. We got fantastic results. I mean our first release as a research output is really like a state of the art model that beats every single other model in a category that has seen quite a bit of action over the years. Time series forecasting has very wide applicability in a lot of different domains. So I think it shows that we can perform at the highest level there. And I think it's a great sign of things to come in terms of AI automation and AI agents.
Okay, thank you. And then David, we keep pointing out that Datadog is one of the only software companies that's investing seriously in headcount growth and feels like that is paying top line dividends pretty tremendously today. We noticed the R&D spending is up noticeably in Q2. Just wondering, what are the mechanics that are driving that on the R&D line? And then the flip side is what's allowing you to guide operating income so much higher in Q3 than you had guided that
for Q2? Yeah, in R&D, as we talked about, we had an aggressive investment plan and we've been able to execute. And I think our recruitment, credit to our recruitment team, we've been able to get people in the door, the right people earlier in the year. There are some things within that around FX that weigh a little bit on it because as you know, we do have a significant R&D center in Paris. But I think the overall trend is the execution in recruiting. We talked about some of the factors in Q2 that cause the operating income to greater increase at a rate of 36%. And some of those are things like the timing of Dash, we talked about 13 million, the FX. And I think that we have good line of sight on the drivers in R&D, both in terms of, as we talked about, and some of the operating expenses have some seasonality in it.
The one thing I would add, which is that we also are spending more on AI training and inference in R&D. If you compare to past years, the output of that is things such as Toto or the next versions of it that we're training right now and experiments we're running to train agents, run simulations with different agents, things like that. You shouldn't expect the overall picture of R&D investment to change in the future. I think we expect the same envelope to be what we use
moving forward. I'll add that and really call out to our R&D team and our FinOps that we said last quarter that we were going to focus on how we use cloud. That applies to both the gross margin and as you know, we dog food. We use a lot of our applications internally. And we work quite successful in Q2. In that run rate, we expect to continue forward in optimizing our cloud usage, which will have an effect on the margins and the OPEX growth rates as we proceed through the year. Thank you very much.
Thank you. And our next question comes from Koti Akeda of Bank of America. Your line is open.
Yeah, hey guys. Thanks so much for taking the questions. We all see that the second quarter was really, really strong. Guidance for 2025 looks really, really great. And so I wanted to ask you about contract visibility. How are you feeling about contract visibility specifically with your large AI native customers? I have to imagine you're very close to these customers and having lots of conversations with them. And so I know there is some concern about there. And David, you mentioned potential volatility. So I really want to ask about how you're feeling about contract visibility.
Thanks. I mean, look, we can't really speak about any specific customers. As a reminder, you know, any individual customer can do whatever they want. They're the heroes of their own stories. You know, we can't really speak for them. I would say we have strong product engagement from our top customers in general. We're working on making it, making that the very best platform for every company in any scale, including scale that has never been seen before in companies with high growth. And I would say it's about it. When you look at the way we forecast the business, remember that we are overall extremely high retention product. For most customers, it's not rational to do it themselves, build their own solutions. We have many customers who de-churn to build themselves who come back afterwards. And we named one on the call today. So we know we feel confident about the way we forecast the business and the long-term growth. Of course, as we renegotiate with customers, as they increase volume, etc., what typically happens is we see short-term drops and long-term growth in the revenue that is associated with them. And that's where we're always at.
Thanks so much. And I did have a follow-up on security. And so, you know, it sounds, I mean, great to hear about the milestones, $100 million, around 40%. And so thinking about the product set, how are you thinking about expanding the capabilities from here? Are you focused on more organic, inorganic, and maybe an update to your M&A philosophy? I mean, I guess the question here is, are you willing to go much bigger to supplement your security strategy? Thank you.
Look, we're looking at the number of people thinking security. There's a lot of companies out there. There's a lot of product areas we cover already and a lot of more product areas we can cover. It's also a space where, you know, you need to cover a lot of the, how we call them, boring, must-have, table-stakes features on one end, but also there's quite a bit of investment in the future with the way the whole field is being disrupted with AI. So there's quite a bit of work to be done there. You should expect us to do more M&A around that, as we do in the rest of the business, as there is a lot of assets out there and there's a lot of opportunities to grow.
Thank you so much.
Thank you.
And our next question comes from Carl Kirstead of UBS. Your line is open.
Okay, great. Thanks. Maybe I'll direct this to David and link the AI native exposure to margins. So, David, now that the AI natives are 11% of Datadog's revenue mix, I think it's fair to ask whether the revenues from that cohort are coming at similar margins as the rest of the business, or do you think that this could be even short-term a modest source of margin pressure? Thank you.
Yeah, I would say, like we talked about last quarter, this isn't about the AI in margin, the AI cohort versus the non-AI cohorts. We price based on volume and on term. So to the extent you would have an AI customer who's doing much the same things as our other customers in the use of the product, has similar volumes and similar terms to the non-AI, it would be similar margins. To the extent that we have a larger customer in there, given our price grids, that customer would get better discounts. That's the way we've always priced. So it really is related to customer size rather than AI native or non-AI native.
Yep. And I will double down with a bit of an infomercial. So we did see, as you mentioned last quarter, we were seeing gross margins going down further than we would like them to. So what happened is we tasked our engineering teams with optimizing the cloud usage, which goes across all of our customer base. What we did is we turned to our own product, we turned to our cloud cost management products, and our profiling product largely. And then we, in a matter of months, we really turned up substantial improvements, savings on our bills, and improvements in performance and efficiency of our systems, while we were still shipping new features. And that's something that we're working right now to bring to all of our customers, so they can get the same effect and they can see their margins go up as well.
Got it. And maybe the natural follow-up there is, David, you mentioned that you're optimistic about gross margins in the second half. Is that because of what Olivier just mentioned, or are there some other drivers you have in mind?
No, it's because of what Olivier mentioned. So we said we were engaging in these efforts, and as we were more successful in the quarter, we will be carrying that run rate forward, which wasn't fully in Q2, as well as using what Olivier mentioned, using cloud cost management and our projects to have further opportunities going forward. So it's really about our progress and pace, which has been successful in our cloud efficiency going forward.
Got it. Thank you both.
Thank you. And the next question comes from Mike Sikos of Needham. Your line is open.
Hey, guys. Thanks for taking the question here. I just wanted to double back on the enterprise segment and just, Mrs. Farali, but if I'm thinking about it, I know that we have the enterprise demonstrating this stable growth. Is it fair to assume, like, is the analogy for enterprises who are more traditional using CPU versus the AI native companies or growing investment in GPUs, is it analogous to like 15 years ago where we saw, hey, on-prem continues to see investment, but maybe more dollars are going towards cloud? Is that like a fair analogy when we think about what sort of behavior is exhibited by these different customers and where Datadog is headed?
I don't know if you can say it exactly this way because at the time, the on-prem versus cloud, they tended to be different customers. Whereas today, sorry, they tended to be the same customers, whereas today, like, the AI native and the enterprise are different companies altogether. I think the main difference is the AI natives have businesses that are going very, very fast, infrastructure that are going very, very fast themselves, whereas the enterprises are still going through controlled migration from on-prem into the cloud, and the rate there is more limited by their bandwidth to undergo that migration as opposed to being driven by an explosion of traffic on the demand side for them. If I look at our enterprise segment in general, we see great trends in terms of the bookings, in terms of new products attached, new customers, things that these customers are buying from us that are new, but we see that the user's growth is a bit more moderate than that at this point. I think that speaks to the bandwidth on their end just to move the workload and to go far there. That relates in part to the fact that a lot of their attention is spent on figuring out what AI technologies they're going to adopt and how they're going to ship these AI applications into production. Overall, we see that rate as stable, so we think this is healthy, but we think we will see more growth from these enterprise customers as they actually get into production with the AI applications in the future.
Understood. Thank you for that, and congrats on the security. I didn't want to leave hanging. I don't know if we got commentary on it, but could we please get an update on FlexLogs? I know it was a shining star if I go back a quarter ago, but just one of those progress is tracking on the FlexLog side of the house.
All of the big deals with enterprise customers now involve FlexLogs in some form, and that's a story that resonates very well, especially when we have customers that want to migrate from legacy solutions from logs. There's a number of things that we're working on with them, in particular, making sure the migration is painless for them, that there's a number of things that we're investing in on that side. FlexLogs is a big draw for them as it really changes the picture economically and the predictability of the observability cost for them, which is a major concern for data-intensive data-intensive cost observability such as Logs. Thank you,
Es. Thank
you. Our next question comes from Jake Roperz of William Player. Your line is open.
Thanks for taking questions. There's obviously been a lot of talk about AI natives around the business. I know you've talked about the potential for optimization for several quarters, but we continue to see really strong growth in that segment. If you were to see optimization, when would you expect that to happen? As you get a wider swath of customers in that AI native cohort, do you think you're at the place where you could actually digest an optimization by one or two of those customers? Well, if I knew when it
was going to happen, I would tell you. The nature of our customers is they grow, they have their own businesses to run, they have their own constraints. We have to have them deliver their services and that's what we work on every single day. Every now and then there's a renegotiation, a renewal, an on-location for our customers to figure out what they need to optimize and what they need to do for the future. We never know whether it's going to happen this quarter, next quarter, three quarters, next year or never. That's really hard to tell.
Okay, that's helpful. Could you also talk about the uptake and feedback that you're getting for your own AI solutions like Bits.ai, the new observability agents, and when you think those could really start layering into the model? The initial
response to the AI agents is really, really positive. The AI SRE actually works surprisingly well. If you think of whole farm, technology has grown in a number of couple of years. Right now, we're busy basically shipping it to as many customers as we can and enabling the customer with it. That's a big area of focus in the business as well. It was developed by a fairly small team, the actual product that we ship, and now we're busy scaling that up as fast as we can. We can sell those customers. That's the core focus of the business today. The initial response is very positive. We've had customers purchase and extract you for it pretty quickly in their trial. We feel very good about it.
Very helpful. Thanks for taking the questions and congrats on the great results.
Thank you.
Our next question comes from Brent Bill of Jeffreys. Your line is open.
Good morning, David. Just on the quota carrying rep capacity, and I know you've been investing aggressively ahead of the curve, but when you think about 2025, are you accelerating that count based on the great results you've seen? Are you digesting that count given those reps are on board? Just give us a sense and flavor of what that quota rep count looks like through the rest of the year and if you can shape how that looks versus 2024.
Yeah.
What we're doing is we're executing the plan we entered the year with. We knew, I think we said that we had under-invested in -to-market and looked at that with the white space, etc. And I would say we're successfully executing that. The plan was a little more front-weighted given our appetite for taking advantage of that opportunity, but we're executing that. We will look at the, towards the end of the year, if we plan for next year, on the metrics around that and try to calibrate how we look at that growth next year.
Okay. And Olivia, I'm just curious, many CEOs are either holding headcounts flat or down. We've seen Meta headcount down from two years ago, Microsoft headcount flat, others, Palantir saying they're going to shrink headcount in 10x revenue. Do you believe you can become more efficient with fewer or do you think that that model doesn't apply that you're seeing that other software companies?
I mean, look, there's definitely the, the, the spend is shifting a little bit on the engineering side. As I said, we compute, we, we consume more AI, AI training, AI inference. And so that's definitely changing a bit of the balance between what you have humans do and what you, what you offer to GPUs. That being said, we're still completely constrained by the amount of product we can put up there. There's a ton of opportunity in every single direction we look, whether that's on the automation, whether it's on the security side, whether that's in the new areas that have the ability or experimentation that we're going after. And so for us, there's good, there's very strong ROI in the ads that we're making at the moment. Great. Thanks.
Thank you.
And our next question comes from Andrew Degaspery of BNP Paribas. Your line is open.
Thanks for taking my question. First, on the ramp up in terms of sales capacity, would you say that's been broad based in terms of the productivity across both international and domestic?
As we talked about previously, we have a less developed international footprint. And so our growth rate internationally is running higher. We have markets we've talked about before, like Brazil and India and parts of APJ and Middle East that we have opportunities to grow our footprint. So we are executing in that way. We're doing it bottoms up as always. We're looking at the accounts. We're looking at the TAM and we're looking at how much we're covering it. So that produces a result of a little more investment intensity internationally versus in North America. But there are lots of opportunities in North America as well.
Thanks. That's helpful. And then on the enterprise side, I mean, given some of these reps are obviously on the ground, should we expect the number of the attached rates in terms of the three or four more products per customer accelerate at this level? I know they've been taking up about a point every quarter. Just wondering if that's something we should be seeing.
Well, I think broadly we expect the trends that we've seen of landing with some of the core products in the pillars and then expanding to continue. As the platform has expanded, we've tended to land with more products, but those trends that we've evidenced in the script are we expect to continue in the geography. And keep in mind,
a lot of the... So when you're in the field, it's always easy to sell a customer and to land a new customer. And a lot of the work we're in territory management and in co-planning for the sales team is really to make sure that there's enough of an incentive to go and look for new customers. So we keep driving a number of new customers out as well. So there's this balance always between do you direct the sales force at upsetting the customers or landing new customers? Thank you.
Thank you. Okay. And our next question comes from Patrick Colville of Scotiabank. Your line is open.
All right. Thank you for squeezing me in. And I guess I just wanted to say before I ask my question, congrats on the S&P 500 index inclusion. I mean, that's a really nice milestone for you guys. The question we get consistently from investors is on competition. I mean, you referred to your on competition kind of tend to gentrally in other kind of answers, but maybe more specifically, I mean, what are you seeing competitively in observability? And the one we get asked about a lot is versus Grafana and Colosphere.
Yeah. I mean, look, there's always been competition in the field. As I like to say, when I first found Riz for Der Dog, the world that was coming back to me every single time with every single know I was getting from RDCs was crowded space. And so throughout the last year, the company there's been not only incumbents that we've mostly have beaten the market now, but also a steady stream of new entrance that we also also have in a year after year have beaten in the market. There's always new companies, always new folks that are building new things in observability. I think it's very attractive for engineers to build that. I would know something about it. Generally speaking, the community's landscape hasn't changed much in the past 10 to 15 years, about the same. The way we win and we will keep winning is by offering an integrated platform that solves as many problems as possible for our customers end to end. So we don't just focus on one data store, one specific brick that our customers might want to use. We solve the whole problem for them end to end. And then in the long run, we win by being more innovative, by having an economic model that lets us invest more in R&D, develop more products, build the existing products into the future faster than anybody else can do, and cover more adjacencies faster than anybody else can do so they can have the broader platform. So that's the reason we win. And if you look at all of the companies you mentioned, none of them are in a position to do the same. And so that's where we're going to end up in the end. And I think that's the end of the call. That would be the last question. And just to close out, I want to thank our customers for working with us to bring all of those great new products to market. So we had a lot on our plate this year. You've seen that at Dash. It was amazing, by the way, to see all these customers and meet with them at Dash and see the reception we get for all the new products. And so I want to thank them. And I know we're working with many of them on how these products are going to be adopted and what's going to happen in Q3 and Q4. So again, thank you, and I will see you next quarter.
This concludes today's conference call. Thank you for participating, and you may now disconnect.