2/4/2026

speaker
Rene Haas
Chief Executive Officer

around three business units edge ai physical ai and cloud ai edge ai compromises comprises the smartphone and iot businesses physical ai includes automotive and robotics and cloud ai encompasses data center and networking a key driver of our royalty momentum is compute subsystems or css we launched css nearly two and a half years ago and demand continues to exceed expectations This quarter, we signed two additional CSS licenses for Agile tablets and smartphones, bringing us to 21 CSS licenses across 12 companies. Five customers are now shipping CSS-based chips, including two shipping a second-generation platform. And the top four Android smartphone vendors are shipping CSS-powered devices. CSS helps customers get to market faster by lowering integration risk and complexity. As demand scales, it increases the value that ARM delivers per chip, creating a significant tailwind to royalties. In Cloud AI, the shift towards inference is reshaping data center design. And increasingly, that inference is agent-based. These workloads are persistent, always on, and power constrained. This is a fundamental change in how AI systems operate. This is because agent-based AI requires coordination across many agents running continuously. and that the CPU can only do coordination. As this model scales, customers need CPU chips with higher core counts and better power efficiency to operate continuously within tight power and cost constraints. This trend directly benefits Arm. Arm-based CPU chips deliver industry-leading performance per watt, enabling customers to scale core counts and run always-on AI workloads. We are now seeing this trend play out in the market where NeoVerse CPUs have surpassed 1 billion cores deployed, and arm's share amongst the top hyperscalers is expected to reach 50%. Leading hyperscalers are launching new products with increased core counts to address this opportunity. AWS launched its fifth-generation Graviton processor with 192 cores, doubling the core count from Graviton4 and delivering 25% higher performance and up to 33% lower latency versus Graviton4. NVIDIA's next-generation VERA CPU features 88 ARM-based cores, up from 72 cores in the gray CPU generation. Microsoft introduced COBOL 200, built on the higher-performance ARM Neoverse CSS v3 with 132 cores, up from 128 cores in COBOL 100, which was based on the prior Neoverse M2 platform. And Google previewed its second ARM-based server processor with Axion-powered N4A instances delivering up to 2x better price performance and 80% better performance per watt than the comparable x86 offerings. Google has now migrated over 30,000 applications to the ARM instruction set. We are also seeing more integrated platform designs to improve system efficiency, often translating to more AI output or more tokens per watt within the same power envelope. AWS integrates Graviton with ARM-based Nitro DPUs and training accelerators, and NVIDIA pairs GPUs with ARM-based Gray CPUs and ARM-based Bluefield DPUs, which has transitioned to VERA, delivering a 6x increase in DPU compute capability over the prior generation. Together, these trends make it clear that as AI inference becomes more agent-based, the importance of CPUs is only increasing. And as a result, ARM's role at the center of the modern data center architecture continues to grow rapidly. Outside the data center, AI is now moving to everyday devices. The edge and physical AI markets are opening up new growth opportunities. These systems operate in real time under strict power, safety, and reliability constraints, where efficient and predictable general-purpose compute is essential. ARM's strengths, power efficiency, predictable latency, and always-on operation, are best suited to on-device agents that continually monitor inputs, characterize tests, and invoke models when needed to preserve battery life. Our common software foundation across devices, vehicles, and robotics lets customers scale deployments without rebuilding software stacks. We now see that momentum in customer innovation. Rivian announced its third-generation autonomy computer based on the ARM-based Rivian autonomy processor, the first production vehicle based on a custom ARM chip and the first to deploy Arm V9 in production car. Tesla's upcoming Optimus humanoid robot is also powered by a custom Arm-based AI processor, and platforms from leading silicon providers like NVIDIA's Jetson Thor and Qualcomm's DragonWing platforms are scaling Arm-based solutions across robotics and autonomous systems. To close, AI is moving to every environment and every power envelope. Arm provides the foundation for that shift, a platform that spans milliwatts to gigawatts, and a developer ecosystem of over 22 million developers, more than 80% of the global total. We are now seeing the results of strategies we put in place years ago, focusing on the data center, power efficiency, and compute subsystems. As a result, as more and more applications move to agentic AI, ARM will be the compute platform connecting cloud, edge, and physical AI use cases. And with that, I'll now hand it over to Jason.

speaker
Jason Child
Chief Financial Officer

Thank you, Rene. We have delivered another strong quarter. Total revenue grew 26% year-on-year to a record $1.24 billion, marking our fourth consecutive quarter above a billion dollars. Royalty revenue exceeded our expectations, growing 27% year-on-year to a record $737 million. The biggest growth contributors were smartphones with higher royalty rates per chip and in the data center where our revenues continue to grow triple digits year-on-year as we see ongoing share gains from custom hyperscaler chips. Royalty revenue from edge AI devices such as smartphones continues to grow much faster than the market. All the major Android OEMs are now ramping smartphones with chips based on both ARMv9 and CSS. In Cloud AI, data center royalty revenue continues to double year on year with the ramp of ARM-based chips by all major hyperscaler companies. We are getting a further benefit as the build out of these new AI data centers are driving increased deployment of networking chips, particularly DPUs and SmartNICs, where Arm has a very high market share. In physical AI, the automotive market grew double digits year on year and contributed to our strong royalty performance. Overall royalty revenue growth continues to reflect Arm's increasing royalty per chip and rising market share. Turning now to licensing. License and other revenue was $505 million, up 25% year on year. Growth was driven by strong demand for next-generation architectures and deeper strategic engagements with key customers. We signed two new ARM ATA or ARM Total Access Agreements during the quarter and two new CSS licenses, both with leading smartphone handset OEMs. These agreements reflect the continued investment by our customers in our next-generation ARM technology. Of the $505 million of license revenue, Our agreement with SoftBank for technology, licensing, and design services contributed $200 million. SoftBank has become an increasingly important customer as they build out their AI compute strategy, including their recent acquisitions such as Ampere and GraphCore. We believe that the revenues we are receiving from SoftBank are durable as they relate to current generations that will continue as SoftBank executes on its roadmap. As always, licensing revenue varies quarter to quarter due to the timing and size of high-value deals, so we will continue to focus on annualized contract value, or ACV, as a key indicator of the underlying licensing trend. ACV grew 28% year-on-year, maintaining strong momentum following the 28% year-on-year growth we reported in Q2 and Q1. This continues to be above our long-term expectation of mid- to high-single-digit growth for license revenue. Turning to operating expenses and profits, non-GAAP operating expenses were $716 million of 37% year-on-year due to strong R&D investment. These investments in R&D reflect ongoing engineering headcount expansion to support customer demand for more ARM technology, including innovation and next-generation architectures, compute subsystems, and into our exploration into chiplets and complete SOCs. Non-GAAP operating income was $505 million of 14% year-on-year. This resulted in non-GAAP operating margin of about 41%. Non-GAAP EPS was 43 cents, close to the high end of our guidance range, driven by both higher revenue and slightly lower OPEX than expected. Turning now to guidance. Our guidance reflects our current view of our end markets and our licensing pipeline. For Q4, we expect revenue of $1.47 billion plus or minus $50 million. At the midpoint, this represents revenue growth of about 18% year-on-year. We expect royalties to be up low teens year-on-year and licensing to be up high teens year-on-year. We expect our non-GAAP operating expense to be approximately $745 million and our non-GAAP EPS to be 58 cents plus or minus 4 cents. The strength of customer demand we are seeing today, combined with a growing base of long duration contracts at structurally higher royalty rates, provides increasing confidence in our future revenue profile. This confidence allows us today to invest in next generation architectures, compute subsystems, and silicon that are needed to enable higher performance, greater efficiency, and more AI use cases. We believe this virtuous cycle of customer demand and ambitious investment positions ARM for sustained growth over the long term. Just before we get into the Q&A portion of the call, as you will have seen, ARM is hosting an event on March 24th, and I'm sure there will be interest about what we are planning to announce. There will be a million ways of asking what we may or may not be announcing. Please be patient, as we won't be providing any details ahead of the event. With that, I'll turn the call back to the operator for the Q&A portion of the call.

speaker
Operator
Conference Operator

Thank you. To ask a question, you will need to press star 1 and 1 on your telephone and wait for your name to be announced. To withdraw your question, please press star 1 and 1 again. We will now take the first question. One moment, please. And your first question today comes from the line of Joe Kachaki from Wells Fargo. Please go ahead.

speaker
Joe Kachaki
Analyst, Wells Fargo Securities

Yeah, thanks for taking the question. Rene, you touched upon it in the prepared remarks, so I was kind of curious if you could just maybe give us a little more detail on just how you view Arm's role and the role of the CPU in AI and cloud data centers, and just how does that change as we start to see more proliferation of AI agents?

speaker
Rene Haas
Chief Executive Officer

Yeah, thank you for the question. There is a number of shifts taking place in the data center, as I mentioned in the opening remarks. First off, as the shift moves away from exclusively training to predominantly inference, that is a workload that launches a number of different solution paths. One of them that we're seeing is around agentic AI. And the agents that are actually talking to other agents or having to control workflows such as service tickets or other workstreams, Those are very, very well suited for CPUs, because CPUs are very, very power efficient, always on, very, very fast latency. And what we are seeing is already an increased deployment of CPUs to address that problem. Now, it's just not CPUs that are good for that problem. It's the number of CPUs you have, and obviously, given the power constraints inside the data center, the efficiency of those CPUs. So for all those reasons, that's a very positive tailwind for ARM. And in particular, we're seeing those proof points now, as I mentioned, where the latest generation of CPU chips from the hyperscaler providers and also NVIDIA have increased the number of cores. And we think that only continues.

speaker
Joe Kachaki
Analyst, Wells Fargo Securities

Thanks for that. And just as a follow-up, let me go for Jason. You know, I know you're not giving Cisco 27, you know, commentary today, but just How do we think about the puts and takes of this royalty revenue growth and the risks that are associated with the potential, like, demand destruction that we're seeing, you know, in consumer electronics potentially from memory?

speaker
Jason Child
Chief Financial Officer

Yeah. Yeah, that's a great question and something we spend a lot of time looking at. So, in particular, you know, I think MediaTek last night talked about something like around a 15% reduction in unit volume for next year. And that's pretty consistent with what we've heard from other smartphone and handset providers around what they think the memory supply chain constraints could provide. So we've done our own kind of analysis of it. And what's interesting is we're hearing from our various partners that they're really trying to make sure that they protect the high end of the market, so the premium and flagship portion of the market. which is great for us because that's where all of our CSS and B9 royalties are, so the highest by a significant margin. And then on the very bottom end of the segment, that's where most of the supply chain constraints will probably be felt. For us, that's B8 and even older generations that are dramatically smaller royalties. So I think if you were to say, what if there's a 20% reduction in volumes next year, For us, that would translate to probably somewhere around a 2% or 4% at worst impact on smartphone royalties. If you then project that across the whole business, it'd be a 1%, maybe 2% negative impact on total royalties. The good news is because, as Renee mentioned, the cloud AI or infrastructure business has been continuing to grow ahead of our expectations. It's actually growing at a level that's more than compensating for those kind of risks on the memory and mobile side. So I think we have a very good setup for next year and not too concerned about at least the royalty revenue impact that we might see from these unit volume and supply chain constraints.

speaker
Rene Haas
Chief Executive Officer

Helpful. Thank you.

speaker
Jason Child
Chief Financial Officer

You bet.

speaker
Operator
Conference Operator

Thank you. Your next question today. comes from the line of Simon Leopold from Raymond James. Please go ahead.

speaker
Simon Leopold
Analyst, Raymond James

Great. Thank you. Appreciate you taking the question. First one is, I'm hoping you're able to shed some light on this, but wondering what your thoughts are on whether or not SoftBank will potentially need to sell some of the arm stock that it holds to finance some of the investments we've talked about making and how we should think about the implications for your shares. Then I've got a quick follow-up.

speaker
Rene Haas
Chief Executive Officer

Sure. Yeah, thank you for the question. You know, that's one that we read a lot about, and there's a lot of speculation on chat boards and whatnot about that. I can tell you from talking to Masa about this, and I would quote him directly, he is not interested in selling one share of Armstock. And that doesn't mean two shares or three shares. That means any shares. He's very long on the company. He's very, very bullish, as am I, about our long-term prospects. And he has no interest in selling. There's been a lot of writing about it. But I can tell you from a direct conversation and direct conversations, plural, that I've had with him, that's just not the case.

speaker
Simon Leopold
Analyst, Raymond James

And then just as a follow-up, you've provided a forecast for deceleration in the royalty revenue growth. I'm just wondering if you could elaborate on the trend. Is it more difficult comps, or is there something else shifting that we should be considering?

speaker
Jason Child
Chief Financial Officer

Yeah, this is Jason. I'll take that. I would say the royalty trends for next year are pretty consistent in absolute dollars, maybe a little bit lighter just because of what you're now seeing on the memory shortage side. Like I said, maybe 1% or 2% impact, largely due to that. The growth percentage is down a bit because of the overperformance that we saw last quarter and expected to see again this quarter. So we are, you know, coming off of a stronger comp. Now, you know, the obvious question then is because you've had stronger growth both in Q3, you know, we thought we'd grow about 20%. We grew 27%, so, you know, a $30 million beat or more. And now seeing some of that flow through into Q4, will that flow into next year as well? Right now, I'd say too hard to say. There's a lot of talk about memory and even wafer shortages. And so that stuff doesn't affect us as much as many of the full fabless semiconductor companies. So I'd say right now, we'll give you updates as we learn more. But overall, the absolute magnitude of royalties for next year, I expect to be pretty close to what we were thinking what we said earlier this year. But, you know, we'll see if this recent strength continues and allows us to take things up as we proceed into next year.

speaker
Simon Leopold
Analyst, Raymond James

Very helpful. Thank you. You bet.

speaker
Operator
Conference Operator

Thank you. In the interest of time, please limit yourselves to one question only and rejoin the queue for any follow-up questions. We will now go to the next question. And your next question today comes from the line of Vivek Arya from Bank of America. Please go ahead.

speaker
Vivek Arya
Analyst, Bank of America

Thanks for taking my question. I actually just had two clarifications. One is I was hoping you could quantify the exact amount of data center revenue. I know you said that it doubled, but how much is it so we can get a sense for, right, what the magnitude is versus the overall company sales? And then the other clarification I had was, I think you mentioned SoftBank contributed $200 million. I somehow recall the original expectation was about $178, $180 million. And if you could clarify that, and what are you embedding for March and onwards from that contribution? Thank you.

speaker
Jason Child
Chief Financial Officer

Yeah, well, the $178 last quarter, it was – So no new deals were signed. It's just the deals from last quarter, it was 178 for the quarter. The full quarterized impact now is about 200. So nothing new. It's just a full quarter impact. I would expect that 200 going forward is the right run rate going forward.

speaker
Vivek Arya
Analyst, Bank of America

And the data center revenue?

speaker
Jason Child
Chief Financial Officer

Yeah, data center revenue, we provide the details on that once a year. I think at the beginning of this year, we said it had just hit double digits. And because it's going so much faster than the rest, assume it's going to be, you know, somewhere in kind of the teens to probably getting closer to 20%. As Renee said, over the next two to three years, you should expect to see it get similar or maybe even larger than smartphone business, which is in the, you know, kind of 40 to 45% of total business. Thank you.

speaker
Operator
Conference Operator

Thank you. Your next question. It comes from the line of Mehdi Husseini from Susquehanna Financial. Please go ahead.

speaker
Mehdi Husseini
Analyst, Susquehanna Financial

Thank you. Thank you for the question. Just as a follow-up to the smartphone topic, how should I think about the migration to the V9 higher royalty is going to help upset lower smartphone units?

speaker
Rene Haas
Chief Executive Officer

Yeah, so I'll let Jason provide the detail, but again, as a reminder with the way that we handle V9 for smartphones, particularly V9 CSSes, every smartphone cycle, we deliver a brand new CSS. Each time we deliver the brand new CSS, the royalty rates are generally increased year on year. So when we think about V9 in smartphones, the appropriate way to think about it is It's all moving to CSS now, and as a result of that, we get Christ every year with a royalty increase year on year.

speaker
Jason Child
Chief Financial Officer

Yeah, and in terms of the guidance that I just gave in terms of if there's a minus 20-degree unit impact, there's at most a kind of 4% to 6% revenue impact just specifically within smartphones. That would be incorporating the higher royalty rate per unit that's already been contractually agreed to and that we assume will be shipping later in the year. Okay, thank you.

speaker
Operator
Conference Operator

Thank you. Your next question comes from the line of Vijay Rakesh from Missouri. Please go ahead.

speaker
Vijay Rakesh
Analyst, Morgan Stanley

Yeah, hi, Renee and Jason. Just a quick question on your partnerships. I'm sure the partner SoftBank executes on its AI roadmap Should we be expecting like an ARM custom ASIC down the road given the substantial partnership that you have with them with the $200 million a quarter NRE that you're getting? How should we look at that, the timing, and how that will impact the fiscal 27, let's say?

speaker
Rene Haas
Chief Executive Officer

Yeah, nothing we can say specific about any products that you're asking about. So, unfortunately, not much more we can say there.

speaker
Operator
Conference Operator

Got it. Thank you. Thank you. Your next question today comes from the line of Chris Sanka from TD Cowen. Please go ahead.

speaker
Chris Sanka
Analyst, TD Cowen

Hi, thanks for taking my question. Really, I just want to find out a little bit about how to think about ARM's IP penetration rate or percentage rate in AI data centers I meet today, and where do you think that evolves over the next three to five years?

speaker
Rene Haas
Chief Executive Officer

That's a wonderful question. I think what we're going to see in the next three years is a evolving of of how these data center chips are built out and what do i mean by that you know today you've got a classic architecture where you've got a cpu which connects into an accelerator the cpu does some work the gpu does some work i think we're going to start to see over time is a morphing of the workloads that the cpu takes that the gpu used to do and as I mentioned you go to a GenTech inference, that's going to mean more CPUs, which could be more different custom chips that are CPU-based. In addition, the inference workloads, which are dominated by two pieces of area of work, specifically pre-fill and decode, you could see some specific solutions around that that continue to extend. Things like what a Grok has done, for example, you could still see more kind of innovation across that area. I also think, you know, you asked about the data center, but I think we're going to start to see a lot of that migrate to the smaller form factors where different combinations of IP and solutions are going to be needed to address areas where power is much more constrained, particularly around physical AI and then the lower edge devices. So I I think there's a lot of innovation to come in solving the AI problems because one thing that's clear is that these AI workloads are going to be running on every single piece of hardware that has compute. And because the vast majority of the compute platforms out there today are already ARM-based, it gives us a gigantic opportunity to mold where that goes.

speaker
Simon Leopold
Analyst, Raymond James

Got it. Thanks, Vinny. Thank you.

speaker
Operator
Conference Operator

Your next question is, comes from the line of from JP Morgan. Please go ahead.

speaker
Unknown
Analyst, JP Morgan

Good afternoon. Thanks for taking my question. On compute subsystems, obviously, you continue to drive solid momentum with two more licenses added in the quarter. The value out of CSS that we hear from your customers is resonating extremely well, right? It improves their productivity. It improves their overall system performance. They're willing to pay a higher licensing fee and higher royalty fee for that value added you mentioned. I'm curious to know what percentage of the royalty mix is CSS today, and what proportion of the royalty revenue could it become over the next two to three years?

speaker
Rene Haas
Chief Executive Officer

Yeah, thank you, Harlan. I'll let Jason take that.

speaker
Jason Child
Chief Financial Officer

Yeah, so, Harlan, yeah, a lot of progress on CSS with the, you know, the five CSS's that have actually already been, you know, turned into silicon and actually something we're receiving royalties on. It's had a material impact. Think of CSS last year. I think it was just kind of approaching double digit. And this year it's well into double digit. Think of it as being into the teens. And then I would say, you know, over the next couple of years, I expect it to, you know, to probably, it could be upwards of 50%. But, you know, we'll have to see. I think, you know, the primary drivers for acceleration of CSS has really been mostly around our customers needing to, you know, shorten the cycle time. And CSS, you know, typically cuts that cycle time about in half. And so, you know, stay tuned, but I would expect to continue to see that acceleration occur and to continue to see, I think right now, every CSS customer that's had a chance to sign up for the next version or kind of renew for the next generation has all done that. So that's certainly a really key indicator of the value that, as you said, customers are seeing from it.

speaker
Unknown
Analyst, JP Morgan

Yeah, absolutely. Thank you.

speaker
Jason Child
Chief Financial Officer

Thank you.

speaker
Operator
Conference Operator

Thank you. Your next question comes from the line of Charles Chee from Needham & Company. Please go ahead.

speaker
Charles Chee
Analyst, Needham & Company

Yeah, thanks for taking my question. I think going back, maybe it was one year, you guys kind of soft-guided the FYE26 and FYE27 growth. It should be around 20%. You are definitely delivering that FYE26. We definitely will see how you think about FYE27 in about the A quarter, but any early view you guys can provide on FYE28? I know I'm asking M plus two year here, but you guys did do that going back about a year, and I was hoping if you can provide any early view into the outer year. Thank you.

speaker
Jason Child
Chief Financial Officer

Yeah, I would say for 26, as you said, we'd said at least 20%, and I think now we're guiding 22 at the midpoint. So as you said, you know, exceeding that target. For 27, not guiding on full year, but in terms of, you know, kind of at a high level, the 20% growth rate, I think certainly is very reasonable and not anything that we'd back away from. In terms of 28, we haven't thrown anything out there yet. I'd say maybe stay tuned. You know, there are opportunities as we contemplate, you know, other possible offerings and what that could do to our numbers is still something we're working through. So we'll give you an update on 28 sometime down the road.

speaker
Simon Leopold
Analyst, Raymond James

Thank you. I appreciate that.

speaker
Operator
Conference Operator

Thank you. Your next question today comes from the line of Swinney Paduri from RBC. Please go ahead.

speaker
Swinney Paduri
Analyst, RBC Capital Markets

Thank you. A couple of clarifications, guys, on the memory impact. I guess you talked about quantifying that impact. But, Jason, the outlook for the next quarter on the royalties being up low teens, do you think memory is already having an impact on the smartphone? Is that why it's only upload teams? And then to add to that, you talked about CSS accelerating. I'm just curious, given the pressure on the bill of materials, do you anticipate or are you seeing any impact in terms of the adaption of CSS and V9, I guess, as you look into the next few quarters given the bill of materials challenges? Thank you.

speaker
Rene Haas
Chief Executive Officer

Yeah, thanks for the question. I'll take the second part first, and then Jason will take the first part on memory. Question was regarding CSS pricing impacting bill of materials. No, we're not seeing any of that at all. What we are seeing is that the value gained by accelerating time to market outweighs anything that customers are considering. given the complexity of building these chips. The increased cycle times through the fabs, going from 5 nanometer to 3 nanometer to 2 nanometer, means that the design windows are really short and missing the first few months of shipment or having any kind of delay would be critical to profits. So based on that, we've really not had many discussions with anyone regarding the BOM impact, the value that we create relative to profits gained by the customer is what really drives the decision point. And then regarding the memory impact on the next quarter, I'll let Jason address that.

speaker
Jason Child
Chief Financial Officer

Yeah, the memory impact, very minimal, I would say. And that's not really the driver of the guidance on the growth. The absolute growth in royalties has much more to do with typically seasonality, our Q4 or calendar Q1 is always one of the slower quarters. And the one thing that happened a year ago is we did have a MediaTek chip come out in Q4 of a year ago, or Q, yeah, our Q4 calendar Q1 of a year ago, which was unusual timing. So we are lapping that. So it's much more about kind of what we're comping and to some extent seasonality. But overall, full year royalties, I would expect, to be in that north of 20% range, which is kind of what we were expecting early in the year, and still expect Q4 or calendar Q1 to be stronger than what we previously expected. So it's really the year-on-year growth piece is really more of a seasonality slash comping, kind of an unusual one-time release from a year ago.

speaker
Simon Leopold
Analyst, Raymond James

Thank you.

speaker
Operator
Conference Operator

We will now go to the next question. And your next question comes from the line of Andrew Gardner from Citi. Please go ahead.

speaker
Andrew Gardner
Analyst, Citi

Good afternoon. Thanks for taking my question as well. Jason, perhaps one for you on the OPEC side. You know, we've clearly seen significant investment in the business, particularly in R&D, given everything that you guys are doing. You've given us a bit of a steer on fiscal 27 revenue growth. you know, Tilly, R&D has been growing at a faster rate than revenue in the current period. Is that something we can expect to continue into fiscal 27, given everything that you guys have got in front of you, or will we actually start to see R&D growth slow relative to the revenue? Thank you.

speaker
Jason Child
Chief Financial Officer

Sure. So a little early to talk full year. I can tell you right now our expectation is that the Q40-Q1 step-up will be similar to last year. I think last year it was you know, low double-digit sequential growth, and you should see the same kind of sequential growth as a year ago. Right now, I would say the growth after Q1 is probably going to moderate more so than it did this year. We did see pretty significant step-ups throughout the year. I don't expect there to be quite as significant step-ups for next year, but as we, you know, progress more into next year will give you a little more color, but that's the high-level, I'd say, modeling approach I would take right now. Thank you.

speaker
Operator
Conference Operator

Thank you. We will now take the next question, and the question comes from the line of John DiFucci from Guggenheim Securities. Please go ahead.

speaker
John DiFucci
Analyst, Guggenheim Securities

Thank you. Rene, you've seen a lot in technology over the years, so I'm going to ask a question that's kind of a little bit self-serving here. I'm curious how you'd characterize what's happening in the stock market recently as it pertains to the software sector. And if you might, since you're at least partially a software company, how does AI affect your business other than driving demand? In other words, how should we think of how you'll leverage AI in the design of chips and systems? Thank you.

speaker
Rene Haas
Chief Executive Officer

Yeah, well, regarding the stock market's reaction to software companies, I've got a great answer for that. I'd probably be in a different position than the one that I have. I'm not sure I can – I'm in a great position to discuss what the near-term impacts are to the stock market. But what I can say after, you know, watching and being in technology my entire career, we do see these kind of things time to time where – investors or the market gets jittery around what the broad impacts are when we're in the midst of fairly significant technology disruptions. I can say for our business, given the fact that we are an intellectual property provider that goes into physical things, chips, AI is not going to replace a physical chip anytime soon. They're kind of linked at the hip, if you will, relative to you need the hardware to run the software. I think there's just enormous opportunity however still for growth in the overall sector because when i when i think about where ai actually is operating truly inside the enterprise it's very uh when i think about our own company and things like our payroll systems or purchase order systems or our sap systems There's some AI going on there, but not nearly enough to be massively transformative yet. And I think part of that is just the complexity of integrating these large systems and changing software workloads. So I think we're in super early days, to be quite frank, and having been in technology again my entire career and have seen lots of technology disruptions. This one feels a little bit like the final frontier in terms of the amount of productivity and change that AI can benefit. And we're still all trying to get our arms around it. If you just even look at the numbers of spend, I heard earlier today, Google or Alphabet announcing $180 billion CapEx spend. That used to be what semiconductor companies used to spend a year on fabs, times a few. So we're in uncharted waters. And maybe that's why you're seeing some jittery numbers relative to how the market reacts. But from where we sit, there's just huge demand for compute. And that's what ARM does. And so I think in the long game, I'm super excited about the opportunity for us.

speaker
John DiFucci
Analyst, Guggenheim Securities

Really appreciate your thoughts, Rene. Thank you.

speaker
Operator
Conference Operator

Thank you. We will now take our final question for today. And the final question comes from the line of Tim Schulte-Melanda from Rothschild & Co. Please go ahead.

speaker
Tim Schulte-Melanda
Analyst, Rothschild & Co.

Yeah, hi there. Thanks for taking my question. It's a two-parter for Rene, please. You've talked a lot about inference in the AI future. You just referenced the Grok architecture and I really wanted to ask you what are your thoughts or how should we think about SRAM, SRAM at the edge, some of these different memory structures and what they could mean for your business. And then the second part is just the cadence of power efficiency for ARM. Is there something that we should think about in terms of the average annual or per V8 to V9 energy for compute efficiency that you see going forward. Thank you so much.

speaker
Rene Haas
Chief Executive Officer

I'll take the latter part first because it kind of bridges into the first. We look at how to address power efficiency 24-7. And the reason for that is increasingly as you get into these smaller form factors, the one thing that you don't get much liberty on is battery life and space. So as a result, we have to always think about operating a constrained environment where you're adding more and more demand of compute. When you add AI onto something that already has to drive a display or open an app or recognize the voice, it's a constant thing that we think about and worry about. I think we're very well positioned to address it because we are the incumbent in many of these platforms. So it is something we spend a lot of time and energy on. To your first part of the question on SRAM and different memory technologies, absolutely, that's something we're highly involved in. To oversimplify a computer, a CPU needs memory and memory needs a CPU, period, and stop. So when you're designing a piece of hardware, the two go very much hand in hand. And there is a lot of work and research being done about not just SRAM, but alternative memory technologies and solutions. that can address these increasing demands on AI. So again, to the question prior to yours in terms of the overall broad opportunity, what people in our space tend to worry about is that there isn't hard problems to go think and work on and develop new technologies for. We don't have that problem. Every single end application is going to be impacted by AI. We believe every end application will run AI through ARM. So we're spending a lot of time and energy, and you can see by our investments, to come up with innovative ways to address that. Great. Thank you so much.

speaker
Operator
Conference Operator

Thank you. I will now hand the call back to Rene for closing remarks.

speaker
Rene Haas
Chief Executive Officer

Yeah, thank you, and thank you for all the thoughtful questions. And we could tell by the range of the questions, we were talking about memory prices inside the quarter and then what alternative memory technologies could look like years from now. I think that's a very good way to sort of describe the current quarter, but how we're very, very bullish about ARM long-term. We delivered the best quarter in our history. We delivered the best quarter in our history on royalties, which is really an indicator for the strategies we have going forward. And we have a huge amount of customers shifting to ARM in a big way with more CPU counts. That being said, the quarters that we're most excited about are the ones ahead of us. We think we have Huge opportunity, as I mentioned, in the new areas of physical AI, cloud AI, and edge AI. And we intend to do everything we can to make ARM the compute platform of choice for all AI workloads. Thank you.

speaker
Operator
Conference Operator

Thank you. This concludes today's conference call. Thank you for participating. You may now disconnect.

Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-