7/31/2023

speaker
Operator
Operator

Welcome to the second quarter 2023 Arista Networks Financial Results Earnings Conference Call. During the call, all participants will be in a listen-only mode. After the presentation, we will conduct a question and answer session. Instructions will be provided at that time. If at any time during the conference you need to reach an operator, please press star followed by zero. As a reminder, this conference is being recorded and will be available for replay from the Investor Relations section at the Arista website following this call. Ms. Liz Stein, Arista's Director of Investor Relations, you may begin.

speaker
Liz Stein
Director of Investor Relations

Thank you, operator. Good afternoon, everyone, and thank you for joining us. With me on today's call are Jay Sriwal, Arista Network's President and Chief Executive Officer, and Ida Brennan, Arista's Chief Financial Officer. This afternoon, Arista Network issued a press release announcing the results for its fiscal second quarter ending June 30th, 2023. If you would like a copy of the release, you can access it online at our website. During the course of this conference call, ERISA Networks Management will make forward-looking statements, including those relating to our financial outlook for the third quarter of the 2023 fiscal year, longer-term financial outlooks for 2023 and beyond, our total addressable market and strategy for addressing these market opportunities, including AI, customer demand trends, supply chain constraints, component costs, manufacturing output, inventory management, and inflationary pressures on our business, lead time, product innovation, working capital optimization, and the benefits of acquisitions, which are subject to the risks and uncertainties that we discuss in detail in our documents filed with the SEC, specifically in our most recent Form 10Q and Form 10K, and which could cause actual results to differ materially from those anticipated by these statements. These forward-looking statements apply as of today, and you should not rely on them as representing our views in the future. We undertake no obligation to update these statements after this call. Also, please note that certain financial measures we use on this call are expressed on a non-GAAP basis and have been adjusted to exclude certain charges. We have provided reconciliations of these non-GAAP financial measures to GAAP financial measures in our earnings press release. With that, I will turn the call over to Jayshree.

speaker
Jay Sriwal
President and CEO

Thank you, Liz, and happy last day in July, everyone. We delivered revenues of $1.46 billion for the quarter with a non-GAAP earnings per share of $1.58. Services and software support renewals contributed approximately 15.2% of revenue. Our non-GAAP gross margins of 61.3% was influenced by improving supply chain overheads and higher enterprise contributions. We do expect gross margins to consistently improve every quarter this year and stabilize in 2024. International contribution registered at 21%, with the Americas at 79%. As we surpass 75 million cumulative cloud networking ports, we are experiencing three refresh cycles with our customers. 100 gigabit migration in the enterprises, 200 and 400 gigabit migration in the cloud, and 400 going to 800 gigabits for AI workloads. During the past couple of years, we have enjoyed significant increase in Cloud CapEx to support our Cloud Titan customers for their ever-growing needs, tech refresh, and expanded offerings. Each customer brings a different business and mix of AI networking and classic cloud networking for their compute and storage clusters. One specific Cloud Titan customer has signals of slowdown in capex from previously elevated levels. Therefore, we expect near-term Cloud Titan demand to moderate, with spend favoring their AI investments. We do project, however, that we will grow in excess of 30% annually versus our prior analyst day forecast of 25% in 2023. The AI opportunity is exciting. As our largest cloud customers review their classic cloud and AI networking plans, Arista is adapting to these changes, thereby doubling down on our investments in AI. Arista is a proud founding member of the Ultra Ethernet Consortium that is on a mission to build open multi-vendor AI networking at scale based on proven Ethernet and IP. There are a lot of software and EOS considerations for AI. AI traffic and performance demands are different as it comprises of a small number of synchronized high-banded flows, making them prone to collisions that slow down the job completion time of AI clusters. As we connect thousands of GPUs, generating billions of parameters for periscale clusters, Arista's EOS capabilities must also scale along with our AI spine and leaf platforms to achieve that consistent performance and throughput. Arista has been developing EOS features such as intelligent load balancing and advanced analyzers to report and rebalance flows that can achieve predictable performance. Customers can now pick and choose programmable packet header fields for better entropy and efficient load balancing of their AI workloads. Network visibility is also important in the training phase for large datasets to improve the accuracy of large language models. Arista's new AI analyzer monitors and reports traffic counters at microsecond-level windows to detect and address microbursts. Our AI strategy and platforms are resonating well with our early customers. Presently, in 2023, we are in the middle of trials for backend AI networks, leading to pilots in 2024. We expect larger clusters and production deployments in 2025 and beyond. In the decade ahead, AI networking will become an extension of cloud networking to form a cohesive and seamless front-end and back-end network. In the non-cloud enterprise category, we continue to experience good momentum in both data center and campus. Let me illustrate with a few customer wins. The first is an international new transportation win where the customer was seeking to modernize their legacy campus. Their endpoints included large and small campus locations, internal communication devices, various IOTs, CCTV, display boards, and much more. The customer mandated a fully automated workflow. Arista presented a highly optimized, best of class cognitive campus. With Arista's single binary EOS image across all campus platforms, complete with a universal API and built-in automation features, the customer was set on a path to continued campus modernization. The next enterprise win involves both data center and campus with advanced EVPN L3 VPN over VXLAN routing architectures instead of the traditional Layer 2 extension. Distributed AVA sensors were strategically positioned within the network to capture and analyze traffic at critical points. This zero trust approach emphasizes threat mitigation throughout the network as opposed to relying solely on silo security. The integration of real-time streaming telemetry and visibility capabilities proved to be paramount in obtaining this operational acceptance. The final win was in a large public sector, connecting redundant data centers to hundreds of campus locations with a large routing environment. They were challenged with complex MPLS routing that was hard to operate across the WAN and campus network. An upgrade of any magnitude implied several million dollars impacting change controls to touch on all their sites. Arista demonstrated that the customer could use a single sign for both LAN and WAN to dramatically simplify and automate the whole environment within 30 days. This 80% reduction in total cost of ownership was made possible with Arista's modern cloud operating model. You can see a recurring theme here across all these customer wins, which is the power of our platform innovation, quality, and support with a low TCO and a single cloud vision and EOS software stack. Arista is diversifying its business to transform the enterprise to a modern network operating model. Before I hand to Ida, I would like to share with you that Ida is planning to retire sometime next year in 2024. She has had a stellar career at Arista as our chief financial officer. Ira has been our business partner and friend for the past eight years. She has displayed the Arista way, always prioritizing our customers, employees, and shareholders. Ira has demonstrated and delivered both growth and profitability with a very, very small G&A investment, often only 1.5% of revenues. These type of pristine financials are so rare in a fast-growing tech company and only possible with a shared vision between the CFO and CEO. Ida, thank you for your steady leadership and contribution. Undoubtedly, we will miss you next year when you retire. Over to you for financial metrics.

speaker
Ida Brennan
Chief Financial Officer

Thanks, Sree. That's very kind. It's been an amazing experience working with you and the whole Arista team over the last eight years. Now back to the numbers. This analysis of our Q2 results and our guidance for Q3 is based on non-GAAP and excludes all non-cash stock-based compensation impacts, certain acquisition-related charges, and other non-recurring items. A full reconciliation of our selected GAAP to non-GAAP results is provided in our earnings release. Total revenues in Q2 were $1.46 billion, up 38.7% year-over-year, and well above the upper end of our guidance of $1.35 to $1.4 billion. Services and subscription software contributed approximately 15.2% of revenue in the quarter, up from 14.9% in Q1. International revenues for the quarter came in at $304.4 million, or 20.9% of total revenue, up from 17.5% last quarter. This quarter-over-quarter increase largely reflected a healthy contribution from our enterprise customers in EMEA and APAC, and some reduction in domestic shipments to our CloudTitan customers which were unusually robust in the prior quarter. Overall gross margin in Q2 was 61.3%, in line with our guidance of approximately 61%, and up from 60.3% last quarter. We continue to see incremental improvements in gross margin quarter over quarter, with higher enterprise shipments and better supply chain costs, somewhat offset by the need for some additional inventory reserve as customers refine their forecasted product mix. Operating expenses for the quarter were $287.3 million, or 19.7% of revenue, up from last quarter at $257.5 million. R&D spending came in at $188.5 million, or 12.9% of revenue, up from $164.8 billion last quarter. This primarily reflected increased headcount and higher new product introduction costs in the period. Sales and marketing expense was 79.6 million, or 5.5% of revenue, compared to 75.9 million last quarter, with increased headcount and product demo costs. Our G&A costs committed 19.1 million, or 1.3% of revenue, consistent with last quarter. Our operating income for the quarter was 6.5 million, or 41.6% of revenue. Other income and expense for the quarter was a favorable 31.6 million, and our effective tax rate was 21.4%. This resulted in net income for the quarter of 501.2 million, or 34.4% of revenue. Our diluted share number was 316.5 million shares, resulting in a diluted earnings per share number for the quarter of $1.58, up 46% from the prior year. Now turning to the balance sheet. Cash, cash equivalents and investments ended the quarter at approximately 3.7 billion. In the quarter, we repurchased $30 million of our common stock at an average price of $137.2 per share. We've now repurchased $855.5 million, or 8 million shares, at an average price of $107 per share under our current $1 billion forward authorization. This leaves $145 million available for repurchase in future quarters. The actual timing and amount of future repurchases will be dependent on market and business conditions, stock price, and other factors. Now turning to operating cash performance for the second quarter. We generated approximately $434.1 million of cash from operations in the period, affecting strong earnings performance, partially offset by ongoing investments in working capital. ESOs came in at 49 days, down from 57 days in Q1, affecting a strong collections quarter with good linearity of buildings. Inventory turns were 1.2 times, down from 1.3 last quarter. Inventory increased to $1.9 billion in the quarter, up from $1.7 billion in the prior period, reflecting the receipt of components from our purchase commitment and an increase in switch-related finished goods. Our purchase commitments at the end of the quarter were $2.2 billion, down from $2.9 billion at the end of Q1. We expect this number to continue to decline in future quarters as component lead times improve and we work to optimize our supply positions. Our total deferred revenue balance was $1.085 billion, down from $1.092 billion in Q1. The majority of the deferred revenue balance is services-related and directly linked to the timing and term of service contracts, which can vary on a quarter-by-quarter basis. Our product deferred revenue balance declined approximately $33 million from last quarter. Pounds payable days were 57 days, up from 55 days in Q1, reflecting the timing of inventory receipts and payments. Capital expenditures per quarter were 11.6 million. Now it turns our outlook to the third quarter and beyond. To recap, global supply chain disruptions over the last couple of years necessitated elongated planning horizons and customer demand signals. The corollary is also true. Improving lead times are now driving shorter planning horizons and demand signals, delaying when customers need to place new orders. This is particularly true of our Cloud Titan customers following a year of elevated purchases must now grapple with changing technology roadmaps and priorities before providing visibility to future demand later in the year on the supply side we expect to continue to ship against previously committed deployment plans for some time targeting supply improvements where most needed but also careful not to create redundant customer inventory in spite of the return to shorter lead times and reduced visibility we are executing well with gradual incremental improvements to our 2023 outlook which now calls for year-over-year growth in excess of 30%. On the gross margin front, we expect continued progress through the end of the year, reflecting supply chain and manufacturing benefits while maintaining a reasonably healthy cloud contribution. Now turning to spending and investments, we continue to monitor the overall macro environment carefully and will prioritize our investments as we move through the year. This will include a focus on targeted hires in R&D and go-to-market as the team sees the opportunity to add talent. On the cash front, while we'll continue to focus on supply chain and working capital optimization, you should expect some continued growth in inventory through the end of the year. Also, as a reminder, our 2023 tax payments have been deferred to October and will represent a significant use of cash in that quarter. With all of this as a backdrop, our guidance for the third quarter is based on non-GAAP results and excludes any non-cash stock-based compensation impacts and other non-recurring items as follows. Revenues of approximately $1.45 to $1.5 billion. Gross margin of approximately 62%. Operating margin at approximately 41%. Our effective tax rate is expected to be approximately 21.5%, with diluted shares of approximately 318 million shares. I will now turn the call back to Liz. Liz?

speaker
Liz Stein
Director of Investor Relations

Thank you, Ida. We will now move to the Q&A portion of the Arista earnings call. To allow for greater participation, I'd like to request that everyone please limit themselves to a single question. Thank you for your understanding. Operator, take it away.

speaker
Operator
Operator

Thank you. We will now begin the Q&A portion of the Arista earnings call. In order to ask a question during this time, simply press star, then the number one on your telephone keypad. If you would like to withdraw your question, it is star one again. We ask that you pick up your handset before asking questions in order to ensure optimal sound quality. Your first question comes from the line of? William Ng with Goldman Sachs.

speaker
Mike Ng
Analyst at Goldman Sachs

Hey, this is Mike Ng from Goldman Sachs. Thanks for the question. I was just wondering if you could talk a little bit more about the outlook for an excess of 30% year-over-year growth this year on revenue. What's gotten better relative to last earnings call? If you could talk about it in the context of cloud titans versus enterprise, that'd be helpful. I'm just trying to reconcile the revenue upgrade versus the commentary about Near-term cloud Titan growth moderating.

speaker
Jay Sriwal
President and CEO

Thank you Yeah, thanks Mike. I think it's pretty clear that some quarter of a quarter our enterprise momentum continues to get stronger and better and Our cloud is strong. However, it's got two components now. There's the classic cloud networking and then the AI and So we're reconciling how we double down more on AI, which we are feeling stronger and stronger about. And even on the cloud, you know that the last two years have just been out of this world and phenomenal. So while it's moderating, it's still pretty good.

speaker
Operator
Operator

Thanks, Mike.

speaker
Moderator
Moderator

Thanks, Rishri.

speaker
Operator
Operator

We'll take our next question from Tim Long with Barclays.

speaker
Tim Long
Analyst at Barclays

Thank you. Jayshree, I was hoping you could dig more into some of the comments around AI. It sounds like there's a large pipeline there and you talked about kind of the stages with 2025 being the big growth area. I'm curious if you can just talk a little bit about a few things related to that. One, do you see that the move to AI expanding or diversifying more your Cloud Titan or your Cloud customers? And second, Can you talk about kind of the next year or two, the entire, you know, the InfiniBand versus Ethernet debate? I think you guys have been trialing some Ethernet inside clusters. Can you just give us an update on how you think that, you know, competition between those two technologies, how you think that's going to play out? Thank you.

speaker
Jay Sriwal
President and CEO

Okay, thank you, Tim. Maybe my answer will be shorter than your question. But I think the gist of what I'd like to first of all say is, majority of Arista's participation has been in the front end of the network, right? And we're getting a chance for the first time ever to play in the back end. So when we think AI, there's clearly some ramifications of bandwidth on the front end of the network, but we're not counting that. So we're truly thinking of something that's incremental, brand new, a lot of work to do in testing, proving, pilots, trials before we get into production. Today, I would say in the back end of the network, there are basically three classes of networks. One is very, very small networks that are within a server where customers use TCIE, CXL, there's proprietary NVIDIA-specific technologies like NVLink that Arista does not participate. Then there's more medium clusters, you can think generative AI, more for inference, where they may well get built on Ethernet. For the extremely large clusters with large language training models, especially with the advent of ChatGPT 3 and 4, You're not talking about not just billion parameters, but an aggregate of trillion parameters. And this is where Ethernet will shine. But today, the only technology that is available to customers is InfiniBand. So obviously, InfiniBand with 10, 15 years of similarity in an HPC environment is often being bundled with the GPUs. But the right long-term technology is Ethernet, which is why I'm so proud of what the UltraEthernet Consortium and a number of vendors do. are doing to make that happen. So near term, there's going to be a lot of InfiniBand and Arista will be watching that outside in. But longer term, Arista will be participating in an Ethernet AI network. And, you know, neither technology, I want to say, were perfectly designed for AI. InfiniBand was more focused on HPC and Ethernet was more focused on general purpose networking. So I think the work we are doing with the UAC to improve ETH and S4AI is very important.

speaker
Moderator
Moderator

Okay. Thank you. That's very helpful.

speaker
Operator
Operator

We'll take our next question from Nita Marshall with Morgan Stanley.

speaker
Nita Marshall
Analyst at Morgan Stanley

Great. Thanks. I mean, just revisiting kind of the Cloud Titan commentary, you know, does the change that you're seeing mean that they're completing kind of one upgrade cycle? There might just be time between the next upgrade cycle. or are there real changes to kind of current deployment plans or kind of deployments of the current upgrades? Thanks.

speaker
Operator
Operator

Sorry. Meta, were you addressing the question to Ida?

speaker
Nita Marshall
Analyst at Morgan Stanley

I guess I was addressing to whoever wants to answer about the kind of commentary on the changes in the CloudTitan order.

speaker
George

Go for it, Anshul. Sure. Medha, you know, when you look at the cloud customers in the last few quarters, especially since the advent of chat GPD, there's been a rotation in your AI. It's not that they're done with the upgrades or there's a score on the upgrades, but they had to reprioritize their business and their deployment for AI. You've seen the competitive battle between the largest of the largest titans in the world trying to get ahead. But we see signs of that going. And in the future, we believe they'll be back to adding and refreshing the standard computer infrastructure as well.

speaker
Jay Sriwal
President and CEO

You know, I always like, you can only do this for so long, eventually you have to eat. So I think we will see a nice mix of AI and classic cloud networking over time.

speaker
Ida Brennan
Chief Financial Officer

Yeah, and I think that the lead time improvements have kind of facilitated them waiting for a little bit longer than what we've gotten used to over the last couple of years. But I think, again, that's kind of, we're going to start coming within lead time here pretty soon, then we'll see. Yeah.

speaker
Moderator
Moderator

Thank you.

speaker
Operator
Operator

Thanks, Mina. We'll take our next question from Ben Bolin with Cleveland Research.

speaker
Ben Bolin
Analyst at Cleveland Research

Good evening, everyone. Thanks for taking the question. Ida, congrats. I had a question for you. I was hoping you could speak to where you see lead times presently, and you talked about you know, taking a little bit more managed approach to inventory levels at customers. Could you talk about some strategies that you employ to manage that and where you might see, where you think inventory levels are within those accounts? Thank you.

speaker
Ida Brennan
Chief Financial Officer

Yeah, I think, look, the lead times are, you know, mixed across products. I mean, our goal certainly is to try to get back to like a six-month lead time here, you know, maybe the end of the year, certainly early next year. But it is currently mixed across products. The commentary around customer inventory and stuff, we've been very diligent all the way through this process, the supply chain process, trying to make sure we understood demand when it showed up and that it was being put into reasonable deployment schedules and deployment plans. And we just want to continue to do that as we come through the other side of really that whole supply chain disruption. So it's really more understanding kind of what customers need, when they need it, And again, being able to prioritize and make sure that we understand that. So it's really a continuation of what we were doing, honestly, on the other side of the supply chain when you have these, you know, when you have this kind of accelerated demand. And then we were very focused on deployment schedules and timing.

speaker
Ida

And this is just the other side of that. Again, making sure we understand what's happening.

speaker
Moderator
Moderator

Thank you.

speaker
Operator
Operator

We'll take our next question from Antoine Chabon with New Street Research.

speaker
Antoine Chabon
Analyst at New Street Research

Hi, thanks a lot for taking my question. This is maybe a bit of a longer-term question, but can you please provide an update on the opportunity at hyperscalers beyond your two largest customers? Does the accelerated deployment of AI clusters potentially open the door to business with the other two hyperscalers as the complexity of the networks is increasing rapidly?

speaker
Moderator
Moderator

So, you know,

speaker
George

This gets asked very often, and we continue to do well with them. As I mentioned before, not all Titans are the same in terms of size. Some are small, and we do very well with them, but they're just not as big as our two largest customers. And others who have the potential, we're still doing very well technologically with them, but we haven't seen the opportunity materialize. It's not that we're losing to anybody. It's just nothing has changed. And we continue to invest with them, and we believe the opportunity is still ahead of us.

speaker
Jay Sriwal
President and CEO

Exactly, Anshul. I think the way to look at our AI opportunity is it's 10 years ahead of us. And we'll have early customers in the cloud with very large data sets trialing our Ethernet now. And then we will have more cloud customers, not only Titans, but other high-end tier two cloud providers and enterprises with large data sets that would also trial us over time. In 2025, I expect to have a large list of customers, of which Cloud Titans will still end up being some of the biggest, but not the only ones.

speaker
Operator
Operator

Thanks, Anton. Our next question comes from Amit Garyanani with Evercore.

speaker
Amit Garyanani
Analyst at Evercore

Thanks, and congrats on a nice quarter here. You know, I guess my question is really, you know, there's been a fair bit of debate among investors on what does calendar 24 look like for Arista? And the fear, I think, always is it could look like calendar 20 when you have some cloud digestion. I realize it's really early for you to write 24, but if you just sort of think about the puts and takes into next year, that would be helpful. And maybe, Jess, you could talk about how do you think Arista is different today versus in the calendar 19, calendar 20 timeframe? That would be helpful.

speaker
Jay Sriwal
President and CEO

Yeah, no, that's a really good question. Stay tuned for our 2024 guide when we have our annual space sometime in November. But qualitatively speaking, we're a very different company today than three years ago. Clearly, we've doubled down on our cloud titans, and you know that they're getting stronger and stronger. But even in the cloud titans, you know, Anshula and the team have worked to have a number of use cases. It isn't just one. And the addition of AI to that use cases just gave us a whole lot of broad opportunity from front end to back end, right? So to me, the holistic and seamless and cohesion between the front end and back end will get even more important as time goes on and cloud tightens. We also see that we're stronger in tier two providers, and of course the broader enterprise. Both of these were not as strong for us three years ago, and they also represent AI opportunities, but as you know, they represent campus, routing, classical data center opportunities and allow us to go target a much larger term. Again, three years ago, it was probably $30 billion. Three years later, it's well north of $50 billion. So I feel we are much more diversified, and while we deeply appreciate M&M, we've got a lot more candy beyond that.

speaker
Moderator
Moderator

Thank you, Amit. Perfect. Thank you.

speaker
Operator
Operator

We'll take our next question from Tal Liani with Bank of America.

speaker
Tal Liani

Hi, Ida. I have to ask you a tough one before you go, so you have a good taste for the rest few years. How much of the growth this time is coming from backlog drawdown? Can you give us some information about the order trends rather than revenues? And the reason why I'm asking it is because your guidance for 3Q is 25% growth. When I look at 4Q, it's implied it's 11%. So there is a sharp deceleration and growth in 4Q. And I'm wondering if it's a function of backlog and of elevated backlog. Thanks.

speaker
Ida Brennan
Chief Financial Officer

I mean, look, we haven't talked about backlog in order. I think we've talked more just in terms of deployments and deployment slots. And if you think back to the to my commentary, I mean, we do believe that there are, you know, ongoing deployments that will go well into 2024, right? So it's not, you know, again, I don't necessarily sign up to the terminology of the backlog and the drawdown, et cetera, because it's just given how the orders, the patterns of the orders, it's very difficult to talk in that language, right? But I think in terms of deployments, you will have deployments that are already planned and scheduled into 2024. I think, you know, we're taking quarter by quarter through the end of the year, but I'd still go back to my kind of incremental you know, look at it kind of incrementally quarter over quarter and, you know, continue to show some improvement. We've guided Q3, so Q4 is, you know, take some similar kind of incremental improvement into Q4. And I think that's the way to think about it for now. But again, I don't know that it's, our commentary on kind of demand and lead time stands, right? I mean, as lead time is shortened, you will see some period of time where customers don't need to place orders until you get back into lead time. And that dynamic is certainly there. And as we get closer to the end of the year, we'll get more visibility into next year.

speaker
Jay Sriwal
President and CEO

I know you asked a difficult question. Look, we'll know more as time goes on. And we think the business is strong and whether it comes in strongly in 24 or 25 or somewhere in between, we'll see. And the reality is it'll be difficult to repeat the last two years of exceptional cloud capex for cloud networking. So as they go through that deployment, and as they look at AI, and as we bring in the enterprise and tier two cloud, we've got a nice mix of things. And I urge everyone to think of our business as Ida has always alluded to, not in one quarter or even one year, but really a three-year kager. And I think our three-year CAGR will continue to be in double digits and good numbers.

speaker
Moderator
Moderator

Great. Thank you.

speaker
Operator
Operator

Thanks, Tal. We'll take our next question from Sebastian Najim with William Blair.

speaker
Sebastian Najim
Analyst at William Blair

Great. Thanks for taking the question. Can you maybe just update us on the visibility in your customer base? Is it still around six months or are we now down closer to three months? Maybe just longer term, do you think that generative AI could help improve that visibility from where it's historically been, just given that many of these hyperscalers have what seems like decent visibility into a pretty robust pipeline over the next few years?

speaker
Jay Sriwal
President and CEO

Yeah, that's a really good question. You know, since we have so many products in the mix, I have to break your question into visibility across multiple areas. Enterprise, I would say, is six to 12 months, generally speaking. In the cloud, given the reduced lead times on classic cloud networking, it's less than six months. However, on AI, it is greater since it's an early cycle and we have to do a lot more joint development. So you can think of it as, you know, three migrations going on with different visibility patterns.

speaker
Operator
Operator

Great, thank you. We'll take our next question from Samik Chatterjee with JP Morgan.

speaker
spk22

Hi, thanks for taking my question. Maybe if I can shift gears here a bit to enterprise, Yashree, and obviously you're talking about the slowdown on the cloud side here a bit going into 2024, but when you look at enterprise, how do you think about sustaining a growth rate or the slowdown in that growth rate into 2024? What are you seeing in terms of orders on that front to sort of give you visibility into 2024? Thank you.

speaker
Jay Sriwal
President and CEO

Look, I think, Sameek, this is an area that we feel pretty good about, and it's an area of great execution from Archul, Chris Schmidt, Ashwin, and the entire team, where we have really diversified our business globally in the enterprise. We're not just in the high-end financials. We're in just about every major vertical, healthcare, transportation, public sector, education, banks, insurances. So I feel enterprise, you know, barring any macro issue, which is the thing we were always worried about for 2024. So if macro doesn't let us down and we don't have to worry about the economy, we will have a strong year in enterprise. Thank you.

speaker
Operator
Operator

We'll take our next question from Aaron Rakers with Wells Fargo.

speaker
Aaron Rakers

Yeah, thank you for taking the question. I guess I wanted to ask just on product cycle cadence. You know, there's a lot of focus from one of your key component suppliers on the merchant silicon side around 51.2 terabit silicon and obviously supporting the 800 gig cycle. I'm curious, how do you think about the timing of that? When do we start to see the materializing deployments of 800 gig? And, you know, maybe that's tied to AI, maybe it's not, but just curious to when that cycle you believe really starts to kick in.

speaker
George

Aaron, you know, we had the same discussion when the world went to 400 gig, I'll be switching from 100 to 400. The reality was the customers continue to buy both 100 and 400 for different use cases. 51T and 800 gig especially are being pulled by AI clusters, the AI teams are very anxious to get their hands on it, move the data as quickly as possible and reduce their job completion times. So you'll see early traction there. You'll see, Chase, we mentioned trials really in 24 going into volume in 25. And that should be the ramp we'll follow for 800 gig. But that does not mean everything they just bought last few years at 400 gig for DCI or the spines and so on for classic clusters is going to get upgraded to 800 gig. I think that's going to be a longer cycle. So you will see 100, 200, 400, and 800 get deployed in parallel as we enter that cycle in 24, 25. Thank you.

speaker
Operator
Operator

We'll take our next question from Matthew Nickno with Deutsche Bank.

speaker
Matthew Nickno
Analyst at Deutsche Bank

Hey, thanks for taking the question. I'm just wondering, on the supply chain, if you could talk about how that's evolved over the last quarter. And as it relates to gross margins, I think you're messaging incremental improvements in 3Q and 4Q. Is that purely a function of easing supply chain or is there also maybe greater relative contribution from enterprise relative to Cloud Titans envisioned in the second half of the year as well? Thanks.

speaker
Ida Brennan
Chief Financial Officer

I mean, we're definitely seeing, you know, improvement on the supply chain side. We're seeing, you know, improvements with freight, improvements with, you know, just some of the expedite costs of the things that we were dealing with and, you know, we're kind of inventoried and now we're releasing them. So I think we're coming out from underneath that. You know, there is some small shift in mix as well, but it's still a good, strong cloud mix, you know, this quarter, this year. So, you know, it's not like we're back to, you know, a heavy enterprise mix without playing a much smaller part. There's still a very healthy kind of cloud mix in this year. So it's more we're eating back out our, you know, the supply chain stuff that we'd incurred in the past.

speaker
Jay Sriwal
President and CEO

Yeah, no, I want to give a shout out, Mark Berlhardt, our new Senior VP of Manufacturing and John McCool. have done a fantastic job of optimizing the supply chain, so those improvements are really playing a role in our quarter-to-quarter gross margins.

speaker
Moderator
Moderator

Thank you. Thank you.

speaker
Operator
Operator

We'll take our next question from James Fish with Piper Sandler.

speaker
James Fish
Analyst at Piper Sandler

Hey, thanks for the question. I just wanted to follow up around some of the prior questions asked, as many might have been asked already. I know you guys aren't talking about visibility and don't discuss backlog, but is it still fair to assume that we should think about you guys returning to a normal environment from a supply perspective in the early part of next year? And I believe, Ida, you've talked about underneath assuming that hyperscalers or your cloud titans grow double digits for this year. Is it still fair to think about that kind of level for 2023?

speaker
Ida

Yes, I think absolutely, right?

speaker
Ida Brennan
Chief Financial Officer

And I think that, you know, we kind of forget that cloud is still an important part of 2023, right? We're still executing on deployments and planning that we did some time back, right, all the way through this year. So cloud is still a significant piece of the business in 2023.

speaker
Jay Sriwal
President and CEO

Yes, and James, just to confirm, we expect a more normal setting in 2024 in terms of lead time.

speaker
Moderator
Moderator

You're right to assume that.

speaker
Operator
Operator

Thanks, Dan. We'll take our next question from Simon Leopold with Raymond James.

speaker
Simon Leopold
Analyst at Raymond James

Thanks for taking the question. I wanted to see if you could maybe do a little bit of unpacking in terms of what's driving your enterprise business in that I think the conventional wisdom is that enterprises are challenged by recessionary forces on the cycle, and then the secular challenge around public cloud adoption means slowing. So what do you see happening? How much of this success is related to market share gains? How much to general cycles, products, et cetera? If we could unpack the enterprise traction. Thank you.

speaker
Jay Sriwal
President and CEO

Sure, Simon. Well, of course we have market share gains. That is the result of our enterprise traction, I would say. But if you ask me, why are we winning in the enterprise? I would say, number one, from an alternative perspective, our customers haven't had one for a very long time. They haven't had a high quality, high support, you know, friendly software experience, a commonly spined architecture across their data center, campus, routing in a long, long time. So I think the architectural shift in the enterprise to move to a modern cloud operating model is the number one reason that Arista has been chosen. They are seeking our architecture for that quality of experience. In fact, Anshul and I were just talking about the cost. You know, we use the word cloudify a lot, and it's quite great right now that our high-end enterprises are really looking for the cloud principles, but however on their premise. In terms of the shift between, you know, workloads on the cloud and workloads on the enterprise, it depends on the customer. You still see some of the mid-market customers want to move their e-commerce workloads on the cloud, but a lot of their mission-critical applications stay on the premise. So a hybrid strategy continues to dominate the enterprise decisions for the data center. Secondly, our entry into the campus and routing, as well as zero trust security, observability, et cetera, is adding more layers to the cake. So our product depth and breadth is getting better and better. So the cloud operating model, the product depth, and now actually we've been at it now for, what do you say, I'm sure three to five years maybe. So especially in the United States, we've got more work to do internationally. I would say we've been engaging with these customers. I remember when Ida and I had a discussion, I want to say five years ago, where she was right and I was wrong, and she persuaded me to invest more in the enterprise. So I think all these things have gone into really making us who we are in the enterprise, and today we are a gold standard, and we have a seat at the table there.

speaker
Operator
Operator

Thank you. Thank you, Simon. We'll take our next question from David Vogt with UBS.

speaker
David Vogt
Analyst at UBS

Great. Thanks, guys, for taking the question, and congratulations, Ida. So I just want to go back to the point and maybe help bridge the 23 to 24 to 25 commentary that Jayshree mentioned, sort of strong double-digit growth. I think in the past you've talked about 15% growth across cycles. And I'm just trying to think through, you know, is there enough in trials and pilots in 24 to kind of get you to that kind of mid-teens growth over the next couple of years? And if not... does that mean that your enterprise business has to remain incredibly robust in 24, you know, upwards of, you know, high teens to low 20% growth next year? I know you're not giving guidance, but trying to kind of walk the bridge to get from where we are today to 25, where you're going to start to see more widespread AI deployments from a revenue recognition perspective. Thanks.

speaker
Ida Brennan
Chief Financial Officer

And now you want us to go to 25 as well. I don't think we're ready to do that. That's a really good conversation for the out of the state. Honestly, I think, you know, we obviously, We're very focused internally. As Jayshree reiterated earlier on, the business is a lot more robust with many different drivers. As you go through that period, cloud will ebb and flow, but it's still a healthy business. It has been a healthy business through those cycles. I think we've got a lot of the building blocks with how we're going to assemble them. Maybe we'll save for the analyst day.

speaker
Jay Sriwal
President and CEO

We'll share the Lego plan more. But David, rest assured that we are aiming for at least double digits next year. And so we'll go from there.

speaker
David Vogt
Analyst at UBS

Great. Thanks, guys. And congrats again. Thank you.

speaker
Operator
Operator

We'll take our next question from Eric Seppager with JMP Securities.

speaker
Eric Seppager
Analyst at JMP Securities

Yeah, thanks for taking the question. Maybe this is for Anshul. Can you just walk us through kind of how the Cloud Titans work. We hear a lot about them buying volumes of GPUs right now. At what point do their purchasing of GPUs translate into their demand for switches? How does it work with the trials and so on and so forth?

speaker
George

Sure. You know, there's no uniform recipe. But in general, when they're buying GPUs, both to connect. These could be a few quarters, depending on their timing of deployments. Do they build a network? These are very large things. Then it takes them a couple of months, sometimes a quarter or more, to fine-tune the cluster and benchmark and test everything before it is actually released to production. So you can think of that as sort of the basis. A couple of months, couple of quarters minimum before you can get there. Sometimes it adds up to about a year. before you really ramp into production.

speaker
Moderator
Moderator

Great.

speaker
Operator
Operator

Thanks, Eric. We'll take our next question from George Notter with Jefferies.

speaker
George Notter
Analyst at Jefferies

Hi, guys. Thanks a lot. I guess I wanted to ask about your comments about 2025 participation in AI. Can you walk us through sort of the milestones that you see between now and then in terms of increasing Arista's participation, certainly there's new product development, there's market acceptance, I presume. And then also, I assume that you participate today with inferencing applications, and that's by and large done on Ethernet. I think what we're really talking about is training, correct? So any more color there would be great. Thanks.

speaker
Jay Sriwal
President and CEO

Yes, George. So I think you can look at 2023 as really a year of planning for AI, because as I said, you know, there's tons and tons of GPUs being purchased. And then the question is, how is it being connected? So depending on whether they're small, medium, or large, there are different technologies. But I'm going to stay focused on the large because that's the biggest problem. You are right to say some of them may be Ethernet or even a non-networking technology, just an IO or a bus for smaller networks. But generally speaking, we're focusing on things that are much larger than, you know, 200 or even 1,000 GPUs. So that's the first thing. So a lot of planning is going into that. And the planning basically is how do they get the GPUs? What is their application? What is the size of the cluster? What is the time? What is the large language model data sets, et cetera? And what is their network foundation? In some cases where they just need to go quick and fast, as I explained before, it would not be uncommon to just bundle their GPUs with an existing technology like InfiniBand. But where they're really rolling out into 2025, they're doing more trials and pilots with us to see what the performance is, to see what the drop is, to see how many they can connect, what's the latency, what's the better entropy, what's the efficiency, et cetera. That's where we are today. Now we expect next year, this will translate to some, what I call pilots because majority of them will happen in 25, but in 24, you'll start seeing, what do you say on show? Maybe, you know, 4,000 to 8,000 GPUs, something in that range. Okay. 4,000 to 8,000 GPUs at about 400 gigs type clusters, but we'll actually put some production workloads on it. So I call them smaller pilots, but the real test of why you buy these expensive GPUs is in 2025. you want a gap not just four to eight but thirty thousand fifty thousand maybe even hundred thousand this is why 2025 is so critical and taking testing and taking out all the kinks you know out of the uh net gpus and networks is important because your network is so a good network is so pivotal to getting the most out of your gpus if you have idling cycles on those gpus you wasted thousands if not millions of dollars. And so I think these next two years are crucial to getting the most out of these expensive GPUs and that's where the network really comes.

speaker
George

Anshu? If I can add one more thing here, what are the milestones to get to these 2025 large-scale GPU deployment? There is one key milestone that has nothing to do with GPUs or our switches, which is does the customer have enough power and the site ready to deploy that many megawatts or gigawatts of capacity And as you know, getting a 1,500 megawatt site takes a couple of years, which is why this is a slow ramp. This is not suddenly turn on the key and you have thousands of GPUs.

speaker
Jay Sriwal
President and CEO

Yeah, really good point.

speaker
Operator
Operator

Simple things like power and space are still vital. Thanks, George. We'll take our next question from Carl Ackerman with BMP Paribas.

speaker
Carl Ackerman
Analyst at BMP Paribas

Yes, thank you. There's been some investor concern that hyperscale customers may focus more on white box solutions for 800 gig than in the 400 gig cycle. We're aware that some of your customers continue to adopt a dual sourcing strategy, but if you could just comment on the potential for an upgrade cycle as well as reuse risk on the transition to 800 gig, it would be very helpful. Thank you.

speaker
Jay Sriwal
President and CEO

Sure. As you're probably well aware, the white box question has remained with Arista as one of the most popular questions asked right in the time of our IPO, whether it's 10 gig, 40 gig, 100 gig, 400 gig, or now you ask it at 800 gig. I think there will always be an element of white box if somebody is just looking to build something and throw in some quick traffic. But for some of these most mission-critical networks, it's less about the box and more about the software stack and how much performance, availability, power you really get out of this. So the cost of putting in the box, if you save something, if you even save something, is far dwarfed by the total OPEX you need to make that box work. So we continue to believe that we will coexist with Whitebox and some of our Cloud Titan customers. We will continue to run both Sonic and FPOS in the case of Microsoft and Meta, along with our EOS. But at the end of the day, whether it's a white box or a blue box, it's the software stack that really wins.

speaker
Liz Stein
Director of Investor Relations

Thanks, Kyle. Operator, we have time for one last question.

speaker
Operator
Operator

Thank you. We'll take our last question from Ben Rietzes with Milius Research.

speaker
Ben Rietzes
Analyst at Milius Research

Hey, thanks a lot for sneaking me in there. Congratulations, Jayshree and team. I wanted to ask about enterprise again. I think the comments you made around cloud titans were all things that people were able to detect, but the enterprise just seems so much better in terms of the performance and the guide. So you mentioned that you gained share, but did the market pick up as well? And do you see that market pick up in demand and the enterprise sustaining into 24? It's just kind of more color around enterprise and whether, you know, the market picked up in addition to you gaining share.

speaker
Jay Sriwal
President and CEO

Hey, Ben, thank you. What do you mean by the market pick up? I don't follow the question.

speaker
Ben Rietzes
Analyst at Milius Research

Did demand pick up because the enterprise outperformance was quite a surprise and clearly the cloud tightened commentary was, you know, subdued as everybody was able to predict after the last conference calls this week. I mean, was it all market share or is the market picking up? Is demand picking up across the board?

speaker
Jay Sriwal
President and CEO

I would say to you that probably our enterprise demand has always been strong and not subdued, far from that. However, dwarfed by the excellence of our cloud performance, you didn't notice it and now you're noticing it.

speaker
Liz Stein
Director of Investor Relations

Thanks, Ben. This includes the original work second quarter 2023 earnings call. We have posted a presentation which provides additional information on our results, which you can access on our investor section of our website. Thank you for joining us today, and thank you for your interest in ERISA.

speaker
Operator
Operator

Thank you for joining, ladies and gentlemen. This concludes today's call. You may now disconnect.

Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-