3/2/2023

speaker
Operator

Thank you everyone for joining us today. Welcome to ARBE's fourth quarter and full year 2022 financial results webcast. My name is Koby Marenko and I'm the co-founder and CEO of ARBE. I'm very excited to share the developments made by ARBE in 2022, especially in Q4, where we have progress from the proof of concept phase to the production and commercial deployment stage. After my presentation, Karin Pinto-Flumenboim, Arbe's CFO, will share a review of our financials and outlook. Next, Ram Machnes, our Chief Business Officer, will discuss the market forecast and the business opportunities that we are pursuing. Finally, Noam Arka, our CTO and my co-founder, will conclude the presentations and speak about Arbe's latest innovations and our vision for the future. We value your input and questions, so we will reserve time for a questions and answers session. Please take a minute to review the safe harbor statement. Throughout the year, Arbe was dedicated to maturing a cutting-edge perception technology that significantly improves vehicle safety and accelerates the realization of Level 2 Plus advanced driver-assistant vehicles. Our company's progress has paralleled the advancements of the vehicle market to Level 2 Plus, which we anticipate will become the industry standard in the automotive by 2025. In 2022, we actively collaborated with leading automakers' perception teams to ensure that ARBE's innovations are designed into their groundbreaking revolution. We received clear indications that our perception radar technology will be a crucial enabler and integral component of Level 2+, as the mass market increasingly adopts autonomous systems at Level 2+, ARBE will become a vital element of the overall solution with a cost-effective sensor. We've been securing global Tier 1 partnerships and commitments to accelerate the transition to advanced vehicle safety systems in the near future. As we look ahead, we remain on track to achieve full production by the fourth quarter of 2023, thanks to the hard work and dedication of our team and our business partners. In 2022, we collaborated with multiple global tier one suppliers who will position Arbe for success in 2023 and beyond. Veoneer, a world leader in automotive safety, selected Arbe's chipset for their next generation radars. Veoneer is an attractive partner for Arbe as it currently produces more than 50 million radars per year and forecasting to grow to roughly 250 million per year by the end of the decade. In addition, Arbe received its first mass production preliminary order for 340,000 chipsets from HiRain Technologies, a leading Chinese ADAS Tier 1 supplier. In Q3, we announced that HiRain was selected to provide perception radars based on the Arbe chipset for an autonomous truck project in the ports of China. These announcements validate Arabe's leadership as trucks require the highest standards of safety and have the biggest need for advanced sensing. The higher-end business partnership will allow us to quickly scale revenue, deploying Arabe products in China, the largest and fastest-growing EV market in the world. Part of our success today is a direct result of the strong relationships we've built with industry leaders such as Valeo, Veoneer, Hyren, and Wefu, suppliers to leading global automakers. This quarter marks a significant milestone for our tier one business partners as these companies have made considerable financial investments and have deployed dedicated and sizable teams to develop radar systems which utilize the cutting edge radar chipset. The Tier 1s are also developing software utilizing the proprietary data generated by ARBE radars to enhance safety for OEMs. We are confident that these advancements will make a significant impact, not only for ARBE, but for the entire industry. We want to thank our Tier 1 partners for choosing and trusting Arbe and for working hard to secure customer wins. We look forward to updating you on the specific ones. In addition to our Tier 1 partners, we are very pleased with our collaborations with passengers and commercial vehicle makers and robot taxi companies. In fact, today we can report that we are actively engaged with 12 out of the top 15 automakers worldwide. conducting field trials with them and participating in RFI and RFQ bids. Our goal for 2023 is to achieve two design-ins in the rapidly expanding Chinese market and two design-ins with OEMs from Europe and the United States. These design-ins' wins will allow multi-year revenue contracts. In the following presentation, Ram Machnes, our Chief Business Officer, will provide details about our forecast for 2023. As some of you may know, the automotive industry in Japan is known for its high standards and regulations, subjecting all automotive technologies to a strict certification process before allowing commercial evaluation in the region. I am pleased to announce that ARBE has obtained the Japanese telecommunication and radio certification for our mass production RF chipset. With a certification process finalized, a leading Japanese automotive company has begun a development project based on our chips. We are also conducting pilot programs with leading OEMs and Tier 1s in the Japanese market, one of the top automotive markets in the world. We believe this very important certification will open a new and large opportunity for our company. As mentioned, our focus in 2022 and 2023 is on the transition from proof of concept to mass production. We have completed our dedicated chipset production line with global foundries in Vermont, USA. In this mass production line, our chips are now going through the AEC Q100 qualification process, which is required automotive level qualification representing the highest safety standard, for example, testing the chipset's durability in 125 degrees Celsius to minus 40 degrees. This is an important step keeping us on track with our plan to begin mass production in the fourth quarter of 2023. At the end of 22, we launched our groundbreaking 360 degrees radar-based perception solution, which provides a comprehensive analysis of a vehicle's surroundings over a long range. We take pride in the fact that we are the first company to offer and integrated 360 degrees radar-based perception. Our suit of perception radars captures surround data utilizing advanced AI technology to identify, classify, and track objects within the entire field of view. The data is processed in real time to create a full free space map around the vehicle and provides an analysis of the evolving hazards detected by the radars. We are excited to offer this innovative and cost-effective solution to the market, and we believe it will set a new industry standard for vehicle perception systems. Alba experienced great success at the CES trade show in Las Vegas, one of the industry's most important events. We had the pleasure of meeting with most of our automakers we are actively engaged with, who reassured us they have secured the necessary budgets to working on products based on our cutting-edge chipset throughout 2023. These projects are focused on driving significant advancements in vehicle safety, and we are proud to partner with our customers to bring these innovations to the market. In addition, CES was a fantastic opportunity to meet with potential new customers. We were pleased to learn that ARBE's leading innovations and value proposition resonates with the needs of these customers. We are also proud to announce that ARBE has once again been awarded the prestigious CES Innovation Award for the 360 degrees radar-based perception. We are thrilled to be recognized for our pioneering work in this field that will advance automotive safety in the near future. Looking forward to 2023, our goal is clear. We expect to enter full production with our cutting-edge chips in the fourth quarter. In fact, we have already received orders that meet our 2023 production capacity. In addition to our production goals, we continue to focus on ambitious sales and business development objectives of OEM design-ins for the year ahead. Our disruptive solutions have quickly gained recognition and endorsements from industry leaders and we believe that our strong offering will enable us to gain market share and drive innovation forward. At this point, I would like to hand over to our CFO, Karin, who will provide a detailed overview of our financial performance. Thank you, Karin.

speaker
Koby Marenko

Thank you, Kobi, and hello, everyone. Let me review our financial results for the fourth quarter and the full year of 2022 in more detail. As Kobi said earlier, Arbe is a company in transition from development to production, and our financial results in Q4 reflect this. As we progress with our strategy, we are shifting our focus on to chips for production. As a result, we have decreased engineering sample sales during Q4. This transition will streamline our operations and provide cost saving as we work to adjust our processes and ramp up production accordingly. We believe that these decisions will enable us to better serve our customers and drive innovation forward. Total revenue for the first quarter was $0.15 million compared to $0.5 million in the fourth quarter of 2021. For the full year of 2022, total revenue was $3.5 million within our guidance and an increase of 56% compared to $2.2 million in 2021. Q4 held negative gross margin of 45.8% compared to a positive gross margin of 37.7% in Q4 2021. This negative margin is another response to our reduced quarterly revenue as we transition to mass production. Gross margin for the full year of 2022 increased to 63.5% compared to 36% in 2021. 2022 gross margin improvement was driven mainly by economy of scale, revenue mix, and lower cost per unit as we progressed towards production. Moving on to expenses. In Q4 2022, we reported total operating expenses of $14 million compared to $14.2 million in Q4 2021. Decrease in our pre-production related costs and a favorable impact of foreign currency exchange rate were offset by an increase in labor costs and non-cash shared base compensation expenses. Operating expenses for the full year totaled to $50 million compared to $34.1 million in 2021. The increase was in line with our expectation. As we continue to grow the company and to add to our employee base to support our future growth. The company continues strengthening its research and development investment with R&D expenses totaling at $10.8 million in Q4 2022 compared to $11.6 million in Q4 2021 related to a decrease in pre-production cost. R&D expenses for the full year were $36.7 million compared to $28.6 million in 2021. Operating loss for the full year of 2022 was $47.7 million compared to a loss of $33.3 million in 2021. Operating loss reflected our growing investment, mainly in research and development, in our human assets and in costs associated with being a publicly traded corporation, all towards our progress for production. Looking at adjusted EBITDA in Q4 of 2022, an on-gap measurement, which excludes expenses for non-cash share-based compensation and for non-recurring items, was a loss of $11.5 million, also within our guidance, and compared to a loss of $11.9 million in the fourth quarter of 2021. Adjusted EBITDA for the full year 2022 amounted to a loss of $38 million compared to a loss of $30.4 million in 2021. Net loss in the fourth quarter of 2022 decreased to $11.1 million with $3 million of financial income compared to a net loss of $15.8 million in the fourth quarter of 2021. Net loss for the full year of 2022 was $40.5 million compared to $58.1 million in 2021. Moving to our balance sheet, as of December 31st, 2022, Arbe had $54.2 million in cash and cash equivalents with no debt. With respect to our guidance for 2023, we would like to provide an outlook for the full year ending December 31st, 2023. Our goal for 2023 is to achieve four design-ins with OEMs. Revenue are expected to be in the range of $5 million to $7 million, representing our expectations of full production in Q4 2023, together with our decision to focus exclusively on production-intent chips. Adjusted EBITDA is expected to be in the range of $32 million loss to $35 million loss, primarily related to revenue year-over-year growth, as well as decrease in initial production costs and cost efficiency efforts. I'm now pleased to hand over the floor to our Chief Business Officer, Ram Machnes, who will share insights of our market forecast and the business opportunity that lies ahead.

speaker
Kobi

Thank you, Corinne. I am Ram Achnes, and I am the Chief Business Officer at Arbe. We would like to provide you some outlook for 2023 and summarize 2022. We'll talk a bit about the market trends that we've seen in 2022. We'll talk about our business model, about what's next, where we see the main trends, and where we see our product winning in the market. so when we look at the market and the trends for 2022 we see a lot of changes in the autonomous driving arena we saw a lot of buzz around the autonomous driving but we see that the whole solution providing a full autonomous driving this is a trend that is diminishing and we are looking at trends of features and scenarios being developed. So instead of providing a full solution that is working in any scenarios, in any road conditions, in any weather conditions, and it's fully unsupervised, we are seeing the trend of developing smaller features developing capabilities that are more for specific scenarios, that are working in specific conditions, and are sometimes supervised, and sometimes even just alerting the driver for something that is happening around it. So if, for example, in the past we wanted the car to drive all around, all the time, now we're talking about scenarios with a very clear entry and exit criteria for example when i'm driving during my drive in a highway or a in a traffic jam in specific weather condition if it's daylight or if it's night or a fog and it can be partial supervised by the driver or a for example just alerting the driver about a something dangerous that is about to happen So the focus of the industry became from solving the entire solution just to solving pieces of the puzzle one at a time and providing the means for the algorithms, for the developers to be able to solve the problems. We are seeing many of the OEMs, the car manufacturers, looking at providing the bits and pieces to build these solutions. So it's a very gradual solution. We saw a lot of software-defined vehicles trend. That means software... that is evolving during the ownership of the car. So when you buy it, it doesn't necessarily do all the features, but gradually you can upgrade. And from the car manufacturers, they want to make sure that the hardware that they are putting on the car is capable of doing the advanced features later on. So it's actually looking at making sure that the hardware itself is an enabler for future technology, and it's not an obstacle that prevents the car manufacturer from providing the full features later on. So some of the features, they are not ready yet, but the car manufacturer wants to make sure that the car is ready to include those features in the future. So we are seeing selections by OEM of hardware platforms, if it's the compute platform, if it's the sensing platform by itself, that is future ready for the software and algorithms that are coming down the road. What we also see is that across all the OEMs, the understanding that imaging radar, the capability to have an independent source of information to the autonomous features, and to the autonomous driving capabilities is a fundamental element that the car manufacturers are looking to add into the vehicles. We see it not only coming from us, we see also other players in the market like Mobileye that are saying that imaging radar is a crucial element in being able to provide those features. And we saw even some other players that in the past said that the radar role is not that significant, playing a much bigger emphasis right now on imaging radar and saying that imaging radar is necessary in order to be able to provide those advanced features coming down the road. If we look at the market from the perspective of Arbe, we are seeing a headway from the different OEMs in 2023. They are selecting this year. a lot of their hardware platform, a lot of their sensing platform, making sure that this is ready for those features. They are looking to make sure that from regulatory perspective, they are ready to meet all the NCAP, all the requirements from all the regulation. And we are today in contact with 12 out of 15 organizations of global OEMs, of the top OEMs, looking at integration of the Arbe radars into next platforms on them. We also announced our penetration to the Japanese market. We passed the Japanese regulation and certification of a radar based on our chipset to start working in these markets as well. When we look at the radars and what RBA is bringing into the table, we are talking about perception radar, which is very, very different than the regular radars used till now. So if we look at the low-end radars that are out there today, the traditional radars, if we take an example of a bus and a... pedestrian walking just next to to the bus with traditional radar you won't be able to see two objects you will just see something that is going on there you might be able to see the velocity of these objects but you won't be able to really separate them and you will be able to detect them only when they are moving if they are stationary if they are not moving you won't be able to detect them With the basic imaging radars that are coming now into the OEMs and the car manufacturers, you might be able to see that there is a bigger object, maybe sometimes identify them as stationary objects, but not really separate them and be able to see two distinct objects, even if they are stationary. And that's what the perception radar really bringing into the table. So it's really bringing an image of the outside world into the perception algorithms that need an independent source of information. They have today the camera, but the camera by itself is not enough because it's not bulletproof. It's not ready for any weather. It's not ready at night. And sometimes it doesn't bring the right analysis of the scene around it. So you need another independent source of information and only... perception radars that are high-end imaging radar, they would be able to provide the details that are needed for their perception algorithms. So if we look at the current generation radars, many of them, they are good for the adaptive course control, maybe for some emergency braking because they can see if there is something in front of you. They won't be able to understand exactly the size of it, they won't be able to understand exactly if it's stationary, they won't be able to identify it even if it's stationary, versus the perception radar that can clearly show the boundaries of these objects, can clearly show the free space, whether I can squeeze in between two different vehicles, whether the pedestrian is walking next to a truck, So I'll be able to see the track and separately the pedestrian. If there is a flat tire on the autobahn and someone is changing the tire, we will be able to see it with a perception radar and we won't be able to see it with the basic imaging radar or with the low-end radars, traditional radars that are out there today. So this is a fundamental revolution to the sensing capabilities that the perception teams are getting with the imaging radar. And as we said and others players are saying now that you need few thousands of channels to be able to really process this information and get detailed image in order to be able to really leverage this information. When we look at the automotive market, it works with OEMs. These are the regular players that we are all familiar with that have the big brands of car manufacturers. And there are tier 1s. Tier 1s are the ones that are providing the hardware to these OEMs. So when we in Arbe, as a cheaper manufacturer, our main customer in the automotive market are the tier 1s. And we have 40 ones that publicly announced that they are basing their next generation radars on our chipset. It's Vionier, who is a global leader in radar. Valeo, who are leaders in ADAS. Wifu and Hyran that are Chinese leaders in their prospective markets. And we have also a T1 coming to the non-retail automotive market, taking it to a commercial vehicle like trucks, delivery robots, agriculture, security use cases, other than the automotive. So overall... We are seeing that from our perspective, the main customers are these big T1s, and we were able to announce these four T1s that already based their next generation based on our chipset. Overall, we are seeing these T1s putting... Tens of engineers, sometimes even 100 engineers, working and developing on our chipset, their next generation radar. And we see a lot of momentum and a lot of mutual work between us and these T1s. Overall, all these five T1s, they have already an A sample radar based on our chipset. There are two T1s that already have a B sample and two other T1s that are going to have a B sample really soon. And for the rest of the year, all of them would have a B sample. Also, we announced publicly AutoX and Bike using our chipset as OEMs. Unfortunately, I cannot talk about other OEMs right now, about their selection. And we do have active projects with four OEMs. And overall, there are right now 10 active RFIs, RFQs with different global OEMs with the tier ones that are using our chipset. Going on to 2023, we are expecting to be able to see two more significant OEMs signing a deal for production with our chipset. We are expecting two new chipsets. Chinese OEM signing a deal for production with our chipset. In the non-retail automotive, we're expecting five more deals coming down the path. We will be in automotive-grade production in Q4 of this year, of 2023, automotive-grade full production. And until then, we're expecting the revenue to come mainly from from evaluation systems, from some samples, and from projects that we are doing together with the OEMs and with the different ones. Overall, we are expecting the revenue to be between $5 million to $7 million in 2023. So I would like to introduce Chief Technology Officer Noam Harkin, who will provide the insights into the groundbreaking innovations that Arbe has developed and what lies ahead of our company.

speaker
Ram Achnes

Thank you, Ram. Hello, everybody. I'm Noam. I'm the co-founder and the CTO at Arbe. And today I'm going to show you our technology. So a radar in general is the perfect sensor for automotive technology. We all know that. It has great features. It's active in the day and night. It's not affected by weather conditions. It's very sensitive. You can see targets very far away. It has a high refresh rate. And it directly measures four dimensions, azimuth elevation, range, Doppler. And it's extremely reliable and affordable. And this is why this technology was introduced a long time ago into the automotive industry. And it's already very mature with over 20 years into the industry. Now, what we do at Arbe, we took this technology and brought it to the edge in terms of performance. So the technology that Albert provides is the best radar that is possible within the limits of the automotive industry. So we achieved this using a massive MIMO concept. And this means that we have a lot of channels inside the radar that we can process, we can use and process. So our main radar is what we use for what we aim for the front of the car and the back in some applications. We use 48 transmit channels and 48 receive channels. And we also have a version which is scaled down of that version, which we aim for the side of the radar, which is 24 transmit channels and 12 receive channels. Now, in radar, number of channels means performance. It's really equivalent. So you can think about it like the equivalent of camera pixels. So imagine that you get a camera with 2,000 pixels, and you want to compare it to other cameras that this is the radar that exists on the market with 10 or 100 pixels. So we get 10 times better performance than the state-of-the-art that's available today in the market. And this is because of the breakthrough we were able to achieve. So in order to achieve this, this is a very hard goal, and this was very hard to get to. And the way that we got there is by we had to innovate in every aspect of the radar system design. And we had to design the core components of the radar system by ourselves. So this is why we designed our own radar chipset, which includes... a receiver chip, a transmitter chip, and a digital chip for the processing. In this digital chip, we have what we call a radar processing unit, which is a dedicated IP embedded in the silicon that really puts all of our best ideas and innovation into something which is cost-efficient and high-performing processing core. We also had to innovate on the way the radar works from the physical point of view, so we had to invent a new type of modulation. And all of this is part of our core IP that is the basis to how we achieve this high performance. This sensor is critical for a wide range of driving scenarios, starting from the most basic applications of just obstacle avoidance. So the radar, this sensor, is very good at mapping the environment and understanding all the cars around your own car and the obstacles, the stationary objects that we have on the road. maybe something fall out of the truck in front of you and things like that. So the radar is very sensitive, is very good at detecting these, mapping them and allowing the car to avoid them. But this is not the case for all type of radars. If you have a low resolution radar, it's very hard to really understand the environment to create this map. So this is an innovation that is enabled only because we have this technology with this high resolution and high channel count. And similarly, like getting into an intersection, you have to see long range and high resolution in order to understand if there's a car coming in your lane or the lane you're trying to get into. It's a very hard task for radar, and this is something that we are enabling with our technology. But we're also looking on edge cases. So one of the most problematic edge cases for this technology that is radar is the truck stuck under a bridge because the bridge and the truck get mixed together. It's very hard to understand if the road is clear and you can drive there or if the road is blocked for a normal radar. But... Because of our technology, this use case is also solved, and we can provide a map with higher accuracy also in the elevation dimension. One of the other very important use cases, which is very hard and we know and we see cars today fail in it all the time, is the VRU detection. VRU is vulnerable road users. So specifically, we talk about... pedestrians and motorbikes that are harder to detect. The radar signature is low, it's harder to detect them, and they move in very slow speed, so we can't really use the motion as a detection principle. We have to get this high resolution, and this is something, again, that our technology solves and allows the car to really have another sensor like the camera that can see those road users. So this is a demo of how the technology really works. You can see that we are able to map the entire environment in 360 degrees coverage around the car. And in real time, we are able to create this high resolution map. to allow the car to navigate in the road and in the world. This application is called simultaneous localization and mapping, and this is considered one of the most difficult applications, doing on a radar-only stack. and we can we prove now that this is possible with our technology and this could be the basic for really planning the trajectory of the car for autonomous driving but also for a simpler adas functions like emergency braking and lane changing once you can get this high resolution map with depth perception and of course velocity perception doppler perception then you can really make conscious and smart decisions while you drive. What we do in Orbit is we created a very strong patent portfolio that protects us and really covers the core technology. And we're using the time we have now from the delays in the market to further extend it. and to improve our technology improve our competitive advantage and to look also on the future so we have core patterns we have nine granted patterns around our core technology which is around the modulation and the processing how do we do the processing exactly of the radar is very new approach that is not straightforward it's not something that you take out of the radar book We had to innovate on that, on how you design the system, how you design the antenna array, how you design the package for the RF chips. Because in order to get the cost down, we had to package, to integrate many channels together in a small silicon area and small package area. So the way to do it is not that trivial. It's quite hard. And we also have some core IP in that regard. Another problem which is very fundamental to radar industry is mutual interference. So the radars always operate in the same frequency band, and they can interfere with each other. So we have... We have invested a lot of effort and a lot of time in investigating how these interferences behave, how they look like, and how they affect the system. And we generated also very strong IP in that field, which is going to be one of the limiting factors of the technology. And we think we have a very good solution on how to cope with this interference problem. Now, looking forward on the milestones we have ahead of us in terms of technology and innovation is very exciting for me. Already today, we have the richest point cloud. I'm talking about the physical representation of the world. So this means that we are using the basic physical principles of the radar to generate the image. But we know that this is not... all we can do. We can take this information and we know that there are more secrets inside this information and we can really create more insights into the scene and the map around us from this information. So this is the data that we have today is rich and it's a great basis for another stage which is creating a super resolution point cloud from the data that we have today. And this is something that we're working on and we already have a prototype of it and it will go into our production solution this year, probably in the middle of this year. And the next, what we're really trying to get is a really good representation, very accurate representation of the world. So if you compare it to other sensors, we know that autonomous driving stacks today rely a lot on LiDAR. which is a very expensive sensor and has its own problems. And we are trying to get, with these techniques, with this approach that we are now introducing, we are trying to get something which is similar to a LiDAR. But, of course, the radar has its own advantages that I mentioned before. And it's also almost ten times cheaper than LiDAR solutions. So we believe that if we can get to a resolution and detailed image which is close to what we can get from a LiDAR, then we can offer this high-resolution radar as an alternative and reduce the cost of the autonomous driving stack. for future systems. And this is the key to getting this technology, this autonomous driving technology into mass adoption in the market. So we see this as a very critical stage to make this happen. We're also innovating on the perception stack that we have on the radar. So we know that AI revolutionized how people are doing image processing, digital image processing. And we believe that the same thing will happen in radar. So today, most of the radar processing stack are based on a model-based approach. But there are much more you can get from the data if you employ these techniques on the radar data. But we also understand that these techniques that were developed for different applications will have to be adapted, and we're working on it, and we have many projects internally that we're trying to understand what is the best way to run AI techniques and AI infrastructures on radar data. And we see ourselves as we should be the leaders in AI for radar. So with that approach, we want to get the best out of the radar, the best radar perception, the best radar only stack, I would say. But we say, okay, we're taking the radar to the edge. But now we want to see what it does to the overall system. So we are also working on a fusion application. So when you have the full sensor stack of the car, it really has a radar and cameras, that's for sure. And we want to see how much the radar really adds up to the camera. We know that in poor weather, lightning conditions, it's going to be critical. So we are now working on this application for fusing our radar data into a camera stack and showing really that if you use the radar plus camera, you solve the problem. Basically, you solve the perception problem and you can make highly reliable decisions in all scenarios. Finally, we are to really make all of this product, real product, we have to embed it into our next generation chipset. Of course, it's always a goal for us, and this is what we will do in this next generation. We will further improve the physical resolution, the native resolution, and we're going to take the AI on the edge approach in terms of radar processing. So we're going to add all the necessary IPs. in order to run AI algorithms on the radar point cloud, on the radar data, already on our processor, so people can run the algorithms there and not have to put everything in a central computer, which can increase the cost of the central computer. and it can be a non-scalable approach. So by putting these AI cores on the edge, it creates the system flexibility and the scalability to improve performance in the future. We also want to enhance our software-defined radar capabilities. So today we can update software and change the way the radar operates on the fly. And this is an ability that we see. It's very important in the market to make the sensor future-proof. So imagine that a car manufacturer took a radar which is 10 times better than any other radar on the market and put it on the car, but it still needs experience in order to use and leverage this information so it can update, it can make the car future-proof by updating software and exploiting the radar information better. And with this approach, we are also aligning ourselves to the market and supporting better our Tier 2 model by improving our software-defined radar capabilities. The most important thing, of course, is reducing cost because we know automotive industry is very cost sensitive and in the next generation we are looking for ways to further reduce the cost, have better integration, put more channels on the same silicon and reducing the cost and power even further. And one of the ways that we're going to do it is we're going to introduce a high channel count transceiver. So today we have a transmitter and receiver chips, which was a good technical decision at the time. In the next generation, we're already looking on how to put them together and provide a high channel count transceiver. And this will also help improve performance and reduce the cost in the system. So this is how we see the road ahead of us, and we're very excited about the future. So thank you, everybody, for listening. And now, Kobi, back to you.

speaker
Operator

Now we will be happy to take your questions. First question will come from the analyst from Gary Mobley of Wells Fargo. Hi, Gary.

speaker
Gary Mobley

Hey, guys. Thanks for taking my question. I wanted to pick up where we just left off in your prepared remarks and talk about your future architecture and the competitive environment. Do you see a situation where automotive OEMs like Tesla, for example, may have the capability to develop high-definition radar solutions internally. And do you see a path toward maybe some of what your competition is doing, and that is centralizing, infusing camera imaging sensing and radar sensing in the centralized processor or domain controller, or do you still ensure in terms of edge processing for radar specifically?

speaker
Operator

Thank you for this question. We believe that the amount of data that is generated from a 4D high-resolution imaging radar cannot be processed not on the edge. Just to take this data, the theoretical data of one tera, and move it to the central compute is unrealistic. The same with OEMs that will develop their own chipset. This doesn't make sense, and there's no economy of scale of that. We are working. We have our own processor. We believe that our core IP is our processor. In our next generation of the processor, We will give to the OEMs much better abilities to develop their own stack, their own kind of radar on top of it, and of course to the tier ones. But we don't see the OEMs, we haven't heard even from one OEM that they want to develop their own chipset. They might want to go directly to the tier two for making sure that the supply will be there, but not for development.

speaker
Gary Mobley

Thank you, Colby.

speaker
Koby Marenko

We'll take questions from Josh from Cohen.

speaker
Josh

Hey, guys. Good afternoon. Thanks for taking my question. I appreciate all the technical detail and thank you there. But I did want to ask more about the financials. I mean, compared to the original expectations, 2023 is coming in materially lighter than your original forecast. I think 23 was originally supposed to be driven by ramps at Chinese customers, but also robo-taxis. I mean, can you speak to what's driving the softer 23 outlook versus your original expectations? And then perhaps more importantly, what are the implications on 2025, 2026 timeline? How did those compare to what you were originally speaking to when you first went public? Thank you.

speaker
Operator

So basically, as we see, we are in a shift of around three quarters from our original plan two years ago. This shift was caused mainly by the supply chain problems that slowed down our production timeline from one hand. And from the other end, it also slowed down the decisions on the OEM side of it. Also, robot taxis and full autonomous driving is slower than expected and forecasting that we got from our customers two quarters ago or even one quarter ago basically slowing down. But I think that the good news is that the supply chain issue problem is behind us and behind our customers. And China is trying to close the gap rapidly. And we believe that in 2024, we're going to be able to be more or less where we wanted to be. and to close this gap and to narrow the gap. And, of course, in 2025 and 2026 as well, we already have a preliminary order for end of 2023, early 2024. And this is just the first order that is around $30 million worth. And we believe that in the next coming two quarters, we will have a few more orders that will give... will build us the 24 forecast and the 24 expected revenues.

speaker
Koby Marenko

To add to Kobe, Josh, it's a little bit also goes together with the language that we wrote. That was we wanted to focus this coming quarters on serial production and to aim our revenue to that pace, which will actually give us more potential reoccurring revenue in the future and to increase our customer base more solidly. And to have the customer as we see it now that they endorse our technology.

speaker
Josh

Thanks, guys. I'll hop back in the queue in the interest of time.

speaker
spk07

Hey, Matthew. Can we, I guess, can we touch on where the OEMs are positioning, you know, perception radar in their stack? I think, you know, in the presentation you talked about you know, OEMs thinking about you know, evolving the software of the stack and sort of getting components in that they'll become more functional over time. So is radar going in today with the idea that it's going to be a front and center and sort of primary sensor for vehicles in 2025? Or is it, you know, they're expecting it to be there? Is it, you know, second part of that question is the radar and camera fusion expected to sort of displace LiDAR in some prior plans as we look out in two or three years? Again, how are the OEMs positioning your radar today?

speaker
Kobi

Yeah, so I think this is a great question. What we see from the OEMs is a very strong desire to make their hardware ready for the features coming with the autonomous driving, level 2, level 2+, going towards level 3, and make sure that their hardware... is capable of processing and being able to sense the environment. And for that, they do need the imaging radar as an independent sensor compared to the camera. So you have the camera, you will always have the camera. And in order to have a safe... features of, for example, driving in the highway, driving in a traffic jam. You need another independent source of image and for that you must have a high quality, high definition imaging radar like the one that we are providing the chipset for.

speaker
spk07

Okay, if I could just ask a quick follow-up. You mentioned 12 of the top 15 automakers you've been engaged with. Are the others engage with advanced radar competitors, or are they radar skeptics at this point, or, you know, why isn't 15 of the top 15 today?

speaker
Kobi

Yeah, so the current radars, they are doing a 4x3, 6x8, but you're right that some of the basic imaging radar are going towards the 12x16, but this doesn't give enough information for the perception stack to really understand the image around us. And only when you go to the 48 by 48 or the higher, much higher channel count, like also some of our competitors announced that 2,300 channels and few thousands of channels, only then you really get the information that the perception team needs in order to drive safely with those features.

speaker
Operator

But to answer the question, we believe that the other three are working with low-end imaging radar, like we call it.

speaker
spk07

Thank you.

speaker
spk05

Jamie? Hey, good day, everybody. Thanks for taking my question. I have more of a technical question. In our presentation, you mentioned that you can have the multiple frequency. Is that multiple frequencies against other radar systems or frequencies that the car emit from other components? So I'm just curious on how that's progressing.

speaker
Operator

The multiple frequencies, yes, it's for making sure that we won't be interfered with other radars and also with our radars because sometimes there is more than one radar per car. So for that, we are changing the frequency or hopping on the frequency in order to make sure that all of the environment can live together without interfering each other.

speaker
spk05

And my second follow up is a more of a financial question. I know you mentioned they're going to go into production in the fourth quarter. What are the components of revenues this year? Is it going to be valuation sales or testing sales?

speaker
Koby Marenko

So we are focusing on, of course, as I said, chipset production, but it will be back-end loaded toward the end of the year. During the year, we will still have small amounts of chipsets of non-automotive customers and also additional small volumes for our known customers.

speaker
Operator

But a major part is the production of our chips and the first production chips that we have a preliminary order for that.

speaker
spk05

Okay, that's all the questions I have for now.

speaker
Koby Marenko

Thank you, Jamie.

speaker
Operator

Okay, we will have some more questions from the audience.

speaker
Koby Marenko

How is our bed different or complementary to Mobileye?

speaker
Kobi

So both Aubert and Mobileye announced that the direction is for high channel count radars based on what the technology is called FMCW. So in that sense, we are very similar trying to solve the problem of having a good sensing for perception. Arbe is sampling the chipset right now. Arbe is going to be in production this year in the fourth quarter, and that's ahead of the competition. We are working already today with the four tier ones that we already announced, and in type with others as well, as well as others OEM that are working

speaker
Koby Marenko

right now to integrate this technology into their vehicles next question from billy how close is the next best competitor to your product performance levels

speaker
Kobi

So if we put aside the very high channel count, like what Mobila is doing and Arbe is doing, the next one are the 12x16s that are basic imaging radar. They are far behind in terms of the channel counts. It's 192 channels versus 2,300. It's 10x factor comparing these two kind of solutions. And the results are accordingly good. So the level of details, the level of false alarms that will cause phantom breaking, the level of misdetections that you get with the low channel count is dramatically higher, and the performance that you're getting with the perception radar is dramatically higher and better and required to get an image of the surrounding, of the stationary objects, of defining and finding the obstacles around you.

speaker
Operator

Just to add to what Ram said, basically we are providing the 10x more performance but on the same price as our next best radar. Next.

speaker
Koby Marenko

Several investors had questions. What is the Tesla buzz all about?

speaker
Kobi

So we can only refer to OEMs that allowed us to say their names. But we are hearing a lot of requirements and demands. global trend of OEMs going into imaging radar and perception radars with the high channel count. And that's across the board, across the market, across most of the OEMs today.

speaker
Koby Marenko

Thank you. Question from Robert. Are other Chinese EV manufacturers testing using your solutions?

speaker
Kobi

Yes, so the answer is absolutely yes. There are others that are right now working to integrate radars based on our chipset into their solution, their vehicles.

speaker
Koby Marenko

Last question. Koby, what keeps you up at night?

speaker
Operator

So, of course, except of the fact that we are on the final days, sorry, final months of production, what really keeps me up at night is how we are staying motivated for innovation, making sure that side by side with taking those chips to production, supporting our customers, generating hundreds of millions of dollars from the current product, we still... doing the innovation like an early stage startup with our next generation product. Thank you. Okay, we are grateful for your participation today and we appreciate your ongoing support as we strive to push the boundaries of innovation in the industry and become the leader in Level 2 Plus and Level 3 Advanced Perception. To our valued employees and partners, we extend our sincerest thanks to your commitment to ARBE. It is your hard work and dedication that propels us forward to our goals. We are excited about the opportunities that lies ahead and we are committed to keeping you updated on our progress. Should you have any questions or want to discuss potential collaborations, please do not hesitate to reach us at investors at Arbe.com or visit our website to schedule a meeting. We look forward to hearing from you. Thank you all. Thank you.

Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-