2/23/2023

speaker
Operator

Good afternoon, ladies and gentlemen. Thank you for standing by. Today's conference call will begin soon. Once again, thank you for your patience.

speaker
Craig

Greetings.

speaker
Tim Clarkson

Welcome to InnoData's fourth quarter and fiscal year 2022 earnings call. At this time, all participants are in a listen-only mode. A question and answer session will follow the formal presentation. If anyone should require operator assistance during the conference, please press star zero on your telephone keypad. Please note, this conference is being recorded. I will now turn the conference over to your host, Amy Agress. You may begin.

speaker
Amy Agress

Thank you, John. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abelhot, CEO of InnoData, and Maryse Espinelli, Interim CFO. We'll hear from Jack first, who will provide perspective about business, and then Maryse will follow with a review of our results for the fourth quarter and the 12 months ended December 31, 2022. We'll then take your questions. First, let me qualify the forward-looking statements that are made during the call. These statements are being made pursuant to the safe harbor provisions of Section 21 of the Securities Exchange Act of 1934 as amended and Section 27 of the Securities Act of 1933 as amended. Forward-looking statements include, without limitation, any statement that may predict, forecast, indicate, or imply future results performance, or achievements. These statements are based on management's current expectations, assumptions, and estimates and are subject to a number of risks and uncertainties including without limitation the expected or potential effects of novel coronavirus COVID-19 pandemic and the responses of government, the general global population, our customers, and the company thereto, impacts resulting from the rapidly evolving conflict between Russia and the Ukraine, investment in large language models that contracts may be terminated by customers, projected or committed volumes of work may not materialize, pipeline opportunities and customer discussions which may not materialize into work or expected volumes of work, acceptance of our new capabilities, continuing digital data solution segment reliance on project-based work in the primarily at-will nature of such contracts, and the ability of these customers to reduce, delay, or cancel projects, the likelihood of continued development of the market particularly new and emerging markets that our services and solutions support, continuing digital data solution segment revenue concentration in the limited number of customers, potential inability to replace projects that are completed, canceled, or reduced, our dependency on content providers in our agility segment, a continued downturn in or depressed market conditions, whether as a result of the COVID-19 pandemic or otherwise, changes in external market factors, the ability and willingness of our customers and prospective customers to execute business plans that give rise to requirements for our services and solutions, and difficulty in integrating and deriving synergies from acquisitions, joint ventures, and strategic investments, potential undiscovered liabilities of companies and businesses that we may acquire, potential impairment of the carrying value of goodwill and other acquired intangible assets of companies and businesses that we may acquire, changes in our business or growth strategy, the emergence of new or growth in existing competitors, our use of and reliance on information technology systems, including potential security breaches, cyber attacks, privacy breaches, or data breaches that result in the unauthorized disclosure consumer, customers, employee, or company information or service interruptions, and various other competitive and technological factors and other risks and uncertainties indicated from time to time in our filings with the Securities and Exchange Commission, including our most recent reports on Form 10-K, 10-Q, and 8-K, and any amendments thereto. We undertake no obligation to update forward-looking information or to announce revisions to any forward-looking statements except as required by the federal securities laws and actual results could differ materially from our current expectations. Thank you. I will now turn the call over to Jack.

speaker
John

Thank you, Amy. Good afternoon, everybody. Thank you for joining our call. Today I'm going to talk briefly about our Q4 and year-end results, and then I'm going to spend some time discussing recent acceleration in AI investment by large technology companies in large language models, coinciding with OpenAI's fourth quarter release of its large language model called ChatGPT, and how we believe InnoData is quite well positioned to capitalize on this increased investment. So first, our results. Q4 revenue is $19.4 million, a 5 percent increase over the prior quarter, which annualizes roughly to a 22 percent growth rate. We posted positive adjusted EBITDA of approximately $250,000 in the quarter, which was a positive swing of $1.5 million from Q3. This significant improvement resulted primarily from our September-October cost containment and efficiency initiatives. The benefits of these initiatives will be fully reflected in our first quarter 2023 results. We ended the year with a healthy balance sheet, no appreciable debt and $10.3 million in cash and short-term investments on the balance sheet. In 2022 overall, we grew revenues 13 percent, despite the significant revenue decline from our large social media customer that underwent significant internal disruption in the second half of the year. but that we believe may normalize this year. Let's now shift to the recent substantial uptick we are seeing in our market activity. As most everyone now knows, in late Q4, OpenAI unveiled ChatGPT. This AI large language model has since gone viral, capturing popular imaginations for its ability to write, to generate computer code, and to converse at what seems like human or even superhuman levels of intelligence. We believe the release of ChatGPT has been broadly seen as a watershed event, potentially heralding a fundamental advancement in the way AI can drive changes in business communication processes and productivity. Our market intelligence indicates that many large tech companies are accelerating their AI investments as they compete for domination in building and commercializing large language models, and that an arms race of sorts is now forming. We believe that the significant investment that will likely result from this competition could dramatically accelerate the performance of these large language models. As a result of this dramatic increase in performance, we expect almost every industry will face fundamental reinvention. We believe that the opportunity for InnoData in all of this is significant and that it is now upon us. We believe our opportunity is actually threefold. First, to help large technology companies, both existing customers and new customers, compete in this large language model arms race. Second, to help businesses incorporate large language models into their products and operations. And third, to integrate these technologies into our own platforms. Let's take each of these in turn, starting with what I just laid out as our first opportunity. helping technology companies, both existing customers and new customers, compete in the large language model arms race. Well, ChatGPT and a host of lesser known but equally impressive large language models are for sure amazing. Our view is that it's still early days. We believe these large language models have room for significant improvements in output quality, in the languages they serve, in the domains they support, and in terms of safety. These are all challenges that we believe we can help with. We expect to help by collecting large-scale real-world data for training, by creating high-quality synthetic data when real-world data training is hard to come by, by annotating training data, and by providing reinforcement learning from human feedback or RLHF to fine-tune model performance and eliminate hallucinations, which is the tendency of these models to make things up on the fly. In addition, we expect to help by minimizing the risk that models generate unsafe or biased results, and we expect to help by hyper-training generalized models for specialized domains. High-quality data is at the root of addressing all of these challenges, and this is and has been in the data's bread and butter specialty for 30 years. We believe that the arms race to which I'm referring has likely already begun. In just the past few weeks, it seems, activity for us has dramatically surged. We are now either expanding work with, beginning work with, or discussing working with four of the five largest technology companies in the world. I'm going to share some examples of the surge in activity we've seen in just the past few weeks. A major cloud provider whose AI needs we began serving 24 months ago engaged us to help them build a new large-scale generative AI model for images. We started the initial phase of this just this week. In addition, the customer asked us just last week to kick off a pilot to support their generative AI large language model development. We started the pilot this week. With the same customer, also in the last few weeks, we expanded our synthetic data program to support its large language model development. We believe high-quality synthetic data is likely to be a key ingredient to high-performing large language models of the future. Synthetic data is entirely new data that we generate through a machine assisted process to match real world data and maintain all of the statistical properties of real world data, which is especially useful for capturing rare cohorts and outliers of interest. Synthetic data is also helpful to correct for data bias, to improve algorithmic fairness, and to avoid having to retrain proprietary or confidential data. We started working with synthetic data back in 2022, and we've been continually improving our capabilities and technologies for synthetic data creation since then. With this customer, we've gone from serving just one of their product lines to now being firmly engaged with three product lines, and we are on pilots with three additional product lines. Also, in the past couple of weeks, with another of the world's largest tech companies, this one a company that would be a new customer for us, we've gotten a verbal commitment to assist them on projects relating to large language models. They have told us that they are in the final stages of putting in place a statement of work. While there can be no assurance that the SAOW is put in place, based on our current estimations and assumptions, Value of this program could potentially approach approximately $1.8 million per year run rate in its initial phases and could ramp up to approximately $6 million per year as it gains momentum. In addition, two weeks ago, one of the world's largest social media companies, another potential new customer for us, reached out to discuss how we might potentially support its large-scale model development. It has been referred to us by one of our existing customers who apparently said that we could be helpful in unlocking value, unlocking scale, and by bringing a consultative approach to a partnership. We believe that the opportunities I've just mentioned individually and in the aggregate are potentially very large. I want to underscore that several of these are pipeline opportunities and at various stages of pipeline. from early stage to late stage. Pipeline opportunities are inherently difficult to forecast and often do not close. That said, I've offered them here in support of two beliefs. The first belief that there is building momentum among big tech companies for AI innovation generally and large language models specifically. And the second belief that InnoData's reputation for high-quality work with high-quality outcomes is becoming firmly instantiated in a dynamic market that is viewing us as a potential partner in one of our generation's greatest innovations. Now let's shift to our second significant market opportunity. We believe that our second significant market opportunity is to help businesses harness the power of these foundational generative AI models Most enterprises have tasks that generative AI can make easier. As the technology improves, and we expect it will, we believe that businesses will see incorporating the technology as a must have rather than a nice to have. Analysts are predicting that this year the most forward thinking business leaders will be actively putting time and money into reimagining their products, service delivery, and operations based on what AI can do for them leading to widespread deployments over 2024 and 2025. What we're also hearing, especially from CTOs, is that their biggest roadblock to deploying AI is finding the right engineers and data scientists to help them get there. We believe our opportunity will be to do just that, to help them get there. We anticipate that this will take the form of fine-tuning existing pre-trained large language models on specific tasks within specific domains bringing expertise in prompt engineering, the art of prompting large language models to produce the appropriate results, and helping with large language model application integration. Early in the first quarter of 2023, a large financial technology company expanded scope with us to leverage our proprietary AI models more fully and re-engineer their technology for the cloud to drive operational efficiencies. Our proprietary AI engine, GoldenGate, uses the same underlying encoder-decoder transformer neural network architecture as GPT. While GPT is trained broadly, GoldenGate is trained narrowly on specific tasks and domains. We have experimented with coupling GPT and GoldenGate, and this seems to result in even higher orders of performance. This is the third scope expansion we've had with this company over the course of the past six months, again, providing further validation of our land and expand strategy. We believe our third opportunity is to harness GPT and other large language models in our own AI industry platforms. Just last month, we announced PR Copilot, a new module within our Agility PR platform that combines proprietary inner data technology and GPT to enable communications professionals to generate first drafts of press releases and media outreach in record time. With our release of PR Copilot, we became, we believe, the PR industry's first integrated platform to incorporate large language model technology. The implementation was significant for InnoData, and we received a supportive write-up in PR Weekly for it. A startup named Jasper vaulted to unicorn status when it implemented something very similar to PR Copilot for creating blogs and social media postings. Their efforts got them a $125 million Series A round on a healthy $1.5 billion valuation. With respect to our agility platform, we're seeing positive momentum in key performance indicators, which we think PR Copilot and our newly integrated social media listening product will help to further accelerate. In Q4, agility platform sales grew 6% over Q3, which annualizes to a roughly 26% growth rate. In 2022 overall, our direct sales new logo bookings increased by 83% year-over-year, and our direct sales net retention increased to 100%. In 2022 overall, approximately 83% of our agility revenue came from direct sales, and 17% of our revenue came from channel partners. In Q4, our conversion from demo to win indirect sales increased to 33%, up from approximately 18% at the beginning of the year. We believe the notion that customers who use us love us is also very much apparent in our Synodex platform. Synodex grew by 71% in 2022, with a net retention of 168%. We announced in Q4 that one of our large Cinedex customers had expanded its recurring revenue program with us. In the announcement, we stated that the expansion was valued at approximately $600,000, but we now believe the value of the expansion is actually closer to $1.2 million. This is now our second largest Cinedex customer with an estimated annual recovering revenue base of $2.3 million. This year, we will be focused on product development to expand our addressable market for medical data extraction. We've got new products currently being evaluated by charter customers in disability claims processing, personal injury claims processing, and long-term care claims processing, as well as in clinical medical data annotation and fully automated life underwriting. Integrated AI will be a feature in all of these products. We are more enthusiastic than ever about our market opportunity and the intrinsic value of our business. In our last call, we said we anticipated expanding our adjusted EBITDA to $10 million or more in 2023, and at the same time capturing significant growth opportunities. We believe the activity we are now seeing in our markets will likely enable us to achieve this and potentially more. I'll now turn the call over to Maryse to go over the numbers, and then we'll open the line for some questions.

speaker
Amy

Thank you, Jack. Good afternoon, everyone. Let me recap the fourth quarter and fiscal year 2022 financial results. Our revenue for the quarter ended December 31, 2022 was $19.4 million, compared to revenue of $19.3 million in the same period last year. Our net loss for the quarter ended December 31, 2022 was $2 million, or $0.07 per basic and diluted share, compared to a net loss of $1.2 million, or core cents per basic and diluted share in the same period last year. The total revenue for the year ended December 31st, 2022 was 79 million up 13% from revenue of 69.8 million in 2021. Net loss for the year ended December 31st, 2022 was 12 million or 40% per basic and diluted share compared to a net loss of 1.7 million or $0.06 per basic and diluted share in 2021. Adjusted EBITDA was $2 million in the fourth quarter of 2022 compared to adjusted EBITDA of $0.3 million in the same period last year. Adjusted EBITDA loss was $3.3 million for the year ended December 31, 2022 compared to adjusted EBITDA of $3 million in 2021. Our cash and cash equivalent and short-term investment were 10.3 million at December 31st, 2022, consisting of cash and cash equivalent of 9.8 million and short-term investment of 0.5 million and 18.9 million at December 31st, 2021, consisting of cash and cash equivalent. So thanks everyone. John, we are now ready for questions.

speaker
Jack

Thank you.

speaker
Tim Clarkson

At this time, we will be conducting a question and answer session. If you would like to ask a question, please press star 1 on your telephone keypad. A confirmation tone will indicate your line is in the question queue. You may press star 2 if you would like to remove your question from the queue. For participants using speaker equipment, it may be necessary to pick up your handset before pressing the star keys. One moment while we poll for questions.

speaker
Jack

The first question comes from Tim Clarkson with Van Clemens.

speaker
Tim Clarkson

Tim, please proceed.

speaker
Tim

Hi, Jack. Apologize if you hear noise in the background. We had a major snowstorm in Minneapolis and the snow is coming off our buildings. So anyhow, the first question I have is, you know, what exactly is the technology experience that makes InnoData, you know, successful at moving forward artificial intelligence in these chatbots?

speaker
John

Sure, Tim. Well, you know, it's not limited to chatbots, you know, artificial intelligence, you know, people believe in, and I firmly believe is at a kind of a fundamental inflection point. You know, we're now seeing the kinds of technologies that people have dreamt about probably since the 1950s. And when you think about building these technologies and think about what goes into them, It's not programming in the traditional sense. It's data. It's high-quality data and data that can help to address some of the fundamental problems that these technologies have. They need to improve their output quality. They need to improve languages they're supporting. They need to be customized for particular domains. They need to improve what we think of as safety, the kinds of responses, the kinds of things that they tell us. So, you know, well, what needs to be done for that? The things that need to be done are the things fundamentally that we've done for a very long time, very, very successfully for some of the largest, you know, companies out there. You know, when we were being retained by, you know, the, you know, large engagements we've had in the past, things like Apple, you know, what we were doing for them was fundamentally building large-quality data or high-quality data for, you know, products and for publishing. Here we're building it because data is the programming language of AI and the programming language of large language models.

speaker
Tim

Right, right. And what typically are the gross margins on the revenues you're getting from this? Are these high gross margin products?

speaker
John

I think from a gross margin perspective, I would continue to expect a range of gross margins from our different capabilities. You know, in the services sector, I think, you know, mid-30s to mid-40s, you know, gross margins are achievable, and they'll get better over time. As we introduce, you know, automations and technology, they tend to drift higher. When we're starting up new projects, they tend to drift a little bit lower. On the platform side, they're higher than that. You know, incremental gross margins especially can be, you know, very substantial. And as we, you know, build scale and start to scale on our fixed costs, we can start to see the kinds of gross margins that will emerge from those kinds of business models.

speaker
Tim

Right, right. Well, just one last comment. You know, I've been all excited about interdata lately. And, you know, the analogy I use is it's, you know, what's happened with artificial intelligence is before it was sort of like da Vinci, you know, seeing a picture of an airplane. But it's one thing to see a picture of an airplane. It's another one to see one fly by you and go from Minneapolis to New York. And when you actually see this artificial intelligence stuff work, you no longer have to be sold on the value. I mean, it's magical. So that's for me, that's the difference is people are really excited about the end product. And emotion is what drives, you know, ultimately decision making. And, you know, there's real excitement behind this. So with that, I'm done. Thanks.

speaker
Tim Clarkson

Thank you, Tim. The next question comes from Dana Busca with Felto and Company. Please proceed.

speaker
Tim

Hi, Jack. How are you today? Dana, I'm doing great. Thank you.

speaker
John

Welcome to the call.

speaker
Tim

You're welcome. Yeah, thank you for taking my questions. My first question is with your new PR co-pilot. You talk about your own technology, and I was just wondering if – your technology is more than just accessing an API at OpenAI?

speaker
John

So it is. I mean, it's basically what we're doing, and we've got a very exciting roadmap. We really believe we've only just got started with ChatGPT. But what we're doing is we're combining an API with prompt engineering that we've done behind our UI and behind the scenes And in the future, what we're going to be doing is we're going to be enriching the training data to more specifically perform within the domain. We've also done other things in terms of that, and we will be doing other things in order to, you know, further enrich the experience. So when you take the two, you know, fundamental use cases we're addressing, one for writing press releases and one for, you know, writing media outreach, We're looking at, well, what goes on in someone's head when they're looking to do either one of those things? What connections are they making? Were they having to go and research? And, you know, built into our product and built into our, you know, underlying data model are lots of connections that we're able to harvest and bring then into, you know, into the prompts in order to create a more precise level of output. So it's very much, you know, a combination of these things. Now, you know, one of the reasons I'm excited for several reasons, but one of the things that it's enabling us to do is to, you know, create a lot of value for PR, but at the same time, learn a lot about how do you integrate these technologies to create a superior, you know, customer experience. And we're able to bring that experience in turn to the work that we're doing for other customers. So it's... It's great fun and it really is a new frontier.

speaker
Tim

That sounds wonderful. I was wondering if you were going to be able to apply your co-pilot technology to other industries or other companies or other applications.

speaker
John

So we're certainly talking to other companies about this. We're looking at some opportunities to apply large language models within our Synodex platform. That's kind of early days, though, so I wouldn't, you know, I'd encourage you not to expect anything out of that very quickly. We think the roadmap, though, for, you know, PR Copilot is fairly extensive, and we've really only begun down that path. So we're very excited about that.

speaker
Tim

Excellent. With agility, where are you projecting breakeven for that it's going to be?

speaker
John

So I don't think we've put out a number, you know, relative to that. But what we've said is that we think we are going to be getting that business to, you know, to become, you know, an adjusted EBITDA positive business in the first half of the year.

speaker
Tim

Okay. Excellent. And in the past, you've given out projections for the first quarter, and I don't know if I missed it or not, but do you have any type of projection in revenue growth that you're thinking for the first quarter?

speaker
John

We've decided not to, you know, this year for the most part, we're probably not going to provide forward-looking guidance only because of the level of activity that's going on in the business. You know, it so substantial and coming on so strong that being able to reduce these things to forecasts and know when something's gonna close and know what it's gonna look like and how it's gonna wrap up is almost impossible. So the likelihood that we would be wrong and maybe even significantly wrong is pretty substantial. Therefore, we've gone the other direction which is kind of, which I think is, you know, materially disclosing, you know, here's what's going on in the business. Here's what's coming our way. You know, we're now working with four of the five largest technology companies who are fundamentally driving what will probably be the innovation of our lifetimes. We'll no doubt look back on it, you know, as that. And, you know, we're partnering with them. We're involved, and that opportunity is coming our way. So long story short, we're going to stay out of the guidance business right now, but we're going to try to disclose what we're doing and the level of activity that we're seeing.

speaker
Tim

Okay. Can we anticipate there'll be a growth year?

speaker
John

Yes, we're very much focused on growth. So yes, is the easy answer to that.

speaker
Tim

Okay, excellent. All right, thank you.

speaker
Tim Clarkson

Once again, if you have a question or a comment, please press star 1 on your touchtone phone. Once again, that's star 1 if you have a question or a comment. The next question comes from Marco Petroni with NG Capital Management. Please proceed.

speaker
Marco Petroni

Hey, Jack. How are you? Marco, hi. How are you? Good. A couple questions. One, everybody has AI and machine learning algorithms. and they put them to different uses. But is there any company out there that you know of that combines that with the ability that you have on the data side with regards to organizing, collecting, and overlaying synthetic data on top of that? Is there anybody out there, including the big guys, that can do that?

speaker
John

Well, I think there are. I think there are a couple of companies that are doing some things that are similar to us, though not very many. you know, we've kind of got a view of the world that we can do, you know, two things well. And we think that there's like a virtuous circle that forms when we do the three things well. First is AI data preparation. You know, we're helping large companies accelerate, you know, their ability to, you know, innovate in AI by, you know, doing the things that we do on the data side. Second thing is we're then helping deploy those models and integrate them into people's businesses. So we're helping build the models, and then we're helping integrate the models. And then thirdly, we have our own platforms, and we're learning the hard way. We're eating our own dog food, and we're figuring out how to do it for ourselves first. So we then develop the expertise to bring to both the data collection. Well, how do you collect data in a way that results in high-performing models? And then on the model deployment, how do you best deploy models in legacy workflows and legacy systems? What are the opportunities for reinvention that you can bring to bear?

speaker
Marco Petroni

The chat GPT is great to go in, but I've had experiences where I put the same data in and get different answers. Obviously, that can't be used within a company. Do you guys have the same capabilities of OpenAI in terms of creating those type of chat GPTs within an organization specifically for an organization? So, for example, their call centers or internally to be able to use that to interact amongst employees as well as customers.

speaker
John

Yeah, so the essential architecture behind GPT is an architecture that is also behind our proprietary GoldenGate technology. Now, do we have the same ability to stand up something that performs the way that that one does in a generalized way? Absolutely not. We don't have the budgets. Hundreds of millions of dollars was likely spent on getting it trained to the level that it's been trained on. There's a tremendous amount of data that gets poured in to create the billions and hundreds of billions of parameters that drive that model. You know, a tremendous amount of cloud processing went into that. We cannot do that. But what we can do and what's the future of the way these things will work is we can build on those. We can train them. We can customize them. We can use what's called reinforcement learning from human feedback in order to train customized, you know, domain customized models with private side data to enable them to perform better. That's really going to be the future of this. So, you know, we'll see. The big companies with the large language models, proprietary to them large language models, we will help them build those, but we won't be able to fund those ourselves. But then what we'll be able to do is customize them and build upon them in order to create business outcomes for people.

speaker
Marco Petroni

And just one last question. You guys are trading at $200 million, roughly two times revenue. What is the company going to do going forward Now that, I mean, obviously AI is everywhere, what is the company going to do to market our stock, basically, and get out there? I know earnings and revenue are great, but to get out there and do that, what do you have planned in the next coming months and quarters?

speaker
John

So, you know, I think the most important thing we can do for, you know, our shareholders and, you know, of course I'm prominent among our shareholders, is to continue to do the things that we're doing. You know, the fact that we're in four out of five of the largest technology companies helping to develop what will be, you know, a transformative technology that's still in its infancy and will need a lot of work over the next several years, you know, that we've gotten there and we're doing that, you know, I think is huge that we've gotten there and are doing that without having to go out into the markets and dilute our equity and raise a ton of debt. I think it's an impressive feat. Now, how do we better promote that? I think the first thing starts with execution and then what follows from that is Lots of conversations with people who, you know, I'm hoping will be attracted to the execution that we're bringing to bear. And, of course, looking at some of the techniques that people use, conferences and outreach and talking to analysts and all of those things. But fundamentally, we're going to be about execution.

speaker
Marco Petroni

No, absolutely. But, I mean, you know, I've been a shareholder for two years, and nobody really knows about us. Every other AI company is trading at, you know, a 510 price. 15 multiple on revenue. We're trading at two times. And going forward, we have the potential of growing at 30 plus percent. It seems like we're pretty undervalued here compared to the sample. But thank you. Thanks, Marco.

speaker
Tim Clarkson

Okay, the next question comes from Craig Samuels with Samuels Capital Management. Please proceed.

speaker
Craig

Hey, Jack. How are you? Hey, Craig. How are you? I'm well, thanks.

speaker
Operator

Pretty good, thank you. Back in several quarters ago, you talked about your total number of sales reps for both agility and then service solutions side. Where do you stand today with the numbers of sales reps?

speaker
John

So on the services solutions side, on the AI side, we're about at the same number that we last shared. We've taken down the number quite a bit on the agility side. And a couple of things went into that decision, first of which was in the beginning of the year we were having a hard time retaining people, very specifically in one of the sales offices that we put up. There was a labor shortage that was pretty well known. We were in Austin, Texas, where a lot of SaaS companies were, and they were overpaying, as far as I'm concerned, for talent. And we didn't want to play that game. So what we decided instead was, let's be good stewards of capital. Let's not play the game of overpaying for talent. Let's instead work with a smaller number. And we've had great success retaining very, talented salespeople in others of our offices. Let's retain them. Let's work on it. Let's build a sales organization that has very much a data-driven approach to sales and to optimizing our customer experience and build from there. And I think that's proving to be the right decision. As we look out at agility and we kind of see what's going on now, we see You know, new logo booking up 83% year over year. You know, our net retention, you know, going from, you know, the 90s up to 100% now. Significant performance improvement relative to the number of demos that we do that end up in closed sales going from like 18% at the beginning of the year to 33% now. So with that, we will, I believe, see acceleration and growth, and it's always easier to throw logs on a fire that's burning strongly. So that's kind of where we are. Right.

speaker
Operator

So I don't remember the last numbers that you had. Can you share them for services? Going back in time, I seem to remember it was like six or seven, and on the agility side, You had a target of 110 and the last numbers I have in the 67 zone. It's been a little while. Can you actually tell us?

speaker
John

Sure. I think what we said was we had nine folks, you know, quota-bearing executives in the services solutions area. And then we had a combination of... about 90 people, 42 quarter bearing people in agility and another 37 BDRs. In agility, we've reshuffled those numbers pretty considerably in terms of the mix and the workflows. And we brought that number down quite a bit. I don't have the current number to share with you. Happy to do that off the call when I go get that. But, you know, we've brought that down and we're getting the performance off of that, you know, smaller cohort, which is at the end of the day what it's all about.

speaker
Operator

Right. And then on the nine service sales reps, I recall from, again, this is probably two years ago.

speaker
Jack

Greg, sorry, I think you dropped off. One moment. I'll reconnect, Craig. Okay. Craig, can you hear us?

speaker
Craig

I can hear you. Are you there?

speaker
Tim Clarkson

Okay. Your line is live.

speaker
Operator

I'm not sure what happened. Yeah, hi, Craig. I'm not sure what happened. I had asked about productivity, and I seem to remember about a million and a half dollar quota per service AI sales rep. Is that still consistent with where you guys are today, or has that number gone up or down?

speaker
John

Yeah, so I think we put that out there with an average. In the services and solutions area, the quotas are actually derived from the account assignments. So depending upon the accounts that people are working on, they can be fairly significantly different from that. They can be much higher than that. And an entry-level person who's kind of building his account base can be lower than that.

speaker
Operator

Got it. And then also, NVIDIA had some news regarding the data center providing computer power for AI, and just wondering if that helps you or if that's competitive.

speaker
John

No, I think it's very much supportive of the value proposition. You know, I was on their call, and I read their, you know, what was said on the call, and... You know, I think the way they're viewing the opportunity is very much the way we're viewing the opportunity. They're looking at it from a different perspective. They're looking at it from a perspective enabling it from a processor side. We're looking at it from a perspective of enabling it with data. And, you know, these are two sides of the same coin as far as I'm concerned.

speaker
Operator

Yep, that's exactly what I thought. I just wanted to hear you confirm that. And then lastly, Would you expect your gross margins to increase over the next 12 to 36 months? Will there be a greater software component or will it still be, you know, heavily weighted towards services?

speaker
John

You know, I think it depends on kind of what happens. Given the activity that we're now seeing of great significance, I think we will continue, if we're successful at closing the opportunities that are before us, I think we're going to continue to see a very heavy weighting toward solutions and services from a consolidated margin perspective. I'm willing to live with that problem, though. Right.

speaker
Operator

So that means still in the below 40?

speaker
John

Not necessarily. I think that, you know, as we start to move you know, execute the plan, we will be able to move above 40% over time, but probably still below 50. Again, you know, if it remains to be seen what we're able to deliver on the platform side, but if the opportunity is as large as we're hoping it is on the solution side, you know, I think it will wait toward that. Yeah.

speaker
Operator

Sounds good. Keep up the good work and look forward to continuing to monitor your progress over time. Thanks. Thank you.

speaker
Tim Clarkson

We have reached the end of the question and answer session, and I will now turn the call over to Jack for closing remarks.

speaker
John

Thank you, operator. So, yeah, I'll quickly recap. You know, we're seeing very recent acceleration in AI investment by large tech companies. seems to be coinciding with OpenAI's release of ChatGPT. We're now either expanding work with or beginning work with or discussing, starting work with four of the five largest tech companies in the world. And much of what is under discussion has to do with building and improving large language models. Very excited about where we are with these companies and excited about where we are with a host of similarly impressive companies across other domains. You know, even though forecasting exact close dates remains challenging, we think we're in the right place at the right time to ride this wave. We're seeing positive trends across our other business segments as well. Synodex growth last year was huge. It's well positioned to expand in its market this year. Agility, you know, supported by what we believe was a very successful release of PR Copilot. We continue to make great strides in win rate, net retention, bookings, all of which are, of course, leading indicators of accelerating growth. So very excited to be here today, very excited with the news that we're sharing today, and thank you all for participating in this call. We'll look forward to our next call with you.

speaker
Tim Clarkson

This concludes today's conference, and you may disconnect your lines at this time. Thank you for your participation.

Disclaimer

This conference call transcript was computer generated and almost certianly contains errors. This transcript is provided for information purposes only.EarningsCall, LLC makes no representation about the accuracy of the aforementioned transcript, and you are cautioned not to place undue reliance on the information provided by the transcript.

-

-