Flybridge AI Index: May 2024 Update
Summary:
In May, the Flybridge AI Index had a return of 2%. For the prior 12 months, the Index had a return of 35% to its current level of 104% since January of 2023.
Of the 28 companies in the Index, 19 gained in May, and 9 declined. Significant gainers include NVIDIA (+32%), C3.AI (+29%), and ARM (+26%). Significant decliners include MongoDB (-37%), UiPath (-34%), and Nice (-17%)
Innovation in the AI market continues, with several companies reporting new capabilities and models. The infrastructure companies are predominantly driving growth, with both NVIDIA and Micron reporting strong results, but many of the case studies from early application-level deployments are encouraging.
We added two companies to the Flybridge AI index: Qualcomm and Micron, the two companies are included in the Index from its inception and the Index's performance has been adjusted accordingly.
Our Thoughts
Several companies in the Flybridge AI Index reported quarterly earnings in May. The results reflect a common theme: the hardware and infrastructure companies saw significant growth, while the more application layer companies discussed tightening budgets and longer sales cycles. This is consistent with our perspective that the initial wave of growth in the AI sector is being driven by capability building (ie training). At the same time, we and many of the reporting companies are optimistic about how AI will drive growth “up the stack”. As noted in the MongoDB earnings call, most of their customers are not yet in production with GenAI use cases. It will take several months to see significant revenue coming from AI.
AI is both an opportunity and a threat for some companies in the Index. For example, Salesforce saw a massive decline in its growth rate in the quarter. AI may represent a significant opportunity to leverage their petabytes of data, but it could also empower newer players to take market share.
We look forward to seeing how these AI strategies will unfold and impact the performance of our index in the coming months.
AI Insights and News:
Here are some noteworthy AI-related insights and news from companies within our index:
NVIDIA had a blowout quarter in Q1-24. Revenue grew 262% YoY, driven by data center revenue which grew 427% YoY driven by strong demand for the Hopper GPU platform. Other insights from the NVIDIA earnings call included:
For every $1 spent on NVIDIA AI infrastructure, cloud providers have an opportunity to earn $5 in revenue over 4 years
NVIDIA supported Tesla's expansion of its training AI cluster to 35,000 H100 GPUs.
Inference is estimated to have driven about 40% of Data Center revenue in the trailing 4 quarters, with both training and inference growing significantly.
UiPath’s CEO, Daniel Dines, participated in the MAD podcast and shared part of the company's strategy around AI. Dines highlighted how Gen AI was the missing piece for their automation strategy, as it allows the company to leverage unstructured data. They have an LLM T5 model (he shared that encoder/decoder models are better for semi-structured documents). This interview is worth listening to as the UiPath journey is both amazing and insightful.
IBM provided open access to its “Granite” family of models, which allows customers to customize the models (and then run them in Watsonx). (Source)
C3.AI reported their Q1-24 earnings. Some interesting insights included:
In fiscal year '24, 88% of C3 bookings were driven by AI application sales and 12% by the C3 AI platform
Baker Hughes' sourcing optimization, using C3 AI, is deployed across 855 sites, offering a potential savings of $100 million a year
C3 generative AI launched 30 generative AI products in fiscal year '24, with almost 50,000 inquiries from 3,000 businesses in Q4 alone
Hubspot reported their Q1-24 earnings. Some interesting insights included:
HubSpot's Spring Spotlight featured over 70 new AI feature releases
Over 50% of enterprise portals and more than 25% of Pro portals are utilizing AI for personalized content generation, call summarization, and automating go-to-market motions.
Marvell Technologies reported its Q1-24 earnings. Some interesting insights included:
AI-driven data center revenue reached $816 million, showing 87% year-over-year growth.
AI-driven custom compute programs are expected to increase significantly, with a projected AI revenue floor of $1.5 billion for this fiscal year.
The data center and AI market are expected to be major growth drivers, with the AI-driven data center expected to grow from $21 billion in 2023 to $75 billion by 2028, at a 29% CAGR.
MongoDB reported their Q1-24 earnings. Some interesting insights included:
The release of the MongoDB AI Applications Program (MAAP), an initiative to help organizations rapidly build and deploy generative AI-enhanced applications at enterprise scale through strategic advisory, professional services, and an integrated technology stack. This includes partnerships with leading players in the AI stack such as Anthropic, Anyscale, AWS, Cohere, Credal, LangChain, LlamaIndex, and Nomic, among others.
Leveraging GenAI to significantly reduce the time, cost, and risk of modernizing legacy relational applications, potentially decreasing the effort required by approximately 50%.
Palantir reported their Q1-24 earnings. Some interesting insights included:
General Mills reported saving $14 million annually due to their deployment of Palantir’s AI solutions.
Salesforce reported their Q1-24 earnings. Some interesting insights included:
Salesforce launched Einstein Copilot, Prompt Builder, and Einstein Studio in Q1, with hundreds of Copilot deals closed since launch, indicating strong customer adoption of generative AI tools.
Einstein is generating hundreds of billions of predictions daily and trillions weekly.
Internally, Salesforce integrates AI in Slack, answering 370,000 employee queries in a quarter and saving developers 20,000 hours of coding per month.
Snowflake reported their Q1-24 earnings. Some interesting insights included:
About 40% of Snowflake customers are processing unstructured data on the platform. Over 1,000 customers have been added to this category in the last six months.
Snowflake announced that their AI layer, Cortex, is generally available, with over 750 customers already using it.
Note: These insights are focused on AI developments and priorities as discussed in management publications, earnings calls, and other company announcements.
Performance Overview:
Inception (January 2023) to date returns for the Flybridge AI Index are 102%. In comparison, over the same time period the Cloud Bessemer Index returned 23%, the S&P 500 returned 36%, the Nasdaq returned 60%, and the F-Prime Fintech Index returned 68%.
The median NTM revenue multiple decreased from 10.9x to 9.9x.
The median quarterly YoY revenue growth rate was 16%.
The median LTM net income margin was 13%.
Additions/Deletions:
We had amazing feedback from the community after releasing the Index. Thanks to the contribution of George Lin, we realized we missed two companies in the Index that we now welcome. Congratulations to Qualcomm and Micron on their inclusion.
Qualcomm: Qualcomm's AI platforms are designed to enhance on-device AI capabilities, providing efficient and powerful processing for machine learning and inference tasks. They recently released the ARM-based processor - Snapdragon X Elite, capable of running generative AI LLM models with over 13B parameters on-device at blazing-fast speeds. As on-device AI grows, Qualcomm will play a central role. In a recent earnings call, they also mentioned:
Launched the Qualcomm AI hub, a gateway for developers to enable at-scale commercialization of on-device AI applications. It features a library of approximately 100 pre-optimized AI models.
Next-generation Windows AI PCs powered by Snapdragon X Elite are optimally positioned to lead the transition to true AI PCs
In XR, Ray-Ban Meta glasses powered by our Snapdragon AR1 Gen 1 platform continue gaining consumer traction.
Micron: primarily manufactures memory and storage solutions, which are critical components for various computing systems, including those used in AI applications. The company shared in their recent earnings call that:
AI server demand is driving rapid growth in HBM (High Bandwidth Memory), DDR5, and data center SSDs, tightening leading-edge supply availability for DRAM and NAND.
Micron views itself as one of the biggest beneficiaries in the semiconductor industry from the multiyear growth opportunity driven by AI
New AI systems, such as Nvidia's next-generation Blackwell GPU architecture, feature significant increases in HBM content, which is supported by Micron’s HBM3E product.
Micron is developing advanced server DRAM modules and high-capacity DRAM products to support AI workloads