This article was originally published on Synnada on April 5, 2023.
The AI/ML landscape is reminiscent of a gold rush, where everyone is vying for a stake in the game. Like any gold rush, it’s not just about discovering gold, but also providing the shovels. In the next phase of AI/ML, as David Aronchick highlights, this involves supplying “specialized electricity” in the form of compute, storage and networking resources. Numerous startups are emerging in this space, each purporting to have the best shovel(s) in the market. To create scalable, enduring systems and avoid vendor lock-in; engineers must scrutinize the tools they employ and concentrate on discovering the most cost-effective, interoperable, and sustainable solutions.
Naturally, providers of specialized electricity are incentivized to promote resource consumption as greater consumption translates to increased revenue for them. This unfortunate dynamic fuels innovations that amplify resource consumption while neglecting the utility derived from those resources. We often encounter solutions that promise utility but end up encouraging consumption, ultimately leading to a cycle of waste. Examples include:
- Data movement solutions like ELT/ETL that quickly turn into cost centers ,
- Telemetry/monitoring tools that encourage data replication and ingest at the expense of efficient resource utilization ,
- Resource-intensive ML solutions like AutoML tools and hyperparameter optimizers that increase resource consumption without necessarily enhancing performance significantly,
- Tool complexity solutions like xOps that add further complexity to already complex systems, aggravating resource consumption and vendor lock-in issues.
Numerous industries, both within and outside the tech sector, face analogous challenges and exhibit similar behavioral patterns. This resembles a pendulum swing: There’s a strong initial propensity to gravitate towards a direction where eager adoption of innovations takes priority. However, as adverse effects of these trends become evident, participants start seeking optimization and sustainability — causing the pendulum to swing back towards utility. For the AI space, we predict the following pendulum swing trajectory:
- Abundance: We expect a surge in specialized electricity providers catering to the next phase AI/ML transformation, with headlines boasting rapid revenue growth for various companies. This explosion will trigger a wave of innovation and activity, attracting numerous newcomers to the market.
- Crunch: As participants recognize their inability to extract desired marginal utility from these specialized electricity providers, they will begin scrutinizing their resource consumptions and try to reduce “waste”. Many providers will fail during this phase, with only those offering genuine utility surviving.
- Normalization: The market will stabilize and enter a gestation period in anticipation of the next innovation wave. During this phase, participants will emphasize sustainability, cost-effectiveness and efficiency; seeking tools and solutions that provide long-term value over short-term gains.
Given that the AI/ML-infra space is ultimately a subset of the B2B data-infra space and each vertical’s pendulum is influenced by its parent vertical, the crunch phase might occur earlier in this domain compared to others. The broader tech world is moving towards a utility phase as companies cut costs and seek ways to optimize their infrastructure spending. Consequently, we are witnessing the emergence of utility-first players (e.g. Bluesky vs. Snowflake, Edgedelta vs. Splunk and Mezmo vs. Datadog) that are well-positioned to excel in a concentrated market with distinct winners. In this context, we predict the AI/ML-infra space to transition into the crunch regime sooner than one may expect, as the market consolidates and prioritizes sustainable, cost-effective solutions very early-on.