Web3-AI is one of crypto’s hottest spaces, with a lot of hype and great potential. The number of Web3 AI projects with multi-billion-dollar market caps, but no practical applications is almost heretical. They are driven by the proxy narratives in the traditional AI industry. The gap between Web2’s and Web3’s AI capabilities continues to grow alarmingly. Web3-AI, however, is not just hype. Recent developments in generative AI highlight the value proposition for more decentralized approaches.

All these factors combined, we are in a market that has been overhyped, overfunded, and is disconnected from the current state of the generative AI sector, but which can unlock tremendous value for the future wave of generative AI. It’s understandable to feel confused. When we take a step back and look at the Web3-AI market through the lens of current needs, it becomes clear where Web3 is able to deliver significant value. This requires cutting through the dense reality distortions.


Web3-AI Reality Distortion

We are crypto natives and we see decentralization as a positive in all things. AI is becoming increasingly centralized in terms of computation and data. The value proposition for decentralized AI must start by addressing this natural centralization.

There is a growing mismatch in the AI market between what we believe to be the value that Web3 can create and the AI needs. It is a worrying reality that the gap between Web2 AI and Web3 AI continues to widen rather than narrow, primarily due to three factors:

Advertisement

Limited AI Research Talent
Web3 has a small number of AI researchers. It is not encouraging news for those who believe that Web3 will be the future of AI.

Constrained Infrastructure
Web3 infrastructure imposes computational constraints that are impractical for the lifecycle of generative AI solutions. Web3 infrastructure has computational limitations that make it impractical to implement generative AI.

Limited models, data, and computational resources
Models, data and computation are the three main components of generative AI. There are no large models that can run on Web3 infrastructures. There is also no basis for large datasets.

Web3 is building an inferior version of AI to try and match Web2’s capabilities. This reality contrasts starkly with the enormous value proposition of decentralization across several AI areas.

Let’s not make this analysis an abstract thesis. Instead, let’s look at different decentralized AI trend and assess them in relation to their AI market potential.

Read more: Jesus Rodriguez, Funding Open Source Generative AI with Crypto

Web3-AI’s reality distortion has caused the initial wave to be dominated by projects with value propositions that seem disconnected from the AI market. Web3-AI also has other areas that are gaining momentum.


Some Overhyped Web3-AI Trends

Decentralized GPU infrastructure for training and fine-tuning
We have seen a boom in decentralized GPU infrastructures over the past few years. They promise to democratize the pretraining of foundation models and their fine tuning. The idea is that this will provide an alternative to the GPU-monopolization established by existing AI labs. In reality, the pretraining and fine tuning of large foundation models requires large GPU clusters connected by super-fast communication busses. Pretraining a foundation model of 50B-100B in a decentralized AI system could take a year if it is successful.

ZK AI Frameworks
Interesting concepts have been developed to implement privacy mechanisms into foundation models by combining AI and zero-knowledge computations. The prominence of Web3’s zk infrastructure has led to several frameworks promising to embed zk computing in foundation models. Despite being appealing in theory, zk AI models are prohibitively expensive to compute when they are applied to large models. Zk also limits aspects like interpretability, one of the most promising areas for generative AI.

Proof-Of-Inference
Cryptography is all about cryptographic evidence, but sometimes it’s attached to things which don’t require them. We see frameworks that issue cryptographic proofs for specific model outputs in the Web3 AI space. These scenarios present challenges that are not technical but rather market-related. Proof-of-inference, as a general rule, is a solution that’s looking for a need and has no real-world applications today.


Web3-AI trends with high potential

Agents With Wallets
Agentic workflows represent one of the most exciting trends in generative AI, and they hold a lot of potential for crypto. Agents are AI programs that not only answer questions passively based on inputs but can also perform actions in a specific environment. Most autonomous agents are designed for isolated use-cases, but we’re seeing the rapid rise of multiagent environments.
Crypto can be a game changer in this area. Imagine a scenario in which an agent must hire another agent to complete a job or stake some money to ensure the quality of their outputs. Agents can be provided with crypto rails to unlock many collaboration scenarios.

AI Crypto Funding
The open-source AI community is experiencing a severe funding crunch. This is one of the most well-known secrets about generative AI. The majority of open-source AI laboratories can no longer afford to work with large models and are instead focusing on areas that do not require massive amounts data and compute access. Crypto can be a very efficient way to raise capital through mechanisms like airdrops, incentives and even points. One of the most promising intersections of these two trends is the concept of crypto funding rails to open-source generative AI.

Small Foundation Models
Microsoft coined the phrase small language model (SLM), last year, after the release its Phi model. With less than 2B parameter, it was able outperform larger LLMs on computer science and mathematics tasks. Small foundation models, such as those with 1B-5B parameter values, are essential for decentralized AI to be viable and for promising scenarios in on-device artificial intelligence. Decentralizing multi-hundred-billion-parameter models is nearly impossible today and will remain so for a while. Small foundation models can run on most of the Web3 infrastructures available today. It is crucial to push the SLM agenda in order to build real value using Web3 and AI.

Synthetic data generation
The scarcity of data is one of the greatest challenges for this new generation of foundation models. In response, research is increasingly focused on mechanisms for synthetic data generation using foundation models to complement real-world datasets. The mechanics of token incentives and crypto networks can coordinate a number of parties in order to create new synthetic datasets.


Web3-AI Trends Relevant to Other Sectors

Other Web3-AI trends have significant potential. The need for proof-of-human outputs has become more relevant as AI generated content becomes more complex. Web3’s trust and transparency can be a real asset in the AI segment of evaluation and benchmarking. Web3 networks can also be used to fine-tune human-centric systems, like reinforcement learning with feedback from humans (RLHF). As generative AI evolves and Web3-AI capabilities become more mature, other scenarios will likely emerge.

It is a very real need to have more decentralized AI capability. Web3 may not be able to compete with the AI mega-models in terms of value, but it can bring real value to the generative AI sector. Web3-AI’s evolution may be hindered by its own reality distortion. Web3-AI is a valuable tool; we should focus on creating real products.