Embracing the Era of Decentralized AI

In late 2024, a roundtable discussion with five founders from CoinFund's portfolio highlighted the potential of a decentralized approach to AI's technology stack. This stack encompasses GPU compute aggregation, decentralized AI training, cost-effective inference, and data acquisition, all of which pose significant challenges. However, it became clear that these decentralized AI founders are not merely integrating AI into Web3 but are actually pushing the boundaries of AI itself. Furthermore, it dawned on me that for retail investors seeking financial exposure to cutting-edge AI, digital assets might offer the best opportunity, rather than public stocks or private placements. This realization led to the prediction of an 'explosion of growth' in the coming months, dubbed 'deAI Summer 2025.' Eight months later, this growth is now a reality. We are witnessing a world where proof-of-concept models have been successfully pre-trained and post-trained on decentralized networks. The likelihood of achieving 100 billion parameter pre-training on decentralized networks this year is high. Some companies are even exploring the possibility of developing frontier intelligence models on decentralized networks through post-training and reinforcement learning. Notably, Jack Clark, co-founder of Anthropic, has written about decentralized training on his personal blog. The development of AI is at a critical juncture. As large corporations like OpenAI, Microsoft, and Google focus on capturing consumer markets, the use of AI tools in personal and professional workflows is increasing, but so is the risk of private data exposure due to centralized control. This is why decentralized AI (deAI) has emerged as a crucial category for Web3, with the potential to revolutionize how models are built and owned. According to data from the analytics platform Kaito, deAI has captured over 30% of mindshare in crypto over the past 12 months. Companies developing distributed compute protocols, agent frameworks, and decentralized marketplaces are dominating this space. Since there are few public equities offering direct exposure to frontier AI technology, and most retail investors lack access to private rounds of frontier labs, deAI crypto assets are becoming a highly sought-after investment class for AI companies. Amidst the hype surrounding deAI, CoinFund has focused on identifying teams that are solving the most impactful and challenging problems, including the feasibility of training large, competitive models on decentralized networks. Historically, training such models has required high-end GPUs with fast communication bandwidth, leading many to believe that decentralized training is not viable due to bandwidth constraints. Previous attempts at compressing bandwidth have been unsuccessful, causing models to fail to converge. However, we disagreed with this assessment. Our thesis at CoinFund is that Web3 networks will enable AI models to be crowd-resourced, trained, owned, and maintained as open, valuable public goods. These networks will aggregate significant compute power, compete at the frontier of AI, and drive massive innovation by solving problems such as bandwidth optimization, error detection, fault tolerance, and model sharding. Additionally, these networks will have business models, where users pay for inference on models, and users are incentivized and aligned by being compensated for funding and training models, giving them a stake in the network. In our portfolio, we have backed companies like GenSyn, Prime Intellect, and Pluralis, which have made remarkable progress in decentralized pre-training and post-training. For instance, GenSyn has aggregated over 85,000 participants in its Swarm RL testnet, while Prime Intellect has trained a 32 billion parameter model using distributed compute. Pluralis has also demonstrated that lossless compression can enable large-scale model pre-training on decentralized networks. We have also supported data management platform Perle, which has published a paper on human annotation in machine learning, and model fine-tuning platform Bagel, which has introduced a framework for zero-knowledge fine-tuning. Our portfolio company Gizmo has debuted ARMA, an AI agent that has gained traction in DeFi by helping users optimize stablecoin yields across different protocols. As the intersection of Web3 and AI continues to evolve, founders should focus on onboarding real customers and achieving product-market fit in the coming months. Companies focusing on inference should competitively sell inference into Web 2.0 and generate revenue. Early mover AI training networks will aggregate more customers. By the end of 2025, we anticipate the emergence of multi-hundred billion parameter models in a decentralized manner, a feat that would have been impossible just 18 months ago. If this trend continues, open-source AI companies may eventually surpass the current dominant players in machine learning, potentially preventing our digital experiences from being controlled by a few corporations.