DePINS: How blockchain technology can help developers in AI access the hardware they need
2023 has been the year of the artificial intelligence craze. Many investors are hoping to be early enough to invest in the next Artificial Intelligence (“AI”) winner. Crypto and Web3, conversely, have been in somewhat of a bear market. However, there are two points that investors might consider: 1) bear markets are often great times to look for the next winner and 1) AI actually can be complementary and even enhanced by Web3 and blockchain technology.
This is the first of multiple reports where Ruceto examines some of the most interesting Web3 projects that can potentially contribute to the advancement of AI. In this report, we are specifically focusing on projects that use blockchain technology to help people and enterprises overcome the massive hardware expenses that are associated with AI. This is an exciting use case because it has the potential to greatly increase adoption of Web3. Additionally, some of the tokens for these projects may be interesting from an investment perspective given the massive potential in AI.
In this report, we will cover the following projects:
Provides a market for GPU based rendering capabilities that can support VR, AR, gaming, media, and various industrial uses
Creating a ecosystem where users can be compensated for providing computing resources, training Machine Learning (“ML”) models, generating content, and sharing data
Created a marketplace where general computing power can be rented, and running a testnet so their marketplace can effectively support AI models
Created a marketplace where general computing power can be rented, and provides access to multiple blockchains through parallel assets
We will release a part 2 to this report very soon that will cover a new set of projects. After releasing part 2, we will examine some other use cases where Web3 technology is complementary to the advancement of AI.
AI’s Hardware Problem
AI has become the hottest topic in technology since OpenAI released its Language Learning Model (“LLM”) ChatGPT on November 30th 2022. ChatGPT crossed 1 million users less than a week after its release, and according to the research firm Similarweb Ltd., ChatGPT has already hit 100 million users in January 2023.
Institutional Capital is Flowing into AI
Unsurprisingly, AI is attracting a large amount of institutional capital. According to PitchBook, VCs have grown their positions in generative AI from $408 million in 2018 to $4.8 billion in 2021 to $4.5 billion in 2022. VC investment in 2023 is showing signs of further acceleration. PitchBook said the following about VC investments in generative AI in 2023, “Roughly $1.7 billion was generated across 46 deals in Q1 2023, according to PitchBook data. An additional $10.68 billion worth of deals were announced in the quarter but have not completed.”
The current market sentiment indicates that AI has replaced crypto as the hot trend in technology. Per Crunchbase, funding to VC-backed Web3 projects dropped 82% year-over-year, falling from $9.1 billion in Q1’22 to only $1.7 billion in Q1’23
AI Development is Expensive
The AI space has a problem that Web3 may be able to solve – extremely high hardware costs. The computational complexity of AI systems is doubling every three months, and existing solutions to provide compute supply cannot keep up. Cloud providers such as AWS, Microsoft, and Google are running into shortages of their own, and are making some customers wait months to rent hardware.
Massive Wealth Transfer to Nvidia
Most GPU chips in the AI industry are provided by Nvidia, whose primary data center workhorse chip costs $10,000. Their graphics processors power ChatGPT, and Wall Street clearly expects Nvidia to continue to power other transformative AI projects as evidenced by Nvidia’s stock rising nearly 180% year to date as of May 2023, and reaching a $1 trillion market cap. New Street Research estimated that the ChatGPT model inside Bing’s search function could require 8 GPUs to deliver a response to a question in less than one second. Per Antoine Chkaiban, a technology analyst at New Street Research, “If you’re from Microsoft, and you want to scale that, at the scale of Bing, that’s maybe $4 billion. If you want to scale at the scale of Google, which serves 8 or 9 billion queries every day, you actually need to spend $80 billion on DGXs.” It’s clear that a lot of capital will go to anybody who can supply compute power to the AI industry.
Big Innovation from Small Players
The high hardware costs associated with AI raise the threat that the AI industry will be dominated by Big Tech. To date, there actually has been a wave of innovation brought about by smaller developers because a substantial amount of what Big Tech has built in AI is open sourced, but a leaked memo that is believed to have come from inside Google indicates that the tech giant is aware of the danger that open source AI models pose to their market share. Due to obvious economic incentives, it may only be a matter of time before Big Tech becomes much less open with their work in AI.
Web3 Bringing Compute Power to the Masses
Web3 may be able to substantially lower the costs for an AI startup to develop and train a machine learning (“ML”) model through Decentralized Physical Infrastructure Networks (DePINs). DePINs utilize blockchain technology to build and operate physical infrastructure and hardware networks. This emerging Web3 technology enables supply-side providers to earn tokens in return for renting out hardware capacity (i.e. GPU compute power) from idle hardware to end users. Essentially, hardware from across the globe can power a cloud-like network.
A major reason why this market opportunity exists for Web3 is because of the abundance of idle hardware distributed across the globe. Matt Hawkins, the CEO & Founder of CUDOS, sold his U.K. based aggregator of 55 data centers before he founded CUDOS to address the inefficiencies he saw in the physical data center market. During a recent AMA Matt said, “For example, if you look at AWS—which is one of the most efficient cloud platforms in the world—they only run at 65% efficiency. We were seeing up to 80, 90% spare capacity. And then, you end up with a lot of hardware because you only keep the last one or two generations of infrastructure running on your network. Everything else goes into what we used to call an eBay room. What we realized is we can build a global marketplace that connects suppliers and buyers of global infrastructure and overcome every single one of those problems. We can make it more ecological and sustainable. And, if we tokenize that infrastructure, we can enable the financing of that infrastructure so providers can scale. That’s essentially what we’ve done.”
The Render Network
|Market Cap (USD mn)||752.46|
|Fully Diluted Market Cap (USD mn)||1,095.22|
Source: Coinmarketcap, Data as of 6/25/2023
The Render Network provides a marketplace for decentralized GPU based rendering capabilities on Ethereum (to be moved to Solana after RNP 002 is implemented). Use cases that can be powered by this marketplace include VR, AR, gaming, media, various industrial uses, and more. Rendering or image synthesis refers to the process of using a computer program to generate a photorealistic or non-photorealistic image from a 2D or 3D model. An example of a render frame is below.
Source: The Render Network
The Render Network was founded by its CEO, Jules Urbach, who is also the CEO of OTOY Inc. OTOY provides GPU-based software solutions including OctaneRender, the only GPU render engine that The Render Network currently supports.
Users who need to perform render jobs (“Creators”) are connected to people who have idle GPUs that are capable of powering those render jobs (“Node Operators”). GPU owners connect their GPUs to the render network, complete jobs using OctaneRender, and receive RNDR as compensation, net a small percentage of the tokens that are paid as a fee to OTOY.
Source: The Render Network
The end result is that Creators can tap into affordable GPU power that was previously out of reach, and Node Operators can monetize idle GPU power. Pricing is based on a multitiered system that Creators can choose from based on their preferred rendering speed, security, and price. Tier 1 is an enterprise tier that handles high profile IPs requiring very high security. It is currently reserved for a closed group of customers and nodes. Existing Tier 1 GPU providers include Google and Microsoft.
Source: The Render Network
Creators can choose to pay for projects in either RNDR tokens (ERC-20 utility token that can be purchased and sold on exchanges) or RNDR Credits (These are backed by RNDR can be purchased via PayPal or Stripe on the Render Network Portal). In either case, node operators will receive RNDR tokens as payment.
Why The Render Network is Interesting
The company is working to bring Stable Diffusion to the Render Network, and a beta went live in the first half of 2023. After fully integrating Stable Diffusion to decentralized nodes, the company will potentially consider more AI-related integrations such as LLMs like ChatGPT, and enabling users to allow their work to be used to train AI models in return for royalty payments.
There is a possibility that The Render Network will partner with Apple after Octane X, a GPU rendering product by OTOY Inc. that is available on the app store, was shown being used at an Apple event. Although no crypto functionality or direct partnership with Apple was discussed, this has fueled speculation that Apple may use the Render Network to power its Vision Pro headset, Apple’s AR and VR product.
exaBITS believes that existing internet architecture is inadequate for the compute needs of computationally intensive applications and AI, so they are building Computing-Oriented Network Architecture (“CONA”). CONA will enable the pooling of distributed computing resources, which the company claims will lower computing costs by over 80%. exaBITS’ blockchain-powered decentralized infrastructure will facilitate a marketplace where users can offer and be compensated for a variety of AI-related services including GPU services, data storage, or expertise.
While this project classifies general “computationally intensive applications” as potential use cases, it is clearly being designed with the needs of AI in mind. exaBITS made this very clear in a February 2023 article where they wrote about the high cost of training and deploying AI solutions, and where exaBITS fits in. The company said “for those who need to train large models, e.g., GNNs on very large graphs, exaBITS provides a scalable solution that supports fast and accurate AI training on billion-parameter models.”
Why exaBITS is Interesting
This project is still in its blueprint stage, but it’s noteworthy because of the substantial amount of mainstream support and accolades it has already received from institutions such as Google and Harvard. exaBITS received support from the Google for Startups Program, which includes finance, expert guidance, Google Cloud services, technical training, and business expansion opportunities. Other Web3 projects that received similar support from Google include Polygon, Solana, Nansen, and Aptos. exaBITS was also selected for the Harvard Innovation Lab incubation program, something that no other Web3 project has ever been selected for. Finally, exaBITS won first place in the 2023 Web3 Global Startup Competition: Singapore HAPathon.
While there is no news on a token yet, this project is worth keeping an eye on because of the substantial institutional support it is receiving. As development advances, this project may hit the ground running with strong enterprise partnerships.
|Market Cap (USD mn)||138.49|
|Fully Diluted Market Cap (USD mn)||204.29|
Source: Coinmarketcap, Data as of 6/25/2023
Flux, a hybrid L1 and L2 blockchain using PoW consensus, offers a suite of decentralized services including FluxNodes. FluxNodes enables users with sufficiently available hardware and FLUX collateral to earn yield in return for offering computing power to the decentralized Flux Computational Network. Multiple node tiers are available, and every time somebody sets up a node on the network, the network grows. The number of Flux nodes as of 6/28/23 is 12,615 (a live dashboard that includes potential yields is here). In addition to the below, there is also a Titan staking node that allows users to pool their resources to run enterprise-level hardware.
Nodes are ranked by their “enterprise score,” which takes into consideration how long a node has been active on Flux, node tier, the identity of the operator, and collateralization.
FLUX is the native utility token that powers FluxNode. It can be obtained by GPU mining, as a reward for operating a node, or from trading on exchanges. While the team plans to accept major cryptocurrencies and fiat, using FLUX to purchase distributed computing power will enable access to substantial discounts. Flux runs on its own blockchain, and it provides access to other blockchains via parallel assets. 7 parallel assets have been created(KDA, ETH, BSC, SOL, TRX, AVAX and ERG) , and 3 more are coming (Algorand, Polygon, and TBD).
Why Flux is Interesting
Flux will launch the proof of concept for “Proof of Useful Work” (“PoUW”) in Q3 2023. PoUW is designed to optimize the energy intensive Proof of Work (“POW”) model, and it is being developed specifically so miners can make their rigs available for energy intensive use cases such as MLM training, rendering, and deepfake detection.
Flux is also working on AI specific projects such as the proof of work “Project Mayhem,” which will identify deep fakes or work that was generated by ChatGPT or a similar program, even if the verbiage was changed by a mixer. Another long-term goal of Flux is to create something like a decentralized ChatGPT, where the decentralized nature will prevent it from having the kind of biases that a centralized LLM would have.
|Market Cap (USD mn)||71.29|
|Fully Diluted Market Cap (USD mn)||243.26|
Source: Coinmarketcap, Data as of 6/25/2023
Akash developed a L1 protocol using Cosmos SDK and PoS consensus, a supercloud network, and a marketplace where users (“Tenants”) lease computing resources from cloud providers (“Providers”). The marketplace can utilize idle and underutilized resources in the estimated 8.4 million data centers globally. Through reverse auctions, Tenants submit their desired attributes and price, and Providers compete for business.
Per the company, “The cost of hosting your application using Akash is about one-third the cost of Amazon AWS, Google Cloud Platform (GCP), and Microsoft Azure. You can check the prices live using the Cloudmos.io price comparison tool.”
The native utility token of the Akash network is AKT. AKT is used for governance, to incentivize Providers to lower prices, and to reward stakers. The company provides up to date token metrics and info here.
Akash is competing with Flux as another Web3 marketplace for general compute power where anyone can rent out compute power from their idle hardware. Look forward to upcoming project-specific deep dive reports from Ruceto that compare the tokenomics and technology of these 2 projects.
Why Akash is Interesting
Akash is clearly aware of the opportunity that AI presents for their marketplace, and recently compared GPUs to oil in a recent blog post about the landscape of AI.
Akash is running a testnet from 6/20/23 – 7/11/23 for “the first open-source cloud for open-source GPUs and AI.” This is specifically for high density GPUs so Akash can build an “AI Supercloud.” The testnet will include the deployment of various AI models onto each GPU type so that AI models can be matched with the appropriate GPU type.
Web3 builders have a massive opportunity to increase adoption of Web3 if they can successfully solve the hardware accessibility problem in the creation and training of ML models. Ruceto will continue to monitor developments in the ML training space, and we will also cover projects that cover other use cases that intersect with AI such as data storage and deepfake detection. As a reminder, Ruceto is a community-driven research platform, so if you have any questions about this report, or if you would like to see one of the above tokens covered in a deep-dive report, please reach out to email@example.com.
Disclaimer: The content presented is for informational purposes only and does not constitute financial, investment, tax, legal, or professional advice. Nothing contained in this report is a direct or indirect recommendation or suggestion to buy, sell, make, or hold any investment, loan, commodity, security, or token, or to undertake any investment or trading strategy with respect to any investment, loan, commodity, security, token, or any issuer. Ruceto does not guarantee the accuracy, completeness, sequence, or timeliness of any of this content. Please see our Terms of Service for more information.
Artificial Intelligence (“AI”): Per MIT, “Artificial intelligence is the ability for computers to imitate cognitive human functions such as learning and problem-solving. Through AI, a computer system uses math and logic to simulate the reasoning that people use to learn from new information and make decisions.”
Machine Learning (“ML”): Per MIT, “Machine learning is when we teach computers to extract patterns from collected data and apply them to new tasks that they may not have completed before.” ML is a subset if AI, and it is used to improve AI.
Neural Networks: Per MIT, “A neural network is a type of machine-learning model that is loosely based on the human brain. Many layers of interconnected nodes, or neurons, process data. Researchers train a network to complete a task by showing it millions of examples from a dataset.”
Per MIT, “Deep learning is machine learning on steroids: it uses a technique that gives machines an enhanced ability to find—and amplify—even the smallest patterns. This technique is called a deep neural network—deep because it has many, many layers of simple computational nodes that work together to munch through data and deliver a final result in the form of the prediction.”
Large Language Models (“LLMs”): Per MIT, “Large language models like OpenAI’s GPT-3 are massive neural networks that can generate human-like text, from poetry to programming code. Trained using troves of internet data, these machine-learning models take a small bit of input text and then predict the text that is likely to come next.”
Render Telegram Channel