The Index Podcast
March 22, 2024

Ritual: Web3's Next Generation of AI & Crypto Infrastructure with Co-founder Niraj Pant

Ritual: Web3's Next Generation of AI & Crypto Infrastructure with Co-founder Niraj Pant
The player is loading ...
The Index Podcast

This week on The Index, host Alex Kehaya welcomes Niraj Pant, Co-founder of Ritual, the network for open AI infrastructure. Join them as they take a deep dive into Ritual's groundbreaking architecture, built on a crowdsourced governance layer aimed at ensuring safety. Explore Ritual's mission to become the focal point of AI in the web3 space by evolving Infernet into a modular suite of execution layers that interoperate with other base layer infrastructure.

Host - Alex Kehaya

Producer - Shawn Nova

 

 

Chapters

00:06 - Crypto AI

13:29 - Decentralized AI Inference Network Overview

26:04 - Decentralized AI Node Expansion Strategy

36:19 - Rapid Technological Advancements in AI

Transcript
Alex Kehaya:

Welcome to the Index Podcast hosted by Alex Cahaya. Plug in as we explore new frontiers with entrepreneurs, builders and investors shaping the future of the Internet.


Alex Kehaya:

Hey everyone and welcome to the Index. I'm your host, alex Cahaya, and today I'm excited to introduce Nouraj Pant, co-founder of Ritual, an open AI infrastructure network that develops innovative architecture with crowdsource governance for safety, funding, alignment and model evolution. Nouraj, I'm really excited to have you on the show. People who've listened to like my last couple of episodes know I'm becoming obsessed with AI, so I'm really excited a little more about Ritual. But before we get into that, maybe just for people who haven't heard of you before or in your background, just give us a brief introduction to how you got to building Ritual.


Niraj Pant:

Sure, thanks for having me. I'm originally a cryptographer by background and discovered the crypto space fairly early through mining. That was my first kind of introduction to the space. I was a researcher when I was in school, working on privacy for payments and smart contracts. Then, in 2017, I joined Polychain as one of the kind of early investors on the team, and I was there for six years. I worked across a number of different investment categories, leading deals in areas like privacy, computational integrity, bridging smart contract systems anything that you can name. I probably had invested in it in that category.


Niraj Pant:

Around the launch of ChatGPT a few years ago and some of the later iterations of the GPT models, I started spending a lot of my own time in AI and saw that there were a lot of really interesting use cases and really cool research to be done. Ai, I think, brings about this future in which computers can very effectively and efficiently talk back to us in human language at least one of the many factors that make it great. This led me to start thinking about, naturally, what are the intersections between AI and crypto. I saw that there was a lot of interesting problems that AI had that we have seen with previous tech platforms, namely centralization, sort of lack of native incentives. Ritual really is a response to where AI is today, which is a super exciting new technology category with some issues and also opportunities that arise from them that we think crypto AI can bring to us. That's where we're at today.


Alex Kehaya:

What does crypto AI mean?


Niraj Pant:

Crypto AI has been a term thrown around for a long time. Maybe I should say why hasn't crypto AI worked in the past? Why do I think now is the time that it can work? Crypto AI at a high level makes sense in the sense that crypto is a technology category that brings sovereignty back to the users. The original design of the internet was a very decentralized future in which you would keep your own data. You had a relatively good understanding of where your data was being used. Ostensibly, maybe at some point you would have gotten compensated for it. But, as we've seen, given many of the major internet real estate is owned by for-profit corporations that are based mostly in the United States, we've tended towards this increased centralization. Now the platform is largely exploit you rather than work for you. That's a big challenge that we've seen. With AI, we're starting to see a very similar centralization. It's a couple of model providers, it's a couple of cloud providers, it's a very small group of researchers, mostly in San Francisco and a couple of other big US cities. What does crypto provide AI? It can provide new guarantees for AI. Because of this unbridledness that you get with AI, which tends to move towards centralization, crypto provides you these self-sovereignty guarantees that you typically don't have. Examples of this could be privacy, computational integrity, censorship, resistance and many other things, including being able to attach direct incentives on top of the utilization and creation of intelligence.


Niraj Pant:

To me, the AI crypto intersection is one in which I think each need each other in many ways. Ai needs crypto. Ai needs the guarantees you get with crypto. Then the other piece is crypto needs AI AI. Some of the largest products in this category, for example, chatgpt, got hundreds of millions of users in a matter of months. 100 million users is more than most on-chain users in all of crypto across every blockchain. I immediately saw that and found an opportunity that we can have something similar happen in crypto and onboard a ton of users through this new paradigm of AI being able to do these new generative tasks. That, to me, is the opportunity set here is they can intersect and interact with each other in many, many different ways. I think you really bring up a great point.


Alex Kehaya:

I had not thought about just trying to copy the onboarding event that happened with open AI and leveraging that to drive a mass adoption opportunity for crypto. I think that's a really good point. I've also been exploring the problem spaces and opportunities in AI. I don't have any level of expertise I claim to have any expertise in it, but it has been a passion project of mine for the last couple of months, mainly because I've been looking into the cloud computing market and bare metal hardware a lot in the past couple of years. I've realized that that market is really a race to the bottom. Unless you're a hyperscale like Amazon, like AWS or GCP, you're really going to have a tough time competing. I started looking at what kind of things can you bundle together to create value and compete on value instead of price? I landed on AI. I was like AI needs tons of hardware, needs a lot of compute and it's a super valuable service. But it has a really big problem for enterprises I think for normal people too. I just don't think they're quite as aware of it, in that when you use chat, gpp or open AI, you're just giving them your data and it's training their AI. You talk about sovereignty and crypto. Why should an enterprise give up their most valuable asset, which is their data? They should hold on to that data, they should own it and it should have value. There are some very interesting technologies coming that are going to enable that. It's closer than I thought. I think that it's not here quite yet where it will work.


Alex Kehaya:

You're understanding cryptography at a much deeper level than I do, but things like fully homomorphic encryption, zero knowledge proofs. I just came across Nillion Network. I don't know if you've heard of that company, but I read their white paper and I was like, if this stuff checks out, maybe you can tell me if it actually, if you've read their white paper, I'm like, because I'm not educated enough or an expert enough to read that white paper and be like, wow, this actually makes sense and will work the way they say it will. But when I read it I was like, man, this is part of the solution I've been looking for. They did do a good job of. There's this chart in that white paper that compares zero knowledge proof and fully homomorphic encryption and NPC and talks about the differences. That was super helpful for me understanding For, like zero knowledge proof technology, I guess the data still has to be decrypted at some point to be used.


Alex Kehaya:

It's not fully encrypted the entire time, but just to bring it back to earth here for people who don't know about encryption and stuff. In order for consumers and enterprises to maintain sovereignty over their data, you need privacy. In order to have privacy, you need cryptography. I find that very exciting and I think if you combine cryptography with decentralized compute, all of a sudden you just unlock like these crazy use cases that people probably don't really imagine are possible. And I also think there's another thing Most companies think that GPUs are hard to come by.


Alex Kehaya:

Most people who are building like AI, it's like man GPUs H100s, a100s are very hard to come by. Actually, there's like a lot of them. You just have to know how to get to them and a lot of it. People are accessing them through these marketplaces, these open source, permissionless marketplaces like a cache, like render network. I even saw a company that's not like a blockchain based marketplace, but it's more like a traditional Web 2 one called Vast AI, that a couple of guys that I used to work with at Orchid spun out.


Alex Kehaya:

There's this like interesting thing where, like it's actually deep in, is like the now, like the thing that people will call it now decentralized physical infrastructure, but these are essentially deep in projects that are bringing access to GPUs at scale, and I'm so excited about the potential. How does what you're building fit into this picture? I really feel like we need to draw an actual diagram of the different parts that make all this work so people can visualize it. I'm kind of a visual learner, but like, how does what you guys are doing fit into that landscape?


Niraj Pant:

The goal of what we're trying to build is the big opportunities set that we see is this Web 3 AI market today is not massive, it's small, but it's growing really, really fast and we're building for that market. There's certainly a very big Web 2 market that you know will potentially enter at some point, but there's a lot of people doing that and we're really interested in the crypto native use cases Because I think that will bring a lot of usage to blockchains. It's something that I think we can uniquely onboard. So where do we fit in? What Ritual does? Is it sort of acts as a conduit between different blockchains and AI models? So DAP developers are able to very easily you know, in a few lines of code take a model that they may want to use in their DAP, let's say Lama 2. They can say I want Lama 2. These are the features I want. Perhaps they do some fine tuning on it for some special results and then they take the user input and then they feed that off to our network, our network. Infernet is the kind of first version of what we've built like Internet with an F. Infernet is an off-chain compute, you know, set of nodes that allows you to basically dispatch these requests onto this off-chain network. The off-chain network does the work optionally, provides a proof of varying types, just depending on the use case, and then returns the result back to the DAP. So that's where we're focused on the workflow. On the GPU side of things, we both have sort of our own that'll be sort of native to the network. Then we also partner with other GPU services that are, you know, bringing tons of GPUs from different countries to come in and fulfill these requests.


Niraj Pant:

On the privacy side, privacy is a pretty important thing to us.


Niraj Pant:

You know, as you mentioned, a corporation can say that they're keeping the data private, but you know, you never know there's always tampering issues or even from the perspective of security, you know, an external attacker can come in and go and find data. So in some ways it's good if it's not even there or it's just held with the user, which I think is a little bit of a better security model. So we're developing a series of new privacy techniques of some of the classes that you mentioned, just depending on the type of model that we service, and it's a long process. Some of these things like FHE, for example, as you mentioned. They're becoming more efficient, but they're still quite early and so doing privacy on, say, a GPT 3.5 style model, doing inference on it, is still relatively slow to do, but I think we'll get to the point where that's efficient to do. But all of these other pieces, I think, are within the realm of possibility and so we're building all these pieces with the kind of application, user and ultimately the developer that's creating that use case in mind.


Alex Kehaya:

You guys are really focused more on the inference side of things, where, like, a app developer or a app developer wants to have their data leverage like Llama 2 to enable AI in their apps, so the data gets sent to the network, the network runs the inference on it and it gets a result back. It's like asking GPT a question and you get the answer back. It's the same thing Again for people listening because I'm still learning here, maybe you are too you have to train a model and it learns how to do things, and that's a separate process. And then there's inference, which is actually like using the model, which it took me forever to figure that out. That sounds stupid to say, but it really did. I mean, it was just kind of like jargling getting thrown around for me for a long time. Am I correct in saying that's what Ritual does and at least that first step that you talked about in the app integrations?


Niraj Pant:

That's right. I mean to put some more color to it. We kind of treat models as first class citizens in our network. So really a lot of what we've built is around servicing AI models and we support a variety of different tasks on those models, you know, the most important one in our case being inference. But we support fine tuning, which you can think of, as you know, a very specialized, narrow form of training.


Niraj Pant:

So just taking some new examples and kind of updating the model for a certain specific use case, model quantization is another use case. We don't focus on training today. There are other teams in the space that are really interested in that, not to go too much on a tangent, but also understanding the provenance of data that goes into the model, because we don't really have a lot of transparency into that, even in the centralized open source models today, and so I think a great future is one in which we have a full understanding of what data is going into the model that we can then track all the way through the training process and then to the end inference process. I think that's important for provenance of data and, you know, ultimately having some sort of understanding of how the results are actually being generated in some capacity.


Alex Kehaya:

One thing I really like about what you're doing, and I noticed it's kind of similar to Nillion Network, because I just again just read their white paper. But it's not a blockchain, it's just a separate like overlay network, right, like you're not building your own blockchain network, you just have a network of compute that enables the inference to happen so it can work with anything, any chain, any ad-app developer could use it. Who runs the nodes? Who's going to run this network? Is it a bunch of individuals? Like what's the ideal there? What kind of hardware do they need? If so, can you just tell me more about how that works?


Niraj Pant:

It's largely us and other kind of large groups that have access to a lot of GPUs, but there are also individuals that are running nodes on this network.


Niraj Pant:

It's a fairly simple and kind of painless process to set up at least we hope should take 15 minutes to set up, and ultimately this is a network that is getting a kind of a heterogeneous set of compute tasks.


Niraj Pant:

Maybe it's inference on a really tiny model, maybe it's inference on five different models and they're being composed together, and so what we are set up to do from a network standpoint is we do want a lot of people to be able to participate in the network.


Niraj Pant:

So, for, say, inference on on, you know, llama to 7b, you don't really need an h100 to do inference on it. You could do it with a, you know, with a less capable GPU, or you could use a cluster of a few GPUs to do the inference. And so we want, at a point, to make this network as open as possible, you know, while also maintaining the flexibility of Sort of the SLAs that you may want in your your application Applications. So if you're you care a lot about performance and privacy and all this stuff, you probably will need the beefiest hardware on the network and that's something that we have the capability to be able to route into that, that network, whereas, you know, some people are fine with some performance degradation and they are able to go to, to maybe the weaker hardware, which would be a you know more, you know sort of a more open group of people.


Alex Kehaya:

What does that look like in practice? I can go to your site, I can see it, the various networks that are available for me to do my inference on, and there's some details there that helped me assess if it's I mean like hardware spec or provider or something like that. Like is that? How does that work?


Niraj Pant:

Yeah, exactly. So what would happen is a A request would be dispatched from a network. So let's say it comes from base. You know an application on base would say, hey, I have this request, sends it to the network. The network goes and pairs you with the appropriate node on the network. That'll then do the result, or get the result, and then spit that back out to the end up. So it's really kind of a chain driven process of chain. Once result goes to the network, network does the work and then it spits it back out. So it really is meant to be something that plugs into many different places and GPU providers on our network are able to Do opportunities across many different networks.


Alex Kehaya:

How does a GPU like the server communicate to the app that it's got the right Chops to do the job for, like lack of a better word, right Like the right resume? You can kind of think of it as like a resume like I've performed really well over the last six months and I've got great hardware available and you know those kinds of things. Does it work like that, like, how do you know you're gonna get quality service, because you mentioned that? That's like a concern, right? So how does that actually work in the process of base connecting to the network?


Niraj Pant:

This is something we'll be releasing a blog post on in the next Hopefully the next couple of weeks. We can borrow a lot of things from the proof of stake world and the proof of work world. The proof of stake world is is good in the sense that you know, for example, in cosmos has sort of historically had, you know, uptime slashing. Networks have different things that they score their validators on. Sometimes it's uptime, sometimes it's it's sort of, you know, reliability, and then there's also the social elements of it as well. You know some validator networks, or I say some validators, have a Brand that they use consistently across different networks and over time they develop this, you know, weak reputation that people start to associate them with, and so those are more of the kind of crypto, economic, kind of softer Forms of guarantees that that we can definitely leverage.


Niraj Pant:

In something like this we can also do things like in networks like Filecoin or or an alio. They are able to issue these challenges To network providers to basically in some ways test their capacity. Filecoin has the process where they're constantly, you know, testing. You know if you're storing all pieces of file there was a concern for a long time was and what if you're only storing, you know, one eighth the file and not the entire thing. So it's early for us, given you know there's a lot, a lot more to build, but this is like a flavor of the types of things that we're exploring for your, for reputation on the network just your explanation Really highlights how like complex these solutions are.


Alex Kehaya:

Right, like the interface should be pretty simple. Right, I'm an app developer, I'm gonna send a request and I get a response back. It's like Sounds pretty simple, but like the actual execution of that and providing quality service in a decentralized fashion is very hard. You want it to be more. It sounds like right now, it's you plus a couple other groups that are providing GPUs, the network, and you're sort of in this beginning stage. Like maybe I don't know if you would call it a beta or like an alpha of the network, but what's your end state? Like, what is censorship resistance? Look like, what does decentralization mean for you guys? Like, how are you defining success? Like a Year, 12 months, 18 months from now, whatever it is and you're talking to your team. You have like this victory speech like we are the most censorship resistant AI network. Like what is that? What's in the speech?


Niraj Pant:

There's a couple of different cuts on how you can think about censorship resistance. One is, you know, when we first started the company Chat, gpt was not available, and I think 50 or 60 countries or something like that and you know, not all of them were sanctioned countries either. They were just like didn't have support or whatever. So our view is building an open, common layer on a blockchain and then being able to remove ourselves and the network still being able to operate. So the network needs to be set up in a way that All of the actors are able to continue performing their function, and they're they're kind of well set up to do that, such that, you know, if I were to go away for you know, five years, I could come back and you know reliably say that the that the network is still running. So that's one of the criteria that we care about a lot. Another one is, you know, broadly building up a big ecosystem of DAPS and model creators that are using this thing, such that, once they're, once all of these things are, you know, there's sort of a kind of a timeline of decentralization that we need to hit. You know, you've seen this with the layer twos. A lot of them start, you know, fairly centralized, get a bunch of usage and then are slowly decentralizing over time. And that's kind of the the tact that we were taking and it's it's basically the same framework. It's like, where are pieces that we can start picking off and fitting in decentralized solutions?


Niraj Pant:

So there's a piece of this now which is, you know, building in model storage capabilities into all of the nodes or being able to offer different types of integrity proofs.


Niraj Pant:

So, depending on, again, the, the type of guarantees you may need for your DAP, like if you're a DeFi app and you're using an ML controller to Adjust the parameters of the network or to generate yield or something, you're probably gonna want pretty high levels of integrity because it's touching a real system and it has money in it, whereas if you're doing a, you know, say, like you're building something like Instagram on chain, you may not need the strongest proof in the world to say, hey, this like this recommendation system output, like the thing that I wanted to.


Niraj Pant:

So those are, I think, the metrics of what we're looking for and, ultimately, I think you want to follow the model of groups that have decentralized very effectively. You know, I think Ethereum has done a great job. You know broad set of kind of no decentralization and also mind-share decentralization I think is important to you know new groups that are bringing new ideas and updating the code and things of that nature is very important. So it's it's an active topic of discussion for us internally on how best and how effectively we can, we can decentralize what we're building.


Alex Kehaya:

Like the way I think about it for Solana, for example, is like how many nodes can we get in? As many individual data center locations and ASNs on the planet? That's like the physical hardware. Decentralization, censorship, resistance and their stake distribution, right, those are the two, the two things that I think like. Weight decentralization Is there like a number of nodes that you want to hit like, or that this thing wants it with, like individual operators running those nodes, running the network, providing Services. So that's question number one.


Alex Kehaya:

My next question has more to do with, like accessing the network. I like a cache, because Greg you probably know Greg from a cache like Super hardcore open source guy, right, they have like really leaned into that and everything they build all the way to the front end. That is like a SaaS product for accessing the network Is open source and so why is that awesome? Well, if they went away for five years or that their company blew up, you know, or something like that, all the contracts are on chain and that interface is open source. So just takes one person to figure out how to spin it up and boom, you can access the marketplace, and I bet you can access it via command line. So it's like I guess I'm like wondering how many nodes and does decentralization and physical location matter for you? And then Access to those nodes. And then my last question is the token. But we can get to that in a second. I'm just sort. I'm assuming there's some kind of token Involved here. But let's forget to the first two for first.


Niraj Pant:

Number of nodes is a really good question because there's also a piece of it of some of the nodes are going to need to be Beefier than others. Those things tend towards just bigger entities running them. I really like what Geo hot is doing with his company. What's the company Geo hot's company, I think it's called the time it's called tiny corporation.


Niraj Pant:

What they're building in the AI world, the way that people in the AI world think about decentralization is they think about a lot of these problems too, but their approach to it is less, I'd say it's less cryptography focused and it's more based on being able to do AI things locally. So you can almost think of it like doing doing edge computing. The dream of every AI decentralization person that maybe doesn't care about crypto is everyone has a home box, that's, you know, hgp use and they can run anything they want, and I think that's a really cool mission. It's going to be tricky, for you know a lot of people, but I think a lot. I think we should go towards that, and so we would really love a future in which that happens and those people also contributing network, contributing capacity and the off cycles to our network. So, number of nodes it's it's hard to put a number on it, but certainly thousands, maybe even tens of thousands at least, and obviously scale up depending on how much traction this space gets over time.


Alex Kehaya:

I find this conversation so interesting because I'm starting to see there's layers here for the kinds of services you can provide and there are layers here for how you can get nodes on your network and there's there's composability across multiple other kinds of networks that are providing different value. That, like ritual can ritual users who want to run a node can Contribute it via some of these other networks like I'm thinking about. Coin network is one that's coming out. I don't know if you've seen those guys that can send you information on them, but they they essentially like a compute marketplace sort of. It's like another. It's just it's an SVM fork and you can basically like list tasks. You download their app on your computer and if you got GPUs there and the task requires GPUs, it will. It will just run the task. And then there's like render and a cache, right, like they have a ton of GPUs in those in those networks. That obviously would work. They might not be beefy ones. The cash ones could be beefy. They could be pretty beefy because they have like tier one data centers Available as providers through their, through their interface. Yeah, I find the connection, the the composability between these networks to be very fascinating, like I'm just listening to you talk and my brain is kind of going off on these different protocols and stuff that I've been Getting to know over the last several years that fit in and you're not even competitive. Like none of this is competing against the other thing.


Alex Kehaya:

Two days ago I talked to the founder of bagel net, who I met through coin funds because they were my lead investor in my last company. And have you heard of bagel? Have you heard of these guys? I have, yeah, so bid on the founder and I got on a phone call and I am like I've been like looking for the. He's possibly building one of the solutions to the privacy angle for both the LLM and the data and that kind of thing actually would enable the use case you were talking about where you could run on a local machine at your house and like Use personal data or at an enterprise like on-prem, you know, use business, sensitive business data on an LLM that's totally proprietary, doesn't even have to be an open source LLM, and all of a sudden it unlocks all these use cases that you can't do right now because we don't have the privacy guarantees.


Alex Kehaya:

I haven't seen their white paper again, I'm not technical enough to you would be able to like look at their tech and probably be like, yeah, this makes sense or not. Please tell me if it doesn't later, if you've read a white paper. But I want to let you answer the second part of the question, though, about access. So how do you, how do you deal with access? Because, again, if you go away, like ritual evaporates, you want the network to still run without you. Your company evaporates, right? So how do you deal with accessing the network and a permissionless, like censorship resistant way? What ideas do you guys have for that?


Niraj Pant:

the thing that we're working on today is, broadly, how do we increase the number of people running nodes and making this a really good opportunity for people to run nodes? So in Some ways that comes downstream from distribution and usage. We just want a ton of daps constantly doing inference requests and fine-tuning such that there's a very active Network of tasks that the, the GPU networks or the individual GPUs can go and pull down and they can say hey, I'm, you know. Like you said, there's, there's a very big range of the types of People that'll be running nodes. It could be myself with my you know home rig Plugging into this network whenever I'm not using it. It could be a Bitcoin miner that has access to my well, not Bitcoin miners, but to GPUs that that use the same power setup that they have.


Niraj Pant:

The first piece is like increasing the number of nodes and broadly distribution to those things, and then it's starting to build. You know, bring in other People from the community to start building new solutions. So maybe it's in some of these networks. They run the ritual node Client alongside their node implementation and you're able to run these two things together and now you're building a system where it's it's not just reliant on, you know, ritual is pushing out this message and and getting people to do things. It's now. Demand comes from other places, supply comes from other places, other people are contributing to the network. There's a lot of, I think, pieces that you know we really want to have and make sure is ready over the next couple of years To, you know, solve a lot of these issues.


Alex Kehaya:

I find this so fascinating. I thought about something similar for, like helium, for example. You know, helium is essentially a mobile virtual network operator, right, that's what they're. The company is operating and then the network itself is all the people running the towers. But because it's all open-source software, like anybody could be, a, could, like I could launch my own virtual network Operator that just uses helium as the backbone. Right now, you have to, like you know, helium has this partnership with T-Mobile where if you leave, if you're not covered by a helium node in your area, they're not enough, you know, not enough bandwidth around there. Well, you're just on T-Mobile now and that's how these virtual mobile network operators operate. That's how they work.


Alex Kehaya:

To get a subscription, that's like you got a website and you buy a mobile phone subscription.


Alex Kehaya:

That's just an interface. It's just an interface that you pay for something and then it allow, and then it allows your phone to configure to connect to this network, and I think what you're talking about is kind of similar. It's like so you could just have an open source Template that people can host to provide services, to create demand for the GPUs, like so you're saying, like node operators can also be the, the, the, the, the vendor of the service, the value added service on top of whatever the node is providing. It could run a couple of different models in my corner of the network and I can use my interface to drive people to use the services that I'm providing along with other people in the network. If I understood you correctly, that's what it sounded like. Node operators will eventually be, in your opinion, full stack businesses that are offering compute with services. That could be like SaaS services, or they could be like enterprise level service offerings too, like helping people do it for them or something like that.


Niraj Pant:

Yeah, exactly, it would be an amazing future if people are able to almost operate their own businesses running nodes on this network, and I think Helium is a really apt analogy. Helium's mobile network was not possible 10 years ago. It took a series of innovations to allow that to happen, one being the creation of this new it's called lightly licensed spectrum, and basically it's this spectrum that the US government has opened up for people to create their own cell services. So Helium there's a company called Really that's doing this in Austin.


Niraj Pant:

That opening up is, then, what allowed all these new businesses to flourish and create this kind of decentralized future where you can run a cell tower or you can even be a user and access this network at a way, cheaper rate, because the utilities today do take a very fat rake for something that is, you know, many ways it can be kind of broken, and the opportunity here with AI, crypto is today you as an individual cannot really participate in the growth of this industry. Some of the groups you mentioned that are doing GPU as a service, bare metal as a service they're just doing all the GPUs themselves or they're sourcing it themselves, whereas with a network like Ritual and many of these other sort of GPU as a service networks. You can plug your GPU in and do the work yourself and kind of like what you can do with the Bitcoin network or in the Filecoin network. So you're exactly right. I think it's a really exciting new opportunity that's been opened up, largely by, I think, the timing of what's going on.


Alex Kehaya:

How does the token work? Is there token?


Niraj Pant:

We're not sharing information on anything here yet. I think what we're focused on really is getting distribution and broadly growing the pie of crypto-AI applications. I think we really want to see AI be the biggest leader of transaction volume in the crypto space, because we talked earlier on the podcast about ChatGPT and how it got so many users so fast, so that's inspired me and inspired the team to say what are ways that we can bring that to crypto. How do we expand the TAM, bring people from outside of crypto into this space, and AI could be that thing. Open AI just released something really interesting today.


Alex Kehaya:

I know.


Niraj Pant:

Text-to-video model. It's crazy right.


Alex Kehaya:

I was going to bring it up with you because I am honestly, terrified and excited at the same time. I've actually never had this feeling when I looked at a technology or a demo of a technology, but seeing Sora today, I'm a little bit afraid. I'm actually afraid because it's so good, it seems so good and you're watching Sam tweet all the examples because people are giving him prompts and tweets and he's just banging out examples and you're like, wow, it must be pretty quick, it must be pretty responsive to these prompts. If it's that responsive that he's able to just whip them and share them in minutes, how far away are we from the matrix? Like an actual AI simulation that just constructs the world around you. It's crazy. It's unbelievable to see that. Maybe I'm having an overreaction, but it feels that way to me.


Niraj Pant:

No, I think that's right. The things that he generated were not normal queries. It wasn't a guy drinking a bottle of water, it was a turtle in a bling zoo. It's just really weird, interesting stuff.


Alex Kehaya:

In high fidelity, though. There was one on their website a California town during the gold rush and I was like it was so detailed. It takes the whole corpus of data of the entire internet and it constructs a movie quality scene. I'm not that detailed of an oriented person so I'm sure someone with a keener eye than me would notice any weird things in the generated footage. It looked pretty flawless to me. I'm looking at it. It just looks like it's a cinematic quality trailer for a gold rush movie or something.


Niraj Pant:

Imagine once we start watching full length movies that are generated with AI and the Brad Pitt of 2034 might not even be a human Might, just be an LLM character that we really like and we start using. Often, many movies are interactive storybooks, like we have in games, where, oh, I really don't like that LLM actress. I'm going to replace her with this one and make the story a little different. It's going to be insane 2034?


Alex Kehaya:

I don't know, man. I think it's like three months from now. I think it's like 2024.5. It just seems to be accelerating. I know this is the meme of crypto Twitter accelerating versus decelerating. That one also took me forever to figure out what people were talking about. I think I get it now. It just seems to be accelerating. In my whole life I don't think I've seen technology move this fast Crypto since I've been in this space in crypto since 2016,. But I've also started to view the future. My show is about the future of the internet, and web three, to me, has become really about the future of the internet, which for me, is AI, blockchain technology, ar and VR. The new internet is these things. I never imagined it would move this fast. It's sooner than we think. We'll have those automated, personalized, grammy-winning movies. We're coming up at the top of the show and you probably have a hard stop at the top of the hour, but what have I not asked you that I should have asked? I ask every guest this at the end of the show.


Niraj Pant:

The thing I would implore people to do that are interested in this space is go check out the stuff and start playing with it. Our Ritual Infernet system is fully open source. You can go and start building applications. We released a demo of an application called FriendRug. Like F-R-E-N-R-U-G, the basic idea of FriendRug was to show off both the Ritual Infrastructure and then to show off how you can build an AI application in crypto. The idea is, I should say, it's actually live. You can go play with it.


Niraj Pant:

We trained up a popular open source LLM on a bunch of examples. We took that bot and stuck it behind a FriendTech room. You enter the room. What you try to do with the bot is you try to convince it to buy your keys. I've seen the weirdest types of responses, everything from points gun at bot buy my keys or else really long-drawn, intimate plays on how to get the keys bought. What this really showed on the back end was we used a couple of different LLM instances to show the stochasticity of responses. Those are fed into a classifier and, using ZKML, we're proving that result on chain. Then you're getting the result. For the most part, especially for a couple of week hack, it's fairly responsive and easy to use.


Niraj Pant:

I would say that I would also say I think a lot of crypto people, once they start to do some of the ML One education, they start to see where the things click in together and where there's really interesting use cases. I'd recommend people go do even a cursory look at. Go look at some of the recent LLM videos. Andre Carpathia does some really good ones. Go look at some of the old stuff. Andrew Ng does some really good stuff. It'll really connect a lot of really interesting concepts. The Bitcoin White Paper for many people in this space was like a coming to God moment. It was something so interesting and different. I would say that that was similar for me with LLMs. It felt like a divine creation. It felt so weird. How does this even work? Those are the couple of things that I would say on, I'd say as maybe closing remarks from at least that question.


Alex Kehaya:

Awesome. Well, look, I really appreciate taking the time. It was a fascinating conversation. I look forward to seeing ritual succeed With that. Yeah, we'll sign off. Thanks so much for taking the time. Thanks for having me.


Alex Kehaya:

You just listened to the index podcast with your host, alex Kahaya. If you enjoyed this episode, please subscribe to the show on Apple, spotify or your favorite streaming platform. New episodes are available every Friday. Thanks for tuning in. I'll see you next time.