Moonbeam Is Launching DataHaven to Secure Data Privacy and Provenance for an AI-Driven Web3 Future
Head of Business Development Ryan Levy sees data as the leading commodity for blockchain tech

Every day, humans and robots produce and consume endless amounts of digital data across websites, mobile applications and beyond.
The Internet has helped proliferate that data to all corners of the globe, and with it, the need for data storage. It’s an expansion that shows no signs of slowing down, and according to Global Market Insights, the global decentralized storage market was valued at $623 million in 2024, but is now estimated to reach $4.5 billion by 2034.
For folks like Ryan Levy, head of business development and partnerships at Moonbeam and DataHaven—its newly launched decentralized storage solution—data is the tech world’s most valuable resource.
DataHaven joins other blockchain protocols like Primus in building technology that prioritizes privacy and verification, betting that a future with more and more artificial intelligence (AI) agents will create a world where data validation and ensuring sources of “truth” will become even more necessary.
“We strongly believe that what we’ve built and continue to build at Moonbeam is highly valuable to the world of web3, and now we’re providing that next level product where people can dive deep into the AI world and feel secure about the decisions they’re making because the data they’re using is verifiable and provable,” Levy said to me during a recent interview. “We’re making sure we’re part of the narrative but that we’re contributing to the overall ‘good’ of the narrative.”
Moonbeam is secured by Polkadot, and initially helped bring Ethereum into the Polkadot ecosystem. With DataHaven, Levy said the opportunity is the reverse – to bring Polkadot to the Ethereum ecosystem and expand the world’s most-used blockchain through data storage.
“We looked at the thriving Moonbeam ecosystem—which is now primarily focused on DeFi, RWA, DePIN and gaming—and we decided we wanted to expand into the Ethereum ecosystem,” he said. “We will have this native bridge between Moonbeam and DataHaven. So although it’s two different blockchains as a whole, anybody building on Moonbeam will get access to DataHaven, and anybody using DataHaven for storage can in turn have access to all of the features and functionality of Moonbeam.”
Read more: E-Pal Looks to Position AI Agent Companions as the Future of Blockchain Gaming
In the coming weeks, DataHaven will release its whitepaper to detail what they’re building and why they’re building it, but the mission seems predicated on a desire to help shape the future of decentralized data storage. Levy, a data infrastructure veteran who spent 25 years building data storage solutions and has a longstanding background in all things data, is leading that team.
“A little later in my career, I was building data centers, and throughout my career I’d done a lot of things related to data like data analytics and cybersecurity,” he said. “But it all translated back into data. If you think of the basis of technology and the core of where all technology lives, it really does rely on data.”
Levy was later introduced to blockchain in 2013, but didn’t become crypto curious until 2017. His first entry point into blockchain—outside of Bitcoin—was through Ethereum.
“I loved the whitepaper and concept behind Ethereum because of the modularity of the ecosystem as a whole and because it was easy to translate to my background of building out data centers and infrastructures,” Levy said. “It really resonated with me and made sense.”
Levy then went on to work with a number of blockchains, building out the business development, partnership and go-to-market team for SKALE Network, Chainstack and Kadena.
“It’s been an incredible journey,” he said. “I was fortunate enough to come in contact with the great team at Moonbeam and absolutely jumped at the opportunity. Whenever I look at projects, teams and companies, the first thing I always look for is culture. The technology is cool, we can do a lot with it, but I’m very much about people and very much about the culture. Moonbeam checked all of those boxes, and it wasn’t long after I joined that we started building out this new expansion into Ethereum, which is the birth of DataHaven.”
An AI-first storage system
It’s an expansion predicated on the belief that data is and is going to be one of the most valuable assets within decentralized systems.
“Arguably, the most valuable asset you create that you own and that actually represents you is data,” Levy said. “That’s both from a human perspective and from a digital perspective. With the emergence of AI and AI agents, data is so important. It’s the fundamental data point we’re working with and why we’ve built DataHaven.”
It’s also why DataHaven was built to be an AI-first storage platform.
“We’re the first decentralized storage platform to be secured through EigenLayer’s restaking protocol,” Levy said. “This gives us the opportunity to provide a highly secure storage platform with a mindset and focus on AI and AI agents.”
The idea is DataHaven will be positioned for an AI agent future where data becomes a commodity across all verticals—from DeFi to gaming and beyond—with Moonbeam playing an integral role in how the system functions as a whole.
“We are absolutely not in any way, shape or form abandoning Moonbeam,” Levy said. “These are two different entities and blockchains. This is our expansion, and the way I see it from a business development and growth perspective is that there’s a new opportunity for the projects that have built on Moonbeam—and the projects that have built on Polkadot—to leverage this decentralized, highly secure storage platform in DataHaven. In return, we have this Moonbeam offering creating a positive cycle where one feeds the other.”
Avoiding the game of ‘telephone’
As we move forward into a future where data authenticity becomes increasingly more valued, touting platform data that’s free of manipulation or adulteration from nefarious influences should become a major selling point.
And while changes in data may not always be for nefarious reasons, Levy believes if we can’t verify the source of the data and can’t verify that the data being used hasn’t been changed, we are at high risk of negative results.
To combat data alteration, DataHaven encrypts all its data and the platform itself cannot read or understand the data that lives on top of it. Instead, the protocol breaks up the data into fragments, that together, represent it with a digital “fingerprint.”
“When someone accesses or wants to use the data, for whatever use it may be, they access that fingerprint, and you can tell immediately if something has been modified,” Levy said. “That fingerprint can only be used to represent data on the backend if all of those pieces fit together like a puzzle. So you’ll know straight away if something has been modified.”
Levy likens the lack of accurate, verified data to a game of “telephone,” where if inaccurate data enters a system and is then disseminated, everyone who now interacts with that data is interacting with incorrect information.
“If we can’t verify the data is as it was at baseline and has not been tampered with, we are fraught with the ability to manipulate and corrupt, especially when we hand the data off to AI agents that will be commuting with one another,” Levy said. “We’ve taken a very strong AI-first approach, knowing that the data needs to be secure and verifiable.”
DataHaven is working with Trusted Execution Environment (TEE) to help verify source data is accurate. In the coming months and years, Levy anticipates data verification will become a way that more and more web2 use cases become onboarded to web3.
“For one of the first times ever, we’re able to provide a platform through blockchain that will meet all of the requirements where people can use health care data, financial data and insurance data for secure, verifiable data sources,” Levy said. ”With that in mind, we’ve tackled the AI component first because it’s so prevalent, focusing on making sure what we build addresses the needs of yesterday, today and tomorrow.”
lead image: Ryan Levy