This is a chapter from the book Token Economy (Third Edition) by Shermin Voshmgir. Paper & audio formats are available on Amazon and other bookstores. Find copyright information at the end of the page.
Ocean Protocol is a peer-to-peer data exchange. It integrates Web3 and privacy-preserving technologies for secure, accountable data sharing for AI models and other data-driven applications. The protocol is designed as a multi-token system, where data owners can share access to their data in a privacy-preserving yet accountable manner.
Note: Ocean Protocol has undergone significant evolution since its inception in 2017. This chapter focuses on the protocol's current mechanisms, particularly the features introduced with the V4 rollout. The author gratefully acknowledges Trent McConaghy for his valuable contributions to this chapter. The analysis of protocol stakeholder and token economics was conducted before the announcement of the Super Intelligence Alliance, and does not document dynamics that unfolded after that.
The rise of the Internet has transformed data into a highly valuable asset class, driving an increasingly dominant data economy. However, this economy faces significant challenges, including limited access to private data, potential data misuse, and biases within available datasets. High-quality, representative data is crucial for AI applications across diverse fields, from chatbots and precision medicine to traffic control, manufacturing automation, and financial services. Unfortunately, growing data privacy legislation often restricts access to this data due to legitimate concerns about security, privacy, trust, and fair monetization.
To address these issues, the founders of Ocean Protocol developed a decentralized data exchange that allows anyone to exchange data assets and their access rights in a privacy-preserving manner while providing tamper-proof audit trails. The protocol serves as public and permissionless infrastructure for creating a variety of P2P tokenized data marketplaces. Their initial application–Ocean Market–allows anyone to become a data publisher by tokenizing datasets into unique Data NFTs and monetizing them through selling access rights in the form of datatokens. Data assets and services can be exchanged in various formats—such as raw datasets, processed data, AI models, and predictions. The protocol also allows anyone to build their own specialized data market by forking Ocean Market's source code. While originally designed for data science purposes, Ocean Protocol's framework can be applied to other data forms, including digital art like photos, videos, or music.
The Ocean Protocol Foundation was established in 2017 by a team that was previously involved in other crypto ventures like “Ascribe,” “BigchainDB” or “IPDB.” The first version of Ocean Protocol was launched in 2018, progressively enhancing its features. Since the launch of V4 in 2022, the Ocean Protocol Foundation introduced two new features to help drive traction for protocol usage: (i) “Ocean Data Farming program,” (ii) “Predictoor,” a specialized data market application, and (iii) “Ocean Nodes,” and infrastructural P2P network that was designed to improve performance of their existing marketplace applications.
In 2024, Ocean Protocol joined the Superintelligence Alliance, collaborating with other Web3 protocols engaging in similar activities, such as “Fetch.ai” and “SingularityNET”. The goal was to counter centralized AI dominance by creating open, decentralized AI infrastructure. While the foundations behind the protocols retained independence, they collaborated through a governing council to integrate decentralized infrastructure into the AI landscape. To fund this initiative, the alliance transformed the existing Fetch.ai token (FET) into ASI, expanding its supply and distribution to include stakeholders from all participating protocols. The FET token was renamed ASI and additional tokens were minted to allocate to OCEAN and AGIX token holders.
Purpose & Political Principals
Ocean Protocol was created to provide a decentralized marketplace for sharing and monetizing data in a privacy-preserving manner while ensuring data provenance, allowing users to track who published, purchased, and consumed the data.
Higher-quality data via publicly verifiable audit trails and shared control: The founders wanted to establish a shared global registry for training data and data models with publicly verifiable audit trails. The assumption was that shared control would encourage data sharing and improve overall data quality, resulting in better AI models and data-driven applications.
- Privacy & control: The goal was to provide technical solutions so that data could be passed on in a privacy-preserving manner and with full control over one’s data.
- Data as a common good: The vision was to accommodate both profit and nonprofit datasets, as well as mixed models, hoping that their data exchange would also be used for open-source licensing schemes to create a body of "data commons."
- Decentralized & permissionless: The long-term goal was to create an open and permissionless data exchange infrastructure that no single person or institution could own or control at any level of the stack. Ocean Protocol features have decentralized progressively since the protocol's inception. The aim of the founders was to bring the protocol to a level where human governance over protocol upgrades would be reduced to a minimum.
Functional Design
In Web2-based systems, data is collected by centrally operated entities and passed between the walled gardens of their server infrastructure, leaving data owners without reliable audit trails to monitor their data's use. Data giants like Google, Amazon, and Facebook dominate AI development due to their vast data resources and in-house expertise. Newcomers like OpenAI changed the rules, leveraging often illegal internet scraping practices in addition to third-party licensed data. Meanwhile, smaller institutions lack either the relevant data or the AI expertise. Data owners, on the other hand, face transparency issues as customer data is passed to third parties without a clear audit trail. Ocean Protocol addresses these challenges with the following design choices:
- Collectively managed and publicly verifiable data markets: The main challenge was to create a collectively managed and publicly verifiable marketplace between data providers and data consumers—including all data assets, their respective licenses, and related payments. A set of smart contracts that operate on a public blockchain infrastructure was designed to manage the issuance and settlement of all data assets and their access rights, with a verifiable audit trail and full control over what happens to one's data.
- Self-custodial wallets: To guarantee full control over one’s data, the marketplace infrastructure offers self-custodial features. This way, data publishers can control the public-private key pair to their data wallets for access-rights management of their data assets. The wallet only manages access rights without necessarily revealing any raw data.
- Privacy: The protocol designers aimed to ensure that data consumers would not have access to all raw data, only to the specific data they need, and that this data would not be shared directly. They developed a system where computation occurs directly on a publisher's controlled devices or cloud storage, which they called "Compute-to-Data." Access is granted for "computer eyes" rather than for "human eyes," meaning that the buyer can only see the data output after computation but never the original data input. This mechanism ensures that sensitive data never leaves the user-controlled device where it is stored. Initially, data providers were responsible for managing and maintaining Compute-to-Data services using infrastructural components provided by Ocean Protocol. Although decentralized, this approach required significant technical expertise and resources, potentially leading to inconsistencies and operational inefficiencies. To address these issues, Ocean Nodes were introduced to outsource Compute-to-Data services to a peer-to-peer network of infrastructure nodes.
- Access-rights management & encryption: Rights management, encryption, and data streaming between the blockchain network and the marketplace smart contract were initially handled by a middleware service called "Ocean Provider." Data publishers had to operate their own instances of Ocean Provider or rely on third-party services, which could centralize control and reduce data autonomy. This function was later delegated to Ocean Nodes, enabling a broader range of participants to operate these infrastructural services in a P2P manner.
- Indexing: To ensure that additional information about a data asset (such as descriptions, access conditions, and update history) is easily searchable and accessible, this metadata needs to be indexed, as searching the blockchain network for metadata can be slow and inefficient. An indexing service was developed that continuously monitors the blockchain, collects metadata about datasets and services, and stores it in a structured, easily accessible format to improve usability and performance without requiring direct interaction with the blockchain. Initially, indexing was designed as a separate component that could be operated by marketplace providers, which required time and resources. This infrastructural function was later delegated to Ocean Node operators.
- Permissionless infrastructure: The system was designed as open-source, offering necessary developer tools so that anyone can create their own third-party marketplace according to their own market rules. The market rules were designed so that data publishers have the right to offer any type of data asset and autonomously define the types of access rights they want to grant—all within the scope of what is legally accepted in their respective jurisdiction.
- Collective content curation: The protocol designers implemented collective content curation through its data farming mechanism, where OCEAN token holders stake assets to influence dataset rankings and earn rewards.
- IP or privacy violations: To strike a balance between decentralization, regulatory compliance, and operational integrity, intellectual property and privacy violations were addressed with a "Purgatory Process" managed by the Ocean Protocol Foundation. This process allows claims to be filed and disputes to be resolved transparently via a public repository on GitHub.
Ocean Market & Third-Party Marketplaces
Ocean Market is a decentralized application operated by the Ocean Protocol Foundation that facilitates transactions between data publishers and data consumers. The exact market features have evolved over time. The open-source nature of the market design and the developer tools allows anyone to use it as a blueprint to develop their own marketplace–for specialized industries or specific use cases. The marketplace features following elements and processes:
- Publishing data assets: Data owners upload their datasets to a marketplace of their choice. In this process an ownership token (Data NFT) is minted which represents property rights to the data asset. The seller also needs to allocate a number of access and usage rights tokens (datatokens) to the NFT. When the data owner publishes the data assets and attached rights in the marketplace application, a blockchain transaction is invoked. The corresponding tokens are minted and the related metadata is registered on-chain. Only the URL to the data is encrypted by a smart contract and registered on the blockchain network; the data is never revealed in plaintext. This way the data itself is not stored on-chain, only the access control to the dataset is managed by the blockchain network. Control over data remains with publishers at all times, as they hold the public-private keys to their data assets. In theory, a data publisher could also sell the property right to the data asset, as long as it is legal to do so.
- Purchasing access & usage rights: One can buy the access- and/or usage-rights to a data asset in the form of datatokens. These datatokens can grant access to a static dataset (one or several files) or a dynamic dataset that represent a stream of data services over time. To access a dataset, the consumer needs to send the access-right tokens (datatokens) one brought, to the wallet of the data publisher, and make a service request. Datatokens can be purchased via the Ocean Market directly, or on a secondary market. The buyer pays fees for every token transaction to the marketplace operator and third party service providers.
- Market fees: Different fees exist such as consumption fees for data access, swap fees for exchanging tokens, provider fees for encryption and compute services, community fees for protocol maintenance and optional publisher fees. Third-party marketplaces can define their own fee structures and services, but have to contribute to the Ocean ecosystem’s sustainability by passing on a certain part of the community fees.
- Discovering data: Ocean Market provides a visual discovery platform where buyers can browse, search, and filter data assets and services offered by data sellers. Content discovery is curated by community members who engage in so-called “data farming” activities. Anyone can participate in curation activities by locking OCEAN tokens into a smart contract to vote for a data asset they expect to be of interest to many people. In return, they can earn network tokens. The amount of rewards depends on the duration they lock their tokens and the number of times that dataset is consumed. Data assets are then ranked in the marketplace according to the number of votes they receive.
Ocean Predictoor
“Predictoor.ai” is a data market application developed by the Ocean Protocol Foundation, specializing in prediction markets. It introduces a new datatoken template for the licensing and access rights management of price predictions and potentially other types of forecasts.
- Publishing predictions: The application allows users to publish price predictions for a range of Web3-native assets, such as ETH or BTC, over 5- and 60-minute intervals. Users can operate bots—"predictoors"—that analyze market sentiment and submit price forecasts to a smart contract. Predictoors must stake OCEAN tokens as a measure of their confidence in their predictions. This stake is lost when making incorrect predictions. If their predictions are correct, they receive a payout. Rewards depend on sales volume, prediction accuracy, and the amount staked. Payments, including staking fees, are settled in OCEAN tokens, for which blockchain transaction fees must be paid.
- Buying predictions: Currency traders can purchase prediction feeds from predictoors they trust to make trading decisions. The protocol designers expected traders to buy more prediction feeds from predictoors they trust, creating a feedback loop where accurate predictions would drive both demand for predictions and protocol revenue.
- Market fees: 0.1 percent of the final price paid for a prediction is passed on as a community swap fee. Twenty percent is allocated to the Ocean Protocol Foundation for the future development of Ocean Predictoor.
Token Types & Token Properties
The founders of Ocean Protocol designed a multi-token system, which originally only had one token type: OCEAN. New types of tokens were eventually introduced with each protocol update to fulfill additional system functions as they were developed: “OCEAN,” “H2O,” “veOCEAN,” “Data NFTs,” and “datatokens.” In 2024, when Ocean Protocol joined the Superintelligence Alliance and introduced “Ocean Nodes,” two new tokens were introduced: “ASI” and “SBT,” which will partially replace the existing token system described in this chapter. At the time of researching for this book, the exact token governance rules for “ASI” and “SBT” are still being developed.
- Data NFTs represent a unique digital certificate of ownership for datasets or data services. They represent the copyright of the data asset, assuming the data publisher has a valid claim on the base IP. The token is minted upon what one could describe as “Claim-of-Copyright,” the moment a data publisher uploads their dataset or data service to Ocean Market, at which point the number and type of licenses issued are also defined. Data NFTs represent the property rights, as well as all related copyrights, management rights, and revenue rights of a given dataset. As the name states, Data NFTs are non-fungible. They can be transferable, but only if IP rights can legally be transferred. Data NFTs do not expire unless the dataset is withdrawn by the publisher. Its privacy features depend on the blockchain network used. The token contract also manages metadata and access/usage rights.
- Datatokens are fungible tokens issued by Data NFT creators to represent access and usage rights of data assets. Licensing models are flexible. They can be perpetual, time-bound, or grant one-time access only. Datatokens are minted during the creation of a Data NFT, where the publisher defines the number of access rights issued as well as their price. Datatokens expire when consumed or upon reaching a predefined expiration date. Datatokens granting access to one dataset (Data NFT) differ from datatokens of another dataset. Within one dataset, they are interchangeable (fungible). Pricing and transferability depend on the publisher’s settings. Privacy settings depend on the blockchain infrastructure used.
- OCEAN was designed to serve as the primary currency within the ecosystem, facilitating payments for services, marketplace transactions, and governance. The second purpose is to serve as a convertible governance token: Users could lock OCEAN tokens temporarily in escrow to generate veOCEAN tokens, gaining voting rights and staking rewards. The token was designed to be fungible, and its transferability would only be limited when OCEAN is temporarily locked up. Although OCEAN was designed as the native currency for network payments, other Ethereum-compatible tokens like DAI or ETH were also accepted. OCEAN has no price stability mechanisms. Its exchange rate is determined by supply and demand, though the protocol provided for some minimum economic policies that encourage holding OCEAN instead of selling them on an open market. All OCEAN tokens were initially minted at project genesis and were allocated to different stakeholders who have different mandates as to how and when to distribute these tokens. OCEAN tokens are burned as an economic policy tool: 5 percent of revenues generated in OCEAN tokens through Ocean Market fees are burned on a regular basis. Its privacy features depend on the blockchain network used.
- H2O is a stable asset backed by OCEAN. It was designed to stabilize pricing for datatokens and reduce volatility risks for publishers and consumers. H2O functions as a crypto-collateralized stable token without a hard peg, maintaining a relatively stable value through its free-floating exchange rate. Publishers could denominate access rights in either OCEAN or H2O, depending on their needs. H2O tokens are minted by locking OCEAN into escrow or can be purchased on exchanges. Its privacy features depend on the blockchain network used.
- veOCEAN was designed as a governance token that could be created by locking OCEAN in an escrow contract, proportional to the amount of OCEAN tokens locked. They could be used to (i) vote on the quality of datasets or (ii) serve as a monetary policy tool. The more OCEAN one has, the more veOCEAN and therefore “voting rights” can be obtained for voting on the quality of datasets. By locking the tokens and pointing them to a Data NFT, one effectively votes on which data assets have promising quality. One could earn a higher yield by actively curating data assets. On a weekly basis, all data assets available on Ocean Market are measured for two parameters: (i) Data Consume Volume, which refers to the number of times a data asset was consumed on Ocean Market in that period of time, and (ii) the amount of veOCEAN allocated to the data asset. Both measures are sampled 50 times a week to obtain an average and to distribute “Active Rewards” to data farmers for this period of participation. As a monetary policy tool, OCEAN token holders were incentivized to save their OCEAN by locking them up and converting them into veOCEAN, earning interest for the duration of the lockup as an incentive for holding OCEAN instead of selling them. veOCEAN tokens are non-transferable and expire when the lockup period ends. All veOCEAN within one escrow contract are fungible in the sense that they are equal in voting/curation power, but they are tied to the identity of the escrow contract owner (or original OCEAN holder). There is no stability mechanism, as this is not relevant in the context of voting in general and curation in particular. Its privacy features depend on the blockchain network used.
- ETH & ROSE: Native blockchain tokens such as ETH (Ethereum network), ROSE (Oasis network), and others blockchain networks on which the Ocean Protocol operates, also play a crucial role in its ecosystem. These tokens are required for network fees on the blockchains hosting Ocean applications. For example, ETH is needed for transactions on Ethereum, while ROSE is used for Ocean Predictoor’s operations on the privacy-preserving Oasis Sapphire network. The economic dynamics of these blockchain networks are likely to influence the token economics of all tokens in the Ocean ecosystem.
- ASI Token: The ASI token, introduced in 2024, serves as the unified utility token for the Superintelligence Alliance. The idea is to consolidate functionalities of the network tokens previously used in Ocean Protocol, Fetch.ai, and SingularityNET to facilitate transactions, governance, and access to services across the alliance's platforms. At the time of researching for this book, the exact token governance rules are still being developed.
- SBT (Soulbound Tokens) is a new token type that was announced to serve the function of a non-transferable reputation token representing unique attributes or achievements of an individual or entity participating in the system. Their potential integration into the Ocean Protocol ecosystem may relate to identity verification or reputation systems. At the time of researching for this book, the exact token governance rules are still being developed.
Economic Policies
The OCEAN token was initially introduced for payments within the Ocean ecosystem. As it lacked a stability mechanism, the H2O token was introduced to offer greater economic stability, providing stakeholders with reliable financial planning. H2O is fully backed by OCEAN. Its dynamics are influenced by the monetary and fiscal policies associated with OCEAN.
- Initial issuance of OCEAN: The initial issuance of OCEAN tokens was capped at 1.41 billion, with allocations as follows: 51 percent for several incentive programs, held in a multisignature wallet managed by the Ocean Protocol Foundation; 15 percent for community grants and foundation operations; 10 percent to BigchainDB, the original founding company; and 24 percent distributed to investors through various funding rounds.
- Continuous issuance of OCEAN: The 51 percent of OCEAN tokens that have been earmarked for continuous issuance of network rewards are used as follows: 35.7 percent supports protocol and marketplace contributions, such as saving OCEAN, curating data assets, or making predictions, while 15.3 percent has been allocated to future incentive programs. Token distribution is currently manual, conducted by the Ocean Protocol Foundation, but should transition to an automated process via “vesting” smart contracts going forward.
- Deflationary monetary policy: Five percent of network revenue collected in OCEAN tokens is burned, reducing overall supply. This burn mechanism is triggered only when network income increases, striking a balance between growth and preserving token value. The idea is to counteract inflationary pressures from new OCEAN tokens entering circulation.
- Interest rate: The interest paid on locked OCEAN tokens is an incentive to encourage OCEAN token holders to convert OCEAN into veOCEAN instead of selling them.
- Data Farming yield: A higher yield is granted to OCEAN token holders who use their veOCEAN to also curate data assets, thereby creating value-added services for the protocol, for which they are rewarded with relatively more OCEAN tokens.
- OCEAN not legal tender: The protocol was intentionally designed to allow any Ethereum-compatible token to be accepted for network payments (purchasing datatokens, accessing services, or paying fees), as enforcing a single token would create lock-in effects and contradict the decentralization ethos. The protocol's FixedRateExchange smart contract facilitates atomic swaps, allowing datatokens to be exchanged for other compatible Ethereum tokens. Swap fees generate revenue for the protocol: 0.1 percent for exchanges using OCEAN and 0.2 percent for other tokens.
- Fiscal policy: The protocol builds its treasury through network taxes, also known as Ocean community fees. These fees are collected from marketplace operators, from the users of their system. Certain fees can be determined autonomously by marketplace operators. This means that, in some form or another, all users of Ocean Protocol pay transaction fees on their network activities, which are then partially allocated to the treasury smart contracts of the Ocean Protocol Foundation. This percentage can only be adapted by the Ocean Protocol Foundation, at least currently. This structure ensures continuous protocol funding for ongoing research, development, and other necessary operations.
Stakeholders
The types and roles of stakeholders within Ocean Protocol have evolved alongside iterations of the protocol. Early governance approaches, including roles like “Keepers,” “Workers,” or “Working Groups,” were replaced as the protocol matured. From November 2020 to October 2022, the Ocean core team operated OceanDAO, a grants program where OCEAN token holders voted on developer proposals. This was replaced in mid-2022 by the Data Farming program, prioritizing ecosystem incentives over community-led grant funding. At present, the Ocean Protocol Foundation oversees protocol evolution, with core team members paid by the Foundation. This will continue until the protocol is finalized and governance can— for the most part—rely on the automated market rules.
- The Ocean Protocol Foundation, a Singapore-based nonprofit established in 2017, manages treasury funds, protocol development, and community activities. Its goal is to decentralize governance and reduce reliance on human intervention. The Foundation hires subcontractors and companies to develop protocol components and adapt its rules as needed, including managing token issuance and de-listing datasets flagged for legal violations.
- BigchainDB GmbH, a German company and the original creator driving Ocean Protocol’s design, has played a diminishing role since the Foundation’s creation. While initially contracted to maintain technical, legal, and operational activities, many former BigchainDB employees now work directly for the Foundation.
- OCEAN token holders have a stake in the system. They participate in protocol governance and economic activities by locking their tokens into veOCEAN. Their participation aligns monetary rewards with network contributions.
- Data Scientists & AI developers can fulfill multiple roles across marketplaces, acting as data consumers, application developers, or prediction providers. In the Ocean Predictoor application, they submit predictions, earning rewards for accurate results. Challenges funded by the Ocean Protocol Foundation also incentivize data scientists to develop application-level innovation for the ecosystem.
- Data publishers are key stakeholders, sharing and monetizing data assets through licenses (datatokens) tied to Data NFTs. They define access permissions, pricing, and restrictions, ensuring datasets meet specific criteria.
- Data consumers drive demand for datasets, paying for licenses and contributing to marketplace revenues. Their role in creating value-added applications highlights their significance in sustaining the ecosystem.
- Ocean Market is the first decentralized application of Ocean Protocol. It is a general-purpose market maker between buyers and sellers of data services and determines the market rules, including the transaction fees market participants have to pay.
- Ocean Predictoor is a specialized data market application for prediction markets, focused on predicting prices. It is a market maker between prediction providers (predictoors, a subset of data publishers) who earn rewards based on accuracy, and prediction buyers (asset traders, a subset of data consumers) who determine the market rules, including the transaction fees market participants have to pay.
- Ocean Nodes provide infrastructural services for marketplace applications, enabling secure data exchange, privacy-preserving computation, and indexing services. Node operators can earn OCEAN tokens by providing infrastructure services, ensuring the availability and reliability of the network. They also play a role in distributing rewards for activities such as curating datasets and facilitating Compute-to-Data jobs.
- Third-party marketplaces expand Ocean’s ecosystem by specializing in specific data types or industries. Their operators can define market rules and earn a share of transaction fees, fostering a innovation in the ecosystem.
- Ocean Ambassadors, is a community initiative funded by the Foundation, with the role of advocating and educating about the ecosystem. Anyone can join after completing Ocean Academy’s introductory course.
- Super Intelligence Alliance: In 2024, Ocean Protocol joined the Super Intelligence Alliance to collaborate with Fetch.ai and SingularityNET to counter centralized AI dominance with an open, decentralized AI infrastructure. While the foundations behind all individual protocols retain independence, they collaborate through a governing council to integrate decentralized infrastructure into the AI landscape. To fund this initiative, the alliance launched a new token. It started with the Fetch.ai network token (FET) as the base token of the alliance, which was then renamed ASI. An additional 1.48 billion ASI tokens were minted, with 867 million ASI tokens allocated to AGIX token holders (SingularityNET) and 611 million ASI tokens allocated to OCEAN token holders (Ocean Protocol). The total supply of ASI tokens is 2.63 billion.
- Regulatory authorities shape the legal framework for transactions, particularly regarding copyright, privacy, and securities laws. Compliance with these regulations influences how the protocol operates across jurisdictions.
- Other Web3 protocols and applications: Economic and infrastructural aspects of the blockchain networks upon which tokens of the Ocean Protocol ecosystem are issued and managed can influence the Ocean ecosystem. Any macroeconomic changes or black swan events within the Ethereum ecosystem, or within wallet services that integrate with Ethereum, can also create positive or negative effects on the Ocean ecosystem. Positive effects could be, for example, wallet applications or market integrations with great usability that onboard many new players to the Web3 ecosystem and generate more market demand. On the other hand, wherever Ocean Protocol tokens integrate with financial products of a DeFi or CeFi ecosystem, they can also be affected by black swan events spilling over from these financial ecosystems.
Power Structures
- Policymaking power: At the time of writing, the Ocean Protocol Foundation’s core team holds most policymaking authority, though its scope is narrowing. Key functionalities like data access control and economic mechanisms are already fixed, limiting changes to transaction fees and network taxes. New features added to the protocol stack remain under the core team’s purview. The founders envision “crystallizing and automating” the protocol, reducing the need for future changes and limiting policymaking to fine-tuning economic parameters. However, it remains unclear when or if these adjustments will be decentralized or how decision-making power will be distributed in unforeseen scenarios. Since the formation of the Super Intelligence Alliance, certain policymaking functions have shifted to the collaborative entity.
- Voting power: Within the Ocean ecosystem, the philosophy of what constitutes a DAO, who should be able to vote in a DAO, and what types of voting rights stakeholders should have has changed over the years. In the beginning, OCEAN token holders could vote on the allocation of funds through a grants DAO and on some protocol changes. Concerns over “organizational capture” by organized subgroups at such a vulnerable early stage of protocol development led to a re-centralization of voting power. Currently, token holders do not have direct voting rights over protocol evolution or funding allocation. Voting powers are limited to data curation via veOCEAN tokens.
- Executive power: Initially, the founders and team members of BigchainDB GmbH had executive power over operational decisions relevant to the evolution and maintenance of the protocol, community management, marketing, etc. As the protocol was deployed and market mechanisms were established, executive power became distributed among various stakeholders. Data publishers and consumers have the right to execute data provision and consumption via various markets within predefined market mechanisms. Ocean node operators have the power to execute infrastructural functions. OCEAN token holders have the right to execute macroeconomic policies by temporarily locking their tokens. The Ocean Protocol Foundation retains executive power to sanction datasets flagged for legal violations, and third-party marketplace operators hold similar authority within their marketplace applications.
- Market power: Data publishers and data consumers make or break the data market, while data market operators have market-making power, shaping the ecosystem’s viability. Over time, as more data marketplaces emerge, additional power structures are likely to form between these marketplaces and their participants. Furthermore, OCEAN token holders exercise market power by buying, holding, converting, or selling their tokens, directly influencing the broader ecosystem’s dynamics, and Ocean nodes have market powers in terms of infrastructural services provided.
Purpose & Reality
The Ocean Protocol team has demonstrated persistence in developing and adapting the protocol while remaining committed to their core roadmap. Over the years, they have addressed technical challenges such as creating privacy-preserving marketplaces and tokenizing data assets and services. At the time of writing, their focus lies on expanding specialized data markets, fine-tuning existing mechanisms, and providing a decentralized backend for AI-powered applications together with their strategic partners. Their most significant challenge remains the curation of quality data within their Data Farming Program, an issue that mirrors broader difficulties in decentralized content curation across Web3 and Web2 platforms.
- Token and market design for data assets and data services: Ocean Protocol has successfully developed a template for data-sharing marketplaces by creating a tokenized system for representing data assets and services that is publicly verifiable.
- Privacy-preserving decentralized data sharing infrastructure: The team has also implemented robust privacy-preserving data-sharing infrastructure that is collectively maintained by a peer-to-peer network of Ocean node operators who ensure that data services can be consumed securely without exposing sensitive information.
- Traction: Despite technical achievements, the protocol has faced challenges with user adoption, possibly because they were ahead of their time. Furthermore, since data is a vast and diverse asset class, the general-purpose market mechanisms provided by Ocean Market may not suffice to cater to the needs of users. The team has responded by fostering specialized data markets, such as Ocean Predictoor, and encouraging the community to build niche applications to address the need for specialized mechanisms that cater to specific data categories through data challenges and related programs.
- Curation of quality data: The greatest challenge lies in the Data Farming Program’s curation mechanisms. Initial implementations faced issues such as “wash consuming,” where data publishers manipulated data curation activities by consuming their own datatokens to inflate demand and rewards artificially. While higher consumption fees have mitigated this issue, the underlying problem remains: the incentive mechanism reduces data relevance and quality to a single metric—popularity, which is measured in terms of “Data Consume Volume.” This approach fails to account for the subjective nature of data quality and relevance, which vary depending on consumer needs. While deploying specialized data markets may help mitigate this problem by focusing on niche datasets, the broader issue of meaningful collective data curation persists. Without subjective curation mechanisms, a critical component of a decentralized data market remains underdeveloped. The protocol’s success depends on its ability to address these challenges, particularly in creating effective and fair curation systems that support diverse and meaningful data sharing
Footnotes
[1] The Ocean Market – or any potential third-party market – typically has a centralized server running the backend to manage metadata cache or serving up webpages. From a legal perspective, this means that the law of the country where the server is running applies.