AllianceBlock Data Tunnel — Development Update and Continuation of Ocean Protocol Partnership
Following the AllianceBlock Tunnel’s announcement, we’d like to provide a deep dive into the Data Tunnel Roadmap to give a glimpse of where exactly we are headed.
The Data Tunnel represents the very first use case of our strategic partnership with Ocean Protocol.
While the synergies between the two companies go far beyond the data tunnel, the tunnel is an essential part of the AllianceBlock ecosystem and demonstrates the potential that the partnership between AllianceBlock and Ocean Protocol holds.
Along with Ocean, we are well on the way to fulfilling our vision to connect the traditional financial industry (TradFi) and the decentralized financial industry (DeFi). The Data Tunnel will be the first of a number of products which will simplify compliance with financial regulations.
Together, AllianceBlock and Ocean will play an important role in the emerging data economy. Through the AllianceBlock Data Tunnel, organizations will have simplified access to valuable bank data compliant with open banking regulation.
AllianceBlock will build a data pipeline on top of Ocean where user data such as KYC/AML/KYB will be stored and users will be able to use role-based access control to give and revoke access to data. This product is part of AllianceBlock’s use case to bridge traditional finance and decentralized finance applications by providing short-term financing to high-net-worth individuals at private banks by securing illiquid assets as collateral.
DeFi P2P lending applications and traditional equivalent applications will be connected to the AllianceBlock product. All lenders interested in participating in specific financing will need to do KYC/AML and become validated by the regulatory and compliance tools built by AllianceBlock and powered by the ALBT token. Storage and access control of data for lenders will use Ocean Protocol technology, with lenders being able to grant and revoke access to their data.
Through Ocean Protocol, retail banking APIs will connect to DeFi in a way that is compliant with financial regulations such as PSD2, resulting in new, exciting open banking solutions.
The AllianceBlock Data Tunnel is an important step toward the creation of a fully decentralized capital market, further bridging CeFi with DeFi. At the same time the Data Tunnel will enable a host of possibilities for SMEs, both inside and outside of the blockchain industry, to monetize data while staying in full control of their data.
The Data Tunnel leverages some of our existing partnerships including Ocean Protocol, Aikon, and API3.
AllianceBlock is aiming to become a major player in decentralized open finance and open banking. Through this partnership, financial institutions and DeFi platforms can use the Ocean Protocol Data Layer through the AllianceBlock Data Tunnel. In addition, there is also an opportunity for SMEs to unlock new possibilities for publishing data that currently remains inaccessible (this can be virtually anything: city planning and utilization data, water quality measurements of a country, traffic flow on a highway, power consumption statistics, etc.). Furthermore, the Core Banking System (CBS) providers will be able to leverage decentralized data layers and create new opportunities for the next generation of decentralized banks.
We developed the prototype in December 2020 and are now working on a Proof of Concept, which is scheduled to be finished at the end of January 2021. The Proof of Concept will be followed by an MVP running on the Mainnet in March. The full-featured v1.0 will be released in May 2021, featuring a developer portal to enable developers to build new services and products on top of the AllianceBlock Data Tunnel and associated infrastructure.
The Development Process
The Data Tunnel required an abundance of research and a plethora of different technologies to be fully implemented, including the technologies provided by our partner projects Ocean Protocol, Aikon, and API3.
The AllianceBlock Data Tunnel development team is located in our office in Utrecht, The Netherlands. Operating in a highly agile Scrum process, during our weekly sprints (defined in Atlassian’s Jira) we relentlessly explore, analyze, and implement user stories. Each sprint is preceded by an extensive planning session and concluded with a retrospective in order to help us continuously improve our development process and increase the quality of our work, whilst remaining agile to the ever-changing digital asset markets.
Phase I — Prototype — Finished
The goal of the prototype was to take any CSV file dataset, convert it to JSON and publish it on the Ocean Data Marketplace through an easy to use UI without the hassle of dealing with all the associated MetaMask transactions.
- Publish a CSV dataset, Consume JSON data
Automatically convert CSV data to JSON data and derive JSON schema from the input data in order to validate data and provide a human and machine-readable format that can be used in order to more easily interpret data consumed from the Data Tunnel.
- Upload and Download through a custom UI (without MetaMask)
In order to drive adoption, we want users/organizations to be able to publish data without having to learn or know MetaMask when interacting with the Ocean Data Marketplace.
Phase II — Proof of Concept (January 2021)
The goal of the Proof of Concept (PoC) is to support the 3 major data formats: XML, CSV, and JSON. Every file (or stream of data) is processed in a serverless environment (cheaper and extremely scalable). The aim of the PoC is to offer both support for users/organizations that are well versed with decentralized technologies and those that prefer a more familiar way of working.
- Easy Dataset Creation
The publisher of the data can initially choose how to publish their data:
1) Fully decentralized, using MetaMask, no account creation required
2) Easy to use, leveraging Aikon’s ORE ID, no MetaMask required
- Subscription vs One-offs
Publishers can choose between a subscription model for their data or single-use models where the data consumer will have to pay each time they query the dataset. Both models can exist at the same time for each dataset.
- Developer Friendly
- Fully Automated Conversion, Analyses, and Validation
Each time (additional) data is uploaded, the AllianceBlock Data Tunnel analyzes the data. During this process, it will validate the data layout in order to guarantee predictable data to the data consumers. Ultimately, the data, no matter in which format it was uploaded, will be available in an easy to use and easy to understand JSON format that comes with a JSON based schema so that developers can more easily build applications on top of this data.
Trustless and Reusable Identity Verifications
A lot of service providers require users to identify themselves at some level (minimum age, compliant jurisdiction, full identity like KYC/AML, etc.). Each time they are forced to submit sensitive personal information, putting themselves at risk of having their data breached. Utilizing the AllianceBlock Data Tunnel, the AllianceBlock Trustless and Reusable Identity Verification only needs to verify a user once. Furthermore, personal information will be owned by the user and only the user. Only their fully anonymized verification data will be available for service providers, and that’s only after they allow them.
- Identity linked to a wallet
Connect a wallet to a Service Provider’s web application and use the wallet as an identification method, no need to provide personal details, unless explicitly required, in order to use the service.
- Anonymized and Reusable Verification Information
Service Providers only require a wallet address in order to know whether somebody is verified or if they are allowed to use their services.
- Fraud prevention
Anonymized Verification Information contains information that can be used to verify the authenticity of the verification easily using information only a trusted identity verification provider could provide through signed messages.
- All sensitive data will be owned by the person the identity belongs to
Once the verification is completed, the data will be destroyed. The report with sensitive information will be fully owned and controlled by the person that provided their identity information.
- Access to information only when granted
Service Providers are able to purchase reusable (and anonymized) verification information through the AllianceBlock Data Tunnel, but only if they were allowed to.
Phase III — MVP (March 2021)
With the MVP (Minimum Viable Product) we will strive for a version which can be used in real world scenarios. This version will be available on Mainnet and aims to be usable by our partners and clients once released. The MVP comes with a more extensive toolset for developers to build on top of the Data Tunnel and the Trustless and Reusable Identity Verification Services.
- Payment Service Provider Integration
In order to cater to the needs of traditional organizations, we will integrate a Payment Service Provider (PSP) to enable a fiat gateway. The fiat gateway is available for both data publishers and consumers.
- AllianceBlock Marketplace (Proof of Concept)
For data consumers (the ones that will purchase, download, and use the published datasets), we will create an easy to use User Interface in order to query, view, and analyze the data using simple to use features. This will enable data consumers to easily get acquainted with the data before having to immediately develop software on top of it.
- More Data Formats Supported
Even though the major formats will already be fully supported by the PoCt, there will surely be a need for other common, albeit more exotic, data formats. We will release a few extensions that can be used by publishers to help get this data in the right format in the Data Tunnel. Furthermore, developers will be able to write their own connectors (with help of our SDK) for their proprietary data formats, which are not yet supported in order to cater to the needs they will see arising in the market.
- Subscription model with Automated Recurring (Fiat) Payments
Data publishers are able to choose whether they want their data to be accessible only per payment, for a certain period (for example a month or a year), or on a continual subscription basis. A subscription model (where a data consumer for example will need to pay monthly or annually in order to retain access to the data) will be possible in order to ensure continuous availability to the data consumer and an easy-to-manage payment gateway for the data publisher.
- Ensuring the quality of published data
In order to continuously improve the quality of the data to maximize the usability for the data consumers, data publishers are able to deeply describe their data before publishing. Our analytical services will use this data to determine the best way of interacting with this data for the data consumers. Additionally, data, which is deemed as low quality will impact the rating of data publishers, something that we will be further expanding on in the 1.0.0 version (continue reading for more information).
Trustless and Reusable Identity Verifications
- Integration KYC providers
Widely trusted KYC/AML providers will be fully integrated with Fraud Prevention methods to avoid faked anonymized identity verifications.
- Fiat gateway for Service Providers
Service Providers that want to use reusable and anonymized identity verifications of their users, might not always know how to interact with decentralized technologies. Through a fiat gateway, they will be able to purchase (anonymized) verification information they can easily use for their services.
- Wallet Integration for non-crypto users
Not everybody has MetaMask installed or even knows what a wallet is. They might like the idea that they can be verified anonymously, but will not be able to actively participate in it. To cater to these users, we will leverage Aikon’s ORE ID in order to give them the same benefits as more blockchain-savvy users get.
Phase IV — Product V1.0 (May 2021)
After the MVP, the next release is a feature-complete version 1.0. This will contain a suite of features that will enable organizations to build new products and offer new services while cutting costs. Developers will be fully embraced with a developer portal, that will be launched concurrently with the start of the AllianceBlock Fund for developers (an allocation of the AllianceBlock token — ALBT — that is reserved for developers that can build something meaningful on top of the AllianceBlock Protocol).
- AllianceBlock Marketplace
Built on Ocean Protocol, provides an intelligent and easy to use toolset and API Explorer. Automatically generated filters and query features are available so that data can be analyzed even without needing tailor-made software.
- Support for Multiple Data Source Protocols
Our Serverless Data Tunnel solution might not be useful to all Data Publishers. In case they want to host the data themselves or host it using IPFS, we will offer them the ability to choose so. This means that data publishers are responsible for the quality of the data hosting service. Each data publisher will be rated based on availability (uptime) and quality (speed). This rating will determine the place of the dataset in the marketplace and the price of access for the data consumers.
- Developer Portal
To fully embrace developers, we will release a developer portal housing all documentation needed to interact with the API directly or the SDK. More SDKs will be made available, including a C# and PHP version.
- More Connectors
Support for other popular data formats will be made available to enable a low barrier of access for most data publishers. All data will always be made available in JSON format regardless to ensure complete predictability for all data consumers.
Trustless and Reusable Identity Verifications
- Oracle integration
With AllianceBlock becoming an API3 Airnode, connected to the AllianceBlock Data Tunnel and the Trustless and Reusable Identity Verification Services, we will be able to provide smart contracts with the possibility to automatically detect whether a transaction can be accepted based on anonymized identity verification details obtained through our oracle.
- Auto-renewal of verifications
Identity verifications are normally not renewed often, even though most service providers that require a KYC/AML verification are obliged to do so. In order to ensure up-to-date data from users, renewal of verification data is automatically required and enforced every six months.
About Ocean Protocol
Ocean Protocol builds powerful Web3 apps for the emerging data economy. Founded in 2017, Ocean Protocol connects data providers and consumers, using blockchain technology. Ocean Protocol technology allows data to be shared, without compromising control or security for the data owner, while ensuring traceability, transparency, and trust for all stakeholders involved. Ocean allows data owners to monetize data while keeping control over their data assets. Ocean Protocol Foundation is based in Singapore.
AllianceBlock is building the first globally compliant decentralized capital market. The AllianceBlock Protocol is a decentralized, blockchain-agnostic layer 2 that automates the process of converting any digital or crypto asset into a bankable product.
Incubated by three of Europe’s most prestigious incubators: Station F, L39, and Kickstart Innovation in Zurich, and led by a heavily experienced team of ex-JP Morgan, Barclays, BNP Paribas, Goldman Sachs investment bankers and quants, AllianceBlock is on the path to disrupt the $100 trillion securities market with its state-of-the-art and globally compliant decentralized capital market.