A Close Account Of The Inevitability Of Tokenized Data- Part 2

While considering data to be the new oil, the reasons are evident for its popularity. Read on the continuation of the previous post. The ultimate is yet to come. 

Continuing From The Earlier Post

Offering an alternative is decentralisation in Dixon’s estimation as the tokenisation due to blockchain in short allows all users to participate in the financial benefit with the upside of the network effectively eliminating the distinction between network owners and also network users. Here there is no distinct ownership class where no one needs to extract. 

On a fundamental level, the problem he faced was about what the token was, where the utility tokens in almost every case were simply payment tokens with alternative money just for that service. In this context, their value relied mainly on speculation as they could achieve a certain monetary premium allowing them to transcend utility for just that network enabling the network to grow large as the value was sustained over time. 

It is easy to understand why things come to be designed this way where the network builders have this sort of payment token allowing a non-dilutive form of capitalisation which was global and instantaneous for the retail buyers it was here that they offered a chance to participate in risk capital in a way they had been denied by accreditation laws. 

However, the simple truth at the end of the day was that these tokens from blockchain were never backed by anything other than dreams, as the market for these dream coins finally crashed when many decided to throw out the token baby with the ICO bathwater. In this case, the prompted questions were instead: what if the token in the decentralised networks which weren’t backed by anything but dreams whereas they were instead backed by data? If instead of dream coins what were the data coins? 

Is Data The New Oil?

In case data is indeed the oil of the new economy, the context of any given digital application, where data is the place value resides and for companies that they are paid to host it, alongside the platforms able to sell advertising against it as the users are effectively trading their data for reduced-priced services. 

In other words, data is an asset that can be tokenised and decentralised into the public blockchain not hard to imagine a future with every meaningful piece of data in the blockchain world represented by the private key. Hereby tying the tokens to data explicitly creates the world of new options reconfiguring how the apps are built. 

Data tokenisation firstly creates an opportunity as the nodes in a decentralised hosting network where the decentralised alternative to AWS is effectively speculating on the future value of the data in the applications where they provided hosting services thereby creating financial incentives beyond simple service provision. Here the third parties are seen to crawl, query and access the data as they pay the token representing the data back to the miners securing and storing it as well as the developers who acquire, structure and label the data that is valuable to third parties especially machine learning and AI-driven organisations. 

For the second reason, app builders can not only harness the benefits from the more fluid capitalisation through tokens whereas easily experiment with new ways arranging value flows as cutting users in on the value of their data and allowing them to benefit. 

Thirdly and finally users start to have a tangible sense of the value of their data exerting, market pressure on platforms included in the upside as well as exert more control over where and how the data was used. 

To Close The Discussion Finally

Tokenised data has therefore created a market mechanism redistributing the balance of power in technology networks without resorting to the ham-fisted regulation like GDPR or even worse the sort of break up proposed by Warren. There are many like Fred Wilson even after the implosion of the ICO phenomenon believing that a shift to user control of data was facilitated by blockchains which is not just possible but inevitable. 

Technology has historically evolved from closed to open, back to closed, and then back to being open was in a closed phase as the centralised apps and services owning and controlling a vast majority of the access to data. It is known that decentralised p2p databases as the public blockchains open up and tokenise data in a disruptive way changing the flow of how values captured and created on the internet. Simply put this tokenised and open data limits the control data monopolies having on future innovations while ushering a new era of computing. 

What's Your Reaction?

like
0
dislike
0
love
0
funny
0
angry
0
sad
0
wow
0