Creating a “Data Economy” in the Property Industry

Mete Varas
7 min readJan 19, 2020

As we moved from analogue to the digital world, everything we do creates data. Every second we are interacting online is being stored in multiple places at the same time. What we search, browse, click, how we move the mouse along the screen, our keystroke cadence, what we pause on and what we chose to ignore all add up to a shockingly accurate description of who we are, data-wise. We have given away our privacy online but we did get something in return. We are able to easily browse the most complicated set of information ever assembled, the internet, in a useful, even enjoyable way. This is thanks to the data economy that the internet has created.

Today, we produce more data now than ever before and it grows exponentially. In 2010, the world produced 1 Zettabyte of data. A zettabyte is 1,000,000,000,000,000,000,000 bytes (that’s 21 zeroes for those counting), or one trillion gigabytes. That’s enough data to fill 75 billion 16-gigabyte-sized devices. In 2016, we produced 16ZB and by 2025, we will produce 160 plus.

While data was primarily the side-effect of successful business operations, it has become one of the most important commodities in every industry. Innovation is the only way to stay ahead of the competition, at least in the long run, and data is the lifeblood of successful innovation. Real estate is not immune to this new reality. Data is the main force behind the tech (r)evolution that we see in the property industry at the moment. I think we are already seeing that tech will be one of the main differentiators between the winners and losers in pretty much every real estate vertical.

Buildings themselves will start to become massive data creators, just like your computer or router. Starting from construction to demolition, buildings continuously create data. There are sensors that can monitor concrete (yes concrete) strength, temperature, and relative humidity. There are sensors available that can cost-effectively measure heat, air quality, traffic, occupancy, you name it. This comes at a cost but it can answer important nagging questions for every building such as “which stairs and elevators are used most?” Or, “which pumps get overheated during peak times?”

On the city level, the size of the data created collectively is almost unimaginable. Mapping inbound and outbound movement within the city day and night. Analyzing each neighbourhood’s to detect crime, utility consumption, garbage and recycle, anything really as long at it falls within our social norms for privacy versus safety and convenience.

As tech strengthens its roots across the property industry, the importance and the value of data is now starting to be fully understood. In the early days, it started with solutions collecting data sets in and out of the building and to create dashboards. The technology was starting to be able to answer what has happened, it was past tense. Then, came the AI tools that help companies run a predictive and prescriptive analysis. This started answering what could or even should we do to make our buildings better.

But to keep this flywheel of innovation II algorithms need snowballing data. The more data available, the more varied and robust it is, the more accurate AI models get. Algorithms demand data sets to get better. According to AI researcher Alexander Wissner-Gross, “Many major AI breakthroughs have been constrained by the availability of high-quality training sets and not by algorithmic advances. Data, not algorithms, might be the key limiting factor to the development of human-level artificial intelligence.” He adds, “AI advances six times faster when data is available.”

Many major AI breakthroughs have been constrained by the availability of high-quality training sets and not by algorithmic advances. Data, not algorithms, might be the key limiting factor to the development of human-level artificial intelligence. Alexander Wissner-Gross

We have reached a critical stage, even a bottleneck, that might decelerate the pace of transformation and impede companies both startups and incumbents. Softwares need more and continuing data whereas providers are fully satisfied with pricing, privacy and trust.

The problem is simple but sometimes the most simple problems are the hardest to solve. The problem is that no one wants to share data. No one wants to share their proprietary data and they have a right to think so. Once data is shared, the owner loses control over it. Without measures that provide a semblance of control, an audit trail on usage, and fair compensation schemes, the data will remain locked-up. We keep hearing similar messages at different PropTech conferences worldwide. But we need to do more than talk (complain or defend depending on how much valuable data you are sitting on). We desperately need to have a solution, a framework for data sharing.

Traditional centralized exchanges and recent crowdsourced efforts are limited. Hosting is always an issue. Data needs to be hosted at the data exchanges, which is not acceptable for many data providers. As an alternative, data could be hosted in situ (on-site) at the data provider, but the options are limited for data consumers. Costs are also usually an issue. data exchanges finance themselves via transaction fees, commissions, and services, adding friction and cost. Out of everything, though, the most important sticking point might be the control. Without audit trails to confirm that their licencing terms are being adhered to data providers have no control over data use once the exchange is given the data

There are ways to create a market for something like data. Asset-backed securities, called ABS, are one example. They are bonds or notes backed by financial assets. Typically these assets consist of receivables other than mortgage loans, such as credit card receivables, auto loans, manufactured-housing contracts and home-equity loans. That’s one of the financial instruments the industry is very familiar with. Well, after the 2008 crisis, the whole world knows what it is but that’s another subject. But maybe a better example is the music licensing industry. The IP licensing industry is finding new and inventive ways to reduce friction (lawyers) and strengthen the controls of data creators. What is a song, a book, an image or a video if not a valuable and sought out piece of data?

If we can create a system that allows data to be priced and to be traded, there is no question it gradually becomes the biggest asset class in the world. One possible solution that I am excited for because of the way that it allows a true and functioning data economy is the Ocean Protocol. Ocean Protocol is a technical and governance framework that is brought together to serve the needs of all stakeholders in the data ecosystem. It is designed for scale to the size that the property industry would need. It uses blockchain technology to allow data to be shared and sold in a safe, secure and transparent manner and gives data providers full control over how they publish and share data. Marketplaces and intermediaries can provide tools to offer discovery and value-added services to data consumers.

Ocean Protocol Foundation is a Singapore based non-profit foundation. Its mandate is to ensure open access to the protocol and platform, provide long-term governance, community and ecosystem, and be a custodian of funds raised. Any enterprise, government, group or data custodian that possesses valuable but underutilized data is a candidate to be data providers. There is already a market of data consumers that needs data for analysis and training AI/ML models, one that will likely grow exponentially. There is also a built-in community of legislators, oversight agencies and internet advocates that wish to monitor and contribute their input to shape how data is used. Developers are encouraged to build value-added services or marketplaces on top of Ocean Protocol.

The network stores data on-premise, in the cloud, or on decentralized networks such as IPFS, Swarm or Storj. Once data is published it can be sold on the marketplace with a number of different pricing models. Providers have complete control over who can buy the data and what they can use it for. There is even a set pricing model via protocol in order to prevent vendor lock-in.

This type of ecosystem will help people discover data for their needs, from anywhere on Earth. Pricing and usage guides will be transparent and all data and providers can be reviewed. You could even let people train their AI models on data without giving away the data if the right protocols were in place.

In the property industry, Ocean Protocol is already being used. They are working with Emporio Analytics to build better shopper experiences. By building a data marketplace on Ocean, Emporio Analytics will enable retailers and brands to work closely together on data-driven shopper marketing. WPP, a creative transformation agency, collaborated with Ocean Protocol to protect consumer privacy in media buying.

It’s fascinating to see such diverse and creative initiatives coming along from different parts of the world. But it does seem like, while data sharing has already started, it’s just not evenly distributed yet. We need to do a better job of explaining to all of the stakeholders in the property industry (everyone in the world basically) that it is possible to have a functioning data economy. If we use technology to incentivize the sharing of data and reward its use, we will see an even faster pace of innovation. It is wrong to assume that people with valuable data should give it away for free. It is smarter to build a system that lets the data be bought and sold like the incredibly valuable asset that it is.

— — — — — — — — — — —

Mete Varas is one of the thought-leaders and influencers on PropTech globally. He has been a prominent entrepreneur in technology businesses, involved numerous ventures as a founder, executive, investor, mentor or advisor. He’s the founder of EurAsia Proptech Initiative. He’s a member of the founding team of Zingat & REIDIN, property marketplace and property data analytics company, responsible for innovation and European operations respectively. Mete Varas is also co-founder of Shopi, a -retail- proptech company.

This article was previously published on Propmodo, January 2, 2020. https://www.propmodo.com/creating-a-data-economy-in-the-property-industry/

--

--

Mete Varas

Serial Entrepreneur, Executive, Advisor, VC, Strategy, Proptech, Business Development Junkie, Blockchain, Dad, Constant Learner, Fruit Lover, Fenerbahce…