© 2022 CyberNews - Latest tech news,
product reviews, and analyses.

If you purchase via links on our site, we may receive affiliate commissions.

Henri Pihkala, Streamr: “when released, digital assets, are no longer yours – they are at someone else’s mercy”


These days, the need for quality data-sharing solutions is rapidly increasing. However, creating such solutions presents more challenges than building simple messaging apps.

The way it works is that a message contains the data payload, but optional features like metadata can also be added. Most commonly, it is used for service-to-service communication (two services or systems need to communicate), asynchronous work item backlogs (service has to track a backlog of actions), and state change notifications (service manages resources).

For those, who haven’t heard of data-sharing networks, the topic might seem a little tough and confusing. Therefore, Cybernews got in touch with Henri Pihkala, CEO of Streamr – decentralized P2P real-time data pub/sub-network.

Would you like to share what has the journey been like for Streamr?

The Streamr project is based in “Crypto Valley” aka Zug, Switzerland, and was started by me and a few other Finnish guys. Before devoting my career to decentralization and Web 3.0 in 2016, I started companies in algorithmic trading and cloud big data. Those adventures taught me about the value of real-time data, as well as gave insights as to what kind of novel data infrastructure is needed in the future.

In 2017, the project was crowdfunded via an ICO to deliver Streamr Network, a decentralized data network, motivated by the belief that traditional, centralized data infrastructure cannot meet the real-time data sharing needs of the future’s data economy. The core of the Streamr protocol is a peer-to-peer network, run by a community of nodes, which is scalable, robust, permissionless, and secure.

On top of the network sits the application layer, which includes a data marketplace, and a data crowdsourcing framework, along with an ecosystem of apps using the network to communicate and propagate real-time messages over the internet.

We are currently nearing our final roadmap milestone, Tatum, which will include the token incentive layer and marks version 1.0 of the Streamr Network.

Can you tell us a little bit about your data network? What are its key features?

The Streamr Network is a publish/subscribe messaging service that enables the real-time distribution of any real-time data source to any number of subscribers.

What makes the network unique is that it is decentralized and operates entirely peer-to-peer. Similar to cloud services, you get on-demand scalability, minimal up-front investment, and benefit from economies of scale. But unlike cloud services, there's no vendor lock-in, no monopolies, no proprietary code, and no need to trust a third party with your data.

The Network is run by its community of users who run nodes to provide bandwidth to the network in exchange for DATA tokens. This creates a permissionless neural network for real-time data. Use cases include generalized messaging between humans, apps, machines, and any IoT objects, open data sharing at scale, and data monetization without middlemen.

It is evident that open source is an important part of Streamr. Would you like to share more about your vision?

For sure! The Streamr Network operates on the principle of trustlessness, which means users don’t have to take the word of the Streamr team at face value or hope that there are no hidden surprises — the code base is open source, so you can see it for yourself. This is a key difference from centralized providers, and we believe it is a welcome feature of the data economy.

Towards the end of the Streamr roadmap, the Network will be fully decentralized with all components run by community members and users, not us. So open sourcing the project is important to allow developers to understand what they are running on their systems and how they can contribute to maintaining the code base. Leveraging this crowd wisdom of users surely comes with advantages from network effects and adaptability in identifying fixes and opportunities. In our roadmap, there’s really no competitive edge in secrecy.

Have the recent global events encouraged you to integrate any new vital features?

Not as such, as for example censorship resistance and end-to-end encryption are already key properties of the protocol. Decentralized communication protocols like Streamr are important for all future scenarios: During times of peace and prosperity, scale and the enablement of new business models become important. During times of crisis and instability, such protocols promote freedom of speech and secure communications.

What are some of the worst mistakes companies make when handling large amounts of data?

Banks keep money safe, and companies keep data safe—right? Not quite. Gathering data is a key component of many business models, but secure custody of data is exactly as difficult as secure custody of the money. Gathering lots of data in a single place is the worst mistake companies make, creating a honeypot for hackers and opening the door for intentional or accidental abuse of the data. Almost all companies make this mistake, as they need data for their business and centralized technologies can’t avoid taking custody of data.

In a decentralized or Web3 stack, data is in self-custody via cryptography. Consequently, there is no honeypot, because there is no single key that unlocks all the data. Instead, every user holds the key to their own data, and controls who they share the data with.

Obviously, it’s not fast nor easy for companies to transform from “hoard as much data as possible” to “store as little data as possible”. Technology helps, regulation helps, and new business models, help, but the change is slow and also requires people to be ready to take control of their digital assets. It took decades to educate people about how to select secure passwords. How long will it take to learn how to keep private keys safe?

What achievements and challenges do you expect to see in the Web3 landscape in the upcoming years?

It may not feel like it, but we are still early in the Web3 space. With this comes challenges of scaling and market fit, identifying the best collaborators and filtering out the noise, hiring talented people with related experience, and of course unpredictability due to the general uncharted territory of what we are collectively building towards.

A more specific technical challenge in the Web3 space is the real-time component. We’re so used to various real-time features in traditional applications that we almost take them for granted. Take messaging apps as an example, which allow you to communicate with thousands of people instantly, see that someone is typing before they even send the message, or see who is online right now.

These real-time features are not easy to implement at scale in the centralized world, and in the decentralized world, it’s even harder. That’s why we don’t see this kind of feature in any dApps today. But Streamr enables this kind of thing and much, much more, capable of serving as the real-time data sharing backbone of the future’s internet.

What predictions do you have for the future of the data economy?

The amount of data is increasing all the time and will continue to do so. People’s lives will become more digitally native, working remotely, and exploring metaverse spaces. More devices will become connected — not only phones, but cars, elevators, fridges, and more. All of this will increase the value of data. In the current data economy model, the parties who most benefit from this value are the big giants who gather data via their platforms.

This is not a good direction, so thankfully a number of projects are working to broaden the data economy with decentralization, blockchain, and cryptography to ensure that not all data ends up in the same place. This Web3 data infrastructure will be layered — there won’t be one protocol to rule them all. Streamr is building the real-time component and there will be others for storing or sharing static data, as well as monetization elements. This will be a diverse ecosystem of ecosystems, with new business models made, rather than today’s primary advertising dominance.

In this age of ever-evolving technology, what do you think are the key security practices both businesses and individuals should adopt?

As many new people and companies curiously venture into the world of cryptocurrencies, NFTs, data, and other digital assets, it’s worth remembering the old saying: “not your keys, not your coins”. The same applies to data. When you release custody of your digital assets or data to someone, they’re not really yours anymore, they are at someone else’s mercy.

People should understand this and prefer new types of applications and systems, such as blockchains and other decentralized protocols, that allow people to retain control of their assets. Of course, with this great power comes great responsibility, as discussed above.

Would you like to share what’s next for Streamr?

While the network itself is already up and running, the tokenomics are currently in development and will be fully introduced with the Tatum milestone, expected in late 2022 to early 2023.

Currently, the payments to nodes come from a temporary stand-in process limited to predefined streams, but once Tatum is implemented, any stream can be incentivized, and therefore an open market for “bandwidth mining” will be created. There’s a lot to do for the dev teams to reach this milestone, as we’re really building something quite unique from scratch, but I’m pleased to say that we’re currently on schedule.

In the next few months, things to watch out for include the Streamr Chat app — a pilot to allow decentralized group wallet chats running on the Streamr Network — and some new developments from our ecosystem of builders. Make sure to join us on Discord and give us a follow on Twitter for the latest updates and developments!



Leave a Reply

Your email address will not be published. Required fields are marked