2021 Technology trend review, part 1: Blockchain, Cloud, Open Source

It’s this time of year, when writers and audience alike feel obliged to look back into the year that just ended, and forecast the one that’s just starting. We figured the best way to chip in our fair share would be to revisit what we identified as 5 key technology trends for the roaring 20s last year, see where things stand, and try and offer some educated guesses as to where things may be headed.

To reiterate the context from last year, the original roaring 20s were 100 years ago, but we may be about to see a new version of this. And if it’s going to be “a period of economic prosperity with a distinctive cultural edge,” then both the economic and the cultural aspects will be all about data.

Data is shaping a new culture, bringing about a new way of doing business, a new way of decision making, new applications and infrastructure, and is an enabler for the transition to AI. Data is the focal point of our coverage on Big on Data, so following up on fellow columnists Andrew Brust and Tony Baer’s predictions, here’s our own round of things to keep an eye on in the 2020s.

Last year we identified blockchain, cloud, open source, artificial intelligence and knowledge graphs as the 5 key technological drivers for the 2020s. Although we did not anticipate the kind of year 2020 would turn out to be, it looks like our predictions may not have been entirely off track. Let’s recap, starting today with blockchain, cloud, and open source, and following up with artificial intelligence and knowledge graphs, plus an honorable mention to Covid-19 related technological developments, in the coming days.

Blockchain’s DeFi-ning moment?

The key takeaway from last year’s review of the blockchain technology and ecosystem was that the potential is there, but there’s still a long way to go, both on the technical and on the organizational and operational side of things. We posit this still holds, but as always, the devil is in the details, so let’s drill down.

On the technical front, what is arguably the most significant development in 2020 materialized almost at the year’s end: the Ethereum 2.0 Beacon chain went live. Let’s take a step back, and explain what this means, and why it’s important.

Ethereum is a blockchain-based network like Bitcoin. Unlike Bitcoin, Ethereum’s goal is to go beyond being a digital currency, to becoming a substrate for the development of all sorts of decentralized applications, or dApps. Although the value of Ether, the Ethereum network’s token, has been growing throughout 2020, this token can actually be used to run applications, as opposed to sitting idle in digital wallets.

Like Bitcoin, however, Ethereum shares a decentralized architecture, imposing the need for cryptographic guarantees and secure decentralized protocols to ensure the viability of transactions on the network. It’s been a long-stated goal for Ethereum to break away from the way Bitcoin does this, based on the concept of proof-of-work, and transition to a different way of doing things, called proof-of-stake.

eth-2-transitions-775x439.jpg

From the Beacon Chain, onwards: Phase 0 launches the proof-of-stake network with multiple technical additions coming set to follow.(Trenton Van Epps)

December 2020 was the time the so-called Beacon chain was released after years of research and development. Beacon aims to be the backbone of a new Ethereum blockchain, claiming to rival established payment networks such as PayPal and Visa in terms of processing speed, while beating them in terms of transparency and payment finality.

That’s a tall order. Not less so if we take into account the fact that there was significant investor pressure to get to that milestone, and Ethereum needs to undergo an in-flight transition to get to the new modus operandi, and that is always tricky.

That does not seem to have stopped the so-called DeFi wave however, which is largely based on Ethereum. DeFi stands for Decentralized Finance. In short, DeFi’s promise is to be able to cut out out middlemen from all kinds of transactions. Similar to how 2017 was the year of ICOs, 2020 was the year of DeFi. Lots of growth, some of it warranted, although oftentimes the “decentralized” part was more of an euphemism, and governance remains a sore spot.

Another key development just in: U.S. regulator now allows banks to access public blockchains such as Bitcoin or Ethereum, hold coins from these rails directly or on behalf of clients, and run a node for a public blockchain. In other words, it allows them to get actively involved. We expect this, along with the ongoing development of Central Bank Digital Currency (CBDCs) to boost interest in blockchain.

Cloud, Kubernetes, and GraphQL

In a way, there’s not much left to be said about the transition to the cloud. Yes, it is happening, and yes, the Covid crisis has –predictably — accelerated it. Yes, there are different ways to use cloud infrastructure — private, public, hybrid and multi-cloud —, each with their own strengths and weaknesses depending on where each organization stands and what its goals are. Yes, AWS is leading, Azure is growing, Google Cloud is 3rd, everyone else is still trailing.

We consider this common knowledge, as it has been covered extensively, both here on ZDNet and at large. Did 2020 bring anything new, or did it make us wiser in some way? Well, perhaps. One of the talking points in the 2020 discussion about cloud was data gravity, and the viability and consequences of having databases and data management platforms run in multi-cloud environments.

At the same time, the ongoing trend of database as a service — fully hosted and managed databases running in the cloud, offered typically but not exclusively by database vendors themselves — showed no signs of slowing down. Quite the opposite. An interesting fact is that the majority of database vendors making the transition to the cloud do this using Kubernetes.

The reason is obvious: portability. In reality that’s another euphemism, as using Kubernetes for data and related workloads in the cloud is hard, and only brings a bare minimum of portability. On the bright side for users, it’s the vendors who do the heavy lifting. And this is what is also driving the adoption of Kubernetes, in an indirect way, according to Percona CEO Peter Zaitsev. Percona’s 2020 survey on database adoption confirms both trends.

default-stargate.png

Navigating the cloud is well understood by now. The cloud’s side effects on application and data architecture will be long-lasting. Image: DataStax

One more thing that can be considered a side-effect of cloud adoption, and the architectural changes it entails, is growing adoption of GraphQL as an API to access databases and data management platforms. In addition to having Dgraph, a database built around a GraphQL variant, an increasing number of databases are adopting GraphQL as a first-class citizen when it comes to data access.

Solutions like FaunaDB, MongoDB, Ontotext, Stardog and Yugabyte are among them, with varying levels of support and maturity. Developer friendly as GraphQL may be, however, it suffers from some drawbacks when used as a database access layer, as GraphQL is not SQL.

The GraphQL specification is rather thin and vague when it comes to things such as queries, which means users can’t easily express complex queries, and implementations may vary across vendors. Cassandra, which has recently joined the ranks of vendors adding GraphQL support, has done it via a new API layer called Stargate, which wraps and extends GraphQL for database operations. Could we see more of this in the future? Will there be a shift in the direction of the GraphQL specification?

Open source is winning, open source creators are losing

Open source is winning, in databases and beyond. Gartner predicts that by 2022, more than 70% of new in-house applications will be developed on an open source database, and 50% of existing proprietary relational database instances will have been converted or be in the process of converting.

That was our opener for 2020, and if anything, it looks like the trend has accelerated. Open source use went up while the economy went down, and open source jobs are hotter than ever. Open source software is a boon for developers who use it, as it lowers the barrier to entry, and makes their skills transferable. But what about developers who create the software?

They get the raw part of the deal, it would seem. The reality is that in the majority of open source software above a certain threshold of complexity, a core team of few people does most of the work. This empirical fact is backed up by analysis on Github data.

We highlighted this theme in early 2020, following up on the New York Times article on the relationship between AWS and commercial open source vendors. Wired followed up with another article highlighting the ordeal of open source creators. Salvatore Sanfilippo, Redis’ “benevolent dictator”, stepping down from his role is another incident in a long chain of open source creators burnout.

istock-971997930.jpg

The digital information highways on which 21st century business relies on is built on open source. Yet open source creators get the raw end of the deal.

metamorworks, Getty Images/iStockphoto

Prominent open source software creators like Andre Staltz have shown how little of the generated value creators get. The cynical answer to that would be that open source is not a business model. But the repercussions of not having open source would be hard to imagine. Beyond fairness, open source users themselves would suffer from a collapse of the ecosystem. AWS, and the cloud at large, is built on open source too. So what are the alternatives?

Careful use of open source licenses to avoid exploitation from vendors who do not give back. Data-driven business models that balance makers and takers. Ethical software and Fair software, i.e. rethinking open source licenses. These are some of the proposals people have come up with. It does not look like 2020 was exactly a breakout year for any of them however.

On the other hand, we have seen some renewed pragmatism in open source. On the commercial open source vendors side of things, it has been suggested that what developers really care about is availability, not terms of use, i.e. a freely accessible API, not an open source product.

DataStax is a vendor exemplifying this change of course, trying to strike a balance between making amends with AWS and reconnecting with the community. AWS on its part broke new ground by enacting a revenue share deal with an open source vendor — Grafana. We don’t really know how much the calling out could possibly have influenced this decision, but we see it as a first step in the right direction. More need to follow.

Check back in a couple of days for our 2021 Technology trend review, part 2. 

By ZDNet Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here