SYS-CON MEDIA Authors: Pat Romanski, Liz McMillan, Yeshim Deniz, Elizabeth White, Courtney Abud

Related Topics: SYS-CON MEDIA

SYS-CON MEDIA: Article

Beware the Dangers of the Decentralized Web

The Decentralized Web movement is either woefully naïve or a front for further organized crime activity

Tim Berners-Lee, the famed inventor of the World Wide Web, has a new project: the Decentralized Web.

His thinking is that great Internet powers like Facebook and Google have largely taken over the egalitarian, Decentralized Web he invented. To fix this problem, we need a new set of protocols that will disintermediate such centralized control, returning it to the people – ‘re-decentralizing’ the Web, as it were.

Berners-Lee’s motives are commendable to be sure – but there’s just one problem. It will never work. And furthermore, as a figure from the Internet’s formative years, he should realize that.

The sad fact is that there have been decades of attempts at decentralized models of global network communications – and every single one of them ends up being a medium for criminal activity that ends up dominating any altruistic uses of the technology.

If it follows the course it’s currently taking, the Decentralized Web will fare no better. And yet, the current state of the Internet is unquestionably fraught with problems. There’s got to be a better answer.

Understanding Decentralization

Depending on the context, the notion of ‘decentralized’ is often confounded with ‘distributed.’ Today, decentralization is an important characteristic of permissionless blockchain platforms like Bitcoin – and in this sense, decentralization refers to the lack of a single point of control. In other words, no one is in charge of a decentralized network.

Decentralized architectures are inherently distributed, but the converse is not true. The open source Cassandra database, for example, is inherently distributed, but only works because it has centralized control.

The Internet, and by extension, the World Wide Web, were originally both decentralized and distributed, as anybody could stand up a web server anywhere they liked. Then, as Google became the predominant search engine, it didn’t actually control the web, but it increasingly controlled who would access which pages, amounting to the same thing.

Today, Google and Facebook control the majority of the online ad market, while Amazon dominates ecommerce. Which ads you see and which products you buy – and to an appalling extent, what opinions you hold – depend upon these three goliaths.

The Decentralized Web seeks to counteract such centralized power by returning control over web-based content to individuals via protocols that uniquely identify content itself, rather than the URLs of that content.

The approach makes sense on its face, but it has at least one fundamental flaw. If we look at the history of decentralized content, that flaw will become apparent.

Decentralization Before the Web

Even before the web was a twinkle in Tim Berners-Lee’s eye, we faced a battle between centralized and decentralized content distribution – not over the Internet, but over dialup modem links.

Dialup services like CompuServe, Prodigy, and AOL offered the benefits of centralized services, including curated, legal content and advertising.

In contrast, bulletin board services (BBSs) provided the decentralized counterpoint. Anybody with a modem could stand up a BBS, simply by setting up their computer to answer the phone and connect to BBS software running on the computer.

The original motivation of BBSs were their eponymous bulletin boards – simple, text-based shared notifications that people could read and update. However, as modem speeds increased, people increasingly used BBSs to host binary files like images and software.

Since such binaries took far longer to download than the text-based bulletin board content, BBS phone lines soon became overwhelmed, as a single downloader might monopolize a line for hours at a time. To survive, BBSs had to scale up, adding numerous phone lines, modems, and computers, and coming up with ways to charge by quantity of content instead of a simple monthly subscription fee.

In other words, BBS economics had shifted. Providers had to charge more to stay in business, which meant they had to provide premium content – in the form of pornographic images and pirated software.

Before the Clinton administration relaxed the enforcement of obscenity laws in the 1990s, thus shifting the now-legal pornography business to the new World Wide Web, BBSs were one of the two most convenient places to obtain illicit hardcore porn – although the download speeds of the day limited it to poor quality images.

The Rise of Usenet

The other place to get your porn, of course, was Usenet. Usenet provided a large number of hierarchically-organized newsgroups – what we’re more likely to call forums today. Every hobby had a newsgroup, from roller coaster enthusiasts to model train aficionados.

Dating from the early 1980s, Usenet was originally a way for BBSs to synchronize with each other over dialup modem lines.

As modem speeds improved and Internet service providers (ISPs) began to offer consumer Internet access, the ISPs began to host Usenet servers, providing access to them either over the Internet as part of their regular fee, or sometimes at a premium.

In particular, ISPs would often charge a premium for groups dedicated to hosting binaries. Slowing down adoption was the fact that uploading such binaries to a Usenet newsgroup was a laborious process involving encoding and segmenting such files – but regardless, Usenet took over the porn distribution business from BBSs.

As the Web took off in the mid-1990s, the binaries remaining on Usenet took on a darker character, centering on illegal content such as pirated software and child pornography, as the other newsgroups filled with spam.

Eventually law enforcement took notice, and a cat-and-mouse game ensued. Criminals would move their wares from one unsuspecting newsgroup to another – all the while taking advantage of the inherently decentralized nature of Usenet for cover.

Peer-to-Peer Brings a Torrent

Rapidly increasing Internet speeds soon changed the game toward the end of the century, as it finally became practical to download video content and other large files. The predominant decentralized offering during this era was BitTorrent, a peer-to-peer (P2P) file sharing protocol that enabled anyone to share files with anyone else.

By 2004, BitTorrent was responsible for one quarter of all Internet traffic, according to Wikipedia – mostly pirated videos, pirated software aka ‘warez,’ child pornography, and other illegal content catering to a wide range of tastes.

BitTorrent, however, did not provide a mechanism for providing its users either security nor anonymity, nor did it offer a payment infrastructure. As such, it was better suited for people sharing illicit content with each other rather than setting up businesses for that purpose.

In other words, BitTorrent was better suited for disorganized rather than organized crime.

The Dark Web is Born

Where there is a gap in the marketplace, somebody is bound to fill it – and lo, Bitcoin was born. Bitcoin gave the criminals struggling to build illicit businesses on technologies like BitTorrent the economic infrastructure they needed to build bona fide crime syndicates.

Even Bitcoin is not truly anonymous, however – and thus Monero and other even more crime-friendly cybercurrencies came along, fleshing out what had been a scattered bunch of lowlifes sharing warez with their friends into a full-fledged global black market we now know as the Dark Web.

Adding commerce to a decentralized, P2P web like BitTorrent opened up new contraband opportunities, and today illegal drugs are the most popular goods on the Dark Web.

What hath the Decentralized Web wrought?

The Intellyx Take

Placed into its historic context, today’s Decentralized Web movement is either woefully naïve or simply a front for further organized crime activity. The world doesn’t need yet another way to share content in a decentralized way, unless you count criminals who continue to seek new ways to avoid getting caught.

The moral of this story is clear. Illegal content is the original and most nefarious raison d’être of the Decentralized Web. In spite of whatever altruistic motivations you might have, any effort to create a Decentralized Web will call upon our basest nature and thus play into the hands of organized crime.

We are thus sandwiched between two evils: a Web dominated by a few major Internet players, and one where criminals run rampant.

We need a better answer. We desire an Internet where any two people can converse and conduct commerce with each other with no restrictions on free speech or freedom of action, and yet we also want to live in a society where we bring criminals to justice, while deterring others from crossing the line.

This conundrum may very well be the primary challenge of our age, as the Internet is the most important enabler of the Digital Era. I don’t have the answer. Neither does Tim Berners-Lee. Do you?

Copyright © Intellyx LLC. Intellyx publishes the Agile Digital Transformation Roadmap poster, advises companies on their digital transformation initiatives, and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: Massacre.

More Stories By Jason Bloomberg

Jason Bloomberg is a leading IT industry analyst, Forbes contributor, keynote speaker, and globally recognized expert on multiple disruptive trends in enterprise technology and digital transformation. He is ranked #5 on Onalytica’s list of top Digital Transformation influencers for 2018 and #15 on Jax’s list of top DevOps influencers for 2017, the only person to appear on both lists.

As founder and president of Agile Digital Transformation analyst firm Intellyx, he advises, writes, and speaks on a diverse set of topics, including digital transformation, artificial intelligence, cloud computing, devops, big data/analytics, cybersecurity, blockchain/bitcoin/cryptocurrency, no-code/low-code platforms and tools, organizational transformation, internet of things, enterprise architecture, SD-WAN/SDX, mainframes, hybrid IT, and legacy transformation, among other topics.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

Latest Stories
Here to help unpack insights into the new era of using containers to gain ease with multi-cloud deployments are our panelists: Matt Baldwin, Founder and CEO at StackPointCloud, based in Seattle; Nic Jackson, Developer Advocate at HashiCorp, based in San Francisco, and Reynold Harbin, Director of Product Marketing at DigitalOcean, based in New York. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rathe...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Using serverless computing has a number of obvious benefits over traditional application infrastructure - you pay only for what you use, scale up or down immediately to match supply with demand, and avoid operating any server infrastructure at all. However, implementing maintainable and scalable applications using serverless computing services like AWS Lambda poses a number of challenges. The absence of long-lived, user-managed servers means that states cannot be maintained by the service. Lo...
With the new Kubernetes offering, ClearDATA solves one of the largest challenges in healthcare IT around time-to-deployment. Using ClearDATA's Automated Safeguards for Kubernetes, healthcare organizations have access to the container orchestration to dynamically deploy new containers on demand, monitor the health of each container for threats and seamlessly roll back faulty application updates to a previous version, avoid system-wide downtime and ensure secure continuous access to patient data.
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
Platform9, the open-source-as-a-service company making cloud infrastructure easy, today announced the general availability of its Managed Kubernetes service, the industry's first infrastructure-agnostic, SaaS-managed offering. Unlike legacy software distribution models, Managed Kubernetes is deployed and managed entirely as a SaaS solution, across on-premises and public cloud infrastructure. The company also introduced Fission, a new, open source, serverless framework built on Kubernetes. These ...
Emil Sayegh is an early pioneer of cloud computing and is recognized as one of the industry's true veterans. A cloud visionary, he is credited with launching and leading the cloud computing and hosting businesses for HP, Rackspace, and Codero. Emil built the Rackspace cloud business while serving as the company's GM of the Cloud Computing Division. Earlier at Rackspace he served as VP of the Product Group and launched the company's private cloud and hosted exchange services. He later moved o...
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, will discuss how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galer...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
Cloud-Native thinking and Serverless Computing are now the norm in financial services, manufacturing, telco, healthcare, transportation, energy, media, entertainment, retail and other consumer industries, as well as the public sector. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that pro...
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to ...