SYS-CON MEDIA Authors: Jason Bloomberg, Elizabeth White, Zakia Bouachraoui, Andy Thurai, Liz McMillan

Blog Feed Post

The collaboration behind Colossus

CRI-117When I first heard about the heroic efforts during WWII to break the Nazi communications codes such as Enigma, I had in my mind the image of a lone cryptanalyst with pencil and paper trying to figure out solutions, or using a series of mechanical devices such as the Bombe to run through the various combinations.

But it turns out I couldn’t be more wrong. The efforts of the thousands of men and women stationed at Bletchley Park in England were intensely collaborative, and involved a flawless execution of a complex series of steps that were very precise. And while the Enigma machines get a lot of the publicity, the real challenge was a far more complex German High Command code called Lorenz, after the manufacturer of the coding machines that were used.

The wartime period has gotten a lot of recent attention, what with a new movie about Alan Turing just playing in theaters. This got me looking around the Web to see other materials, and my weekend was lost in watching a series of videos filmed at the National Museum of Computing at Bletchley Park.  The videos show how the decoding process worked using the first actual electronic digital computer called Colossus. Through the efforts of several folks who maintained the equipment during wartime, the museum was able to reconstruct the device and have it in working order. This is no small feat when you realize that most of the wiring diagrams were immediately destroyed after the war ended, for fear that they would fall into the wrong hands. And that many people are no longer alive who attended to Colossus’ operations.

The name was realistic in several ways: first, the equipment easily filled a couple of rooms, and used miles of wires and thousands of vacuum tubes. At the time, that was all they had, since transistors weren’t to be invented for several years. Tube technology was touchy and subject to failure. The Brits figured out that if they kept Colossus running continuously, they would last longer. It also wielded enormous processing power, with a CPU that could have had a 5 MHz rating. This surpassed the power of the original IBM PC, which is pretty astounding given the many decades in between the two.

But the real story about Colossus isn’t the hardware, but the many people that worked around it in a complex dance to input and transfer data from one part of it to another. Back in the 1940s we had punch paper tape. My first computer in high school had this too and let me tell you using paper tape was painful. Other data transfers happened manually copying information from a printed teletype output into a series of plug board switches, similar to the telephone operator consoles that you might recall from a Lily Tomlin routine. And given the opportunity to transfer something in error, the settings would have to be rechecked carefully, adding more time to the decoding process.

There is an interesting side note, speaking about mistakes. The amount of sheer focus that the Bletchley teams had on cracking German codes was enormous. Remember, the codes were transmitted over the air in Morse. It turns out the Germans made a few critical mistakes in sending their transmissions, and these mistakes were what enabled the codebreakers to figure things out and actually reconstruct their own machines. Again, when you think about the millions of characters transmitted and just finding these errors, it was all pretty amazing.

What is even more remarkable about Colossus was that people worked together without actually knowing what they did. There was an amazing amount of wartime secrecy and indeed the existence of Colossus itself wasn’t well known until about 15 or 20 years ago when the Brits finally lifted bans on talking about the machine. For example, several of the Colossus decrypts played critical roles in the success of the D-Day Normandy invasion.

At its peak, Bletchley employed 9,000 people from all walks of life, and the genius was in organizing all these folks so that its ultimate objective, breaking codes, really happened. One of the principle managers, Tommy Flowers, is noteworthy here and actually paid for the early development out of his own pocket Another interesting historical side note is the contributions of several Polish mathematicians too.

As you can see, this is a story about human/machine collaboration that I think hasn’t been equaled since. If you are looking for an inspirational story, take a closer look at what happened here.


Read the original blog entry...

More Stories By David Strom

David Strom is an international authority on network and Internet technologies. He has written extensively on the topic for 20 years for a wide variety of print publications and websites, such as The New York Times, TechTarget.com, PC Week/eWeek, Internet.com, Network World, Infoworld, Computerworld, Small Business Computing, Communications Week, Windows Sources, c|net and news.com, Web Review, Tom's Hardware, EETimes, and many others.

Latest Stories
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single appl...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...