|By Nikita Ivanov||
|March 8, 2014 03:00 PM EST||
by Abe Kleinfeld and Nikita Ivanov
Gordon E. Moore's famously predicted tech explosion was prophetic, but it may have hit a snag. While the number of transistors on integrated circuits has doubled approximately every two years since his 1965 paper, the ability to process and transact on data hasn't. We're now ingesting data faster than we can make sense of it, leaving computing at an impasse. Without a new approach, the innovation promised by the combination of Big Data and internet scale may be like the flying cars we thought we'd see by 2014. Fortunately, this is is not the case, as in-memory computing offers a way to bridge this impasse.
Keeping up with Moore's law requires computing orders of magnitude faster than allowed by traditional methods and at a reasonable cost. In-memory computing achieves just this. It's already well-established that in-memory computing is much, much faster and scalable than traditional methods. Furthermore, the dropping cost of memory has made it economical.
Despite this, there's a lingering misperception that in-memory computing resides in the realm of supercomputers. Most people don't realize just how fast and affordable it really is. To offer some perspective, GridGain recently demonstrated one billion transactions per second using our In-Memory Data Grid on just $25K worth of commodity hardware. In short, it's now economical for organizations of all sizes.
Opening the doors to mass adoption through open source in-memory technology
In-memory computing is definitely entering the mainstream, however, achieving mass innovation with any technology requires mass adoption. One of the best ways to accomplish this is by offering technology through an open source license, enabling users to begin to work with it without necessarily committing to it. This allows developers the flexibility to use the technology in new and interesting ways, and to address very specific challenges.
With GridGain offering a complete In-Memory Computing Platform through an Apache 2.0 license, all the barriers to adoption are removed. The high performance computing capabilities of in-memory technology are now fully part of the public domain, meaning that developers have full freedom to experiment with it, test its capabilities and try out new ideas.
Unifying the cloud, Big Data and real-time analytics to accelerate innovation
Now that developers have access to computing power commensurate with their creativity it'll be exciting to see what they come up with. While we can't predict the future, one thing is for certain -- the new level of computing power afforded by in-memory technology will enable developers to create a new class of applications that combine the cloud, Big Data and real-time analytics. Once you can do that, the genie is out of the bottle.