Click here to close now.

SYS-CON MEDIA Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan, tru welu

News Feed Item

New Convey GraphConstructor Leverages Hybrid-Core Architecture to Speed De Novo Genome Assembly

From Cows to Grapes, Efficient Use of Memory Fuels More Science

RICHARDSON, TX -- (Marketwire) -- 05/17/11 -- Convey Computer today introduced the Convey GraphConstructor™ (CGC), a new software and hardware solution that speeds up some of the world's most popular bioinformatics algorithms and helps scientists better manage and analyze escalating amounts of research data.

The Convey GraphConstructor accelerates construction and manipulation of de Bruijn graphs commonly used in short-read genome assembly applications such as Velvet (1) and Abyss. (2) The Convey GraphConstructor is the newest component in the company's bioinformatics suite.

"Convey's hybrid-core architecture provides powerful advantages to scientists dealing with large datasets across many disciplines. Nowhere is this more important than in bioinformatics -- where customers are achieving performance speed-ups from 2.2 to 8.4 times," (3) says Convey CEO and co-founder, Bruce Toal. "The Convey GraphConstructor helps researchers explore and manage the data deluge spilling from next-generation sequencing technologies faster and with significantly lower computing costs than in the past."

Key to improving performance and capability is Convey's novel hybrid-core computing architecture. Software-only applications are limited by the performance of commodity servers executing a stream of general-purpose instructions. Convey's architecture pairs classic Intel® x86 microprocessors with an FPGA-based coprocessor. This architecture allows key segments of an application -- DNA sequence alignment, for instance -- to run directly in hardware.

While raw processor performance increases are important, improved memory management is often just as important to increasing research throughput. Bioinformatics applications that depend upon random access patterns to large memory spaces, such as graph-based algorithms, experience severe memory performance limitations on cache-based x86 servers. Convey's highly parallel memory subsystem allows application-specific logic to concurrently access 8,192 individual words in memory, significantly increasing effective memory bandwidth over cache-based memory systems.

Many algorithms, such as Velvet and other de Bruijn graph based, short-read, de novo assemblers, greatly benefit from this type of memory architecture. Velvet author Dr. Daniel Zerbino says, "There are a number of engineering issues we didn't fully address in 2006 when we were developing Velvet and one of those issues was the hardware footprint. Memory size is the biggest difficulty. If your machine doesn't have enough memory, you must break down the problem and that can be quite a constraint for users. Convey's GraphConstructor offers a new approach to help researchers who want to test more parameters to achieve better assemblies or look at bigger jobs such as metagenomic or mammalian genome samples."

In fact, researchers at the U.S. Department of Energy/Joint Genome Institute (JGI) and the University of Mainz are currently using advanced computer architectures, such as hybrid-core computing and Convey's GraphConstructor, to tackle problems previously deemed impractical:

  • Cow Rumen Metagenome: As part of their work researching biofuels, JGI researchers want to discover how cows convert grass to gas so effectively. In January, JGI reported (Science, January 28, 2011) it had "sequenced and analyzed 268 gigabases of metagenomic DNA from microbes adherent to plant fiber incubated in cow rumen." The result so far is discovery of nearly 30,000 new enzymes for biofuel production improvements. Using a Convey hybrid-core computer and Convey's GraphConstructor helped JGI researchers speed up the discovery process by as much as 2.8 times and reduce the required memory footprint by up to 82 percent. (4)

  • Riesling Genome Assembly. In doing the first sequencing of the Riesling grape, University of Mainz researchers produced a dataset of 300 M reads, or 30 B nucleotides in size. Achieving an accurate assembly required using a fairly short kmer length, which drives up required memory and runtime. The existing computer system didn't have enough memory to complete the assembly with Velvet, but a Convey system running the Convey GraphConstructor was able to do so efficiently and without difficulty.

"Convey is solving a big problem here -- de novo assembly has been very difficult," says Dr. John Castle, head of Bioinformatics/Genomics at the Institute for Translational Oncology and Immunology (TrOn), University of Mainz. "At TrOn, we tried to assemble the Riesling genome with SOAPdenovo and with Velvet. Both failed because the computer ran out of memory. Other groups are trying to assemble grape genome as well but with mixed results, frequently due to hardware limitations. Convey, just by increasing the efficiency of Velvet, has made a significant accomplishment!"

Convey's use of reconfigurable technology and supercomputer-inspired memory management systems permit Convey hybrid-core systems to accelerate applications, drive next-generation solutions, and lower ownership costs. Convey's hybrid-core platforms include the HC-1 and the HC-1ex. For more information about Convey or the Convey GraphConstructor visit http://www.conveycomputer.com/

About Convey Computer Corporation

Based in Richardson, Texas, Convey Computer breaks power, performance and programmability barriers with the world's first hybrid-core computer -- a system that marries the low cost and simple programming model of a commodity system with the performance of a customized hardware architecture. Convey brings decades of experience and intellectual assets to performance problem solving. Its executive and design teams all come from successful backgrounds of building computer companies, most notably Convex Computer Corporation and Hewlett-Packard. Convey Computer investors include Braemar Energy Ventures, CenterPoint Ventures, Intel Capital, InterWest Partners, Rho Ventures, and Xilinx. More information can be found at: www.conveycomputer.com.

(1) http://www.ebi.ac.uk/~zerbino/velvet/; Velvet is the most widely used program for de novo assembly of short-read sequences.
(2) http://www.ncbi.nlm.nih.gov/pubmed/19251739
(3) Performance varies considerably depending on problem size and specified kmer lengths.
(4) Poster, "Efficient Graph Based Assembly of Short-Read Sequences on a Hybrid-Core Architecture," DOE JGI User Meeting, Genomics of Energy and Environment, March 22-24, 2011, Walnut Creek, California.

Convey Computer, the Convey logo, Convey HC-1 and HC-1ex, and the Convey GraphConstructor are trademarks of Convey Computer Corporation in the U.S. and other countries. Intel® and Intel® Xeon® are registered trademarks of Intel Corporation in the U.S. and other countries. Xilinx is a registered trademark of Xilinx, Inc.

Add to Digg Bookmark with del.icio.us Add to Newsvine

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is - how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, discussed the common mistakes businesses make when transitioning t...
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
The move to the cloud brings a number of new security challenges, but the application remains your last line of defense. In his session at 15th Cloud Expo, Arthur Hicken, Evangelist at Parasoft, discussed how developers are extremely well-poised to perform tasks critical for securing the application – provided that certain key obstacles are overcome. Arthur Hicken has been involved in automating various practices at Parasoft for almost 20 years. He has worked on projects including database dev...
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
“Oh, dev is dev and ops is ops, and never the twain shall meet.” With apoloies to Rudyard Kipling and all of his fans, this describes the early state of the two sides of DevOps. Yet the DevOps approach is demanded by cloud computing, as the speed, flexibility, and scalability in today's so-called “Third Platform” must not be hindered by the traditional limitations of software development and deployment. A recent report by Gartner, for example, says that 25% of Global 2000 companies will b...
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...