SYS-CON MEDIA Authors: Vormetric Blog, Jason Bloomberg, Liz McMillan, Pat Romanski, Elizabeth White

News Feed Item

IDC Announces New Winners of HPC Innovation Excellence Awards

International Data Corporation (IDC) today announced the seventh round of recipients of the HPC Innovation Excellence Award at the ISC'14 supercomputer industry conference in Leipzig, Germany. Prior winners were announced at the ISC'11, SC11, ISC'12, SC12, ISC'13, and SC13 supercomputing conferences.

The HPC Innovation Excellence Award recognizes noteworthy achievements by users of high performance computing (HPC) technologies. The program's main goals are to showcase return on investment (ROI) and scientific success stories involving HPC; to help other users better understand the benefits of adopting HPC and justify HPC investments, especially for small and medium-size businesses (SMBs); to demonstrate the value of HPC to funding bodies and politicians; and to expand public support for increased HPC investments.

"IDC research has shown that HPC can accelerate innovation cycles greatly and in many cases can generate ROI. The award program aims to collect a large set of success stories across many research disciplines, industries, and application areas," said Chirag Dekate, Research Manager, High Performance Computing at IDC. "The winners achieved clear success in applying HPC to greatly improve business ROI, scientific advancement, and/or engineering successes. Many of the achievements also directly benefit society."

Winners of the first six rounds of awards, announced over the last three years, included 34 organizations from the U.S., three from the People's Republic of China and Italy, four from UK, two from India, and one each from Australia, Canada, Sweden, South Korea, Switzerland, Germany, France, and Spain.

The new award winners and project leaders announced at ISC'14 are as follows (contact IDC for additional details about the projects):

  • University of Wisconsin-Madison (U.S.). University of Wisconsin Researchers utilized HPC resources in combination with multiple advanced forms of protein structure prediction algorithms and deep sequence data mining to construct a highly plausible capsid model for Rhinovirus-C (~600,000 atoms). The simulation model helps researchers in explaining why the existing pharmaceuticals don't work on this virus. The modeling frameworks developed by the researchers provide angstrom-level predictions for new antivirals and a platform for vaccine development. Lead: Ann C. Palmenberg
  • Argonne National Laboratory, Caterpillar, Convergent Science (U.S.). Researchers from Argonne National Laboratory conducted one of the largest internal combustion engine simulations. Predictive internal combustion engine simulations necessitate very fine spatial and temporal resolutions, high-fidelity and robust two-phase flow, spray, turbulence, combustion, and emission models. The research has allowed Caterpillar Inc. to shrink their development timescales and thus result in significant cost savings. Caterpillar engineers predict that these HPC developments will reduce the number of multi-cylinder test programs by at least a factor of two, which will result in a cost saving of $500,000-$750,000 per year. Lead: Sibendu Som
  • CINECA (Italy). Engineers from THESAN srl, an Italian SME active in the renewable energy sector, teamed up with the Italian supercomputing center CINECA to develop simulation-driven engineering of hydroelectric turbines. The research was conducted in the framework of the PRACE SHAPE (SME HPC Adoption Programme in Europe) Initiative. The engineers and researchers built an HPC-based workflow to optimize the design of a new class of hydroelectric turbines. Using CFD Thesan could generate cost savings through reducing or eliminating the production of physical prototypes, better understanding the flaws of earlier design setups, and critically shortening the time to market. Lead: Raffaele Ponzini, Roberto Vadori, Giovanni Erbacci, Claudio Arlandini
  • Pipistrel d.o.o. (Slovenia). Engineers and scientists from Pipistrel utilized HPC and technical computing resources to design and develop the Taurus G4 aircraft. The aircraft was conceived, designed, and built in a mere 5 months, relying heavily on CAD and rapid prototyping techniques, but especially on the use of CFD and other computer aerodynamic tools for evaluation of flight performance and handling before committing to the building of the prototype. The aircraft introduced a unique twin fuselage configuration, presenting significant challenges in designing the wings, high lift systems, and the overall configuration. HPC-based CFD was used already in the conceptual design stage to optimize the shape of the engine nacelle in order to avoid premature flow separation. CFD was used in further stages of the design to optimize the high lift slotted flap geometry, and especially to determine the lift and stability behavior of the complete aircraft configuration in ground effect. Lead: Prof. Dr. Gregor Veble
  • Culham Centre for Fusion Energy, EPCC at the University of Edinburgh, York Plasma Institute at the University of York, and Lund University. Researchers from CCFE, EPCC and the Universities of York and Lund have made substantial recent optimizations for the well-known plasma turbulence code, GS2. This included a total rewrite of the routines that calculate the response matrices required by the code's implicit algorithm, which has significantly accelerated GS2's initialization, typically by a factor of more than 10. Taken together, these optimizations have vastly reduced wall time, as illustrated by the impressive gain in speed by a factor of 20 that was achieved for a benchmark calculation running on 8,192 cores. The optimized code achieves scaling efficiencies close to 50% at 4,096 cores and 30% at 8,192 cores for a typical calculation, compared to efficiencies of 4% and 2% respectively prior to these optimizations. Leads: David Dickinson, Adrian Jackson, Colin M Roach and Joachim Hein
  • Westinghouse Electric Company LLC, ORNL (U.S.). Researchers from Westinghouse Electric Company and the Consortium for Advanced Simulation of LWRs (CASL), a U.S. Department of Energy (DOE) Innovation Hub, performed core physics simulations of the AP1000® PWR startup core using CASL's Virtual Environment for Reactor Application (VERA). These calculations, performed on the Oak Ridge Leadership Computing Facility (OLCF) "Titan" Cray XK7 system, produced 3D high-fidelity power distributions representing the conditions expected to occur during the AP1000 start-up. The set of results obtained provide insights that improve understanding of core conditions, helping to ensure safe startup of the AP1000 PWR first core. Lead: Fausto Franceschini (Westinghouse)
  • Rolls-Royce, Procter and Gamble, National Center for Supercomputing Applications, Cray Inc., Livermore Software Technology Corporation (U.S.). Researchers from NCSA, Rolls Royce, Proctor and Gamble, Cray Inc, and Livermore Software Technology Corporation were able to scale the commercial explicit finite element code, LS-DYNA, to 15,000 cores on Blue Waters. The research has potential to transform several industries including aerospace and automotive engine design, and consumer product development and design. Researchers cited that the increased scalability can result in significant cost savings. Lead: Todd Simons, Seid Koric.

IDC welcomes award entries from anywhere in the world. Entries may be submitted at any time by completing the brief form available at https://www.hpcuserforum.com/innovationaward/. New winners will be announced multiple times each year. Submissions must contain a clear description of the dollar value or scientific value received in order to qualify. The HPC User Forum Steering Committee performs an initial ranking of the submissions, after which domain and vertical experts are called on, as needed, to evaluate the submissions.

HPC Innovation Excellence Award sponsors include Adaptive Computing, Altair, AMD, Ansys, Cray, Avetec/DICE, the Boeing Company, the Council on Competitiveness, Department of Defense, Department of Energy, Ford Motor Company, Hewlett Packard, HPCwire, insideHPC, Intel, Microsoft, National Science Foundation, NCSA, Platform Computing, Scientific Computing, and SGI.

The next round of HPC Innovation Excellence Award winners will be announced at SC’14 in November 2014.

About IDC

International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. IDC helps IT professionals, business executives, and the investment community to make fact-based decisions on technology purchases and business strategy. More than 1,000 IDC analysts provide global, regional, and local expertise on technology and industry opportunities and trends in over 110 countries. In 2014, IDC celebrates its 50th anniversary of providing strategic insights to help clients achieve their key business objectives. IDC is a subsidiary of IDG, the world's leading technology media, research, and events company. You can learn more about IDC by visiting www.idc.com.

All product and company names may be trademarks or registered trademarks of their respective holders.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
At 15th Cloud Expo, Shrikant Pattathil, Executive Vice President at Harbinger Systems, demos a video delivery platform that helps you do interactive videos. He discusses how Harbinger is accomplishing it in the cloud world, the problems they faced and the choices they made to get around these problems.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
In her General Session at 15th Cloud Expo, Anne Plese, Senior Consultant, Cloud Product Marketing, at Verizon Enterprise, focused on finding the right mix of renting vs. buying Oracle capacity to scale to meet business demands, and offer validated Oracle database TCO models for Oracle development and testing environments. Anne Plese is a marketing and technology enthusiast/realist with over 19+ years in high tech. At Verizon Enterprise, she focuses on driving growth for the Verizon Cloud platfo...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by minin...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your o...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...