|By Business Wire||
|June 25, 2014 02:00 AM EDT||
Today at the International Supercomputing Conference (ISC) 2014 in Leipzig, Germany, Adaptive Computing, the company that powers many of the world’s largest private/hybrid cloud and technical computing environments with its Moab optimization and scheduling software, and Bright Computing, a leading provider of management solutions for clusters and clouds, announced their reseller agreement and a deeper integration of their product sets to enhance provisioning and workflow optimization in technical computing environments.
Building on the existing integration between Moab HPC Suite and Bright Cluster Manager®, Adaptive Computing and Bright Computing together provide enhanced functionality that enables users to dynamically provision HPC clusters based on both resource and workload monitoring. The combined capabilities of Moab and Bright Cluster Manager also create a more optimal solution to managing technical computing and Big Workflow requirements — a solution that accelerates insights by more efficiently processing intense simulations and big data analysis.
“As a company that shares our expertise in HPC, cloud and big data, Bright Computing represents an ideal collaboration partner in furthering our Big Workflow vision,” said Rob Clyde, CEO of Adaptive Computing. “We will continue to innovate with Bright Computing to allow organizations to accelerate insights with greater flexibility while controlling their costs to a higher degree than ever before.”
“We recognized that our customers would greatly benefit from a deeper integration of Adaptive’s and Bright’s systems,” said Dr. Matthijs van Leeuwen, CEO and Founder of Bright Computing. “We look forward to future collaborations that provide greater out-of-the-box integration and automation to streamline the installation process and further optimize technical computing environments to enhance a better workflow.”
Key benefits of the integration include the following:
Intelligent Workload Monitoring. The integration of Bright Computing’s resource monitoring and Adaptive Computing’s workload monitoring capabilities allows users to optimize HPC clusters and improve workload scheduling. For example, in the case of a hardware disruption or node failure, Bright Cluster Manager is alerted and reacts by communicating with Moab to identify availability within the cluster and reroute jobs to other available nodes. This provisioning helps unify cluster resources, ensuring optimal usage and guaranteeing users can efficiently run workloads.
Green and Power Management. Moab leverages Bright Cluster Manager’s green and power management features to enforce power management policies, allowing users to limit power consumption automatically. Through this integration:
- Idle nodes can be identified
- Nodes can be powered as needed
- Based on the policies, these nodes can be placed in a low-power suspend or sleep state, which consumes 10–50 percent of power compared to the active running state
Automated Health Checks. By combining Moab’s intelligent capabilities with Bright Computing’s Cluster Health Management in Bright Cluster Manager, users can now conduct health checks based on resource monitoring and scheduling. With the help of Cluster Health Management, Moab can:
- Identify node problems and take proactive measures
- Leverage a high-availability scenario that ensures a consistent workflow, even with node failures
- Provide an administrator report on the encountered problem and any actions taken by Cluster Health Management, eliminating or reducing administrator time to conduct node investigations
About Adaptive Computing
Adaptive Computing powers many of the world’s largest private/hybrid cloud and technical computing environments with its award-winning Moab optimization and scheduling software. Moab enables large enterprises in oil and gas, financial, manufacturing, and research as well as academic and government to perform simulations and analyze Big Data faster, more accurately and most cost effectively with its Technical Computing, Cloud and Big Data solutions for Big Workflow applications. Moab gives users a competitive advantage, inspiring them to develop cancer-curing treatments, discover the origins of the universe, lower energy prices, manufacture better products, improve the economic landscape and pursue game-changing endeavors. Adaptive is a pioneer in private/hybrid cloud, technical computing and big data, holding 50+ issued or pending patents. Adaptive’s flagship products include:
For more information, call (801) 717-3700 or visit www.adaptivecomputing.com.
NOTE TO EDITORS: If you would like additional information on Adaptive Computing and its products, please visit the Adaptive Computing Newsroom at http://www.adaptivecomputing.com/category/news/. All prices noted are in U.S. dollars and are valid only in the United States.
About Bright Computing
Bright Computing specializes in management software for on-premise HPC, Hadoop, storage, database and workstation clusters, as well as the seamless extension of these clusters into the cloud. Its flagship product — Bright Cluster Manager — with its intuitive graphical user interface and powerful cluster management shell, makes clusters of any size easy to install, use and manage, including systems combining processors with accelerators (e.g., NVIDIA GPUs) or coprocessors (e.g., Intel Xeon Phi). Bright's minimal footprint enables systems to be utilized to their maximum potential, from departmental Hadoop clusters to large-scale supercomputers. Bright Computing partners include Amazon, Cisco, Cray and Dell, while Boeing, ING Bank, NASA, Roche, Saudi Aramco plus Stanford University and Tokyo Institute of Technology are examples of Bright customers. Bright Computing is a Red Herring 2013 Top 100 North America Award winner, the Deloitte Technology Fast50 “Rising Star 2013” Award winner, the Main Software 50 “Highest Growth” Award winner, and Bright Cluster Manager was a "Best of Show Award" winner at Bio-IT World 2013.
Pictures and screenshots of Bright Cluster Manager
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
May. 22, 2015 08:00 AM EDT Reads: 4,493
Actifio, the copy data virtualization company, today announced a series of new features and functional enhancements to better support the growing number of customers leveraging its virtual data pipeline technology in support of DevOps use cases. While infrastructure is increasingly seen as a commodity inside large enterprises, applications – where the technology touches the business – are becoming more strategic. Managing the flow of data that powers those applications presents a range of cha...
May. 22, 2015 08:00 AM EDT Reads: 1,549
Cloud services are the newest tool in the arsenal of IT products in the market today. These cloud services integrate process and tools. In order to use these products effectively, organizations must have a good understanding of themselves and their business requirements. In his session at 15th Cloud Expo, Brian Lewis, Principal Architect at Verizon Cloud, outlined key areas of organizational focus, and how to formalize an actionable plan when migrating applications and internal services to the ...
May. 22, 2015 08:00 AM EDT Reads: 3,398
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
May. 22, 2015 08:00 AM EDT Reads: 3,925
Most companies hope for rapid growth so it's important to invest in scalable core technologies that won't demand a complete overhaul when a business goes through a growth spurt. Cloud technology enables previously difficult-to-scale solutions like phone, network infrastructure or billing systems to automatically scale based on demand. For example, with a virtual PBX service, a single-user cloud phone service can easily transition into an advanced VoIP system that supports hundreds of phones and ...
May. 22, 2015 07:00 AM EDT Reads: 2,588
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
May. 22, 2015 06:00 AM EDT Reads: 4,642
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.
May. 22, 2015 06:00 AM EDT Reads: 5,799
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
May. 22, 2015 06:00 AM EDT Reads: 1,997
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
May. 22, 2015 06:00 AM EDT Reads: 3,688
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
May. 22, 2015 06:00 AM EDT Reads: 4,461
“Oh, dev is dev and ops is ops, and never the twain shall meet.” With apoloies to Rudyard Kipling and all of his fans, this describes the early state of the two sides of DevOps. Yet the DevOps approach is demanded by cloud computing, as the speed, flexibility, and scalability in today's so-called “Third Platform” must not be hindered by the traditional limitations of software development and deployment. A recent report by Gartner, for example, says that 25% of Global 2000 companies will b...
May. 22, 2015 05:45 AM EDT Reads: 2,602
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
May. 22, 2015 05:45 AM EDT Reads: 6,247
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
May. 22, 2015 05:30 AM EDT Reads: 2,183
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...
May. 22, 2015 05:30 AM EDT Reads: 3,118
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
May. 22, 2015 05:15 AM EDT Reads: 2,876