Click here to close now.

SYS-CON MEDIA Authors: VictorOps Blog, Tim Hinds, Mike Kavis, Plutora Blog, Andreas Grabner

News Feed Item

Yahoo! Japan Qualifies HGST FlashMAX II PCIe SSDs

Enterprise storage leader HGST, a Western Digital company (NASDAQ:WDC), today announced that its flash-based Storage Class Memory (SCM) solution, the HGST FlashMAX II, has been qualified by Yahoo! Japan for use in their relational database management system (RDBMS) server storage systems. With the high capacity in a low-profile form factor, the HGST FlashMAX II increases Yahoo! Japan RDBMS processing speeds by 10x in their conventional configuration.

Due to the rapid growth of Internet data and the increasing demand for rich content, the RDBMS database servers at Yahoo! Japan needed more speed with low latency to handle growth and serve up more targeted content. Given this volume and the mission-critical nature of their business, Yahoo! Japan coupled the HGST FlashMAX II with an x86 server and InfiniBand to its conventional RDBMS to increase processing speeds and application performance, while reducing power consumption.

“As workloads increased, our conventional iSCSI-based storage solution could not keep pace with the rate of our business,” said Hiromune Ozaki, manager of Data Solution Division, System Management Group, Yahoo! Japan. “The HGST FlashMAX II integration delivered more predictable and sustained performance, increased the reliability of our RDBMS database clusters and allowed our team to reduce our datacenter footprint.”

The HGST FlashMAX II has been designed from the ground up to fully exploit today’s enterprise datacenter architectures and solve application performance problems. The HGST SCM architecture has been designed to tightly integrate flash media, hardware and software to deliver memory-class performance with storage-class capacity to reduce power consumption, server sprawl and enable cost-effective upgrades to existing infrastructure performance.

“Yahoo! Japan has a rigorous selection process, and the qualification of our FlashMAX II is continued validation that our solution has been designed from the ground up to fully exploit and improve application performance in todays enterprise datacenters,” said Mike Gustafson, senior vice president and general manager, HGST’s SSD, Software and Solutions Group. “The HGST FlashMAX II allows the Yahoo! Japan team to improve their existing infrastructure and take full advantage of CPU performance.”

About HGST

HGST, a Western Digital company (NASDAQ: WDC), develops innovative, advanced hard disk drives, enterprise-class solid state drives, external storage solutions and services used to store, preserve and manage the world’s most valued data. HGST addresses customers’ rapidly changing storage needs by delivering intelligent storage devices that tightly integrate hardware and software to maximize solution performance. Founded by the pioneers of hard drives, HGST provides high-value storage for a broad range of market segments, including Enterprise, Cloud, Datacenter, Mobile Computing, Consumer Electronics and Personal Storage. HGST was established in 2003 and maintains its U.S. headquarters in San Jose, California. For more information, please visit the company’s website at http://www.hgst.com.

One GB is equal to one billion bytes, and one TB equals 1,000 GB (one trillion bytes). Actual capacity will vary depending on operating environment and formatting.

Virident is a registered trademark and FlashMAX is a filed trademark of HGST, Inc. and its affiliates in the United States and/or other countries.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...