SYS-CON MEDIA Authors: Pat Romanski, Gary Arora, Zakia Bouachraoui, Yeshim Deniz, Liz McMillan

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Apache, Government Cloud

@CloudExpo: Article

Washington to Put $200 Million into Big Data R&D

To improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

The Obama Administration Thursday unveiled a Big Data Research and Development Initiative that will see the six federal agencies and departments put $200 million or more into Big Data R&D.

These new commitments are supposed to improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

Dr. John Holdren, director of the White House Office of Science and Technology Policy, said, "In the same way that past federal investments in information technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education and national security."

It's seen as being that important.

The major initiative is supposed advance state-of-the-art core technologies, apply them to accelerate the pace of discovery in science and engineering as well as transform teaching and learning, and expand the workforce needed to develop and use Big Data technologies.

It's a response to recommendations by the President's Council of Advisors on Science and Technology, which last year concluded that the federal government was under-investing in Big Data technologies.

As a result, the National Science Foundation (NSF) and the National Institutes of Health (NIH) will be implementing a long-term strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate and serve data to communities; and new approaches to education and workforce development.

As a start, NSF will be funding a $10 million Expeditions in Computing project based at Berkeley to integrate machine learning, cloud computing and crowd sourcing.

It will also provide the first round of grants to support EarthCube, a system that lets geoscientists access, analyze and share information about the planet, issue a $2 million award for a research training group for undergraduates to use graphical and visualization techniques for complex data and provide $1.4 million to support a focused research group of statisticians and biologists to determine protein structures and biological pathways.

NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical and other data sets related to health and disease.

It said the world's largest set of data on human genetic variation - produced by the international 1000 Genomes Project - is now available on Amazon's cloud. At 200TB - the equivalent of 16 million file cabinets filled with text, or more than 30,000 standard DVDs - the current 1000 Genomes Project data set, derived from 1,700 people, is a prime example of Big Data.

AWS is storing the 1000 Genomes Project on S3 and in Amazon Elastic Block Store (EBS) as a publicly available data set for free; researchers only will pay for the EC2 and Elastic MapReduce (EMR) services they use for disease research. They used to have to download publicly available datasets from government data centers to their own systems, or have the data physically shipped to them on disks. The current aim of the project is to sequence 2,600 individuals from 26 populations around the world. (See http://aws.amazon.com/1000genomes.)

The Defense Department will be investing around $250 million a year (with $60 million available for new research projects) in a series of programs that use Big Data in new ways to bring together sensing, perception and decision support to make autonomous systems that can maneuver and make decisions on their own.

The agency also wants a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities and events an analyst can observe.

DARPA, the Defense Advanced Research Projects Agency, is beginning an XDATA program that will invest about $25 million a year for four years to develop computational techniques and software tools for analyzing large volumes of data, both semi-structured (tabular, relational, categorical, metadata) and unstructured (text documents, message traffic).

That means developing scalable algorithms for processing imperfect data in distributed data stores and creating effective human-computer interaction tools to facilitate rapidly customizable visual reasoning for diverse missions.

The XDATA program will employ open source toolkits for software development so users can process large volumes of data in timelines "commensurate with mission workflows of targeted defense applications."

The Energy Department will kick in $25 million in funding to establish a Scalable Data Management, Analysis and Visualization (SDAV) Institute under Lawrence Berkeley National Laboratory.

It's supposed to bring together the expertise of six national laboratories and seven universities to develop new tools to help scientists manage and visualize data on the agency's supercomputers to streamline the processes that lead to discoveries made by scientists using the agency's research facilities. It said new tools are needed since the simulations running on its supercomputers have increased in size and complexity.

Lastly, the US Geological Survey will incubate Big Data projects that address issues such as species response to climate change, earthquake recurrence rates and the next generation of ecological indicators.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure...
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.
DevOps has long focused on reinventing the SDLC (e.g. with CI/CD, ARA, pipeline automation etc.), while reinvention of IT Ops has lagged. However, new approaches like Site Reliability Engineering, Observability, Containerization, Operations Analytics, and ML/AI are driving a resurgence of IT Ops. In this session our expert panel will focus on how these new ideas are [putting the Ops back in DevOps orbringing modern IT Ops to DevOps].
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Concerns about security, downtime and latency, budgets, and general unfamiliarity with cloud technologies continue to create hesitation for many organizations that truly need to be developing a cloud strategy. Hybrid cloud solutions are helping to elevate those concerns by enabling the combination or orchestration of two or more platforms, including on-premise infrastructure, private clouds and/or third-party, public cloud services. This gives organizations more comfort to begin their digital tr...
Most organizations are awash today in data and IT systems, yet they're still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What's lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions.
Keeping an application running at scale can be a daunting task. When do you need to add more capacity? Larger databases? Additional servers? These questions get harder as the complexity of your application grows. Microservice based architectures and cloud-based dynamic infrastructures are technologies that help you keep your application running with high availability, even during times of extreme scaling. But real cloud success, at scale, requires much more than a basic lift-and-shift migrati...
David Friend is the co-founder and CEO of Wasabi, the hot cloud storage company that delivers fast, low-cost, and reliable cloud storage. Prior to Wasabi, David co-founded Carbonite, one of the world's leading cloud backup companies. A successful tech entrepreneur for more than 30 years, David got his start at ARP Instruments, a manufacturer of synthesizers for rock bands, where he worked with leading musicians of the day like Stevie Wonder, Pete Townsend of The Who, and Led Zeppelin. David has ...
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Addteq is a leader in providing business solutions to Enterprise clients. Addteq has been in the business for more than 10 years. Through the use of DevOps automation, Addteq strives on creating innovative solutions to solve business processes. Clients depend on Addteq to modernize the software delivery process by providing Atlassian solutions, create custom add-ons, conduct training, offer hosting, perform DevOps services, and provide overall support services.