SYS-CON MEDIA Authors: Pat Romanski, Zakia Bouachraoui, Liz McMillan, Elizabeth White, Yeshim Deniz

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Apache, Government Cloud

@CloudExpo: Article

Washington to Put $200 Million into Big Data R&D

To improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

The Obama Administration Thursday unveiled a Big Data Research and Development Initiative that will see the six federal agencies and departments put $200 million or more into Big Data R&D.

These new commitments are supposed to improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

Dr. John Holdren, director of the White House Office of Science and Technology Policy, said, "In the same way that past federal investments in information technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education and national security."

It's seen as being that important.

The major initiative is supposed advance state-of-the-art core technologies, apply them to accelerate the pace of discovery in science and engineering as well as transform teaching and learning, and expand the workforce needed to develop and use Big Data technologies.

It's a response to recommendations by the President's Council of Advisors on Science and Technology, which last year concluded that the federal government was under-investing in Big Data technologies.

As a result, the National Science Foundation (NSF) and the National Institutes of Health (NIH) will be implementing a long-term strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate and serve data to communities; and new approaches to education and workforce development.

As a start, NSF will be funding a $10 million Expeditions in Computing project based at Berkeley to integrate machine learning, cloud computing and crowd sourcing.

It will also provide the first round of grants to support EarthCube, a system that lets geoscientists access, analyze and share information about the planet, issue a $2 million award for a research training group for undergraduates to use graphical and visualization techniques for complex data and provide $1.4 million to support a focused research group of statisticians and biologists to determine protein structures and biological pathways.

NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical and other data sets related to health and disease.

It said the world's largest set of data on human genetic variation - produced by the international 1000 Genomes Project - is now available on Amazon's cloud. At 200TB - the equivalent of 16 million file cabinets filled with text, or more than 30,000 standard DVDs - the current 1000 Genomes Project data set, derived from 1,700 people, is a prime example of Big Data.

AWS is storing the 1000 Genomes Project on S3 and in Amazon Elastic Block Store (EBS) as a publicly available data set for free; researchers only will pay for the EC2 and Elastic MapReduce (EMR) services they use for disease research. They used to have to download publicly available datasets from government data centers to their own systems, or have the data physically shipped to them on disks. The current aim of the project is to sequence 2,600 individuals from 26 populations around the world. (See http://aws.amazon.com/1000genomes.)

The Defense Department will be investing around $250 million a year (with $60 million available for new research projects) in a series of programs that use Big Data in new ways to bring together sensing, perception and decision support to make autonomous systems that can maneuver and make decisions on their own.

The agency also wants a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities and events an analyst can observe.

DARPA, the Defense Advanced Research Projects Agency, is beginning an XDATA program that will invest about $25 million a year for four years to develop computational techniques and software tools for analyzing large volumes of data, both semi-structured (tabular, relational, categorical, metadata) and unstructured (text documents, message traffic).

That means developing scalable algorithms for processing imperfect data in distributed data stores and creating effective human-computer interaction tools to facilitate rapidly customizable visual reasoning for diverse missions.

The XDATA program will employ open source toolkits for software development so users can process large volumes of data in timelines "commensurate with mission workflows of targeted defense applications."

The Energy Department will kick in $25 million in funding to establish a Scalable Data Management, Analysis and Visualization (SDAV) Institute under Lawrence Berkeley National Laboratory.

It's supposed to bring together the expertise of six national laboratories and seven universities to develop new tools to help scientists manage and visualize data on the agency's supercomputers to streamline the processes that lead to discoveries made by scientists using the agency's research facilities. It said new tools are needed since the simulations running on its supercomputers have increased in size and complexity.

Lastly, the US Geological Survey will incubate Big Data projects that address issues such as species response to climate change, earthquake recurrence rates and the next generation of ecological indicators.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Because Linkerd is a transparent proxy that runs alongside your application, there are no code changes required. It even comes with Prometheus to store the metrics for you and pre-built Grafana dashboards to show exactly what is important for your services - success rate, latency, and throughput. In this session, we'll explain what Linkerd provides for you, demo the installation of Linkerd on Kubernetes and debug a real world problem. We will also dig into what functionality you can build on ...
The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get tailored market studies; and more.
Most organizations are awash today in data and IT systems, yet they're still struggling mightily to use these invaluable assets to meet the rising demand for new digital solutions and customer experiences that drive innovation and growth. What's lacking are potent and effective ways to rapidly combine together on-premises IT and the numerous commercial clouds that the average organization has in place today into effective new business solutions. New research shows that delivering on multicloud e...
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leadin...
DevOps is a world surrounded by information, starting from a single commit and ending in roll out to production. In this talk, I'll introduce you to the world of Taboola DevOps data collection, to better understand what goes on under the hood. The system we've developed in-house helps us collect and analyse the entire DevOps process from the very first commit all the way to production. It provides us a full clear view with a drill-down toolset that helps keep us away from the dark side. ...
Take advantage of autoscaling, and high availability for Kubernetes with no worry about infrastructure. Be the Rockstar and avoid all the hurdles of deploying Kubernetes. So Why not take Heat and automate the setup of your Kubernetes cluster? Why not give project owners a Heat Stack to deploy Kubernetes whenever they want to? Hoping to share how anyone can use Heat to deploy Kubernetes on OpenStack and customize to their liking. This is a tried and true method that I've used on my OpenSta...
We at Capgemini have developed a cloud-native PaaS Solution called "Apollo". Apollo is built on top of following open source components. - Apache Mesos for cluster management, scheduling & resource isolation - Marathon or Kubernetes for Container orchestration - Docker for application container runtime, - Consul for service discovery via DNS - Weave for networking of Docker Containers - Traefik for application container load balancing
After years of investments and acquisitions, CloudBlue was created with the goal of building the world's only hyperscale digital platform with an increasingly infinite ecosystem and proven go-to-market services. The result? An unmatched platform that helps customers streamline cloud operations, save time and money, and revolutionize their businesses overnight. Today, the platform operates in more than 45 countries and powers more than 200 of the world's largest cloud marketplaces, managing mo...
With digital video content creation going viral and assuming the bulk of Internet traffic, how can the deluge of video content be analyzed effectively to derive insights and ROI? After all, video is not only huge in size, but it is complex given various visual, audio and temporal elements. Video summarization (a mechanism for generating a short video summary via key frame analysis or video skimming) has become a popular research topic industry-wide and across academia. Video thumbnail generation...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio a...
Containerized software is riding a wave of growth, according to latest RightScale survey. At Sematext we see this growth trend via our Docker monitoring adoption and via Sematext Docker Agent popularity on Docker Hub, where it crossed 1M+ pulls line. This rapid rise of containers now makes Docker the top DevOps tool among those included in RightScale survey. Overall Docker adoption surged to 35 percent, while Kubernetes adoption doubled, going from 7% in 2016 to 14% percent.
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to ...