How well is the Federal Government leveraging Big Data? In this podcast, Michael Nelson, a technology policy analyst from Bloomberg Government describes recent progress and obstacles the Feds are running into along the way.
Big Data, it’s is the next big thing expected to trigger $34 billion in worldwide IT spending this year. But harvesting the fruit is tedious work with a lot of steps in the process. Are the big data dollars at your agency being spent in a way that will maximize success? A new Bloomberg study broke down the big data cycle to look at the privacy, security and transparency issues that could be holding you back.
This week PNNL announced that the lab is launching the new Northwest Institute for Advanced Computing in cooperation with the University of Washington. Researchers associated with the institute will work to ensure the next generation of computers and the methods used to run them can address challenges ranging from climate change to energy management.
Computing has transformed science, engineering and society in remarkable ways,” said Doug Ray, associate director of PNNL’s Fundamental & Computational Sciences Directorate. “But as huge amounts of new data are generated daily by scientific instruments and household electronics, new technologies and approaches are needed to give that information more meaning. Researchers at the Northwest Institute for Advanced Computing will tackle ‘big data’ and help improve the quality of life for many U.S. citizens.”
Located on UW’s campus, the institute will be a center of collaboration where UW and PNNL researchers jointly explore advanced computer system designs, accelerate data-driven scientific discovery and improve computational modeling and simulation. Scientists and engineers at the institute will also train future researchers in modern computational approaches. Read the Full Story.
A team at the Georgia Institute of Technology has received a $2.7 million award from the Defense Advanced Research Projects Agency (DARPA) to develop technology to help address the challenges of Big Data – data sets that are both massive and complex.
The contract is part of DARPA’s XDATA program, a four-year research effort to develop computational techniques and open-source software tools for processing and analysing data, motivated by defence needs. Georgia Tech was selected to perform research in the area of scalable analytics and data-processing technology.
The team will focus on producing new machine-learning approaches capable of analyzing very large-scale data. Team members will also pursue development of distributed computing methods that can process data-analytics algorithms very rapidly with a variety of systems, including supercomputers, parallel-processing environments and networked, distributed computing systems.
‘This award allows us to build on the foundations we’ve already established in large-scale data analytics and visualisation,’ said Richard Fujimoto, leader of the Georgia Tech team. ‘The algorithms, tools and other technologies that we develop will all be open source, to allow them to be customised to address new problems arising in defence and other applications.’
The award is part of a $200 million multi-agency federal initiative for big-data research and development. It aims to improve the ability to extract knowledge and insights from the nation’s fast-growing volumes of digital data.
In this video, Shirley Ann Jackson from Rensselaer Polytechnic Institute and Rachel Sterne Haot from the New York City Hall look at how government, companies, educational institutions, and the health care sector can harness and manage Big Data. The video was recorded at The Economist’s World in 2013 Festival on December 8, 2012.
In this video from ZDNet, NASA’s Nicholas Skytland discusses the agency’s big data challenges with over 10 Terabytes of data coming in every day from satellites. With missions planned that will transmit 24 Terabytes per day, NASA is leveraging Cloud Computing to handle much larger data sets in the future.
Over at GigaOM, Derrick Harris writes that NASA has launched a series of TopCoder challenges designed to find innovative solutions to the government’s big data problems. In the Big Data Challenge, contestants will try to derive value from disparate, incompatible cross-agency data sets.
Although the possibility of influencing big data strategies within some of the country’s most advanced agencies might be novel, crowdsourcing solutions to these types of difficult problems is becoming rather common. TopCoder exists as a competitive platform for solving application development and design issues, and the Big Data Challenge is just the latest it’s hosting for NASA via the agency’s Center of Excellence for Collaborative Innovation and NASA Tournament Lab. There’s also general-purpose platform InnoCentive and wildly popular data science platform Kaggle.
The report sketches out the opportunity — the power of knowing more — to improve health care and reduce costs, put more cars on the road with less congestion, and produce more accurate weather predictions to lessen storm-related property damage and save lives, among other benefits.
In this video from LawTechCamp London 2012, Dr. Jack Conrad and Dr Daniel Katz discuss areas where the term ‘big data’ will become a key consideration for practitioners, managers, software engineers and start-up businesses.