Drawing on extensive expertise and experience in HPC and managing high volume storage, SGI is introducing new solutions to perform Big Data analytics faster. Now the enterprise can achieve extreme capacity and scale needed for Big Data storage, and manage storage investments more cost effectively.
A group of researchers from three U.S. universities are leveraging high performance computing (HPC) systems in conjunction with Big Data systems to make a significant step forward in the rapid analysis of financial markets; in effect reducing processing from hours/days down to minutes. The researchers from the University of Illinois, the Pittsburgh Supercomputer Center at […]
Big Data science has long-relied on supercomputers to do immense amounts of number-crunching and data-sifting that would otherwise be impossible or take a very, very, very long time. At Brookhaven Labs, 23 racks containing thousands of processors give scientists the computer power they need. Our supercomputers also help us dig through the massive amounts of […]
Flavio Villanustre writes that convergence of HPC and Big Data applications is being driven by machine learning.
In this podcast, Deepak Jeevan Kumar from VC firm General Catalyst Partners describes his efforts to help entrepreneurs with disruptive ideas in big data, cloud computing, data center infrastructure, cyber-security, and clean energy. Deepak Jeevan Kumar has been with General Catalyst since 2010, first in Boston and later in the firm’s Palo Alto office. […]
In this video from the HPC Advisory Council Europe Conference, Roland Fehrenbacher from Q-Leap Networks presents: Qlustar – A full-fledged HPC/Storage Cluster OS.
In this video from the DDN User Group at ISC’13, James Coomer from DataDirect Networks presents: Massively-Scalable Platforms and Solutions Engineered for the Big Data and Cloud Era. Check out more from the International Supercomputing Conference at our ISC’13 Video Gallery.