Sensors, agents and an Internet of Things are all producing data, all of the time. It would be a vast understatement to say that the CIA has experience in acquiring, handling and analyzing big quantities of data. In this talk, the CTO of the CIA will talk about the scale of the problems his team deals with now, the coming inflection point in the increase in data, the grand challenges we face and why an emphasis on analytics is critical for the future. This is a talk not to be missed.
In this video, author Alistair Croll explains the concepts of his book, Lean Analytics. Croll strongly advises startups to pick the one metric that matters the most and to focus on it.
This talk was hosted by MaRS, a Canadian organization that provides resources — people, programs, physical facilities, funding and networks — to ensure that critical innovation happens.
Addison Snell will present some of the top insights from recent market intelligence studies from Intersect360 Research, including forward-looking views of the vertical markets, new applications, and technologies with the best prospects for growth in 2012 and beyond. The view from Intersect360 Research will include applications in both High Performance Technical Computing (HPTC) and High Performance Business Computing (HPBC), with an emphasis on the opportunities for HPC technologies in emerging Big Data applications. The evolving industry dynamics around accelerators, file systems, and InfiniBand will also be discussed.”
In this video from the 2013 National HPCC Conference, Rich Brueckner from inside-BigData moderates a panel discussion on How to Talk to Your CFO about HPC and Big Data.
John C. Morris – Pfizer
Dr. George Ball – Raytheon
Henry Tufo – University of Colorado, Boulder
Dr. Flavio Villanustre – LexisNexis
As members of the HPC community, we spend a good share our time sharing our work and best practices with our colleagues. But how do we communicate the business value of high performance computing and Big Data analytics to CFOs who have little affinity to discussions of things like cores, Hadoop, and MPI? In this panel discussion, experts and Big Data and HPC will come together to share best practices and communication strategies that have proven effective when talking to CFOs and other C-level executives.”
In this video from the 2013 National HPCC Conference, Bob Feldman moderates a panel discussion entitled: Big Systems, Big Data, Better Products.
- Devin Jensen – Altair
- Rene Copeland – SGI
- Dr. Stephen Wheat – Intel
- Sanjay Umarji – HP
How will enormous data sets and an endless stream of ever-more granular variables drive supercomputing in the coming years? Will it be like a dust storm that buries us, or flood waters we can redirect and manage? How will it alter the evolution of architecture and subsystems? How will it change computer science education, development tools and job descriptions? And will gargantuan data form a barrier to our evolution to Exascale and beyond by sapping the shrinking resources for funding and creativity?
In this video from the 2013 National HPCC Conference, Bradford Spiers from Bank of America presents: Big Data in Banking.
To some people, Big Data in Banking might relate them to calls from their credit card when a charge seems unusual. To others, it might mean calculations behind low-latency trading. Initially, it seemed to mean just simple Hadoop. Now we see specialization according to the problem we are solving. This talk will discuss different types of Big Data seen in Banking and how one might tie them together to form viable workflows that solve our business and infrastructure challenges.”
HPCC Systems from LexisNexis Risk Solutions works with clients in various industries to manage different types of risk by helping them derive insight from massive data sets. To do this, we have developed our High Performance Computing Cluster (HPCC) technology, making it possible to process and analyze complex, massive data sets in a matter of seconds.
Despite the much-discussed power of data, there are roles for people to play in big data projects. Data increasingly influences companies’ decision making processes, but several speakers hit on the notion that people should be involved in big data storage and analysis. It all starts with a human question. Before machines generate answers, employees from many departments should feel empowered to ask good questions of data, said John Sotham, vice president of finance at BuildDirect.
Read the Full Story.
The internet, sensors and high performance computing are some of the top Big Data producers. Recently, there has been increased focus on extracting more value out of these generated data. Analysis of Big Data sets may be simplified as “looking for needle in a haystack” on one end of a spectrum to “looking for relationships between hay in a stack” on the other. We will discuss the architectural platforms and tools suitable for different parts of this spectrum.”
In this video from the 2013 National HPCC Conference, Wolfgang Gentzsch presents: EUDAT and Big Data in Science.
Big data science emerges as a new paradigm for scientific discovery that reflects the increasing value of observational, experimental and computer-generated data in virtually all domains, from physics to the humanities and social sciences. Addressing this new paradigm, the EUDAT project is a European data initiative that brings together a unique consortium of 25 partners — including research communities, national data and high performance computing (HPC) centers, technology providers, and funding agencies — from 13 countries. EUDAT aims to build a sustainable cross-disciplinary and cross-national data infrastructure that provides a set of shared services for accessing and preserving research data. The design and deployment of these services is being coordinated by multi-disciplinary task forces comprising representatives from research communities and data centers.”