In this slidecast, Jeff Denworth from DDN describes the company’s new hScalar storage system — the World’s First Enterprise Apache Hadoop Appliance.
DDN has developed a Hadoop solution that is all about time to value: It simplifies rollout so that enterprises can get up and running more quickly, provides typical DDN performance to accelerate data processing, and reduces the amount of time needed to maintain a Hadoop solution.” said Dave Vellante, Chief Research Officer, Wikibon.org. “For enterprises with a deluge of data but a limited IT budget, the DDN hScaler appliance should be on the short list of potential solutions.”
In this slidecast, Brian Christian from Zettaset presents: Examining Hadoop as a Big Data Risk in the Enterprise.
While the open source framework has enabled Hadoop to logically grow and expand, business and government enterprise organizations face deployment and management challenges with Hadoop. Hadoop’s core specifications are still being developed by the Apache community, and thus far, do not adequately address enterprise requirements, such as support for robust security and regulatory compliance mandates such as HIPAA and SOX, for example.”
In this slidecast, Vishnu Bhat from Infosys presents an overview of the company’s all-new BigDataEdge platform for insight management.
Enterprises today cannot afford to spend an inordinate amount of time making sense of the data deluge that surrounds them,” said Vishnu Bhat, VP of Cloud at Infosys. “Infosys BigDataEdge draws upon our deep research & development capabilities and proven expertise in Big Data and analytics to help clients turn data into revenues faster. This unique platform is already enabling ten global organizations to develop actionable insights in a matter of days and act on them from day one.”
In this slidecast, Gary Tyreman from Univa discusses the new Univa Grid Engine for ARMv7 Release. As HPC and Big Data infrastructure design continues to converge, this platform could be a stepping stone to the future.
Driven by the demand for new datacenter services to support mobile and cloud computing, ARM will continue to gain in-roads into the datacenter server market because of the low-power and energy efficient design of SOC’s based on ARM’s technology”, said Karl Freund, VP Marketing at Calxeda. “As enterprises shift towards highly scalable solutions such as Calxeda, a key enabling technology is intelligent workload management – and we have partnered with Univa to provide our customers with a great solution.”
In this slidecast, Floyd Christofferson from SGI describes how the combination of the company’s Infinite Storage platform and Scality Ring technology provide a new, unified scale-out storage system. The solution is designed to provide both extreme scale and high performance, allowing customers to manage storage of massive stores of unstructured data.
Scale-out object-based solutions are designed to address this particular set of problems by minimizing manual intervention for storage expansions, migrations, and recoveries from storage system failure,” said Ashish Nadkarni, research director, Storage Systems at IDC. “Such a dispersed, fault-tolerant architecture enables IT organizations to more efficiently absorb data growth in a manner that is predicable for the long term.”
In this podcast, the Radio Free HPC team looks at at a new book called The Human Face of Big Data by Rick Smolan. The book details the ways in which Big Data affects our daily lives and predicts the ways it will transform our future.
The big-picture overview the author presents is that of our planet becoming a giant nervous system in which we, its inhabitants, are the nerve endings. Or as his ten-year-old son put it, “Isn’t this like growing another eye?” Indeed it is: with all of this sensory input, we are beginning to see a third dimension.
It’s also one of those we-don’t-know-what-we-don’t-know propositions. Just as we could never have imagined how transformative the Internet would be, we can’t predict where Big Data will take us. We don’t even know, really, how to maximize the data we’re collecting right now.
The Human Face of Big Data is available — well, probably lots of places — but here’s the Amazon link, because they have so much data on us already.
In this slidecast, Eric Barton, Lead Architect for Intel’s High Performance Data Division presents a progress update on the Fast Forward I/O & Storage program.
Back in July 2012, Whamcloud was awarded the Storage and I/O Research & Development subcontract for the Department of Energy’s FastForward program. Shortly afterward, the company was acquired by Intel. The two-year contract scope includes key R&D necessary for a new object storage paradigm for HPC exascale computing, and the developed technology will also address next-generation storage mechanisms required by the Big Data market.
The subcontract incorporates application I/O expertise from the HDF Group, system I/O and I/O aggregation expertise from EMC Corporation, object storage expertise from DDN, and scale testing facilities from Cray, teamed with file system, architecture, and project management skills from Whamcloud. All components developed in the project will be open sourced and benefit the entire Lustre community.
This is a fascinating presentation for those interested in how an Exascale system might handle data, and the prototype that comes out of it may well represent the roadmap to the future of supercomputing.
How well is the Federal Government leveraging Big Data? In this podcast, Michael Nelson, a technology policy analyst from Bloomberg Government describes recent progress and obstacles the Feds are running into along the way.
Big Data, it’s is the next big thing expected to trigger $34 billion in worldwide IT spending this year. But harvesting the fruit is tedious work with a lot of steps in the process. Are the big data dollars at your agency being spent in a way that will maximize success? A new Bloomberg study broke down the big data cycle to look at the privacy, security and transparency issues that could be holding you back.