In this video from the HPC Advisory Council Switzerland Conference, D.K. Panda from Ohio State University presents: Accelerating Big Data with Hadoop (HDFS, MapReduce and HBase) and Memcached. Download the slides (PDF).
In this video, Ph.D candidate Jerome Mitchell from Indiana University details the benefits of Hadoop as well as offering a hands-on session illustrating its uses. You can also see Part 2, Part 3, Part 4, and Part 5 of this lecture.
When Intel announced its Intel Distribution for Apache Hadoop Software earlier this week, the company bolstered its offering by including the full range of enterprise data integration and analytics software from the Santa Clara-based Pentaho Corporation.
Under the terms of an OEM licensing agreement, Intel will integrate all the software applications in the Pentaho Business Analytics platform, including data mining, interactive reporting, analysis, data discovery/visualizations, dashboards, predictive analytics, and data integration ¬– with full support for big data.
Intel embedding Pentaho into its big data analytics solution means that customers can more easily integrate Hadoop within their enterprise data environments while also providing exceptional analytics capabilities to a wider set of business users.” said Pentaho Chief Strategy Office Richard Daley. “We are very proud to be Intel’s technology partner and expect it will bring valuable new opportunities for customers looking to leverage the disruptive power of big data.”
Intel is billing Intel Distribution for Apache Hadoop software as an important open platform for next-gen analytics. The solution features an up to 30x boost in Hadoop performance and is optimized (no surprises here) for Intel Xeon processors, Intel SSD storage, and Intel 10GbE networking. The integration of the Pentaho software adds a nice set of additional capabilities to the Intel offering.
Read the Full Story
In this video, Intel’s Boyd Davis launches the new Intel Distribution for Apache Hadoop.
People and machines are producing valuable information that could enrich our lives in so many ways, from pinpoint accuracy in predicting severe weather to developing customized treatments for terminal diseases,” said Boyd Davis, vice president and general manager of Intel’s Datacenter Software Division. “Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open source community to provide the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data.”
DDN has developed a Hadoop solution that is all about time to value: It simplifies rollout so that enterprises can get up and running more quickly, provides typical DDN performance to accelerate data processing, and reduces the amount of time needed to maintain a Hadoop solution.” said Dave Vellante, Chief Research Officer, Wikibon.org. “For enterprises with a deluge of data but a limited IT budget, the DDN hScaler appliance should be on the short list of potential solutions.”
In this webcast, Cloudera founder Christophe Bisciglia and O’Reilly author Tom White will provide an introduction to Hadoop/MapReduce, the open source project that allows organizations to process, store and analyze massive application datasets.
In this slidecast, Brian Christian from Zettaset presents: Examining Hadoop as a Big Data Risk in the Enterprise.
While the open source framework has enabled Hadoop to logically grow and expand, business and government enterprise organizations face deployment and management challenges with Hadoop. Hadoop’s core specifications are still being developed by the Apache community, and thus far, do not adequately address enterprise requirements, such as support for robust security and regulatory compliance mandates such as HIPAA and SOX, for example.”
In this video, Dr. Brent Welch from Panasas presents: High Performance NAS for Hadoop.
In this video, D.K. Panda from Ohio State University presents: Accelerating Big Data with Hadoop and Memcached. The presentation was recorded at the HPC Advisory Council Stanford Conference 2013. Download the slides (PDF).
This week Xyratex announced a partnership with analytics leader Pentaho Corporation to develop the industry’s first fully integrated Big Data analytics and scalable storage solutions. The combined offerings, which will be released later this year, will help organizations decrease the amount of hardware and software required to complete major data analysis projects – and unlock the limitless potential of Big Data while delivering lower total cost of ownership (TCO) to end users.
This tight integration of the analytics engine and the data storage into the same solution will remove performance bottlenecks, reduce deployment complexity, simplify management and ease the scaling of an organization’s big data infrastructure, enabling our customers to garner valuable insights into their business sooner,” said Ken Claffey, senior vice president of the ClusterStor business at Xyratex. “Today, in collaboration with our partners, we’re helping end users achieve best-in-class performance, reliability and scalability – including implementing the fastest data storage system in the world. We’re confident that the combined power of our ClusterStor data storage with Pentaho’s leading analytics will re-define what’s possible with Big Data.”
Read the Full Story.