How well is the Federal Government leveraging Big Data? In this podcast, Michael Nelson, a technology policy analyst from Bloomberg Government describes recent progress and obstacles the Feds are running into along the way.
Big Data, it’s is the next big thing expected to trigger $34 billion in worldwide IT spending this year. But harvesting the fruit is tedious work with a lot of steps in the process. Are the big data dollars at your agency being spent in a way that will maximize success? A new Bloomberg study broke down the big data cycle to look at the privacy, security and transparency issues that could be holding you back.
In this slidecast, CEO Bill Bain from ScaleOut Software presents: In-Memory Data Grids Enable Real-Time Analysis.
ScaleOut Software is a pioneer and leader in data grid software. Since our first products shipped in January 2005, we have consistently developed leading-edge technologies that help our customers solve scalability and performance challenges and gain competitive advantages for their businesses.”
From preparing and modeling big data stores for analysis, to data visualization, exploration and predictive analysis, Pentaho Business Analytics allows you to harvest the meaningful patterns buried in large volumes of structured and unstructured data. Analyzing big data sets gives you the power to identify new revenue sources, develop loyal and profitable customer relationships, and run your overall organization more efficiently and cost effectively.”
In this slidecast, Richard Treadway and Rich Seger from NetApp discuss the company’s storage solutions for Big Data and HPC. The company’s HPC solutions for Lustre support massive performance and storage density without sacrificing efficiency.
In this slidecast, Loggly CEO Charlie Oppenheimer describes why the company is the world’s most popular cloud-based Log Management Service.
Loggly has a rich set of features that makes log management fun and easy, and being 100% cloud-based means our focus in on scale and speed so you can focus on your application not hosting and hardware. While demand for storing all those logs is accelerating along with all the data being generated, the technology behind the storage and processing of data also continues to accelerate. Within a few months time, the technology we are developing at Loggly will provide companies a way to peek into these large volumes of log data – where they couldn’t before – and allow them to see exactly what their users are doing with all that big data.
In this podcast (audio only with still image) Intel CIO Kim Stevenson discusses how Intel IT is looking to leverage predictive analytics to deal with the sea of data out there, and how this is already creating new opportunities for the organization. Download the MP3.
In this podcast, the Radio Free HPC team looks at Big Data and analytics. More specifically, the guys talk about a couple of examples of when over-reliance on analytics leads to bad outcomes. The first deals with some high school kids who were allegedly found to be cheating by a plagiarism software program and the second looks at how a major bank may have lost up to $9 billion due to lack of proper controls and blind faith in existing analytic systems.
In this slidecast, Edouard Servan-Shreiber from 10gen presents an overview of MongoDB 2.2.
MongoDB 2.2 has been a huge effort to make the database even easier to use and operate,” says Eliot Horowitz, 10gen co-founder and CTO. “We think that moving to NoSQL should make you a more productive software engineer, and features like the aggregation framework deliver on that promise.”
In this slidecast, 1010 Data CEO Sandy Steier presents: Making the Biggest Big Data Easy. The company has just re-launched its web site with a new look and a live demo.
For well over a decade, 1010data has pushed the limits of analytics on large amounts of data, including “Big Data”. From routine reporting to advanced analytics, the 1010data system allows businesses like yours to hone their tactics and strategy, while reducing technology overhead, costs and risk.”
In this slidecast, Ofer Bengal from Garantia Data presents an overview of the company’s fully-automated cloud service for hosting Memcached and Redis.
The Garantia Data Memcached Cloud was built from the ground up over a true Memcached cluster architecture and with a highly reliable infrastructure. We distribute your dataset across multiple shards in multiple nodes of our Memcached cluster and constantly monitor your shards to ensure optimal performance. When needed, we add more shards and nodes to your dataset so it can continuously and limitlessly scale. We have also added to our Memcached cluster replication, data persistence, backup and auto-failover capabilities that guarantee your dataset is always up and running.