Sign up for our newsletter and get the latest big data news and analysis.

Intel’s Boyd Davis Talks Predictive Analytics and March Madness

boydDavis

“Intel’s goal is to encourage more innovative and creative uses for data as well as to demonstrate how big data and analytics technologies are impacting many facets of our daily lives, including sports. For example, coaches and their staffs are using real-time statistics to adjust games on-the-fly and throughout the season. From intelligent cameras to wearable sensors, a massive amount of data is being produced that, if analyzed in real-time, can provide a significant competitive advantage. Intel is among those making big data technologies more affordable, available, and easier to use for everything from helping develop new scientific discoveries and business models to even gaining the upper hand on good-natured predictions of sporting events.”

Interview: Active Archives for Managing and Storing Big Data

Floyd-Christofferson_AA

“Active archives are ideal for organizations that face exponential data growth or regularly manage high-volume unstructured data or digital assets. Target markets include life sciences, media and entertainment, education, research, government, financial services, oil and gas, and telecommunications, as well as general IT organizations requiring online data archive options.”

How MPSTOR Delivers Software Defined Storage Across Multiple Services

337f8a2

MPSTOR integrates virtualization into the software stack to provide better, more robust infrastructure management. “Orkestra enables automated delivery of “Anything as a Service,” allowing cloud operators to create and deliver cost effective, differentiated services.”

Interview: Caserta Brings New Business Insights via Data Intelligence

1db7879

“Caserta Concepts is a leading big data innovation and implementation services organization. Among our areas of specialization is the development of solutions to meet the volume and variety of data and overall speed demands of the financial services sector.”

Interview: How Anaplan Delivers Innovation in Real-Time Data Modeling

Anaplan_CEO_Fred_Laluyaux2

“The Anaplan revolution is to provide a big-data engine for business users, removing the need to work with data scientists. The ability to scale your data – 100 billion cells in one model, with 1 billion items in a list – will prove to be the key to proliferation, so long as the data is immediate, useable, consumable via apps, and easy to modify. With Anaplan, business users can build a model with 500 million cells, use it for one hour for a specific purpose, and then throw it away and start on a new one if they want! Ease of use is key. This is the future of enterprise big data.”

Interview: Carpathia Leverages Hadoop and the Cloud to Host Big Data Solutions

main-thumb-184090-200-N5gltip4Uqw001eGfs2hCFfHGqGxAqck

“As a cloud operator and managed hosting provider, we’re seeing demand from a wide range of companies that are interested in leveraging the scalability and efficiency benefits of the cloud to better manage, analyze, and extract value from Big Data.”

Interview: Spectra Logic and Deep Storage Solutions for Massive Data

Kevin Dudak

“Deep storage, and tape library-based storage in general, benefit organizations that are looking to incorporate low-cost, high-density, scalable storage into their fast-growth data environments. Industries that recognize the value and regularly rely on tape storage include education, federal and state government, finance, life sciences, media and entertainment, oil and gas exploration, and Web 2.0, among others.”

How Kiuwan Measures and Analyzes the Quality of Your Code

3c785cc

“Kiuwan does automatic code review based on static analysis in the cloud. It is SaaS model, as opposed to other software quality solutions based in code analysis that are on-premise and very expensive to implement, Kiuwan is an affordable solution in the cloud. We are the Salesforce.com of software quality.”

Tarmin Defines and Refines Data Defined Storage

Tarmin

“We are continuing to align GridBank with the growing demands of the information economy to deliver a future proof data centric solution for organizations. “

Adaptive Computing Introduces Big Workflow to Accelerate Insights

Jill King

“A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage. We are confident that Big Workflow will enable enterprises across all industries to leverage big data that inspires game-changing, data-driven decisions.”