Interview: NetApp and Policy-Based Data Management for the Enterprise

17891b9

“Key industries including healthcare, retail, telecommunication, media and entertainment, financial services and the government leverage NetApp solutions to manage large amounts of content, expand technology infrastructures without disrupting operations, and improve data-intensive workflows.”

Interview: A3CUBE Sets Sights on the Emerging Arena of High Performance Data

cube

“Our architecture permits tens of thousands of SSDs to be connected together and accessed in a parallel and concurrent way using direct mapping of memory accesses from a local machine to the I/O bus and memory of a remote machine. This feature allows for data transmission between local and remote system memories without the use of operating system services. It also enables a unique linear scalability of SSDs bandwidth and IOPS and consequently allows computation and data access to scale together linearly. This totally eliminates the bottleneck in bandwidth or IOPS and provides optimal dimensions of performance, capacity, and computation with an unmatched flexibility at a fraction of the costs.”

What Can Hadoop Can Do for Your Big Data Strategy?

SAS Hadoop

For all its agility in handling big data, Hadoop by itself is not a big data strategy.

Big Data Could Add $325 Billion to Economy by 2020

McKinsey has issued a 146 page report this month: Game changers: Five opportunities for US growth and renewal. According to them, Big Data will add $155 to $325 Billion to the US economy by 2020, which represents 0.8 to 1.7 % of GDP. Big-data analytics is a productivity tool. Data is captured everywhere from data […]

Big Data for Big Careers – P&G Quads Down on Analytics Expertise

By Dan Olds of Gabriel Consulting Procter & Gamble is going to quadruple the company’s staff of business analytics experts in the near future. This is despite the fact that the company is reducing spending in other categories, including significant non-manufacturing layoffs and a hefty 30% cut (equaling $1 billion) in annual IT spending. So […]

Labor: The Big Data Bottleneck

Paul Sonderegger from Oracle Endeca writes that the biggest bottleneck in making Big Data productive is labor. In a big data world, data modeling, integration, and performance tuning are governors of data use because they rely on relatively slow manual processes done by relatively expensive specialists. In an ironic twist, the substitution of computing capital […]