Sign up for our newsletter and get the latest big data news and analysis.

Interview: ParStream Analyzes Billions of Records in Less than a Second

cto_w_170

“ParStream is a columnar database with a hybrid in-memory storage and a shared nothing architecture. Based on patented algorithms for indexing and compressing data, Parstream uniquely combines three core features: Analyzing billions of records in sub seconds, continuous fast import with up to 1 million rec/s and a flexible, interactive analytics engine with a SQL interface.”

Interview: NetApp and Policy-Based Data Management for the Enterprise

17891b9

“Key industries including healthcare, retail, telecommunication, media and entertainment, financial services and the government leverage NetApp solutions to manage large amounts of content, expand technology infrastructures without disrupting operations, and improve data-intensive workflows.”

Interview: A3CUBE Sets Sights on the Emerging Arena of High Performance Data

cube

“Our architecture permits tens of thousands of SSDs to be connected together and accessed in a parallel and concurrent way using direct mapping of memory accesses from a local machine to the I/O bus and memory of a remote machine. This feature allows for data transmission between local and remote system memories without the use of operating system services. It also enables a unique linear scalability of SSDs bandwidth and IOPS and consequently allows computation and data access to scale together linearly. This totally eliminates the bottleneck in bandwidth or IOPS and provides optimal dimensions of performance, capacity, and computation with an unmatched flexibility at a fraction of the costs.”

Interview: Nexenta Seeks to Do Away with MESS

Thomas1

“Software defined storage is a fundamental component of software defined data centers – the next step in the evolution of virtualization and cloud computing. In its simplest form, Software Defined Storage is about leveraging software only solutions to address storage challenges, from vendor lock-in, cost, performance, security, scale and manageability. A complete SDS portfolio enables customers to both optimize existing infrastructure and fully replace legacy configurations with industry standard hardware powered by software.”

Interview: Adaptive Computing Brings Big Workflow to the Data Center

Jill King

“Our thought process was that Big Data + a better Workflow = Big Workflow. We coined it as an industry term to denote a faster, more accurate and more cost-effective big data analysis process. It is not a product or trademarked name of Adaptive’s, and we hope it becomes a common term in the industry that is synonymous with a more efficient big data analysis process.”

Interview: Wise.io Sees Machine Learning Throughout the Entire Customer Lifecycle

2760956

“Our main differentiator from other machine-learning companies is that we’re focused not just on high-performance algorithms, but on delivering an end-to-end application for business users. While we continue to push boundaries of cutting-edge machine learning technology, we made an early decision not to get sucked into the “algorithms arms race.” We hold a fundamental belief that the best analytics technologies will fail unless they can be implemented in a timeframe relevant to the business and interpreted by the ultimate decision makers.”

Interview: Datameer Brings End-to-End Data Analytic Solutions Built on Hadoop

Karen Hsu Headshot

“Datameer is all about providing a self-service, end-to-end experience for big data analytics on Hadoop. From data integration to analytics to visualization, we are wizard-led, point-and-click. Most recently we announced our Smart Analytics module, which allows business users to use data mining algorithms through a drag and drop UI. These new capabilities complement what data scientists are doing and enable business analysts to take advantage of advanced algorithms without involving IT.”

Interview: Glassbeam Joins Forces with HDS for Complex Infrastructure Management

1372421-304

“Machine logs contain simple and complex data – some logs contain time stamped data (i.e. syslogs) that are tactical events or errors used by sys admins to troubleshoot IT infrastructure. But other logs have more complex, unstructured or multi-structured text with sections on configuration info, statistics and other non-time stamped data. To make sense of the data in these logs, one needs a powerful language and processing engine to provide meaning and structure to the information. Once structure is defined, complex analytics and trend reporting can be performed.”

Interview: Splunk Brings Machine Data to Higher Education

Rob Reed

“Splunk Enterprise is a platform for machine data. The technology delivers powerful and fast analytics to quickly unlock the value of machine data to IT and other users throughout an organization. In short, it’s a simple, effective way to collect, analyze and secure the massive streams of machine data generated by all IT systems and technology infrastructure.”

Intel’s Boyd Davis Talks Predictive Analytics and March Madness

boydDavis

“Intel’s goal is to encourage more innovative and creative uses for data as well as to demonstrate how big data and analytics technologies are impacting many facets of our daily lives, including sports. For example, coaches and their staffs are using real-time statistics to adjust games on-the-fly and throughout the season. From intelligent cameras to wearable sensors, a massive amount of data is being produced that, if analyzed in real-time, can provide a significant competitive advantage. Intel is among those making big data technologies more affordable, available, and easier to use for everything from helping develop new scientific discoveries and business models to even gaining the upper hand on good-natured predictions of sporting events.”