Interview: Why Denodo Believes Everyone Needs Data Virtualization

Suresh Chandrasekaran

“The Denodo Platform delivers the capability to access any kind of data from anywhere it lives without necessarily moving it to a central location like a data warehouse. Once moved it exposes that data to various users and analytical/business applications as virtual data services in a way that is meaningful to the users, in real-time, with high performance, using caching and minimal data movement only as needed. That is data virtualization in a nutshell.”

Interview: Accenture Looks to Gather and Analyze Data at the ‘Speed of Now’

dellanno

“The Democratization of Analytics doesn’t mean that every individual is or should be a data scientist, mathematician or statistician. Our solutions, products and services intend to make analytics and analytic capabilities more understandable and more available to individuals across the enterprise so they can operate more effectively and execute accurately. The Democratization of Analytics means that the enterprise can deliver more impactful outcomes, faster.”

Interview: VoltDB Powers Fast and Smart Data in Gaming World and Beyond

PVescuso_POSSCon

VoltDB is an in-memory, distributed, relational database that exceeds the performance needs of modern data-intensive applications in industries including mobile, gaming, advertising technology, financial services and energy.

Interview: Guavus Tackles Real-Time Data Analytics Across the Entire Enterprise

drn18021

Guavus uses live analytics with responsive queries to garner insightful business metrics to serve up competitive advantage. “Guavas is unique in its ability to provide an end-to-end view across your business and operations in real time. Our operational intelligence platform processes over 2.5 petabytes of data per day, which equals to 250 billion records per day and 2.5 million transactions per second.”

Interview: ParStream Analyzes Billions of Records in Less than a Second

cto_w_170

“ParStream is a columnar database with a hybrid in-memory storage and a shared nothing architecture. Based on patented algorithms for indexing and compressing data, Parstream uniquely combines three core features: Analyzing billions of records in sub seconds, continuous fast import with up to 1 million rec/s and a flexible, interactive analytics engine with a SQL interface.”

Interview: NetApp and Policy-Based Data Management for the Enterprise

17891b9

“Key industries including healthcare, retail, telecommunication, media and entertainment, financial services and the government leverage NetApp solutions to manage large amounts of content, expand technology infrastructures without disrupting operations, and improve data-intensive workflows.”

Interview: A3CUBE Sets Sights on the Emerging Arena of High Performance Data

cube

“Our architecture permits tens of thousands of SSDs to be connected together and accessed in a parallel and concurrent way using direct mapping of memory accesses from a local machine to the I/O bus and memory of a remote machine. This feature allows for data transmission between local and remote system memories without the use of operating system services. It also enables a unique linear scalability of SSDs bandwidth and IOPS and consequently allows computation and data access to scale together linearly. This totally eliminates the bottleneck in bandwidth or IOPS and provides optimal dimensions of performance, capacity, and computation with an unmatched flexibility at a fraction of the costs.”

Interview: Nexenta Seeks to Do Away with MESS

Thomas1

“Software defined storage is a fundamental component of software defined data centers – the next step in the evolution of virtualization and cloud computing. In its simplest form, Software Defined Storage is about leveraging software only solutions to address storage challenges, from vendor lock-in, cost, performance, security, scale and manageability. A complete SDS portfolio enables customers to both optimize existing infrastructure and fully replace legacy configurations with industry standard hardware powered by software.”

Interview: Adaptive Computing Brings Big Workflow to the Data Center

Jill King

“Our thought process was that Big Data + a better Workflow = Big Workflow. We coined it as an industry term to denote a faster, more accurate and more cost-effective big data analysis process. It is not a product or trademarked name of Adaptive’s, and we hope it becomes a common term in the industry that is synonymous with a more efficient big data analysis process.”

Interview: Wise.io Sees Machine Learning Throughout the Entire Customer Lifecycle

2760956

“Our main differentiator from other machine-learning companies is that we’re focused not just on high-performance algorithms, but on delivering an end-to-end application for business users. While we continue to push boundaries of cutting-edge machine learning technology, we made an early decision not to get sucked into the “algorithms arms race.” We hold a fundamental belief that the best analytics technologies will fail unless they can be implemented in a timeframe relevant to the business and interpreted by the ultimate decision makers.”