Interview: Spectra Logic on Big Data Storage in the Cloud

Kevin Dudak

“Big Data puts new requirements on storage with respect to scalability, data integrity and cost efficiency – and Spectra is well-positioned to serve this market. Our archive and backup data storage tape products support all aspects of secondary storage, are compatible with every major tape and disk format, enable massive scalability and provide a plethora of advanced features that ensure the data is protected, its integrity is maintained and that it will be available virtually forever. Our suite of T-Series tape libraries offer high capacity TS1140 and open standard LTO media options, have the capability to offer block, file and object storage on our tape systems; and can deliver long-term storage for under $0.10/GB LIST pricing.”

Interview: Globys Answers the Question–Can Big Data Go Too Far?

Olly Downs

“Our contextual marketing platform, Mobile Occasions, leverages our customers’high-volume behavioral transaction data to gain deep insight into subscriber behavior, the platform then manages, executes and dynamically optimizes marketing campaigns that help maximize long-term revenue and retention of those subscribers, leveraging that insight.”

Interview: Survey from Dell Discovers Need for Big Data in Midmarket Companies

04ec7c4225e3d05d0ed8eddc74be3481

Big Data has mostly been considered the realm of big enterprise and not the midmarket segment. Dell launched a survey to study this notion and discovered that midmarket companies not only need Big Data to engender better, more competitive business practices, but many are already using data analysis. We caught up with Darin Bartik, Executive Director and GM of Database Management at Dell, to learn more about the survey and its findings.

Interview: Concurrent Leads the Way in Application Building on Hadoop

Gary-Nakamura-1713631-220

“Concurrent is the team behind Cascading, the proven application development framework that makes it possible for enterprises to leverage their existing skill sets for building data-oriented applications on Hadoop. Cascading has built-in attributes that make data application development a reliable and repeatable process. Companies that standardize on Cascading can build data applications at any scale, integrate them with existing systems, employ test-driven development practices and simplify their applications’ operational complexity.”

Interview: Why Denodo Believes Everyone Needs Data Virtualization

Suresh Chandrasekaran

“The Denodo Platform delivers the capability to access any kind of data from anywhere it lives without necessarily moving it to a central location like a data warehouse. Once moved it exposes that data to various users and analytical/business applications as virtual data services in a way that is meaningful to the users, in real-time, with high performance, using caching and minimal data movement only as needed. That is data virtualization in a nutshell.”

Interview: Accenture Looks to Gather and Analyze Data at the ‘Speed of Now’

dellanno

“The Democratization of Analytics doesn’t mean that every individual is or should be a data scientist, mathematician or statistician. Our solutions, products and services intend to make analytics and analytic capabilities more understandable and more available to individuals across the enterprise so they can operate more effectively and execute accurately. The Democratization of Analytics means that the enterprise can deliver more impactful outcomes, faster.”

Interview: VoltDB Powers Fast and Smart Data in Gaming World and Beyond

PVescuso_POSSCon

VoltDB is an in-memory, distributed, relational database that exceeds the performance needs of modern data-intensive applications in industries including mobile, gaming, advertising technology, financial services and energy.

Interview: Guavus Tackles Real-Time Data Analytics Across the Entire Enterprise

drn18021

Guavus uses live analytics with responsive queries to garner insightful business metrics to serve up competitive advantage. “Guavas is unique in its ability to provide an end-to-end view across your business and operations in real time. Our operational intelligence platform processes over 2.5 petabytes of data per day, which equals to 250 billion records per day and 2.5 million transactions per second.”

Interview: ParStream Analyzes Billions of Records in Less than a Second

cto_w_170

“ParStream is a columnar database with a hybrid in-memory storage and a shared nothing architecture. Based on patented algorithms for indexing and compressing data, Parstream uniquely combines three core features: Analyzing billions of records in sub seconds, continuous fast import with up to 1 million rec/s and a flexible, interactive analytics engine with a SQL interface.”

Interview: NetApp and Policy-Based Data Management for the Enterprise

17891b9

“Key industries including healthcare, retail, telecommunication, media and entertainment, financial services and the government leverage NetApp solutions to manage large amounts of content, expand technology infrastructures without disrupting operations, and improve data-intensive workflows.”