With a hybrid approach to big data storage, companies can combine the high performance and speed capabilities of in-memory while solving the storage issues by putting the vast historical data sets on disk. By bridging available technologies, companies can deliver on all counts – including cost.
In this special guest feature, Dr. Michael Blaha from Modelsoft Consulting Group provides a series of 10 useful guidelines for obtaining good performance with traditional relational databases. Michael Blaha is a consultant and trainer who specializes in conceiving, architecting, modeling, designing and tuning databases. He has worked with dozens of organizations around the world. Blaha […]
In this special guest feature, Andrew Herman, President of CorSource, addresses data quality, a challenge facing all companies in the age of mass data collection. “Successfully tackling data quality is imperative, and achievable with a progressive, methodical approach. Your competitors are struggling with this very issue, and the question is whether this is going to remain your problem, or just theirs.”
In this special guest feature, Jesse Anderson from Cloudera writes about his perspectives on becoming a computer programmer including education, aptitude and other musings. As an extra bonus check out the tutorial video at the end of the article.
In this special guest feature, ISC Big Data conference chair Sverre Jarp discusses the Internet of Things with Dirk Slama, Director of Business Development at Bosch Software. In his keynote presentation on October 1, Slama will be focusing on how the IoT is enabling new business models and services, stressing particularly on the key success factors and present a framework that he believes will help enable that success.
With today’s big data industry, it’s all about consumers doling out their personal data and companies gladly scooping them up to push them through advanced machine learning algorithms in an attempt to understand every last nuance about you – whether you like it or not. With today’s beta launch of the new Private.me search engine, the playing field has changed.
In this podcast, the Radio Free HPC team discusses the new TPCx-HS benchmark for Big Data. Designed to asses a broad range of system topologies and implementation methodologies, TPCx-HS is the industry’s first objective specification enabling measurement of both hardware and software including Hadoop Runtime, Hadoop Filesystem API compatible systems and MapReduce layers.