“A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage. We are confident that Big Workflow will enable enterprises across all industries to leverage big data that inspires game-changing, data-driven decisions.”
As a practicing data scientist and big data journalist, I often find myself down in the trenches on pursuit of new trends, products, and services. Earlier this week I attended a local machine learning meetup group event and I came away with a real gem. The presenter mentioned in passing a new cloud service called “Domino” and I rushed back to my office to learn more. I wasn’t disappointed.
Leading researchers in data science and genomics are recommending strategies to help genomic scientists better manage, share, analyze and archive massive research and clinical data sets in an effort to ensure that the big data explosion results in better health outcomes and faster research discoveries.
Paradigm4 is the company behind SciDB, a scalable array database with native complex analytics. CEO Marilyn Matz is an expert in the field of big data, after co-founding Cognex Corp in 1981. Marilyn has some interesting perspectives as to why Hadoop might not always be the correct choice for big data deployments. I recently caught up with Marilyn to discuss these views.
H2O, the open source in-memory machine learning and predictive analytics company for big data, announced a partnership with Cloudera, a leader in enterprise data management powered by Apache™ Hadoop.