Maptimize, the company behind the ‘One Million Tweet Map’, provides a clustering application to display massive amounts of information – such as Tweets, users or points of interest – on a map.
“Dolphin helps companies manage data volume and optimize processes so they can balance the performance and processing capabilities of SAP systems against the cost of running those systems. We develop a data volume management strategy so our customers can keep business critical data in SAP HANA, to get the fast efficient processing they need, and move static or business complete data on to other storage where it is still accessible. With a data volume management strategy in place, our customers are better prepared to go live on HANA and improve their return on investment.”
MongoDB, the database for modern applications, today announced that the City of Chicago is using MongoDB as the data store for WindyGrid, a real-time geospatial platform that delivers a unified view of city operations. WindyGrid enables City personnel to analyze spatial data historically and in real-time, with the database size growing by millions of records each day.
BIG DATA USE CASE Signal, a leader in real-time, cross-channel technology, recently announced it has enabled leading online survey platform SurveyMonkey to test and compare the performance of two retargeting vendors, resulting in a 33 percent reduction in customer acquisition cost. A/B testing allows marketers to make data-driven decisions that take the guesswork out of […]
From premature babies to traumatic brain injuries in adults, critically-ill patients often hang in a precarious balance between stabilizing and taking a turn for the worse. Doctors and nurses work tirelessly to keep track of data from heart monitors, respirators, and other machines to help restore health. But it’s not just the amount of healthcare-related data that seems to be growing exponentially — so is the pace needed to analyze it.
Versium, a data technology company that operates a LifeData® predictive analytics scoring service, today announced the launch of its Predictive GivingScore solution.
FIELD REPORT Last week I attended the long-anticipated useR!2014 international conference at the UCLA campus, my alma mater. The four day event had something for everyone in attendance – all the brain cycles centered around the use of the R statistical environment. Since R is a primary tool for my work in data science and […]
Big Data security all too often is an afterthought when deploying solutions like Hadoop, and companies slowly are discovering that security is just as important as any other aspect of the project. In the interview below, I was able to catch up with officials at a leading big data security vendor Dataguise to talk about […]
Big data often leads to overusing data! When a large volume of data is available, it is all too easy to find correlations between unrelated factors. So, it is important to correlate events with known dependencies to start with. Beware of the regression line! Sign up for the free insideBIGDATA newsletter.