Our focus is on the design and deployment of highly-integrated HPC, clustered computing, and storage solutions for all areas of computer-aided research and production. With over 55 years of staff experience in high-performance computing, enterprise computing architectures, and data storage, our focus is on architecting a perfectly-suited solution for your needs. We do not adhere to the large manufacturer approach of “one size fits most”. Every application or research methodology is different. We prefer to learn about our customer’s research, their needs and challenges. Our strength comes from being able to draw from many different types of technologies and configuration approaches. Our goal is to design and deploy a cluster or system for our customers that avoids needless bottlenecks, limitations or design and configuration flaws that are commonly found in solutions designed by companies that are more focused on profits or sales incentives from manufacturers.”
When it comes to security, Big Data is a two-edged sword.
On one hand it can be used to analyze mountains of data in order to foil intruders, head off attacks and neutralize a wide variety of other threats. But the network architecture required to support Big Data analytics is itself vulnerable to attack.
Writing in CSO Magazine, John P. Mello, Jr. notes that Hadoop is frequently used in order to manage the computer clusters that are at the heart of Big Data deployments. This, he says, can create problems for security people, especially if they are relying on traditional security tools.
He quotes a white paper from Zettaset, a Big Data security company, which asserts, “Incumbent data security vendors believe that Hadoop and distributed cluster security can be addressed with traditional perimeter security solutions such as firewalls and intrusion detection/prevention technologies. But no matter how advanced, traditional approaches that rely on perimeter security are unable to adequately secure Hadoop clusters and distributed file systems.”
Traditional security products are designed to protect a single database. But when these products are called upon to protect a distributed cluster of computers that may number in the thousands, they fall short.
Mello interviewed Zettaset CTO Brian Christian.
When you put them (traditional security products) on a large scale distributed computing environment, they become either a choke point or a single point of failure for the entire cluster,” Christian said. “They could potentially be extremely dangerous running them on a cluster, because if they do fail, there is the potential to deny everybody on the cluster access to petabytes of data or a corruption of data in some of the encryption security technologies.”
Other problems arise when security is “bolted on” to an existing Big Data infrastructure, a costly and often ineffective procedure.
And, the story notes, when it comes to business versus security, business requirement takes precedence over implementing an ideal security solution. Says Chris Petersen, CTO of LogRhythm, “While security catches up, there is going to vulnerability. My guess is that there is a lot of vulnerability right now in organizations adopting Hadoop.”
Read the Full Story.
In this video from the Strata 2013 Conference, David Smith from Revolution Analytics describes the five stages of real-time analytics deployment, and the technologies supporting each stage, including Hadoop, R, and database warehousing systems. He also shares some best practices for setting up a the technology stack and processes for model deployment, based on some real-life case studies.
WARP Mechanics Ltd. is a leading provider of high performance computing (HPC) solutions. The company mission is to bring these super computing technologies into broader IT markets. Each WARP product is factory-optimized for vertical markets such as public-sector “Big Science”, commercial Bio/Life, Cloud, or Media/Entertainment, and can be rolled out in a turn-key fashion.
In this video from the 2013 Open Fabrics Developer Workshop, D.K. Panda from Ohio State University presents: High Performance RDMA-based Design for Big Data and Web 2.0 memcached.
You can check out more OFA videos at our Open Fabrics Workshop Video Gallery.
Ceph is a free software unified storage platform designed to present object, block, and file storage from a single distributed cluster. Ceph’s main goals are to be completely distributed without a single point of failure, scalable to the exabyte level, and freely-available. The data is seamlessly replicated, making it fault tolerant. Ceph is a software-based solution and runs on commodity hardware. The system is designed to be both self-healing and self-managing and strives to reduce both administrator and budget overhead.
Check out more presentations at our LUG 2013 Video Gallery.
The WARP Mechanics 39830 is a turnkey network-attached non-volatile RAM + SSD system with industry-leading price, performance, and scalability. This system maximizes the IOPs performance for the most demanding application profiles. It is an ultra-dense space and power saving solution. This is optimal for large-scale IO intensive workloads with large live data sets. The 50x high capacity 2TB SSD modules per 4U enclosure are configured into five 10-disk RAID 6 sets to maximize protection and performance. Each RAID set has a two NV-RAM modules serving as write cache. These RAID sets are added to the overall ZFS storage pool and can be allocated to a nearly limitless number of any sized volumes presented to hosts. This yields a flexible 100TB of usable RAID protected SSD storage.
Check out more Lustre presentations at our LUG 2013 Video Gallery.
This week IDC released the latest results from its Worldwide Semiannual Software Tracker. Despite only modest gains last year in the worldwide software market, certain specific areas showed strong growth. According to IDC, the management and leveraging of information for competitive advantage is driving gains in markets associate with Big Data and analytics.
For 2012, the worldwide software market grew 3.6% year over year reaching a total market size of $342 billion. This was less than half the growth rate experience in 2010 and 2011 and is indicative of a more conservative growth period.
But, says IDC, despite the slowdown, there are faster growing market segments such as Data Access, Analysis and Delivery, Collaborative Applications, CRM Applications, Security Software, and System and Network Management Software. Every one of these markets grew in the 6-7% range, about double the rate for enterprise software as a whole.
The global software market, comprised of a multi-layered collection of technologies and solutions, is growing more slowly in this period of economic uncertainty,” said Henry D. Morris, Senior Vice President for Worldwide Software, Services and Executive Advisory Research. “Yet there is strong growth in selective areas. The management and leveraging of information for competitive advantage is driving growth in markets associated with Big Data and analytics. Similarly, rapid growth in cloud deployments is fueling growth in application areas associated with social business and customer experience. Both these initiatives require a reliable and secure infrastructure, driving investments in security and system/network management. The combination of these forces is advancing the growth to what IDC has termed the third platform.”
IDC identifies Application Development & Deployment (AD&D) as one of the three primary segments making up the total software market. AD&D was the fastest growing segment, comprising nearly 24% of software revenues in 2012 and growing at a rate of 4.6%.
Business Intelligence and relational database management systems are fueling this growth because of the growing adoption of Big Data and analytics. IDC goes on to say that “Big data and analytics are also closely tied to the fast growth social business software markets, where the combination of contextual data and the ‘right’ expertise is becoming critical for supporting enterprise decision making and data driven customer experience solutions. Oracle continued to lead the AD&D segment with steady market share of 21.6%, followed by IBM, Microsoft, SAP, and SAS. Among these vendors, Microsoft and SAP stood out by each gaining almost a half point of market share year over year.”
Read the Full Story.
In this video from the Lustre User Group 2013 conference, Robert Read from Intel presents: Lustre on Amazon Web Services.
You can check out more Lustre presentations at our LUG 2013 Video Gallery.