Symantec Big Data Solution Makes Apache Hadoop Enterprise Ready
Cluster File System technology gives customers timely business insights using their existing infrastructure.
Note: ESJ’s editors carefully choose vendor-issued press releases about new or upgraded products and services. We have edited and/or condensed this release to highlight key features but make no claims as to the accuracy of the vendor's statements.
Symantec Corp. today announced an add-on solution for Symantec’s Cluster File System that enables customers to run big data analytics on their existing infrastructure by making it highly available and manageable. Apache Hadoop helps users analyze massive data volumes, but many existing data solutions lack the data management capabilities and built-in resilience to overcome the cost and complexity of increasing storage and server sprawl.
The new Symantec Enterprise Solution for Hadoop offering provides a scalable, resilient data management solution for handling Big dataworkloads to help make Apache Hadoop ready for enterprise deployment.
Symantec’s Cluster File System addresses big data workloads. With Symantec Enterprise Solution for Hadoop, organizations can:
Leverage existing infrastructure and avoid over-provisioning: IT administrators have spent considerable time and resources consolidating their data centers and reducing their footprint through virtualization and cloud computing. Taking advantage of big data analytics should leverage this consolidation of storage and compute resources. Symantec Enterprise Solution for Hadoop enables customers to run Hadoop while minimizing investments in a parallel infrastructure -- greatly reducing the storage footprint to reduce cost and complexity.
Analyze data where it resides, eliminate expensive data moves: The first step in making the Hadoop infrastructure work is to funnel data for analysis. By enabling integration of existing storage assets into the Hadoop processing framework, organizations can avoid time consuming and costly data movement activities. Symantec Enterprise Solution for Hadoop allows administrators to leave the data where it resides and run analytics on it without having to extract, transform and load it to a separate cluster, avoiding expensive and painful data migrations.
Ensure Hadoop is highly available: In an Apache Hadoop environment, data is distributed across nodes with only one metadata server that knows the data location -- potentially resulting in a performance bottleneck and single point of failure that could lead to application downtime. Symantec Enterprise Solution for Hadoop provides file system high availability to the metadata server while ensuring analytics applications continue to run as long as there is at least one working node in the cluster. The Hadoop file system is replaced with Symantec’s Cluster File System, so each node in the cluster can also access data simultaneously, eliminating both the performance bottleneck and single point of failure.
Pricing, Availability, and Support
The Symantec Enterprise Solution for Hadoop is available now to existing Cluster File System customers at no additional charge. Symantec Enterprise Solution for Hadoop supports Hortonworks Data Platform (HDP) 1.0 and Apache Hadoop 1.0.2. Customers running HDP 1.0 will be able to get Hadoop support and training from Symantec’s Hadoop partner Hortonworks, a company that promotes the innovation, development and support of Apache Hadoop.
More information is available at www.symantec.com.