Symantec has announced an add-on solution for its Cluster File System that enables customers to run Big Data analytics on their existing infrastructure. Apache Hadoop offers significant value to drive revenue by helping analyze data for business insights, however many existing data solutions lack the data management capabilities and built-in resilience to overcome the cost and complexity of increasing storage and server sprawl. By working closely with Hortonworks, the new Symantec Enterprise Solution for Hadoop offering provides a scalable, resilient data management solution for handling Big Data workloads to help make Apache Hadoop ready for enterprise deployment.
With Symantec Enterprise Solution for Hadoop, organizations can:
- Leverage their existing infrastructure by scaling up to 16 PB of data including structured and unstructured data
- Avoid over provisioning on both storage and compute capacity
- Run analytics wherever the data sits, eliminating expensive data moves
- Make Hadoop highly available without a potential single point of failure or a performance bottleneck
“Indian organizations looking to leverage Big Data need to first overcome the complexities associated with managing, archiving and accessing unstructured data be it on-premise or in the cloud. We are in the midst of a massive information explosion and Symantec is excited to be at the forefront of this opportunity to leverage existing infrastructure for enterprise-ready Hadoop,” said Anand Naik, Managing Director – Sales, for India and SAARC, Symantec.
The Symantec Enterprise Solution for Hadoop is available now to existing Cluster File System customers at no additional charge. Symantec Enterprise Solution for Hadoop supports Hortonworks Data Platform (HDP) 1.0 and Apache Hadoop 1.0.2. Customers running HDP 1.0 will be able to get Hadoop support and training from Symantec’s Hadoop partner Hortonworks, a leading commercial vendor promoting the innovation, development and support of Apache Hadoop.