High Performance Computing

Overview

HPC environments are increasingly seeking new solutions to maintain speed of workflows and collaboration in a decentralized environment. Traditional solutions for single datacenter environments are being overwhelmed by added requirements to run at the edge, capture distributed data, and enable collaboration with a remote team of data users.

Data only achieves its full value when it can be used by the most advanced applications and users who derive insights from it. The cloud brings many advancements to help process. Yet tools to efficiently make data accessible across multiple data center technologies and cloud services have been lagging.

 

 

Increasingly complex workflows stretch both resources and budget in multiple areas:

01. Increased Cost

Expanding datasets add significant cost to accommodate higher data volumes and performance requirements. This impacts both CAPEX and OPEX.

02. Need for Collaboration

Data must be accessible to scientists and applications that are not local to the original data source. Increasingly the collaboration and sharing of scientific data in multiple locations is needed to use the tools best fit for the job that are spread across workstations, datacenters, and the Cloud.

03. Copy Proliferation

It is too expensive to maintain multiple copies of the same data in different locations and different storage silos.  The infrastructure cost is too high and the burden on IT to keep track of the different data copies is too much.

Hammerspace Solution

Hammerspace is a software solution that provides a global data environment to give users, applications, and cloud services local access to data, no matter where it is stored. It is designed to help research organizations create, analyze, and manage multiple petabytes of HPC data, and to enable collaboration across a decentralized environment.

The Hammerspace Difference

High-Performance Data Creation

High-performance streaming as a parallel file system or NAS.

Efficient Data Collaboration

Local access to remote data with data orchestration of shared copy of data with any application, any user, any cloud service, anywhere.

End-to-End Data Management

Store in the appropriate cost tier and with policy-driven archiving to preferred datacnter or cloud storage

Hammerspace Solutions For

Computing
Processing
Archiving

Computing

Hammerspace is built on our high-performance, parallel, global file system. This ensures high-performance data capture from large compute clusters and parallel user access for 10s, 100s, or 1000s of users.

Processing

Hammerspace makes it fast and efficient to capture, process, analyze, visualize, and share in a single data environment. It can bridge multiple silos, multiple data centers, and even multiple Cloud regions and vendors. Simplify working with decentralized resources. Applications, users and data services at any location directly work with all resources as though they were local.

Archiving

    An NFS-mounted archive file system provides perhaps the most familiar environment for interacting with archived data in an HPC workflow. Mounted file systems appear as local directories to users and applications and are accessible via standard Linux commands, such as cd, mkdir, chmod, etc. This approach is extremely convenient and requires very limited workflow modification to integrate Hammerspace as the archive file system.

    Learn More About Our Technology

    In The News

    Hammerspace Hits the Market with Global Parallel File System

    Learn More >

    VIDEO

    Breaking Down Storage Silos with a Hammerspace Global Data Environment

    View Now >

    video

    Hybrid Cloud and Multi-Cloud NAS Storage with Hammerspace

    View Now >