Hammerspace Introduces an All-New Storage Architecture | Read the Press Release

Hammerspace Newsletter October 2022

No tricks here, it’s the October edition of the Hammerspace newsletter! 

This month our focus is on High Performance Computing. HPC environments are increasingly seeking new solutions to maintain high-performance workflows and collaboration with more  decentralized compute, storage, and collaboration requirements becoming the norm. Traditional solutions for single data center environments are being overwhelmed by added requirements to run at the edge, capture distributed data, process in the cloud, and enable collaboration with teams of users who may be anywhere.

Hammerspace provides a single global data environment with the flexibility for fast data ingest, data placement on any storage in any location, and enables distributed collaboration.  Hammerspace is bringing greater agility to high-performance computing, allowing organizations to use every global resource available to speed scientific discovery and innovation. 

Why is this important? Increasingly, there is demand for organizations to access distributed applications, brain power, and collaborators from other organizations. There is also the increased need to have greater flexibility to rapidly extend data center compute resources with burst-to-cloud and elastic cloud storage to mitigate the impact of snarled supply chains, skyrocketing energy costs, and to leverage ever-more specialized hardware (GPUs, AI chips, etc.).

Hammerspace delivers a Global Data Environment with a high-performance Parallel Global File System that extends across multiple siloed edge, on-premises, and cloud-based storage and compute resources. In this way,  Hammerspace enables a data-centric model for HPC where you are able to get immediate global online access to data that may be anywhere:

  • Collaborate on even high-performance workloads from any remote location, and securely reduce the time to results. 
  • Efficiently and transparently route decentralized data to any available (centralized or burst) compute resource anywhere.
  • Enable seamless access by AI engines and machine learning applications across otherwise incompatible data silos, without the need to consolidate files into a central repository.
  • Converge datasets to execute High-Performance Data Analytics (HPDA) with data in place on existing storage.
  • File-granular and automated data orchestration across decentralized multi-vendor infrastructure.
  • Gain a single, consistent global view and control of data, wherever it resides.

This allows organizations to make all data on any storage platform a global asset, and lets users superpower their data insights, while reducing time to results and ultimately time to value.

In this Issue:

  • Vertical View: High Performance Computing 
  • Data Unchained Podcast: “Companies to Watch for Data Innovation” with Eyal Waldman
  • On-Demand Webinar: Beyond Cache with a Global Data Environment
  • Hammerspace Announcements
  • In the News
  • New Resources
  • Upcoming Events

Data only achieves its full value when it can be used by the most advanced applications and users who derive insights from it. The cloud brings many advancements to help process. Yet tools to efficiently make data accessible across multiple data center technologies, locations, and cloud services have been lagging.

Increasingly complex workflows stretch both resources and budgets in multiple areas:

  • Increased Cost – Expanding datasets add significant cost to accommodate higher data volumes and performance requirements. This impacts both CAPEX and OPEX.
  • Need for Collaboration – Data must be accessible to scientists and applications that are not local to the original data source. Increasingly the collaboration and sharing of scientific data in multiple locations is needed to use the tools best fit for the job that are spread across workstations, data centers, and the cloud.
  • Copy Proliferation – It is too expensive to maintain multiple copies of the same data in different locations and different storage silos. The infrastructure cost is too high and the burden on IT to keep track of the different data copies is too much.

Hammerspace software gives users, applications, and cloud services the experience of local access to data, no matter where it is stored. It is designed to help research organizations create, analyze, and manage multiple petabytes of HPC data, and to enable collaboration across a decentralized environment.

Learn more about our solutions for high performance computing here

Eyal Waldman, Co-Founder and former President, CEO, and Member of the Board of Mellanox Technologies, joins host Molly Presley to highlight “Companies to Watch for Data Innovation,” including how data has transformed the world and the importance of security for global data communication.

Subscribe to the Data Unchained podcast here, and listen on your favorite podcast platform, including: 

Apple Podcast

Google Podcast



As organizations begin leveraging the cloud (and become increasingly decentralized), it’s generally accepted that they increasingly need  multiple clouds and a variety of cloud services for both compute and storage. Unfortunately, these clouds often operate as disconnected data silos and are not yet able to achieve the true “utility computing and storage” model where anything can run anywhere seamlessly, without concern for infrastructure and data locality.

Join Floyd Christofferson and Brian Bashaw in this one-hour educational webinar as they discuss how a global data environment can help unify data, provide low latency access to remote workers, and leverage the optimal compute environment – regardless of locations.

October has been a big news month here at Hammerspace! We’ve had two major announcements in recent weeks:

These capabilities enable users to better access, collaborate and get more value from their files globally from their desktop, regardless of which vendor system the data is stored within. 

The announcement underscores the latest expanding ways that Hammerspace enables customers to seamlessly integrate their existing workflows and applications into a Global Data Environment. It also includes native integration to workflow management applications such as Autodesk ShotGrid, Projective Strawberry, Slurm, fTrack and others. 

“We are thrilled to have Hammerspace available to provide the data access solution decentralized organizations need when working with data stored at the edge, in data centers, or across different clouds,” said Steve Low, Titan Co-Founder and Sales Director. “It certainly meets the requirements of companies that we work with in life sciences, in finance, legal, and media and entertainment, that have the same issues when working across multiple sites.”

Hear more from Titan’s Steve Low in this episode of the Data Unchained podcast, where he and host Molly Presley discuss the ins and outs of Titan Solutions, how they help resellers asses new technologies that are coming out, how they shifted from a data storage company to a data management company, and how their new partnership with Hammerspace will better help their customers and company as a whole. 

Are you feeling the buzz and momentum around Hammerspace? Check out these great industry articles from many of the top publications in the industry.

With the new product enhancements announced this month, we’ve updated our Global Data Environment White Paper, providing an understanding of the problems Hammerspace solves, the context of why this is important, and what Hammerspace’s key capabilities are to address them.

We have also updated the Hammerspace Technology White Paper, which provides a deeper dive into our technology and the Hammerspace Data Sheet, which provides a shorter overview. 

Plus! There are several new videos available, including:

On the industry analyst front, Hammerspace was recognized as a Sample Vendor in the 2022 Gartner® Hype Cycle™ for Storage and Data Protection Technologies Report.  According to Gartner, “This Hype Cycle focuses on emerging innovative storage and data protection technologies and evaluates business impact, adoption rate and maturity level to help infrastructure and operations leaders build adaptable and future-ready storage, and data protection platforms for changing business needs.”