Welcome to the Gypsum Cluster Documentation

Gypsum is a large cluster of computers (also referred to as "nodes"), each of which contains between four and eight NVIDIA GPU cards. There are 75 nodes with four NVIDIA Titan-X GPUs, 25 nodes with four NVIDIA M40 GPUs and 63 nodes with eight NVIDIA GeForce GTX 1080 Ti GPUs. Gypsum was purchased with funds generously provided by the Massachusetts Technology Collaborative.

Purpose of the Cluster

The purpose of Gypsum is to support GPU computing. If your application does not use GPUs, or if you are not sure whether your application makes use of GPUs, you should not use the Gypsum cluster. The College of Information and Computer Sciences (CICS) has another cluster known as Swarm2 which can be used for standard non-GPU computing.

In particular, it is important that people do not run large CPU jobs (as opposed to GPU jobs) on the Gypsum cluster, as this will prevent usage of the GPUs on the machines on which the CPU jobs are run, and they will remain idle.

The Gypsum hardware cluster consists of:

  • 25 compute nodes with 4 NVIDIA Tesla M40 GPUs, 12 cores (2 x Xeon E5-2620 v3 2.40 GHz), 256GB of RAM, and 256GB SSD for local disk.
  • 75 compute nodes with 4 NVIDIA TITAN X GPUs, 12 cores (2 x Xeon E5-2620 v3 2.40 GHz), 256GB of RAM, and 256GB SSD for local disk.
  • 53 compute nodes with 8 NVIDIA GeForce GTX 1080 Ti GPUs, 24 cores (2 x Xeon Silver 4116 2.10 GHz), 384GB of RAM, and 256GB SSD for local disk.
  • A 325 TB shared file system accessible by all of the compute nodes.
  • A 325 TB backup system.

The software consists of:

  • Bright Cluster Manager
  • CentOS Linux 7 (OS)
  • ZFS (Shared file system)
  • Slurm (Job Scheduling & resource management)

Before usage please read the following Documentation. It may save you time.

  • Policy Documentation This document explains how the resources are shared. Some policy is enforced via software. To make sure that your jobs don't die, please read this documentation.
  • User Documentation This document has quick reference on how to get started using Gypsum.
  • Library This has links to useful software and ideas on using clusters that may make your work easier.

Creating an Account

To start using the cluster, you need to first setup an account. The request should come from a faculty member, or be approved by a faculty member. To create an account, please send an email to the Gypsum administrators.

For general discussions regarding the use of the cluster, you may email to gypsum-disc@cs.umass.edu. Note that this discussion mailing list involves all users of the cluster.