The Deepthought2 cluster

Welcome

The University of Maryland has a number of high performance computing resources available for use by campus researchers requiring compute cycles for parallel codes and applications. These are:

  • Deepthought2: Our flagship cluster, intended for large, parallel jobs, housed just off campus and maintained by the Division of Information Technology. It consists of over 480 nodes with dual socket (20 cores per node) Ivy Bridge 2.8 GHz processors, forty of which have dual Nvidia K20m GPUs. Most nodes have 128 GB of RAM, with a few having 1 TB of RAM. All nodes have FDR infiniband (56 Gb/s) interconnects, and there is 1 PB of fast lustre storage.
  • Deepthought: The original of the Deepthought clusters, now intended for smaller, less parallel or serial jobs and for parallel code development. This resource is housed on campus and maintained by the Division of Information Technology. Because new hardware has been added over the years, it is very heterogeneous, which makes it less suitable for large parallel jobs. CPUs range from 2.0 to 2.9 GHz, from Woodcrest to Sandy Bridge architectures and 4 - 16 cores/node, and 1 - 4 GB/core RAM. All nodes have Gigabit ethernet interconnects, some have DDR, QDR, or FDR infiniband. There is about 100 TB of lustre storage available.
  • MARCC/Bluecrab: The University of Maryland is allocated 15% of the MARCC/Bluecrab cluster housed at the Maryland Advanced Research Computing Center, jointly managed by John Hopkins and the University of Maryland. This is suitable for large parallel jobs. There are about 650 compute nodes with dual Haswell CPUs and 128 GB of RAM. In addition, there are almost 50 nodes with dual Nvidia Tesla K80 GPUs, and 50 nodes with 1 TB of RAM. Although housed off campus, the Division of IT is working on establishing a fast pipe to the cluster.

Comparison of Clusters

The following table tries to compare the HPC resources:

Cluster Number of
compute nodes
Cores/node Processor
speeds
Memory/node
GB
Nodes with GPUs Interconnect Disk space Licensed
Software
Deepthought2 488 20
(1TB nodes have 40)
2.8 GHz
(1TB nodes are 2.2 GHz)
128
(4 nodes have 1 TB)
40 (dual K20m) FDR Infiniband 1 PB lustre Intel compiler suite
Matlab Distributed Compute Server
Deepthought 376 Varies
(4-16)
Varies
2.0-2.9 GHz
Varies
4-64 GB
None Varies from
GigE,
DDR/QDR/FDR
Infiniband
100 TB lustre Intel compiler suite
MARCC/Bluecrab 846 most 24
some 28
(1TB nodes have 48)
2.5-3.0 GHz 128
(50 nodes have 1 TB)
72 (dual K80) FDR Infiniband 2 PB lustre
18 PB ZFS
Intel compiler suite

Click on the clustername above for more detailed information about a particular cluster.