Facilities

compute nodes in the teal cluster
Teal Nodes

The VCU High Performance Research Computing (HPRC) Core Facility occupies approximately 2000 sq ft of total space, predominantly on the third floor of Harris Hall on the Monroe Park Campus. The mission of the HPRC is to provide high performance computing services for the VCU research community. To accomplish this goal, the HPRC maintains four major supercomputing clusters, each specialized for different computing environments. They may be summarized as follows (descriptions current as of September 2019)

    • teal.vcu.edu is the primary cluster intended for large scale parallel computing, and is especially well suited for applications such molecular dynamics simulations, quantum chemistry and other Physical Sciences jobs. Teal consists of ~4500 64 bit Intel and AMD compute cores, each with 2-4 GB RAM/core, 10.2 TB of total RAM, 180 TB of /home space, and tmp space of between 360 and 787 GB per node. High speed network infrastructure is provided by a 20 Gb/second infiniband architecture.

Load Tracking accessible within the VCU VPN

    • bach.vcu.edu is the cluster designated for serial and small parallel applications. Bach consists of a total of 944 AMD 64 bit cores, each with a minimum of 2 GB/core RAM, 2 TB total RAM, 12 TB of /home space, and /tmp space of 360 GB per node. Networking infrastructure is gigabit ethernet.

Load Tracking accessible within the VCU VPN

    • godel.vcu.edu is a cluster optimized for bioinformatics applications, with  1768 Intel and AMD 64 bit cores, each with at least 3 GB RAM/core, 4.8 TB of total RAM, 17 TB of /home space, tmp space of at least 180 GB/node, and 40 Gb/second Infiniband networking, 1.2TB of GPFS high performance parallel file system storage.

Load Trackingaccessible within the VCU VPN

    • fenn.vcu.edu is a cluster designed to support research using data that must comply with federal security and privacy requirements, with 1016 Intel 64 bit cores, 2/GB of RAM/core, 840TB of GPFS high performance parallel file system storage (expandable to 2.2PB)m 54 Gb/second Infiniband networking. These clusters are collectively served by over 1.9 PB of networked nfs and GPFS high speed storage

Load Tracking accessible within the VCU VPN

GPU system image
GPU systems

To support this infrastructure, the HPRC has 4 FTE staff positions, (J. Mike Davis, Technical Director; Carlisle Childress & Brad Freeman, Systems Analysts; and John Layne, Applications Analyst). In addition to maintaining the hardware, the HPRC works collaboratively with the user base to maintain and optimize a large number of applications and development tools (BLAST, R, MATLAB, NAMD, Gaussian, Gromacs, Charm, C/C++, Fortran compilers, as well as other scientific, statistical and development software.

Privacy Statement