Infrastructure
HPC & Cloud Infrastructure
The ACG offers a diverse set of systems for meeting HPC, Cloud, Storage and Visualization needs.
HPC Cluster
110 nodes totaling 4016 cores:
14 nodes: 96 cores each with two AMD EPYC 7643 48-core CPUs running at 2.3 Ghz with 512 GB of RAM
4 nodes: 32 cores each with two AMD EPYC 7313 16-core CPUs running at 3.0 Ghz with 1 TB of RAM
8 nodes: 36 cores each with two Intel Skylake 18-core CPUs running at 2.3 Ghz with 256 GB of RAM
60 nodes: 28 cores each with two Intel Broadwell 14-core CPUs running at 2.4 Ghz with 128 GB of RAM
24 nodes: 24 cores each with two Intel Haswell 12-core CPUs running at 2.5 Ghz with 64 GB of RAM
5 GPU nodes with 25 GPUs, 384 cores and 3328 GB of RAM :
Nvidia DGX A100 system with eight 40GB A100 GPUs, 256 CPU cores and 1 TB RAM
Gigabyte system with eight Nvidia RTX 2080Ti GPUs, 32 CPU cores and 768 GB RAM
Dell C6525 system with 2 80GB A100 GPUs, 32 CPU Cores and 512 GB RAM
Dell C6525 system with 3 48GB L40 GPUs, 32 CPU Cores and 512 GB RAM
Dell C6525 system with 4 24GB A30 GPUs, 32 CPU Cores and 512 GB RAM
Low latency Infiniband Interconnect throughout
Cloud Computing System
The cloud computing system provides virtual machine services.
Running OpenStack Rocky
2048 virtual cores
3TB of RAM providing the ability to host high memory Virtual Machines
Prototype Secure cloud for hosting VMs to work with Sensitive/Compliant data
HPC & Cloud Storage
Our primary storage for HPC and Cloud is a Ceph system comprised of:
six Management, MDS, and Monitor nodes
10 OSD nodes with over 2.5 PB of storage space
Our secondary and backup storage systems are based on ZFS totaling 1.6 PB of storage
As usage on the system grows, so will the system.
Visualization Systems
ACG operates two visualization systems that consist of dual-socket motherboards containing Intel Haswell processers and 256GB of RAM.
Each visualization system is equipped with an nVidia Quadro K4200 video card and is connected to HPC storage by Infiniband.
The systems run VirtualGL software and VNC to allow for VNC connections to render graphics using the servers GPU rather than sending OpenGL calls across the network to be rendered on the VNC client machine. This provides extremely fast 3D visualization with only minimal hardware requirements for the VNC client.
More packages will be installed soon. The address is: viz.acg.maine.edu and current users of Marconi can access this system with their same credentials.
Networking
All Systems are connected with FDR (40 Gbps) and HDR (100/200 Gbps) Infiniband.