GenomeDK is maintained by Aarhus University.
More information can be found here.
Summary
- Compute nodes: 213
- CPU cores: 6776
- Memory: 61 TB
- Storage: 11.5 PB BeeGFS distributed file system (fast storage)
- Interconnect: InfiniBand (56 Gbit/s)
Compute Nodes
50 Nodes:
- 2x AMD/”EPYC Rome” 7452 CPUs @ 2.35 GHz, 32 cores/CPU
- 512 GB memory
- Infiniband EDR
40 Nodes:
- 2x Intel/”Sandy Bridge” E5-2670 CPUs @ 2.67 GHz, 8 cores/CPU
- 128 GB memory
- 10 GigE
56 Nodes:
- 2x Intel/”Sandy Bridge” E5-2670 CPUs @ 2.67 GHz, 8 cores/CPU
- 128 GB memory
- InfiniBand 4X QDR
32 Nodes:
- 2x Intel/”Haswell” E5-2680v3 CPUs @ 2.5 Ghz, 12 cores/CPU
- 256 GB memory
- InfiniBand FDR
30 Nodes:
- 2x Intel/”Skylake” Gold 6140 CPU @ 2.30GHz, 18 cores/CPU
- 384 GB memory
- InfiniBand FDR
2 Nodes:
- 2x Intel/”Skylake” Gold 6140 CPU @ 2.30GHz, 18 cores/CPU
- 384 GB memory
- InfiniBand FDR
- 2 NVIDIA Tesla V100 16Gb GPU devices
3 Nodes:
- 4x Intel/”Westmere” E7-4807 CPUs @ 1.87 Ghz, 6 cores/CPU
- 1024 GB memory
- InfiniBand 4X QDR
Scheduler
The queuing system used at GenomeDK is SLURM