GPU Instance Restrictions
Learn more about the details of what you can and cannot provision. Most of these are due to physical hardware constraints. As always, feel free to contact us if you have any questions!
General Restrictions
These limits apply to GPU instances with the corresponding GPU model. Note that the
minimum vCPU count is always 1 vCPU per GPU. So, if you have 8 GPUs, you need at
least 8 vCPUs to be allocated for the virtual machine provisioning request to
succeed.
These restrictions are to ensure that our host nodes remain balanced,
but feel free to contact us if you have different needs, and we'd
be happy to consdier it.
GPU | # | Gbps | Min CPU/GPU | Max CPU/GPU | vCPU |
---|---|---|---|---|---|
A100 80GB PCIE | 8 | 10 | 1 | 30 | 94 |
A100 40GB SXM | 8 | 10 | 1 | 30 | 124 |
A100 40GB PCIE | 8 | 10 | 1 | 18 | 94 |
V100 16GB SXM | 8 | 10 | 1 | 8 | 60 |
V100 16GB PCIE | 7 | 10 | 1 | 8 | 44 |
A40 | 8 | 10 | 1 | 40 | 94 |
A6000 | 4 | 10 | 1 | 15 | 60 |
A5000 | 4 | 10 | 1 | 22 | 44 |
A4000 | 7 | 10 | 1 | 32 | 46 |
Quadro RTX 5000 | 4 | 10 | 1 | 24 | 44 |
Quadro RTX 4000 | 7 | 10 | 1 | 40 | 40 |
CPU Restrictions on GPU Instances
These limits apply to GPU instances with the corresponding GPU model. Note that the minimum GB of RAM is always 1 GB of RAM per vCPU. So, if you have 32 vCPUs, you need at least 32 GB of RAM to be allocated.
GPU | Min GB of RAM/CPU | Max GB of RAM/CPU | Max Instance RAM |
---|---|---|---|
A100 80GB PCIE | 1 | 48 | 731 |
A100 40GB SXM | 1 | 240 | 960 |
A100 40GB PCIE | 1 | 24 | 492 |
V100 16GB SXM | 1 | 120 | 960 |
V100 16GB PCIE | 1 | 60 | 240 |
A40 | 1 | 48 | 741 |
A6000 | 1 | 124 | 496 |
A5000 | 1 | 30 | 366 |
A4000 | 1 | 8 | 366 |
Quadro RTX 5000 | 1 | 120 | 240 |
Quadro RTX 4000 | 1 | 120 | 240 |