GPU Instance Pricing

5x cheaper than other clouds
Autoscale with Serverless with cold-start in milliseconds.
Serverless Pricing
Secure Cloud
1x GPU
$/hr
8x GPU
$/hr
A100 80GB
80 GB VRAM
117 GB RAM  12 vCPU
Spot
-

On-Demand

1.89

1 Month
-
3 Month
-
6 Month
-
-
15.12
-
-
-
A100 SXM 80GB
80 GB VRAM
125 GB RAM  16 vCPU
Spot
-

On-Demand

2.29

1 Month
-
3 Month
-
6 Month
-
-
18.32
-
-
-
H100 80GB SXM5
80 GB VRAM
125 GB RAM  16 vCPU
Spot
-

On-Demand

4.69

1 Month
-
3 Month
-
6 Month
-
-
37.52
-
-
-
H100 80GB PCIe
80 GB VRAM
188 GB RAM  16 vCPU
Spot
-

On-Demand

3.89

1 Month
-
3 Month
-
6 Month
-
-
31.12
-
-
-
A40
48 GB VRAM
48 GB RAM  9 vCPU
Spot
0.49

On-Demand

0.69

1 Month
0.66
3 Month
0.62
6 Month
0.59
3.92
5.52
5.28
4.96
4.72
L40
48 GB VRAM
58 GB RAM  16 vCPU
Spot
0.69

On-Demand

1.14

1 Month
1.08
3 Month
1.02
6 Month
0.97
5.52
9.12
8.64
8.16
7.76
L40S
48 GB VRAM
125 GB RAM  12 vCPU
Spot
-

On-Demand

1.49

1 Month
-
3 Month
-
6 Month
-
-
11.92
-
-
-
RTX 6000 Ada
48 GB VRAM
61 GB RAM  8 vCPU
Spot
-

On-Demand

1.14

1 Month
1.08
3 Month
1.02
6 Month
0.97
-
9.12
8.64
8.16
7.76
RTX A6000
48 GB VRAM
50 GB RAM  8 vCPU
Spot
0.49

On-Demand

0.79

1 Month
0.75
3 Month
0.71
6 Month
0.67
3.92
6.32
6
5.68
5.36
RTX 3090
24 GB VRAM
24 GB RAM  2 vCPU
Spot
-

On-Demand

0.44

1 Month
-
3 Month
-
6 Month
-
-
3.52
-
-
-
RTX 4090
24 GB VRAM
24 GB RAM  4 vCPU
Spot
0.49

On-Demand

0.74

1 Month
-
3 Month
-
6 Month
-
3.92
5.92
-
-
-
L4
24 GB VRAM
GB RAM   vCPU
Spot
-

On-Demand

0.44

1 Month
0.42
3 Month
0.39
6 Month
0.37
-
3.52
3.36
3.12
2.96
RTX A5000
24 GB VRAM
24 GB RAM  6 vCPU
Spot
-

On-Demand

0.44

1 Month
0.42
3 Month
0.39
6 Month
0.37
-
3.52
3.36
3.12
2.96
RTX 4000 Ada
20 GB VRAM
31 GB RAM  6 vCPU
Spot
-

On-Demand

0.39

1 Month
0.34
3 Month
0.29
6 Month
-
-
3.12
2.72
2.32
-
RTX A4500
20 GB VRAM
29 GB RAM  4 vCPU
Spot
0.22

On-Demand

0.36

1 Month
0.34
3 Month
0.32
6 Month
0.3
1.76
2.88
2.72
2.56
2.4
RTX A4000
16 GB VRAM
23 GB RAM  5 vCPU
Spot
-

On-Demand

0.34

1 Month
0.32
3 Month
0.3
6 Month
0.28
-
2.72
2.56
2.4
2.24
Deploy Now
Community Cloud
1x GPU
$/hr
8x GPU
$/hr
A100 80GB
80 GB VRAM
117 GB RAM  12 vCPU
Spot
0.89

On-Demand

1.59

7.12
12.72
H100 80GB SXM5
80 GB VRAM
125 GB RAM  16 vCPU
Spot
2.49

On-Demand

3.89

19.92
31.12
H100 80GB PCIe
80 GB VRAM
188 GB RAM  16 vCPU
Spot
2.49

On-Demand

3.39

19.92
27.12
A40
48 GB VRAM
48 GB RAM  9 vCPU
Spot
0.34

On-Demand

0.67

2.72
5.36
L40S
48 GB VRAM
125 GB RAM  12 vCPU
Spot
-

On-Demand

0.5

-
4
RTX 6000 Ada
48 GB VRAM
61 GB RAM  8 vCPU
Spot
0.69

On-Demand

0.99

5.52
7.92
RTX A6000
48 GB VRAM
50 GB RAM  8 vCPU
Spot
0.34

On-Demand

0.69

2.72
5.52
A100 SXM 40GB
40 GB VRAM
GB RAM   vCPU
Spot
-

On-Demand

1

-
8
RTX 5000 Ada
32 GB VRAM
GB RAM   vCPU
Spot
-

On-Demand

0.69

-
5.52
V100 SXM2 32GB
32 GB VRAM
46 GB RAM  20 vCPU
Spot
-

On-Demand

0.39

-
3.12
A30
24 GB VRAM
31 GB RAM  8 vCPU
Spot
0.19

On-Demand

0.26

1.52
2.08
RTX 3090
24 GB VRAM
24 GB RAM  2 vCPU
Spot
0.19

On-Demand

0.26

1.52
2.08
RTX 3090 Ti
24 GB VRAM
30 GB RAM  7 vCPU
Spot
0.2

On-Demand

0.29

1.6
2.32
RTX 4090
24 GB VRAM
24 GB RAM  4 vCPU
Spot
0.39

On-Demand

0.54

3.12
4.32
RTX A5000
24 GB VRAM
24 GB RAM  6 vCPU
Spot
0.19

On-Demand

0.26

1.52
2.08
RTX 4000 Ada
20 GB VRAM
31 GB RAM  6 vCPU
Spot
0.15

On-Demand

0.21

1.2
1.68
RTX 4000 Ada SFF
20 GB VRAM
23 GB RAM  7 vCPU
Spot
0.15

On-Demand

0.21

1.2
1.68
RTX A4500
20 GB VRAM
29 GB RAM  4 vCPU
Spot
0.15

On-Demand

0.21

1.2
1.68
RTX 4080
16 GB VRAM
62 GB RAM  16 vCPU
Spot
-

On-Demand

0.35

-
2.8
RTX A4000
16 GB VRAM
23 GB RAM  5 vCPU
Spot
0.15

On-Demand

0.19

1.2
1.52
V100 FHHL
16 GB VRAM
GB RAM   vCPU
Spot
0.19

On-Demand

0.24

1.52
1.92
Tesla V100
16 GB VRAM
GB RAM   vCPU
Spot
0.19

On-Demand

0.24

1.52
1.92
V100 SXM2
16 GB VRAM
31 GB RAM  10 vCPU
Spot
0.15

On-Demand

0.29

1.2
2.32
RTX 3080 Ti
12 GB VRAM
15 GB RAM  6 vCPU
Spot
0.15

On-Demand

0.19

1.2
1.52
RTX 4070 Ti
12 GB VRAM
46 GB RAM  12 vCPU
Spot
0.15

On-Demand

0.25

1.2
2
RTX 3080
10 GB VRAM
14 GB RAM  6 vCPU
Spot
0.15

On-Demand

0.18

1.2
1.44
RTX 3070
8 GB VRAM
26 GB RAM  4 vCPU
Spot
-

On-Demand

0.14

-
1.12
RTX A2000
6 GB VRAM
10 GB RAM  9 vCPU
Spot
-

On-Demand

0.14

-
1.12
Deploy Now

Storage Pricing

Pod Volume / Container Disk
$0.10/GB/Month on running pods
$0.20/GB/Month for idle volume
Network Storage
$0.07/GB/Month
$0.05/GB/Month for 1TB or more

Bandwidth Pricing

It's FREE!
No hidden internet bandwidth costs.
Don't see a machine configuration that you like?
Don't want to use containers?
We provide custom solutions too!
Contact Us