Runpod × OpenAI: Parameter Golf challenge is live
You've unlocked a referral bonus! Sign up today and you'll get a random credit bonus between $5 and $500
You've unlocked a referral bonus!
Claim Your Bonus
Claim Bonus
Emmett Fear

Emmett Fear

Emmett runs Growth at Runpod. He lives in Utah with his wife and dog, and loves to spend time hiking and paddleboarding. He has worked in many different facets of tech, from marketing, operations, product, and most recently, growth.

Serverless GPUs for API Hosting: How They Power AI APIs–A Runpod Guide

Explores how serverless GPUs power AI-driven APIs on platforms like Runpod. Demonstrates how on-demand GPU instances efficiently handle inference requests and auto-scale, making it ideal for serving AI models as APIs.
Guides

Unpacking Serverless GPU Pricing for AI Deployments

Breaks down how serverless GPU pricing works for AI deployments. Understand the pay-as-you-go cost model and learn tips to optimize usage to minimize expenses for cloud-based ML tasks.
Guides

Unlock Efficient Model Fine-Tuning With Pod GPUs Built for AI Workloads

Shows how Runpod’s specialized Pod GPUs enable efficient model fine-tuning for AI workloads. Explains how these GPUs accelerate training while reducing resource costs for intensive machine learning tasks.
Guides

How to Deploy LLaMA.cpp on a Cloud GPU Without Hosting Headaches

Shows how to deploy LLaMA.cpp on a cloud GPU without the usual hosting headaches. Covers setting up the model in a Docker container and running it for efficient inference, all while avoiding complex server management.
Guides

Nvidia B200 GPU: Specs, VRAM, Price, and AI Performance

The complete guide to the Nvidia B200 GPU: full specs, 180 GB HBM3e VRAM, pricing, AI benchmark performance, and how it compares to the H100 and H200 for cloud GPU workloads on Runpod.
Guides

How to Run Automatic1111 (Stable Diffusion Web UI) on Runpod

Step-by-step guide to running Automatic1111 (Stable Diffusion Web UI) on Runpod cloud GPUs. Covers setup, model loading, SDXL, ControlNet, Forge, and GPU recommendations.
Guides

Cloud Tools with Easy Integration for AI Development Workflows

Introduces cloud-based tools that integrate seamlessly into AI development workflows. Highlights how these tools simplify model training and deployment by minimizing setup and accelerating development cycles.
Guides

Running Whisper with a UI in Docker: A Beginner’s Guide

Provides a beginner-friendly tutorial for running OpenAI’s Whisper speech recognition with a GUI in Docker, covering container setup and using a web UI for transcription without coding.
Guides

Accelerate Your AI Research with Jupyter Notebooks on Runpod

Describes how using Jupyter Notebooks on Runpod accelerates AI research by providing interactive development on powerful GPUs. Enables faster experimentation and prototyping in the cloud.
Guides

Build what’s next.

The most cost-effective platform for building, training, and scaling machine learning models—ready when you are.

You’ve unlocked a
referral bonus!

Sign up today and you’ll get a random credit bonus between $5 and $500 when you spend your first $10 on Runpod.