Explore our credit programs for startups
Blog

Runpod Blog

Our team’s insights on building better and scaling smarter.
All
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Open Source Video & LLM Roundup: The Best of What’s New

Open Source Video & LLM Roundup: The Best of What’s New

Open-source AI is booming—and 2024 delivered an incredible wave of new LLMs and generative video models. Here’s a quick roundup of the most exciting releases you can run today.
Read article
Hardware & Trends
What Even Is AI? A Writer & Marketer’s Perspective

What Even Is AI? A Writer & Marketer’s Perspective

Part 1 of the “Learn AI With Me” no-code series. If you’re not a dev, this post breaks down AI in human terms—from chatbots to image generation—and why it’s worth learning.
Read article
Learn AI
Training Flux.1 Dev on MI300X with Massive Batch Sizes

Training Flux.1 Dev on MI300X with Massive Batch Sizes

Explore what’s possible when training Flux.1 Dev on AMD’s 192GB MI300X GPU. This post dives into fine-tuning at scale with huge batch sizes and real-world performance.
Read article
AI Workloads
Streamline GPU Cloud Management with RunPod’s New REST API

Streamline GPU Cloud Management with RunPod’s New REST API

RunPod’s new REST API lets you manage GPU workloads programmatically—launch, scale, and monitor pods without ever touching the dashboard.
Read article
AI Infrastructure
AI, Content, and Courage Over Comfort: Why I Joined RunPod

AI, Content, and Courage Over Comfort: Why I Joined RunPod

Alyssa Mazzina shares her personal journey to joining RunPod, and why betting on bold, creator-first infrastructure felt like the right kind of risk.
Read article
Learn AI
Enhanced CPU Pods Now Support Docker and Network Volumes

Enhanced CPU Pods Now Support Docker and Network Volumes

We’ve upgraded Runpod CPU pods with Docker runtime and network volume support—giving you more flexibility, better storage options, and smoother dev workflows.
Read article
Product Updates
Run DeepSeek R1 on Just 480GB of VRAM

Run DeepSeek R1 on Just 480GB of VRAM

DeepSeek R1 remains one of the top open-source models. This post shows how you can run it efficiently on just 480GB of VRAM without sacrificing performance.
Read article
AI Workloads

Build what’s next.

The most cost-effective platform for building, training, and scaling machine learning models—ready when you are.

You’ve unlocked a
referral bonus!

Sign up today and you’ll get a random credit bonus between $5 and $500 when you spend your first $10 on Runpod.