
Chen Wong
Want to run open-source AI models from your phone? This guide shows how to launch a pod on Runpod and connect to it from your iPhone—no laptop required.
AI Workloads

Brendan McKeag
Use Cursor as your AI-native IDE? Here’s how to connect it directly to LLM pods on Runpod, enabling real-time GPU-powered development with minimal setup.
AI Workloads

Alyssa Mazzina
Not sure why AI needs a GPU? This post breaks it down in plain English—from matrix math to model training—and shows how GPUs power modern AI workloads.
Learn AI

Brendan McKeag
Learn how to deploy a lightweight Gemma 3 model to generate image captions using Runpod Serverless. This walkthrough includes setup, deployment, and sample outputs.
AI Workloads

Alyssa Mazzina
Tired of usage limits or API costs? This guide walks you through switching from OpenAI’s API to your own self-hosted LLM using open-source models on Runpod.
AI Infrastructure

Alyssa Mazzina
Finished training your model in a Pod? This guide helps you decide when to switch to Serverless, what trade-offs to expect, and how to optimize for fast, cost-efficient inference.
AI Infrastructure

Alyssa Mazzina
No GPU. No team. Just $5. This is how one solo developer used Runpod Serverless to build and deploy a working AI product—"AI for Dads"—without writing any custom training code.
AI Workloads
Oops! no result found for User type something