We're officially SOC 2 Type II Compliant
You've unlocked a referral bonus! Sign up today and you'll get a random credit bonus between $5 and $500
You've unlocked a referral bonus!
Claim Your Bonus
Claim Bonus
Blog

Runpod Blog.

Our team’s insights on building better
and scaling smarter.
All
This is some text inside of a div block.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
RunPod Partners With OpenCV to Empower the Next Gen of AI Builders

RunPod Partners With OpenCV to Empower the Next Gen of AI Builders

RunPod has teamed up with OpenCV to provide free GPU access for students building the future of computer vision. Learn how the partnership works and who it supports.
Read article
AI Workloads
Meta and Microsoft Release Llama 2 as Open Source

Meta and Microsoft Release Llama 2 as Open Source

Llama 2 is now open source, offering a native 4k context window and strong performance. This post walks through how to download it from Meta or use TheBloke’s quantized versions.
Read article
Hardware & Trends
Runpod Roundup: High-Context LLMs, SDXL, and Llama 2

Runpod Roundup: High-Context LLMs, SDXL, and Llama 2

This Runpod Roundup covers the arrival of 8k–16k token context models, the release of Stable Diffusion XL, and the launch of Llama 2 by Meta and Microsoft. All are now available to run on Runpod.
Read article
Hardware & Trends
How to Install SillyTavern in a Runpod Instance

How to Install SillyTavern in a Runpod Instance

This guide walks through setting up SillyTavern—a powerful, customizable roleplay frontend—on a Runpod instance. It covers port exposure, GitHub installation, whitelist config, and connecting to models like Oobabooga or KoboldAI.
Read article
Learn AI
How to Install SillyTavern in a RunPod Instance

How to Install SillyTavern in a RunPod Instance

Want to upgrade from basic chat UIs? SillyTavern offers a more interactive interface for AI conversations. Here’s how to install it on your own RunPod instance.
Read article
Learn AI
16k Context LLM Models Now Available On Runpod

16k Context LLM Models Now Available On Runpod

Runpod now supports Panchovix’s 16k-token context models, allowing for much deeper context retention in long-form generation. These models require higher VRAM and may trade off some performance, but are ideal for extended sessions like roleplay or complex Q&A.
Read article
Product Updates
Runpod Partners With Defined.ai To Democratize and Accelerate AI Development

Runpod Partners With Defined.ai To Democratize and Accelerate AI Development

Runpod announces a partnership with Defined.ai to offer ethically sourced speech and text datasets to AI developers, starting with a pilot program to fine-tune LLMs and accelerate NLP research.
Read article
Product Updates
Oops! no result found for User type something
Clear search
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Build what’s next.

The most cost-effective platform for building, training, and scaling machine learning models—ready when you are.

You’ve unlocked a
referral bonus!

Sign up today and you’ll get a random credit bonus between $5 and $500 when you spend your first $10 on Runpod.