
Brendan McKeag
This week’s Runpod RoundUp covers major releases including Llama-2 with 32k context support, SDXL 1.0’s public release, and StabilityAI’s new Stable Beluga LLMs—all now available to run on Runpod.
AI Workloads

Brendan McKeag
Stable Diffusion XL 1.0 is now live on Runpod with full support in the Fast Stable Diffusion template. Users can generate higher-resolution, more anatomically accurate, and text-capable images with simplified prompts using AUTOMATIC1111 via a streamlined Jupyter setup.
AI Workloads

Brendan McKeag
RunPod has teamed up with OpenCV to provide free GPU access for students building the future of computer vision. Learn how the partnership works and who it supports.
AI Workloads

Brendan McKeag
Llama 2 is now open source, offering a native 4k context window and strong performance. This post walks through how to download it from Meta or use TheBloke’s quantized versions.
Hardware & Trends

Brendan McKeag
This Runpod Roundup covers the arrival of 8k–16k token context models, the release of Stable Diffusion XL, and the launch of Llama 2 by Meta and Microsoft. All are now available to run on Runpod.
Hardware & Trends

Brendan McKeag
This guide walks through setting up SillyTavern—a powerful, customizable roleplay frontend—on a Runpod instance. It covers port exposure, GitHub installation, whitelist config, and connecting to models like Oobabooga or KoboldAI.
Learn AI

Brendan McKeag
Want to upgrade from basic chat UIs? SillyTavern offers a more interactive interface for AI conversations. Here’s how to install it on your own RunPod instance.
Learn AI
Oops! no result found for User type something