Alyssa Mazzina

Alyssa is Runpod's Content Marketing Manager. She lives in California with her kids and dogs.

Mixture of Experts (MoE): A Scalable AI Training Architecture

MoE models scale efficiently by activating only a subset of parameters. Learn how this architecture works, why it’s gaining traction, and how Runpod supports MoE training and inference.
Read article
AI Workloads

AI, Content, and Courage Over Comfort: Why I Joined RunPod

Alyssa Mazzina shares her personal journey to joining RunPod, and why betting on bold, creator-first infrastructure felt like the right kind of risk.
Read article
Learn AI

Built on RunPod: How Cogito Trained Models Toward ASI

San Francisco-based Deep Cogito used RunPod infrastructure to train Cogito v1, a high-performance open model family aiming at artificial superintelligence. Here’s how they did it.
Read article
AI Workloads

The Future of AI Training: Are GPUs Enough?

GPUs still dominate AI training in 2025, but emerging hardware and hybrid infrastructure are reshaping what's possible. Here’s what GTC revealed—and what it means for you.
Read article
AI Workloads

The RTX 5090 Is Here: Serve 65,000+ Tokens Per Second on RunPod

The new NVIDIA RTX 5090 is now live on RunPod. With blazing-fast inference speeds and large memory capacity, it’s ideal for real-time LLM workloads and AI scaling.
Read article
AI Workloads

How to Choose a Cloud GPU for Deep Learning (Ultimate Guide)

Choosing a cloud GPU isn’t just about power—it’s about efficiency, memory, compatibility, and budget. This guide helps you select the right GPU for your deep learning projects.
Read article
Hardware & Trends

Bare Metal vs. Instant Clusters: What’s Best for Your AI Workload?

Runpod now offers Instant Clusters alongside Bare Metal. This post compares the two deployment options and explains when to choose one over the other for your compute needs.
Read article
AI Infrastructure

Build what’s next.

The most cost-effective platform for building, training, and scaling machine learning models—ready when you are.

12:22