Gophercamp2026

Sessions

Discover the talks, workshops, and lightning talks at Gophercamp 2026

13 sessions confirmed

This list is not final and subject to change.

Observability-Driven Development: Why 99.9% uptime doesn't mean your product works

Abstract Your users are leaving before you know they had problems. A slow signup flow, a failing payment endpoint, or a broken onboarding step. By the time you hear about it from support tickets, you've already lost trust and revenue. Most Go applications start with great intentions: fast iteration, clean code, and rapid shipping. But without the right observability foundations from day one, teams end up flying blind. Metrics live in one place, logs in another, and there's no way to connect a spike in error rates to actual user impact. In this talk, I'll share hard-won lessons from building production systems at scale and show you how to instrument Go applications with user journeys at the center. You'll learn how to build a minimal, effective observability stack using OpenTelemetry, connect technical signals to business outcomes, and establish SLOs that Product and Engineering can co-own. This is not a talk about adding more dashboards. This is about shipping fast with confidence. What You'll Learn 1. Why observability is a Day 1 decision - The cost of flying blind: churn, firefighting, and lost roadmap time - How to measure user outcomes, not just server health - The difference between good and great early-stage observability 2. Building the minimal observability stack in Go - Instrumenting with OpenTelemetry: metrics, traces, and structured logs - Choosing the right backends: Prometheus, Tempo, Loki (or managed alternatives) - Connecting technical signals: from metric spike → trace → log → user impact - Practical Go patterns: middleware, context propagation, and sampling strategies 3. Making SLOs about user journeys - Defining SLIs/SLOs for core flows (signup, checkout, onboarding) - Shared ownership between Product & Engineering - Using error budgets to balance speed and reliability - Release guardrails: detecting regressions in minutes, not hours Target Audience - Go developers at startups or scale-ups who want to build observability from the ground up - Engineering leads balancing velocity with reliability - Product Engineers who need to understand user impact, not just server metrics - Anyone who has debugged production issues by guessing Prerequisites: Basic Go experience. No prior observability knowledge required.

Ultimate Private AI

This is a hands-on, full-day workshop where you'll go from zero to running open-source models directly inside your Go applications — no cloud APIs, no external servers, no data leaving your machine. You'll start by loading a model and running your first inference with the Kronk SDK. Then you'll learn how to configure models for your hardware — GPU layers, KV cache placement, batch sizes, and context windows — so you get the best performance out of whatever machine you're running on. With the model tuned, you'll take control of its output through sampling parameters: temperature, top-k, top-p, repetition penalties, and grammar constraints that guarantee structured JSON responses. Next you'll see how Kronk's caching systems — System Prompt Cache (SPC) and Incremental Message Cache (IMC) — eliminate redundant computation and make multi-turn conversations fast. You'll watch a conversation go from full prefill on every request to only processing the newest message. With the foundation solid, you'll build real applications: a Retrieval-Augmented Generation (RAG) pipeline that grounds model responses in your own documents using embeddings and vector search, and a natural-language-to-SQL system where the model generates database queries from plain English — with grammar constraints ensuring the output is always valid, executable SQL. Each part builds on the last. By the end of the day, you won't just understand how private AI works — you'll have built applications that load models, cache intelligently, retrieve context, and generate code, all running locally on your own hardware.

Understanding Escape Analysis in Go - How Variables Move Between Stack and Heap

As a seasoned Go developer responsible for developing and maintaining a registrar backend to handle connections with about 40 registries and about 1k req/min I have to make sure the system is always well handled and performant. In order to make sure of it, escape analysis is a key part in it that I had to consider in it. - Why Escape Analysis Matters The Go compiler automatically decides where variables lives on the stack (fast, automatically freed) or heap (managed by GC, slower). While many developers never think about memory allocation, understanding escape analysis can be crucial for performance-sensitive code paths. Excess heap allocations increase garbage collection pressure and can slow down applications. - What Escape Analysis Is Escape analysis is a static compiler optimization that analyzes whether a variable can safely be kept on the stack. If a variable’s address escapes the function scope for example because it is returned or stored for later use the compiler must allocate it on the heap. - Code Walkthrough with Examples We’ll explore key patterns that force heap escapes or keep data on the stack: 1. Simple value return vs pointer return 2. Passing pointers and how this affects escape decisions 3. Why local pointers sometimes don’t escape Each example will include the -gcflags="-m" output to show the compiler’s reasoning. - How to Inspect Escape Behavior in Your Code Attendees will learn how to use `go build -gcflags="-m"` to see escape analysis annotations. We’ll interpret compiler messages and explain how they map to code behavior. - Practical Tips to Reduce Unnecessary Heap Allocations Beyond theory, the talk will cover actionable advice: 1. Prefer returning values instead of pointers when possible 2. Be mindful of interfaces and closures that may cause escapes 3. Understand allocations in hot paths and optimize where it matters

Want to Present at Gophercamp?

We're still accepting proposals! Share your Go knowledge and experiences with the community.

Submit Your Proposal

Submission deadline: March 31, 2026