Go got generics in 1.18, in March 2022. It’s been a year and a half. I’ve had time to form some opinions about when I use them and when I don’t. Spoiler: they’re good, but I use them much less than I expected.

The ideal use case, the one everyone points to, is generic container types:

type Set[T comparable] struct {
    m map[T]struct{}
}

func NewSet[T comparable]() *Set[T] {
    return &Set[T]{m: make(map[T]struct{})}
}

func (s *Set[T]) Add(v T) { s.m[v] = struct{}{} }
func (s *Set[T]) Has(v T) bool { _, ok := s.m[v]; return ok }
func (s *Set[T]) Len() int { return len(s.m) }

Before generics, you’d either write this for a specific type, or use map[any]struct{} and type-assert everywhere. Now you write it once and get a typed API. This is straightforwardly a win. I now have a personal genericslib with typed set, deque, priority queue, and LRU cache.

The second good use case is generic utility functions:

func Map[T, U any](s []T, f func(T) U) []U {
    r := make([]U, len(s))
    for i, v := range s {
        r[i] = f(v)
    }
    return r
}

func Filter[T any](s []T, pred func(T) bool) []T {
    r := s[:0]
    for _, v := range s {
        if pred(v) {
            r = append(r, v)
        }
    }
    return r
}

Most of these are now in golang.org/x/exp/slices and maps, so I don’t actually write them myself anymore. slices.SortFunc, slices.Index, slices.Contains, maps.Keys, maps.Values — all typed, all zero-overhead. Nice.

Where I thought I’d use generics and I don’t:

Writing generic interfaces. You can define type Reader[T any] interface { Read() (T, error) }, but you can’t put it in a container, because Reader[int] and Reader[string] are different types. Go doesn’t have type erasure. If you want a heterogeneous list of “things that can read,” you still need interface{ Read() (any, error) }.

Numeric generics. Go’s constraints.Ordered and constraints.Integer let you write generic numeric functions, but they’re not as powerful as you’d think. For example, you can’t write a generic Mean that returns the right type for integers vs floats without jumping through hoops. Integer division is truncating; float division isn’t. Generic code has to pick one behavior. Mostly I end up writing MeanInt and MeanFloat separately.

Reducing interface satisfaction boilerplate. I thought generics would let me avoid writing type IntSet struct { ... } and type StringSet struct { ... } with identical methods. They do! But… I already had IntSet and StringSet. And refactoring to a generic version is a non-trivial change. So I use generics for new code, but I haven’t gone back and genericized older code much.

Now, the subtle stuff.

Method sets are not generic. You can’t add a method to a generic type that depends on the type parameter:

// DOES NOT COMPILE
func (s *Set[T]) Union(other *Set[T]) *Set[T]

Wait, that actually does compile. What doesn’t compile is:

// DOES NOT COMPILE
func (s *Set[T]) Reduce[U any](f func(U, T) U, init U) U { ... }

You can’t add type parameters to methods, only to the type itself. This feels arbitrary but there’s a reason — method sets would get unboundedly large at compile time. Workaround: make it a function, not a method.

func Reduce[T, U any](s *Set[T], f func(U, T) U, init U) U { ... }

Zero values are tricky. Inside a generic function, var x T is the zero value of T, but you don’t know what that is. If T is a pointer, it’s nil. If T is a struct, it’s the zero struct. If T is time.Time, it’s the time that nobody ever wants. Code like if x == zero { ... } often fails to typecheck because there’s no == zero unless T is comparable. I find myself writing generic code that never needs zero values, or taking a “zero value” as an argument.

Type inference is limited. Go’s inference can figure out types from function arguments but NOT from return types. So:

result := Reduce(items, func(acc int, x Thing) int { return acc + x.Count }, 0)

works fine. But:

result := Parse[MyType](data)  // can't infer MyType

has to be specified explicitly. I’ve been caught by this when refactoring — adding a generic parameter that has to be explicit breaks a lot of call sites.

Compile times can suffer. Each generic type instantiation is compiled separately (via “GCshape” stenciling — not quite monomorphization, not quite erasure). Heavy generic use can slow builds. I haven’t personally felt this, but I’ve heard reports from larger codebases.

Runtime performance is surprising. Go’s generics implementation is neither monomorphized (like Rust) nor erased (like Java). It uses “GCshape stenciling plus dictionaries,” which means compatible types share a single compiled body with a dictionary parameter. This means generic code is often slightly slower than hand-specialized code, because of the dictionary lookup. For hot loops, this can matter. If you benchmark your generic function against a specialized copy, you may find the specialized version wins by 10-20%. Usually not enough to care about, but worth knowing.

My current rule of thumb:

  • Use generics for containers and utility functions.
  • Don’t write a generic thing unless you have at least 2 concrete call sites.
  • For hot-path code, benchmark against a specialized version before committing to the generic approach.
  • Don’t try to use generics to model OOP-style hierarchies. Go wasn’t designed for that.

A year and a half in, generics feel like an orthogonal feature: they exist, they help in specific cases, they don’t change how I approach most problems. The vast majority of my Go code still doesn’t use them. The library authors benefit disproportionately — slices, maps, sync/atomic.Pointer[T] — and the rest of us get nice typed APIs. That’s enough.

See also my post on sync.Map — a typed generic replacement for sync.Map has been proposed but isn’t in the standard library yet.