Go Generics: Practical Patterns After a Year of Use
Go 1.18 shipped last week with generics. I've been running the beta in side projects for a few months, and I want to share what's actually useful versus what's hype.
The short version: generics solve a real, narrow problem in Go. They're not a general-purpose abstraction tool. If you go in with the right expectations, they're great.
The Syntax
Type parameters go in square brackets before the function signature:
func Map[T, U any](s []T, f func(T) U) []U {
result := make([]U, len(s))
for i, v := range s {
result[i] = f(v)
}
return result
}
T and U are type parameters. any is the constraint โ the new alias for interface{}. Use any everywhere going forward; interface{} still works but any is cleaner.
Constraints define what operations you can perform on a type parameter. The comparable constraint lets you use == and !=, which means you can use the type as a map key:
func Contains[T comparable](s []T, v T) bool {
for _, item := range s {
if item == v {
return true
}
}
return false
}
For ordered comparisons (<, >, <=, >=), you need constraints.Ordered from golang.org/x/exp/constraints. This covers all numeric types and strings. Note: in Go 1.21, cmp.Ordered moves into the standard library, and the slices and maps packages replace most of what you'd reach for golang.org/x/exp to get today.
The Slice Utilities You'll Actually Use
Before generics, every Go codebase either duplicated these functions for each type or wrote them with interface{} and type assertions. Now:
func Filter[T any](s []T, f func(T) bool) []T {
var result []T
for _, v := range s {
if f(v) {
result = append(result, v)
}
}
return result
}
func Reduce[T, U any](s []T, init U, f func(U, T) U) U {
acc := init
for _, v := range s {
acc = f(acc, v)
}
return acc
}
These are genuinely useful in production code. Filtering a slice of structs, mapping over a list of IDs to fetch records, reducing a list of errors into a single error โ all cleaner with generics than with the alternatives.
Type Inference
One thing that surprised me: you almost never need to write the type parameters explicitly. The compiler infers them from the arguments:
names := []string{"alice", "bob", "carol"}
upper := Map(names, strings.ToUpper) // type params inferred
found := Contains(names, "bob") // inferred
You only need explicit type parameters in edge cases where inference can't figure it out, which is rare for the utility functions above.
The Typed Cache โ A Real Before/After
Here's the pattern I've replaced most often in production services. The old way, using sync.Map:
// Before: untyped, interface{} everywhere
type Cache struct {
m sync.Map
}
func (c *Cache) Set(key string, value interface{}) {
c.m.Store(key, value)
}
func (c *Cache) Get(key string) (interface{}, bool) {
return c.m.Load(key)
}
// Caller has to type-assert, and can get it wrong silently
val, ok := cache.Get("session:123")
if ok {
session := val.(*Session) // panics if wrong type stored
}
The new way with generics:
// After: typed, no assertions needed
type Cache[K comparable, V any] struct {
mu sync.RWMutex
m map[K]V
}
func NewCache[K comparable, V any]() *Cache[K, V] {
return &Cache[K, V]{m: make(map[K]V)}
}
func (c *Cache[K, V]) Set(key K, value V) {
c.mu.Lock()
defer c.mu.Unlock()
c.m[key] = value
}
func (c *Cache[K, V]) Get(key K) (V, bool) {
c.mu.RLock()
defer c.mu.RUnlock()
v, ok := c.m[key]
return v, ok
}
Usage:
sessionCache := NewCache[string, *Session]()
sessionCache.Set("session:123", &Session{UserID: 42})
session, ok := sessionCache.Get("session:123")
// session is *Session, no assertion, no panic risk
The type-safety benefit is real. In a service that handles multiple entity types, this eliminates an entire class of runtime panics that were previously caught only in testing (if you were lucky).
A Generic Result Type
Another pattern that works well: a typed result/error type for operations that return either a value or an error, useful in channel-based pipelines:
type Result[T any] struct {
Value T
Err error
}
func (r Result[T]) Unwrap() (T, error) {
return r.Value, r.Err
}
func OK[T any](v T) Result[T] { return Result[T]{Value: v} }
func Err[T any](e error) Result[T] { return Result[T]{Err: e} }
This is cleaner than chan interface{} with type assertions when you're fanning out work across goroutines.
Where Generics Add Confusion
The failure mode I've seen: people reach for generics when they should use interfaces. If you're writing code that depends on behavior (methods), use an interface. If you're writing code that depends on the shape of data (the same algorithm over different concrete types), use generics.
The io.Reader interface doesn't need generics. A Map function does. If you're not sure, the stdlib guidance is: "if in doubt, leave it out." A function that takes any and type-switches internally is often more readable than a constrained generic that does the same thing.
Also: don't use generics to avoid writing code twice when the two versions are actually different. Generics are for when the code is structurally identical but the types differ.
Performance
Go's generics use GCShape stenciling, not full monomorphization like Rust or C++. Types that share the same GCShape (e.g., all pointer types share one shape) share the same compiled implementation, with a dictionary passed at runtime to handle type-specific operations. For most use cases this is fine โ the performance is comparable to interface-based approaches, and for pointer-heavy code (which most Go is) there's often no overhead at all. For tight loops over scalar types, you might see some overhead compared to hand-written type-specific code, but I haven't hit that in practice on real services.
The Stdlib Catch-Up (Go 1.21 Preview)
The golang.org/x/exp/slices and golang.org/x/exp/maps packages I've been using will move into the standard library as slices and maps in Go 1.21. This means slices.Contains, slices.SortFunc, maps.Keys, maps.Values โ all the things you're writing today โ will be in stdlib. Start with x/exp now, plan to migrate.
Generics in Go are conservative and practical, which is very on-brand. They don't try to be Haskell. They solve the specific problem of duplicated algorithms over typed collections, and they solve it well. Use them for that, leave the rest alone.