Why pattern matching can matter in hot code

Rust’s pattern matching is usually compiled very efficiently. In many cases, the compiler turns a match into a jump table, a decision tree, or a sequence of comparisons. However, performance can still suffer when:

  • patterns force repeated destructuring of large values,
  • matches are nested in ways that create extra branching,
  • ownership moves trigger copies or allocations,
  • guards prevent the compiler from simplifying control flow,
  • or the code repeatedly matches on the same value instead of reusing a simpler representation.

The key idea is to make the compiler’s job easy: match on small, cheap-to-inspect values, keep branches predictable, and avoid unnecessary work inside the match arms.

Prefer matching on references when ownership is not needed

A common performance mistake is matching by value when the code only needs to inspect data. If the matched value is large, moving it can be expensive or may force cloning elsewhere.

Example: avoid moving large structs

struct Packet {
    header: [u8; 64],
    payload: Vec<u8>,
}

fn handle(packet: Packet) {
    match packet {
        Packet { header, payload } => {
            // `header` and `payload` are moved here.
            println!("payload size = {}", payload.len());
        }
    }
}

If you only need to read fields, match by reference instead:

fn handle(packet: &Packet) {
    match packet {
        Packet { header: _, payload } => {
            println!("payload size = {}", payload.len());
        }
    }
}

This avoids moving the Vec<u8> and keeps the function flexible. In hot code, matching on &T or &mut T often enables better reuse and fewer ownership-related workarounds.

Rule of thumb

SituationPrefer
Only inspecting fieldsmatch &value
Mutating fields in placematch &mut value
Consuming the value is requiredmatch value

Flatten nested matches into simpler control flow

Nested matches can be elegant, but they often create deeper branch trees than necessary. In performance-critical code, a flatter structure can improve readability and sometimes help the compiler generate tighter code.

Before: nested matching

enum Message {
    Request { kind: Kind, id: u32 },
    Response { status: u16 },
}

enum Kind {
    Read,
    Write,
}

fn process(msg: Message) -> u32 {
    match msg {
        Message::Request { kind, id } => {
            match kind {
                Kind::Read => id,
                Kind::Write => id + 1,
            }
        }
        Message::Response { status } => status as u32,
    }
}

After: combine patterns when possible

fn process(msg: Message) -> u32 {
    match msg {
        Message::Request { kind: Kind::Read, id } => id,
        Message::Request { kind: Kind::Write, id } => id + 1,
        Message::Response { status } => status as u32,
    }
}

This version is easier for both humans and the compiler to reason about. It also avoids introducing an extra branch level for kind.

When flattening helps most

  • enum-heavy state machines,
  • protocol parsers,
  • request routers,
  • and event loops with many small variants.

If a nested match is only used to inspect one field, consider matching that field directly in the outer pattern.

Use exhaustive enums instead of stringly branching

Rust enums are not just safer than strings; they are often faster. Matching on an enum allows the compiler to know the full set of cases at compile time. That can produce compact branch code and avoid repeated comparisons.

Better than matching on strings

enum Command {
    Start,
    Stop,
    Status,
}

fn run(cmd: Command) -> u8 {
    match cmd {
        Command::Start => 1,
        Command::Stop => 2,
        Command::Status => 3,
    }
}

Compare that to matching on &str, which requires string comparisons and length checks. If a value is used repeatedly in control flow, parse it once into an enum and match on the enum afterward.

This pattern is especially useful in:

  • CLI argument handling,
  • network protocol decoding,
  • configuration parsing,
  • and message dispatch.

Be careful with match guards

Match guards are convenient, but they can reduce optimization opportunities because they add extra conditional logic after a pattern has already matched. In hot code, a guard may be fine, but if the condition is simple and structural, it is often better to encode it directly in the pattern or precompute a classification value.

Example: guard vs direct pattern

fn classify(x: Option<u32>) -> u8 {
    match x {
        Some(n) if n < 10 => 1,
        Some(_) => 2,
        None => 0,
    }
}

This is readable, but if the same threshold logic appears in many places, consider normalizing the input first or using a smaller enum:

enum Bucket {
    Empty,
    Small,
    Large,
}

fn bucket(x: Option<u32>) -> Bucket {
    match x {
        None => Bucket::Empty,
        Some(n) if n < 10 => Bucket::Small,
        Some(_) => Bucket::Large,
    }
}

The important point is not that guards are slow by default. Rather, they can make branch structure harder to simplify. Use them when they improve clarity, but avoid piling on multiple guards in a tight loop.

Match on discriminants only when the payload is expensive

Sometimes you only need to know which variant you have, not its contents. In that case, avoid destructuring large payloads.

Example: inspect the variant cheaply

enum Event {
    Connected(String),
    Disconnected,
    Data(Vec<u8>),
}

fn is_data(event: &Event) -> bool {
    matches!(event, Event::Data(_))
}

The matches! macro is ideal for fast checks when you do not need the payload. It keeps the code concise and avoids unnecessary binding.

If you need to branch on the variant and later access the payload, consider a single match rather than checking with matches! and then matching again. Repeating the same classification work can be wasteful.

Prefer matches! and if let for simple cases

A full match is not always the best tool. For single-pattern checks, if let and matches! can produce clearer and sometimes more direct code.

Use if let for one successful path

fn handle(opt: Option<u32>) -> u32 {
    if let Some(n) = opt {
        n + 1
    } else {
        0
    }
}

Use matches! for boolean tests

fn is_terminal(state: &State) -> bool {
    matches!(state, State::Done | State::Failed)
}

These forms reduce visual noise and make the intent obvious. They also help avoid accidental overengineering, such as introducing a large match for a simple predicate.

Reorder patterns to favor common cases

Branch prediction matters in hot loops. If one variant or case is much more common than others, place it where the compiler can generate efficient code for the likely path. Rust does not guarantee branch order will always map directly to machine code order, but the structure of your match still matters.

Example: common case first

enum Token {
    Ident,
    Number,
    Symbol,
    Eof,
}

fn consume(token: Token) -> bool {
    match token {
        Token::Ident => true,
        Token::Number => true,
        Token::Symbol => true,
        Token::Eof => false,
    }
}

If Eof is rare, this layout makes the success path straightforward. In more complex cases, consider grouping common variants together or using an early return for the rare path.

Avoid repeated matching inside loops

A frequent performance issue is matching the same value multiple times in a loop body. If the result of the match is stable, compute it once and reuse it.

Before: repeated classification

fn score(tokens: &[Token]) -> usize {
    let mut total = 0;

    for token in tokens {
        if matches!(token, Token::Ident) {
            total += 1;
        }
        if matches!(token, Token::Ident | Token::Number) {
            total += 2;
        }
    }

    total
}

After: classify once

fn score(tokens: &[Token]) -> usize {
    let mut total = 0;

    for token in tokens {
        let is_ident = matches!(token, Token::Ident);
        let is_value = matches!(token, Token::Ident | Token::Number);

        if is_ident {
            total += 1;
        }
        if is_value {
            total += 2;
        }
    }

    total
}

This is a small example, but in real code the repeated work may involve more complex destructuring or guard logic. Cache the classification result when it is reused.

Use #[repr] carefully for low-level interop

For internal Rust code, the compiler usually chooses an efficient layout for enums. If you are interfacing with C or using a packed protocol representation, you may need a specific layout. However, #[repr(C)] and #[repr(u8)] should be used for correctness first, not as a blanket performance trick.

When layout matters

  • FFI boundaries,
  • serialized wire formats,
  • bit-packed state machines,
  • and memory-mapped data.

For ordinary application code, prefer idiomatic enums and let the compiler optimize. Forcing a representation can sometimes reduce optimization freedom.

Practical checklist for faster pattern matching

Use this checklist when reviewing performance-sensitive Rust code:

  • Match on references unless ownership is required.
  • Flatten nested matches where possible.
  • Prefer enums over strings or ad hoc integer codes.
  • Use matches! for boolean checks.
  • Avoid guards in the hottest paths unless they are necessary.
  • Do not destructure large payloads if you only need the variant.
  • Compute classifications once and reuse them.
  • Keep the common path simple and direct.

A realistic example: event dispatch

Suppose you are building a server that processes events from a queue. A naive implementation might repeatedly inspect the same event in several places.

enum Event {
    Connect { client_id: u64 },
    Disconnect { client_id: u64 },
    Data { client_id: u64, bytes: Vec<u8> },
}

fn handle(event: &Event) -> usize {
    match event {
        Event::Connect { .. } => 1,
        Event::Disconnect { .. } => 2,
        Event::Data { bytes, .. } if bytes.len() > 1024 => 3,
        Event::Data { .. } => 4,
    }
}

This is already fairly good: it matches by reference, avoids moving the payload, and uses a guard only for a meaningful size threshold. If Data is overwhelmingly common, you might even split the logic to make that path more direct:

fn handle(event: &Event) -> usize {
    match event {
        Event::Data { bytes, .. } => {
            if bytes.len() > 1024 { 3 } else { 4 }
        }
        Event::Connect { .. } => 1,
        Event::Disconnect { .. } => 2,
    }
}

The best version depends on your workload. The point is to model the dominant case clearly and minimize unnecessary branching or movement.

Measure before and after

Pattern matching performance is highly context-dependent. The compiler is often excellent at optimizing matches, and many changes that look faster in source code make no measurable difference. Always validate changes with benchmarks and inspect generated code when the hot path matters.

Focus your effort on places where pattern matching is:

  • inside tight loops,
  • on critical request paths,
  • part of parsing or dispatch,
  • or repeated millions of times per second.

If a match is not on a hot path, prefer the clearest version.

Learn more with useful resources