Crate pw_tokenizer

source ·
Expand description

pw_tokenizer - Efficient string handling and printf style encoding.

Logging is critical, but developers are often forced to choose between additional logging or saving crucial flash space. The pw_tokenizer crate helps address this by replacing printf-style strings with binary tokens during compilation. This enables extensive logging with substantially less memory usage.

For a more in depth explanation of the systems design and motivations, see Pigweed’s pw_tokenizer module documentation.

§Examples

Pigweed’s tokenization database uses printf style strings internally so those are supported directly.

use pw_tokenizer::tokenize_printf_to_buffer;

let mut buffer = [0u8; 1024];
let len = tokenize_printf_to_buffer!(&mut buffer, "The answer is %d", 42)?;

// 4 bytes used to encode the token and one to encode the value 42.  This
// is a **3.5x** reduction in size compared to the raw string!
assert_eq!(len, 5);

We also support Rust’s core::fmt style syntax. These format strings are converted to printf style at compile time to maintain compatibly with the rest of the Pigweed tokenizer ecosystem. The below example produces the same token and output as the above one.

use pw_tokenizer::tokenize_core_fmt_to_buffer;

let mut buffer = [0u8; 1024];
let len = tokenize_core_fmt_to_buffer!(&mut buffer, "The answer is {}", 42 as i32)?;
assert_eq!(len, 5);

Macros§

Traits§