The Protobuf compiler module provides build system integration and wrapper scripts for generating source code for Protobuf definitions.

Protobuf compilation#

Generator support#

Protobuf code generation is currently supported for the following generators:






Compiles using pw_protobuf.

pw_protobuf RPC


Compiles pw_rpc service and client code for pw_protobuf.



Compiles using Nanopb. The build argument dir_pw_third_party_nanopb must be set to point to a local nanopb installation.

Nanopb RPC


Compiles pw_rpc service and client code for nanopb. Requires a nanopb installation.



Compiles raw binary pw_rpc service code.



Compiles using the standard Go protobuf plugin with gRPC service support.



Compiles using the standard Python protobuf plugin, creating a pw_python_package.



Compilation is supported in Bazel via @rules_proto_grpc. ProtoCollection provides convience methods for proto descriptors.

GN template#

This module provides a pw_proto_library GN template that defines a collection of protobuf files that should be compiled together. The template creates a sub-target for each supported generator, named <target_name>.<generator>. These sub-targets generate their respective protobuf code, and expose it to the build system appropriately (e.g. a pw_source_set for C/C++).

For example, given the following target:

pw_proto_library("test_protos") {
  sources = [ "my_test_protos/test.proto" ]

test_protos.pwpb compiles code for pw_protobuf, and test_protos.nanopb compiles using Nanopb (if it’s installed).

Protobuf code is only generated when a generator sub-target is listed as a dependency of another GN target.

GN permits using abbreviated labels when the target name matches the directory name (e.g. //foo for //foo:foo). For consistency with this, the sub-targets for each generator are aliased to the directory when the target name is the same. For example, these two labels are equivalent:


pw_python_package subtargets are also available on the python subtarget:


Supported Codegen

GN supports the following compiled proto libraries via the specified sub-targets generated by a pw_proto_library.

  • ${target_name}.pwpb - Generated C++ pw_protobuf code

  • ${target_name}.pwpb_rpc - Generated C++ pw_protobuf pw_rpc code

  • ${target_name}.nanopb - Generated C++ nanopb code (requires Nanopb)

  • ${target_name}.nanopb_rpc - Generated C++ Nanopb pw_rpc code (requires Nanopb)

  • ${target_name}.raw_rpc - Generated C++ raw pw_rpc code (no protobuf library)

  • ${target_name}.go - Generated GO protobuf libraries

  • ${target_name}.python - Generated Python protobuf libraries


  • sources: List of input .proto files.

  • deps: List of other pw_proto_library dependencies.

  • other_deps: List of other non-proto dependencies.

  • inputs: Other files on which the protos depend (e.g. nanopb .options files).

  • prefix: A prefix to add to the source protos prior to compilation. For example, a source called "foo.proto" with prefix = "nested" will be compiled with protoc as "nested/foo.proto".

  • strip_prefix: Remove this prefix from the source protos. All source and input files must be nested under this path.

  • python_package: Label of Python package to which to add the proto modules. The .python subtarget will redirect to this package.

  • enabled_targets: List of sub-targets to enable (see Supported Codegen), e.g. ["pwpb", "raw_rpc"]. By default, all sub-targets are enabled. The enabled sub-targets are built only as requested by the build system, but it may be necessary to explicitly disable an unused sub-target if it conflicts with another target in the same package. (For example, nanopb codegen can conflict with the default C++ codegen provided by protoc.) TODO(b/235132083): Remove this argument once we’ve removed the file-name conflict between nanopb and protoc code generators.



pw_proto_library("my_protos") {
  sources = [

pw_proto_library("my_other_protos") {
  sources = [ "some/other/path/baz.proto" ]  # imports foo.proto

  # This removes the "some/other/path" prefix from the proto files.
  strip_prefix = "some/other/path"

  # This adds the "my_other_protos/" prefix to the proto files.
  prefix = "my_other_protos"

  # Proto libraries depend on other proto libraries directly.
  deps = [ ":my_protos" ]

source_set("my_cc_code") {
  sources = [

  # When depending on protos in a source_set, specify the generator suffix.
  deps = [ ":my_other_protos.pwpb" ]

From C++, baz.proto included as follows:

#include "my_other_protos/baz.pwpb.h"

From Python, baz.proto is imported as follows:

from my_other_protos import baz_pb2

Proto file structure#

Protobuf source files must be nested under another directory when they are compiled. This ensures that they can be packaged properly in Python.

Using prefix and strip_prefix together allows remapping proto files to a completely different path. This can be useful when working with protos defined in external libraries. For example, consider this proto library:

pw_proto_library("external_protos") {
  sources = [
  strip_prefix = "//other/external/some_library/src/protos"
  prefix = "some_library"

These protos will be compiled by protoc as if they were in this file structure:

├── alpha.proto
├── beta.proto
└── internal
    └── gamma.proto

Adding Python proto modules to an existing package#

By default, generated Python proto modules are organized into their own Python package. These proto modules can instead be added to an existing Python package declared with pw_python_package. This is done by setting the python_package argument on the pw_proto_library and the proto_library argument on the pw_python_package.

For example, the protos declared in my_protos will be nested in the Python package declared by my_package.

pw_proto_library("my_protos") {
  sources = [ "hello.proto ]
  prefix = "foo"
  python_package = ":my_package"

pw_python_pacakge("my_package") {
  generate_setup = {
    metadata = {
      name = "foo"
      version = "1.0"

  sources = [ "foo/" ]
  proto_library = ":my_protos"

The proto module can be used alongside other files in the foo package.

from foo import cool_module, hello_pb2

Working with externally defined protos#

pw_proto_library targets may be used to build .proto sources from existing projects. In these cases, it may be necessary to supply the strip_prefix argument, which specifies the protobuf include path to use for protoc. If only a single external protobuf is being compiled, the python_module_as_package option can be used to override the requirement that the protobuf be nested under a directory. This option generates a Python package with the same name as the proto file, so that the generated proto can be imported as if it were a standalone Python module.

For example, the pw_proto_library target for Nanopb sets python_module_as_package to nanopb_pb2.

pw_proto_library("proto") {
  strip_prefix = "$dir_pw_third_party_nanopb/generator/proto"
  sources = [ "$dir_pw_third_party_nanopb/generator/proto/nanopb.proto" ]
  python_module_as_package = "nanopb_pb2"

In Python, this makes nanopb.proto available as import nanopb_pb2 via the nanopb_pb2 Python package. In C++, nanopb.proto is accessed as #include "nanopb.pwpb.h".

The python_module_as_package feature should only be used when absolutely necessary — for example, to support proto files that include import "nanopb.proto".


CMake provides a pw_proto_library function with similar features as the GN template. The CMake build only supports building firmware code, so pw_proto_library does not generate a Python package.


  • NAME: the base name of the libraries to create

  • SOURCES: .proto source files

  • DEPS: dependencies on other pw_proto_library targets

  • PREFIX: prefix add to the proto files

  • STRIP_PREFIX: prefix to remove from the proto files

  • INPUTS: files to include along with the .proto files (such as Nanopb .options files)





    some/other/path/baz.proto  # imports foo.proto

  # This removes the "some/other/path" prefix from the proto files.

  # This adds the "my_other_protos/" prefix to the proto files.

  # Proto libraries depend on other proto libraries directly.


# When depending on protos in a source_set, specify the generator suffix.
target_link_libraries(my_module.my_cc_code PUBLIC

These proto files are accessed in C++ the same as in the GN build:

#include "my_other_protos/baz.pwpb.h"

Supported Codegen

CMake supports the following compiled proto libraries via the specified sub-targets generated by a pw_proto_library.

  • ${NAME}.pwpb - Generated C++ pw_protobuf code

  • ${NAME}.pwpb_rpc - Generated C++ pw_protobuf pw_rpc code

  • ${NAME}.nanopb - Generated C++ nanopb code (requires Nanopb)

  • ${NAME}.nanopb_rpc - Generated C++ Nanopb pw_rpc code (requires Nanopb)

  • ${NAME}.raw_rpc - Generated C++ raw pw_rpc code (no protobuf library)


In Bazel we provide a set rules with similar features to the GN templates:

  • pwpb_proto_library - Generated C++ pw_protobuf code

  • pwpb_rpc_proto_library - Generated C++ pw_protobuf pw_rpc code

  • raw_rpc_proto_library - Generated C++ raw pw_rpc code (no protobuf library)

  • nanopb_proto_library - Generated C++ nanopb code

  • nanopb_rpc_proto_library - Generated C++ Nanopb pw_rpc code

These rules build the corresponding firmware code; there are no rules for generating Python libraries. The Bazel rules differ slightly compared to the GN build to be more in line with what would be considered idiomatic in Bazel.

To use Pigweeds Protobuf rules you must first pull in the required dependencies into your Bazel WORKSPACE file. e.g.

load("@pigweed//pw_protobuf_compiler:deps.bzl", "pw_protobuf_dependencies")

Bazel uses a different set of rules to manage proto files than it does to compile them. e.g.

# BUILD ...
load("@rules_proto//proto:defs.bzl", "proto_library")

# Manages proto sources and dependencies.
  name = "my_proto",
  srcs = [

# Compiles dependent protos to C++.
  name = "my_proto_pwpb",
  deps = [":my_proto"],

  name = "my_proto_nanopb",
  deps = [":my_proto"],

  name = "my_proto_raw_rpc",
  deps = [":my_proto"],

  name = "my_proto_nanopb_rpc",
  nanopb_proto_library_deps = [":my_proto_nanopb"],
  deps = [":my_proto"],

# Library that depends on only pw_protobuf generated proto targets.
  name = "my_proto_only_lib",
  srcs = ["my/"],
  deps = [":my_proto_pwpb"],

# Library that depends on only Nanopb generated proto targets.
  name = "my_nanopb_only_lib",
  srcs = ["my/"],
  deps = [":my_proto_nanopb"],

# Library that depends on pw_protobuf and pw_rpc/raw.
  name = "my_raw_rpc_lib",
  srcs = ["my/"],
  deps = [
  name = "my_nanopb_rpc_lib",
  srcs = ["my/"],
  deps = [

From my/ you can now include the generated headers. e.g.

#include "my_protos/bar.pwpb.h"
// and/or RPC headers
#include "my_protos/bar.raw_rpc.pb.h
// or
#include "my_protos/bar.nanopb_rpc.pb.h"

Why isn’t there one rule to generate all the code?#

There is! Like in GN, it’s called pw_proto_library, and has subtargets corresponding to the different codegen flavors. However, we recommend against using this target. It is deprecated, and will be removed in the future.

The pw_proto_library target has a number of disadvantages:

  1. As a general bazel style rule, macros should produce exactly one target for external use, named according to the invocation’s name argument. BUILD files are easier to follow when the name specified in the macro call actually matches the name of the generated target. This is not possible if a single macro is generating multiple targets, as pw_proto_library does.

  2. If you depend directly on the pw_proto_library, rather than the appropriate subtargets, you will build code you don’t actually use. You may even fetch dependencies you don’t need, like nanopb.

  3. The subtargets you don’t depend on are still added to your BUILD files by the pw_proto_library macro, and bazel will attempt to build them when you run bazel build //.... This may cause build breakages, and has forced us to implement awkward workarounds.

Python proto libraries#

pw_protobuf_compiler includes utilties for working with protocol buffers in Python. The tools facilitate using protos from their package names (my.pkg.Message()) rather than their generated module names (proto_source_file_pb2.Message()).

python_protos module#

Tools for compiling and importing Python protos on the fly.

class pw_protobuf_compiler.python_protos.Library(modules: Iterable[module])#

A collection of protocol buffer modules sorted by package.

In Python, each .proto file is compiled into a Python module. The Library class makes it simple to navigate a collection of Python modules corresponding to .proto files, without relying on the location of these compiled modules.

Proto messages and other types can be directly accessed by their protocol buffer package name. For example, the message can be accessed in a Library called protos as:

A Library also provides the modules_by_package dictionary, for looking up the list of modules in a particular package, and the modules() generator for iterating over all modules.

__init__(modules: Iterable[module])#

Constructs a Library from an iterable of modules.

A Library can be constructed with modules dynamically compiled by compile_and_import. For example:

protos = Library(compile_and_import(list_of_proto_files))

classmethod from_paths(protos: Iterable[Union[str, Path, module]]) Library#

Creates a Library from paths to proto files or proto modules.

classmethod from_strings(contents: Iterable[str], includes: Iterable[Union[Path, str]] = (), output_dir: Optional[Union[Path, str]] = None) Library#

Creates a proto library from protos in the provided strings.

messages() Iterable#

Iterates over all protobuf messages in this library.

modules() Iterable#

Iterates over all protobuf modules in this library.

pw_protobuf_compiler.python_protos.proto_repr(message, *, wrap: bool = True) str#

Creates a repr-like string for a protobuf.

In an interactive console that imports proto objects into the namespace, the output of proto_repr() can be used as Python source to create a proto object.

  • message – The protobuf message to format

  • wrap – If true and black is available, the output is wrapped according to PEP8 using black.