pw_protobuf_compiler#
The Protobuf compiler module provides build system integration and wrapper scripts for generating source code for Protobuf definitions.
Protobuf compilation#
Generator support#
Protobuf code generation is currently supported for the following generators:
Generator |
Code |
Notes |
pw_protobuf |
|
Compiles using |
pw_protobuf RPC |
|
Compiles pw_rpc service and client code for
|
Nanopb |
|
Compiles using Nanopb. The build argument
|
Nanopb RPC |
|
Compiles pw_rpc service and client code for nanopb. Requires a nanopb installation. |
Raw RPC |
|
Compiles raw binary pw_rpc service code. |
Go |
|
Compiles using the standard Go protobuf plugin with gRPC service support. |
Python |
|
Compiles using the standard Python protobuf
plugin, creating a |
Typescript |
|
Compilation is supported in Bazel via @rules_proto_grpc. ProtoCollection provides convience methods for proto descriptors. |
GN template#
This module provides a pw_proto_library
GN template that defines a
collection of protobuf files that should be compiled together. The template
creates a sub-target for each supported generator, named
<target_name>.<generator>
. These sub-targets generate their respective
protobuf code, and expose it to the build system appropriately (e.g. a
pw_source_set
for C/C++).
For example, given the following target:
pw_proto_library("test_protos") {
sources = [ "my_test_protos/test.proto" ]
}
test_protos.pwpb
compiles code for pw_protobuf, and test_protos.nanopb
compiles using Nanopb (if it’s installed).
Protobuf code is only generated when a generator sub-target is listed as a dependency of another GN target.
GN permits using abbreviated labels when the target name matches the directory
name (e.g. //foo
for //foo:foo
). For consistency with this, the
sub-targets for each generator are aliased to the directory when the target name
is the same. For example, these two labels are equivalent:
//path/to/my_protos:my_protos.pwpb
//path/to/my_protos:pwpb
pw_python_package
subtargets are also available on the python
subtarget:
//path/to/my_protos:my_protos.python.lint
//path/to/my_protos:python.lint
Supported Codegen
GN supports the following compiled proto libraries via the specified
sub-targets generated by a pw_proto_library
.
${target_name}.pwpb
- Generated C++ pw_protobuf code${target_name}.pwpb_rpc
- Generated C++ pw_protobuf pw_rpc code${target_name}.nanopb
- Generated C++ nanopb code (requires Nanopb)${target_name}.nanopb_rpc
- Generated C++ Nanopb pw_rpc code (requires Nanopb)${target_name}.raw_rpc
- Generated C++ raw pw_rpc code (no protobuf library)${target_name}.go
- Generated GO protobuf libraries${target_name}.python
- Generated Python protobuf libraries
Arguments
sources
: List of input .proto files.deps
: List of other pw_proto_library dependencies.other_deps
: List of other non-proto dependencies.inputs
: Other files on which the protos depend (e.g. nanopb.options
files).prefix
: A prefix to add to the source protos prior to compilation. For example, a source called"foo.proto"
withprefix = "nested"
will be compiled with protoc as"nested/foo.proto"
.strip_prefix
: Remove this prefix from the source protos. All source and input files must be nested under this path.python_package
: Label of Python package to which to add the proto modules. The .python subtarget will redirect to this package.enabled_targets
: List of sub-targets to enable (see Supported Codegen), e.g.["pwpb", "raw_rpc"]
. By default, all sub-targets are enabled. The enabled sub-targets are built only as requested by the build system, but it may be necessary to explicitly disable an unused sub-target if it conflicts with another target in the same package. (For example,nanopb
codegen can conflict with the default C++ codegen provided byprotoc
.) TODO: b/235132083 - Remove this argument once we’ve removed the file-name conflict between nanopb and protoc code generators.
Example
import("$dir_pw_protobuf_compiler/proto.gni")
pw_proto_library("my_protos") {
sources = [
"my_protos/foo.proto",
"my_protos/bar.proto",
]
}
pw_proto_library("my_other_protos") {
sources = [ "some/other/path/baz.proto" ] # imports foo.proto
# This removes the "some/other/path" prefix from the proto files.
strip_prefix = "some/other/path"
# This adds the "my_other_protos/" prefix to the proto files.
prefix = "my_other_protos"
# Proto libraries depend on other proto libraries directly.
deps = [ ":my_protos" ]
}
source_set("my_cc_code") {
sources = [
"foo.cc",
"bar.cc",
"baz.cc",
]
# When depending on protos in a source_set, specify the generator suffix.
deps = [ ":my_other_protos.pwpb" ]
}
From C++, baz.proto
included as follows:
#include "my_other_protos/baz.pwpb.h"
From Python, baz.proto
is imported as follows:
from my_other_protos import baz_pb2
Proto file structure#
Protobuf source files must be nested under another directory when they are compiled. This ensures that they can be packaged properly in Python.
Using prefix
and strip_prefix
together allows remapping proto files to
a completely different path. This can be useful when working with protos defined
in external libraries. For example, consider this proto library:
pw_proto_library("external_protos") {
sources = [
"//other/external/some_library/src/protos/alpha.proto",
"//other/external/some_library/src/protos/beta.proto,
"//other/external/some_library/src/protos/internal/gamma.proto",
]
strip_prefix = "//other/external/some_library/src/protos"
prefix = "some_library"
}
These protos will be compiled by protoc as if they were in this file structure:
some_library/
├── alpha.proto
├── beta.proto
└── internal
└── gamma.proto
Adding Python proto modules to an existing package#
By default, generated Python proto modules are organized into their own Python
package. These proto modules can instead be added to an existing Python package
declared with pw_python_package
. This is done by setting the
python_package
argument on the pw_proto_library
and the
proto_library
argument on the pw_python_package
.
For example, the protos declared in my_protos
will be nested in the Python
package declared by my_package
.
pw_proto_library("my_protos") {
sources = [ "hello.proto ]
prefix = "foo"
python_package = ":my_package"
}
pw_python_pacakge("my_package") {
generate_setup = {
metadata = {
name = "foo"
version = "1.0"
}
}
sources = [ "foo/cool_module.py" ]
proto_library = ":my_protos"
}
The hello_pb2.py
proto module can be used alongside other files in the
foo
package.
from foo import cool_module, hello_pb2
Working with externally defined protos#
pw_proto_library
targets may be used to build .proto
sources from
existing projects. In these cases, it may be necessary to supply the
strip_prefix
argument, which specifies the protobuf include path to use for
protoc
. If only a single external protobuf is being compiled, the
python_module_as_package
option can be used to override the requirement that
the protobuf be nested under a directory. This option generates a Python package
with the same name as the proto file, so that the generated proto can be
imported as if it were a standalone Python module.
For example, the pw_proto_library
target for Nanopb sets
python_module_as_package
to nanopb_pb2
.
pw_proto_library("proto") {
strip_prefix = "$dir_pw_third_party_nanopb/generator/proto"
sources = [ "$dir_pw_third_party_nanopb/generator/proto/nanopb.proto" ]
python_module_as_package = "nanopb_pb2"
}
In Python, this makes nanopb.proto
available as import nanopb_pb2
via
the nanopb_pb2
Python package. In C++, nanopb.proto
is accessed as
#include "nanopb.pwpb.h"
.
The python_module_as_package
feature should only be used when absolutely
necessary — for example, to support proto files that include
import "nanopb.proto"
.
Specifying a custom protoc
#
If your build needs to use a custom build of protoc
rather than the one
supplied by pigweed it can be specified by setting
pw_protobuf_compiler_PROTOC_TARGET
to a GN target that produces a protoc
executable and pw_protobuf_compiler_PROTOC_BINARY
to the path, relative to
root_build_dir
, of the protoc
executable.
For all protoc
invocations, the build will add a dependency on that target
and will invoke that executable.
CMake#
CMake provides a pw_proto_library
function with similar features as the
GN template. The CMake build only supports building firmware code, so
pw_proto_library
does not generate a Python package.
Arguments
NAME
: the base name of the libraries to createSOURCES
: .proto source filesDEPS
: dependencies on otherpw_proto_library
targetsPREFIX
: prefix add to the proto filesSTRIP_PREFIX
: prefix to remove from the proto filesINPUTS
: files to include along with the .proto files (such as Nanopb .options files)
Example
include($ENV{PW_ROOT}/pw_build/pigweed.cmake)
include($ENV{PW_ROOT}/pw_protobuf_compiler/proto.cmake)
pw_proto_library(my_module.my_protos
SOURCES
my_protos/foo.proto
my_protos/bar.proto
)
pw_proto_library(my_module.my_protos
SOURCES
my_protos/foo.proto
my_protos/bar.proto
)
pw_proto_library(my_module.my_other_protos
SOURCES
some/other/path/baz.proto # imports foo.proto
# This removes the "some/other/path" prefix from the proto files.
STRIP_PREFIX
some/other/path
# This adds the "my_other_protos/" prefix to the proto files.
PREFIX
my_other_protos
# Proto libraries depend on other proto libraries directly.
DEPS
my_module.my_protos
)
add_library(my_module.my_cc_code
foo.cc
bar.cc
baz.cc
)
# When depending on protos in a source_set, specify the generator suffix.
target_link_libraries(my_module.my_cc_code PUBLIC
my_module.my_other_protos.pwpb
)
These proto files are accessed in C++ the same as in the GN build:
#include "my_other_protos/baz.pwpb.h"
Supported Codegen
CMake supports the following compiled proto libraries via the specified
sub-targets generated by a pw_proto_library
.
${NAME}.pwpb
- Generated C++ pw_protobuf code${NAME}.pwpb_rpc
- Generated C++ pw_protobuf pw_rpc code${NAME}.nanopb
- Generated C++ nanopb code (requires Nanopb)${NAME}.nanopb_rpc
- Generated C++ Nanopb pw_rpc code (requires Nanopb)${NAME}.raw_rpc
- Generated C++ raw pw_rpc code (no protobuf library)
Bazel#
In Bazel we provide a set rules with similar features to the GN templates:
pwpb_proto_library
- Generated C++ pw_protobuf codepwpb_rpc_proto_library
- Generated C++ pw_protobuf pw_rpc coderaw_rpc_proto_library
- Generated C++ raw pw_rpc code (no protobuf library)nanopb_proto_library
- Generated C++ nanopb codenanopb_rpc_proto_library
- Generated C++ Nanopb pw_rpc code
These rules build the corresponding firmware code; there are no rules for generating Python libraries. The Bazel rules differ slightly compared to the GN build to be more in line with what would be considered idiomatic in Bazel.
To use Pigweeds Protobuf rules you must first pull in the required dependencies into your Bazel WORKSPACE file. e.g.
# WORKSPACE ...
load("@pigweed//pw_protobuf_compiler:deps.bzl", "pw_protobuf_dependencies")
pw_protobuf_dependencies()
Bazel uses a different set of rules to manage proto files than it does to compile them. e.g.
# BUILD ...
load("@rules_proto//proto:defs.bzl", "proto_library")
load("@pigweed//pw_protobuf_compiler:pw_proto_library.bzl",
"nanopb_proto_library",
"nanopb_rpc_proto_library",
"pwpb_proto_library",
"raw_rpc_proto_library",
)
# Manages proto sources and dependencies.
proto_library(
name = "my_proto",
srcs = [
"my_protos/foo.proto",
"my_protos/bar.proto",
]
)
# Compiles dependent protos to C++.
pwpb_proto_library(
name = "my_proto_pwpb",
deps = [":my_proto"],
)
nanopb_proto_library(
name = "my_proto_nanopb",
deps = [":my_proto"],
)
raw_rpc_proto_library(
name = "my_proto_raw_rpc",
deps = [":my_proto"],
)
nanopb_rpc_proto_library(
name = "my_proto_nanopb_rpc",
nanopb_proto_library_deps = [":my_proto_nanopb"],
deps = [":my_proto"],
)
# Library that depends on only pw_protobuf generated proto targets.
cc_library(
name = "my_proto_only_lib",
srcs = ["my/proto_only.cc"],
deps = [":my_proto_pwpb"],
)
# Library that depends on only Nanopb generated proto targets.
cc_library(
name = "my_nanopb_only_lib",
srcs = ["my/nanopb_only.cc"],
deps = [":my_proto_nanopb"],
)
# Library that depends on pw_protobuf and pw_rpc/raw.
cc_library(
name = "my_raw_rpc_lib",
srcs = ["my/raw_rpc.cc"],
deps = [
":my_proto_pwpb",
":my_proto_raw_rpc",
],
)
cc_library(
name = "my_nanopb_rpc_lib",
srcs = ["my/proto_only.cc"],
deps = [
":my_proto_nanopb_rpc",
],
)
From my/lib.cc
you can now include the generated headers.
e.g.
#include "my_protos/bar.pwpb.h"
// and/or RPC headers
#include "my_protos/bar.raw_rpc.pb.h
// or
#include "my_protos/bar.nanopb_rpc.pb.h"
Why isn’t there one rule to generate all the code?#
There is! Like in GN, it’s called pw_proto_library
, and has subtargets
corresponding to the different codegen flavors. However, new code should not
use this. It is deprecated, and will be removed in the future.
The pw_proto_library
target has a number of disadvantages:
As a general bazel style rule, macros should produce exactly one target for external use, named according to the invocation’s name argument.
BUILD
files are easier to follow when the name specified in the macro call actually matches the name of the generated target. This is not possible if a single macro is generating multiple targets, aspw_proto_library
does.If you depend directly on the
pw_proto_library
, rather than the appropriate subtargets, you will build code you don’t actually use. You may even fetch dependencies you don’t need, like nanopb.The subtargets you don’t depend on are still added to your BUILD files by the
pw_proto_library
macro, and bazel will attempt to build them when you runbazel build //...
. This may cause build breakages, and has forced us to implement awkward workarounds.
Python proto libraries#
pw_protobuf_compiler
includes utilties for working with protocol buffers
in Python. The tools facilitate using protos from their package names
(my.pkg.Message()
) rather than their generated module names
(proto_source_file_pb2.Message()
).
python_protos
module#
Tools for compiling and importing Python protos on the fly.
- class pw_protobuf_compiler.python_protos.Library(modules: Iterable[ModuleType])#
A collection of protocol buffer modules sorted by package.
In Python, each .proto file is compiled into a Python module. The Library class makes it simple to navigate a collection of Python modules corresponding to .proto files, without relying on the location of these compiled modules.
Proto messages and other types can be directly accessed by their protocol buffer package name. For example, the foo.bar.Baz message can be accessed in a Library called protos as:
protos.packages.foo.bar.Baz
A Library also provides the modules_by_package dictionary, for looking up the list of modules in a particular package, and the modules() generator for iterating over all modules.
- __init__(modules: Iterable[ModuleType])#
Constructs a Library from an iterable of modules.
A Library can be constructed with modules dynamically compiled by compile_and_import. For example:
protos = Library(compile_and_import(list_of_proto_files))
- classmethod from_paths(
- protos: Iterable[str | Path | ModuleType],
Creates a Library from paths to proto files or proto modules.
- classmethod from_strings(
- contents: Iterable[str],
- includes: Iterable[str | Path] = (),
- output_dir: Path | str | None = None,
Creates a proto library from protos in the provided strings.
- messages() Iterable #
Iterates over all protobuf messages in this library.
- modules() Iterable #
Iterates over all protobuf modules in this library.
- pw_protobuf_compiler.python_protos.proto_repr(message, *, wrap: bool = True) str #
Creates a repr-like string for a protobuf.
In an interactive console that imports proto objects into the namespace, the output of proto_repr() can be used as Python source to create a proto object.
- Parameters:
message – The protobuf message to format
wrap – If true and black is available, the output is wrapped according to PEP8 using black.