Below you will find pages that utilize the taxonomy term “Protobufs”
Google Cloud Events protobufs and SDKs
I’ve written before about Ackal’s use of Firestore and subscribing to Firestore document CRUD events:
- Routing Firestore events to GKE with Eventarc
- Cloud Firestore Triggers in Golang using Firestore triggers
I find Google’s Eventarc documentation to be confusing and, in typical Google fashion, even though open-sourced, you often need to do some legwork to find relevant sources, viz:
- Google’s Protobufs for Eventarc (using cloudevents)
google-cloudevents
1 - Convenience (since you can generate these using
protoc
) language-specific types generated from the above e.g.google-cloudevents-go
;google-cloudevents-python
etc.
1 – IIUC EventArc is the Google service. It carries Google Events that are CloudEvents. These are defined by protocol buffers schemas.
Prost! Tonic w/ a dash of JSON
I naively (!) began exploring JSON marshaling of Protobufs in rust. Other protobuf language SDKs include JSON marshaling making the process straightforward. I was to learn that, in rust, it’s not so simple. Unfortunately, for me, this continues to discourage my further use of rust (rust is just hard).
My goal was to marshal an arbitrary protocol buffer message that included a oneof
feature. I was unable to JSON marshal the rust generated by tonic
for such a message.
Prometheus Protobufs and Native Histograms
I responded to a question Prometheus metric protocol buffer in gRPC on Stackoverflow and it piqued my curiosity and got me yak shaving.
Prometheus used to support two exposition formats including Protocol Buffers, then dropped Protocol Buffer and has since re-added it (see Protobuf format). The Protobuf format has returned to support the experimental Native Histograms feature.
I’m interested in adding Native Histogram support to Ackal so thought I’d learn more about this metric.
Gnarly Protocol Buffers compilation
This Stackoverflow question piqued my interest:
retry policy configuration for grpc not working
Service Config in gRPC is new to me but, my initial suspicion (albeit incorrect) was that the JSON types were incorrect.
I decided to try using the Protocol Buffer source service_config.proto
to verify the JSON.
To do so I needed to compile the source…. it was gnarly.
There are 2 repos used:
The service_config.proto
includes options
for java_package
but no go_package
.
Python Protobuf changes
Python’s Protocol Buffers code-generation using protoc
has had significant changes that can cause developers… “challenges”. This post summarizes my experience of these mostly to save me from repreatedly recreating this history for myself when I forget it.
- Version change
- Generated code change
- Implementation Backends
I’ll use this summarized table of proto
and the Pypi library’s history in this post. protoc
refers to the compiler that supports code-generation in multiple languages. protobuf
refers to the corresponding Python (runtime) library on Pypi:
Access Google Services using gRPC
Google publishes interface definitions of Google APIs (services) that support REST and gRPC in a repo called Google APIs. Google’s SDKs uses gRPC to access these services but, how to do this using e.g. gRPCurl?
I wanted to debug Cloud Profiler and its agent makes UpdateProfile
RPCs to cloudprofiler.googleapis.com
. Cloud Profiler is more challenging service to debug because (a) it’s publicly “write-only”; and (b) it has complex messages. UpdateProfile
sends UpdateProfileRequest
messages that include Profile
messages that include profile_bytes
which are gzip compressed serialized protos of pprof’s Profile
.
WASM Transparency
I’ve been playing around with a proof-of-concept combining WASM and Trillian. The hypothesis was to explore using WASM as a form of chaincode with Trillian. The project works but it’s far from being a chaincode-like solution.
Let’s start with a couple of (trivial) examples and then I’ll explain what’s going on and how it’s implemented.
2020/08/14 18:42:17 [main:loop:dynamic-invoke] Method: mul
2020/08/14 18:42:17 [random:New] Message
2020/08/14 18:42:17 [random:New] Float32
2020/08/14 18:42:17 [random:New] Float32
2020/08/14 18:42:17 [random:New] Message
2020/08/14 18:42:17 [random:New] Float32
2020/08/14 18:42:17 [random:New] Float32
2020/08/14 18:42:17 [Client:Invoke] Metadata: complex.wasm
2020/08/14 18:42:17 [main:loop:dynamic-invoke] Success: result:{real:0.036980484 imag:0.3898267}
After shipping a Rust-sourced WASM solution (complex.wasm
) to the WASM transparency server, the client invokes a method mul
that’s exposed by it using a dynamically generated request message and outputs the response. Woo hoo! Yes, an expensive way to multiple complex numbers.
Remotely invoking WASM functions using gRPC and waPC
Following on from waPC & Protobufs, I can now remotely invoke (arbitrary) WASM functions:
Client:
The logging isn’t perfectly clear but, the client gets (a previously added) WASM binary from the server (using the SHA-256 of the WASM binary as a unique identifier). The result includes metadata that includes a protobuf descriptor of the WASM binary’s functions. The descriptor defines gRPC services (that represent the WASM functions) with input (parameters) and output (results) messages.
Golang Protobuf APIv2
Google has a new Golang Protobuf API, APIv2 (google.golang.org/protobuf) superseding APIv1 (github.com/golang/protobuf). If your code is importing github.com/golang/protobuf
, you’re using APIv2. Otherwise, you should consult with the docs because Google reimplemented APIv1 atop APIv2. One challenge this caused me, as someone who does not use protobufs and gRPC on a daily basis, is that gRPC (code-generation) is being removed from the (Golang) protoc-gen-go
, the Golang plugin that generates gRPC service bindings.
WASM Cloud Functions
Following on from waPC & Protobufs and a question on Stack Overflow about Cloud Functions, I was compelled to try running WASM on Cloud Functions no JavaScript.
I wanted to reuse the WASM waPC functions that I’d written in Rust as described in the other post. Cloud Functions does not (yet!?) provide a Rust runtime and so I’m using the waPC Host for Go in this example.
It works!
PARAMS=$(printf '{"a":{"real":39,"imag":3},"b":{"real":39,"imag":3}}' | base64)
TOKEN=$(gcloud auth print-identity-token)
echo "{
\"filename\":\"complex.wasm\",
\"function\":\"c:mul\",
\"params\":\"${PARAMS}\"
}" |\
curl \
--silent \
--request POST \
--header "Content-Type: application/json" \
--header "Authorization: Bearer ${TOKEN}" \
--data @- \
https://${REGION}-${PROJECT}.cloudfunctions.net/invoker
yields (correctly):
waPC & Protobufs
I’m hacking around with a solution that combines WASM and Google Trillian.
Ultimately, I’d like to be able to ship WASM (binaries) to a Trillian personality and then invoke (exported) functions on them. Some this was borne from the interesting exploration of Krustlet and its application of wascc.
I’m still booting into WASM but it’s a very interesting technology that has most interesting potential outside the browser. Some folks have been trailblazing the technology and I have been reading Kevin Hoffman’s medium and wascc (nee waxosuit) work. From this, I stumbled upon Kevin’s waPC and I’m using waPC in this prototyping as a way to exchange data between clients and servers running WASM binaries.
Rust implementation of Crate Transparency using Google Trillian
I’ve been hacking on a Rust-based transparent application for Google Trillian. As appears to be my fixation, this personality is for another package manager. This time, Rust’s Crates often found in crates.io
which is Rust’s Package Registry. I discussed this project earlier this month Rust Crate Transparency && Rust SDK for Google Trillian and and earlier approach for Python’s packages with pypi-transparency.
This time, of course, I’m using Rust. And, by way of a first for me, for the gRPC server implementation (aka “personality”). I’ve been lazy thanks to the excellent gRPCurl and have been using it way of a client. Because I’m more familiar with Golang and because I’ve written (most) other Trillian personalities in Golang, I resorted to quickly implementing Crate Transparency in Golang too in order to uncover bugs with the Rust implementation. I’ll write a follow-up post on the complexity I seem to struggle with when using protobufs and gRPC [in Golang].
gRPC, Cloud Run & Endpoints
<3 Google but there’s quite often an assumption that we’re all sitting around the engineering table and, of course, we’re not.
Cloud Endpoints is a powerful offering but – IMO – it’s super confusing to understand and complex to deploy.
If you’re familiar with the motivations behind service meshes (e.g. Istio), Cloud Endpoints fits in a similar niche (“neesh” or “nitch”?). The underlying ambition is that, developers can take existing code and by adding a proxy (or sidecar), general-purpose abstractions, security, logging etc. may be added.
Golang gRPC Cloud Run
Update: 2020-03-24: Since writing this post, I’ve contributed Golang and Rust samples to Google’s project. I recommend you start there.
Google explained how to run gRPC servers with Cloud Run. The examples are good but only Python and Node.JS:
Missing Golang…. until now ;-)
I had problems with 1.14 and so I’m using 1.13.
Project structure
I’ll tidy up my repo but the code may be found:
Google's New Golang SDK for Protobufs
Google has released a new Golang SDK for protobuf. In the [announcement], a useful tool to redact properties is described. If like me, this is somewhat novel to you, here’s a mashup using Google’s Protocol Buffer Basics w/ redaction.
To be very clear, as it’s an important distinction:
Version | Repo | Docs |
---|---|---|
v2 | google.golang.org/protobuf | Docs |
v1 | github.com/golang/protobuf | Docs |
Project
Here’s my project structure:
.
├── protoc-3.11.4-linux-x86_64
│ ├── bin
│ │ └── protoc
│ ├── include
│ │ └── google
│ └── readme.txt
└── src
├── go.mod
├── go.sum
├── main.go
├── protos
│ ├── addressbook.pb.go
│ └── addressbook.proto
└── README.md
You may structure as you wish.