HTTP to GPRC Translation

HTTP to GPRC Translation

August 2, 2022

HTTP to GPRC Translation

This is a situation that I have encountered serveral times now in my career. There is an existing microservice that was initially designed to be internal-only and now the portal needs access to the capabilities or data provided by this service.

And because many microservices are built with schema-reliant protocols like gRPC with protobuf, they aren’t easy to simply expose as a web service.

What are your options?

There are a few things that you can do to connect your front-end to the gRPC backend. Can you guess which option I picked?

  • You can have your javascript call to a gRPC backend. But this would not help with any REST API (or prometheus scraper) that you may want to support.
  • You can create a translating service that has an HTTP server component and then have that service call the gRPC backend. This allows you to support rest API endpoints.
  • Use Envoy Proxy with its gRPC to HTTP transcoding filter. If you need to add some additional logic, you can also add that using Lua script that executes directly in Envoy.

Envoy HTTP to gRPC Transcoding

In this article, we are going to discuss how to use Envoy Proxy to translate HTTP to gRPC. Additionally, I will show you how to add Lua scripts that rewrite your payload so the incoming request and outgoing response aren’t just JSONified protobuf messages.

Explaining my usecase

Why did I need this? We are developing a new runtime called Wasmflow and part of the requirements is that we should have metrics and statisitcs enabled by default. Many of you may have heard of Prometheus – the de facto metrics collection mechanism – and I want to be able to support Prometheus to scrape metrics from applications running on our runtime.

Prometheus can only scrape metrics from HTTP enabled endpoints. While we are working to formalize the metrics endpoint over HTTP directly in Wasmflow, we currently only have a Stats InvocationService defined in our proto file that can be queried using gRPC. In this blog, I will explain how I used Envoy and Lua to transform my requests and responses to be in a format that Prometheus can consume.

Example Project Source Code

Feel free to clone the source code. You can see everything working by running the Docker compose command docker-compose up. It has a client, Wasmflow server, Envoy proxy, and Prometheus running. You will be able to view the metrics being scraped by Prometheus. This will automatically create a server and then simulate traffic so you can see the metrics go up and be collected over time in Prometheus. Once you have started the docker containers, you can use your browser to visit localhost:9090 to look at the metrics in the local Prometheus instance.

Step 1: Prepare your development environment

To use Envoy’s gRPC to JSON transcoding functionalty, you need to first prepare your protobuf file. You will need to download the googleapis repo and make sure you have protoc installed. Instructions are in the links.

Step 2: Update your protobuf file

To use the gRPC to JSON transcoding, you first need to update and compile your protobuf file so Envoy Proxy knows which gRPC services will be reachable through which HTTP paths.

This is done by updating the appropriate protobuf service with a google.api.http option. Make sure you add the import "google/api/annotations.proto"; line at the top so your protoc can properly compile the protobuf file. In my usecase, I want Prometheus to scrape the /metrics endpoint using HTTP to speak to the Stats service while other gRPC services can continue to call the /wasmflow.InvocationService/Stats endpoint using gRPC natively.

syntax = "proto3";
import "google/api/annotations.proto";

package wasmflow;

service InvocationService {
  rpc Invoke(Invocation) returns (stream Output);
  rpc List(ListRequest) returns (ListResponse);
  rpc Stats(StatsRequest) returns (StatsResponse) {
    option (google.api.http) = {
      get: "/metrics"
    };
  };

}

...

Once you have updated the protobuf with the appropriate HTTP endpoint, you can compile your protobuf using the following command (make sure your $GOOGLEAPIS_DIR env variable has been manually populated from Step 1):

protoc -I${GOOGLEAPIS_DIR} -I. --include_imports --include_source_info --descriptor_set_out=./envoy/wasmflow.pb ./envoy/wasmflow.proto

Step 3: Update your Envoy config file to enable gRPC transcoding

In my initial Envoy config, I created two route matches. This will allow me some greater flexibility later when I want to control which requests are manipulated by Lua. It is possible to do a single route match for / and use Lua to only operate based on reading the response content-type header. I prefer these types of conditionals to be very explicit so I opted for this approach.

virtual_hosts:
  - name: local_service
    domains: ["*"]
    routes:
      - match:
          prefix: "/metrics"
        route:
          cluster: wasmflow
      - match:
          prefix: "/wasmflow.InvocationService"
          grpc: {}
        route:
          cluster: wasmflow

I then enabled the grpc_json_transcoder filter. I made sure to set match_incoming_request_route and preserve_proto_field_names to true. If you don’t preserve the field names, it will automatically convert any snake_case fields to camelCase. The match_incoming_request_route was set to true so the above route match condition works as expected.

name: envoy.filters.http.grpc_json_transcoder
typed_config:
  "@type": type.googleapis.com/envoy.extensions.filters.http.grpc_json_transcoder.v3.GrpcJsonTranscoder
  proto_descriptor: "/etc/envoy/wasmflow.pb"
  services: ["wasmflow.InvocationService"]
  match_incoming_request_route: true
  print_options:
    add_whitespace: true
    always_print_primitive_fields: true
    always_print_enums_as_ints: false
    preserve_proto_field_names: true

Step 4: Add some Lua code to convert from JSON to Prometheus Metric

I created a Lua script that reads the response JSON and uses the values to create a text that is compliant with the Prometheus Exposition Format. I made my life simpler by importing a JSON parsing library for Lua.

The Lua script is straightforward.

  • read the body
  • parse as JSON
  • create a text output using on JSON values
  • set the response body and update headers

In the Final Envoy Config, I added the envoy.filter.http.lua extension in the http_filters section above (before) the grpc_json_transcoder filter:

http_filters:
  - name: envoy.filters.http.lua
    typed_config:
      "@type": type.googleapis.com/envoy.extensions.filters.http.lua.v3.Lua
      source_codes:
        convert.lua:
          filename: /lua/convert.lua
- name: envoy.filters.http.grpc_json_transcoder
...

Then I enable Lua for the /metrics path and disable it for the gRPC path.

virtual_hosts:
  - name: local_service
    domains: ["*"]
    routes:
      - match:
          prefix: "/metrics"
        route:
          cluster: wasmflow
        typed_per_filter_config:
          envoy.filters.http.lua:
            "@type": type.googleapis.com/envoy.extensions.filters.http.lua.v3.LuaPerRoute
            name: convert.lua
      - match:
          prefix: "/wasmflow.InvocationService"
          grpc: {}
        route:
          cluster: wasmflow
        typed_per_filter_config:
          envoy.filters.http.lua:
            "@type": type.googleapis.com/envoy.extensions.filters.http.lua.v3.LuaPerRoute
            disabled: true

Step 5: Done

Envoy proxy is very powerful and Lua makes it easy to do some heavy lifting without much effort. If you have any questions about this or want to discuss any related topics, join our Discord chat.

Written By
Fawad Shaikh
Fawad Shaikh