Category Archives: gRPC

The “UNAVAILABLE: Trying to connect an http1.x server” gRPC error

I’m working with on C# client library using gRpc. This is not a web based library where there’s a need for envoy or other application to help work with HTTP2. Instead this was on a C# client using tcp/ip.

Everything was working fine on my machine until the machine was relocated (not that this specifically was the problem as I later found out others were having similar problems, some times intermittently) at which point I found that the client would exception with the message “UNAVAILABLE: Trying to connect an http1.x server”.

Note: if you turn on all exceptions within Visual Studio, you’ll firstly see an exception where grpc tries to get a Value from a Nullable when it’s null (this was the call to Native.grpcsharp_batch_context_recv_message_length). This is a red herring, it’s simple that the nullable is null which seems to be expected behaviour, maybe should be handled in the gRPC .NET library code.

Testing the client on another machine demonstrated the problem was seemingly network related, turns out it was HTTP proxy related to be precise.

This doesn’t seem to be too well documented from what I could tell, but setting the ChannelOption grpc.enable_http_proxy to 0 (see https://grpc.github.io/grpc/core/group__grpc__arg__keys.html) fixed the problem.

var channel = new ManagedChannel(
   host, 
   port, 
   ChannelCredentials.Insecure,
   new []{ new ChannelOption("grpc.enable_http_proxy", 0)} );

Errors within gRPC

Whilst we’d love our code to work perfectly, we also need to handle errors that might come from our server code.

We might simple wrap any response’s in a Result class (or similar) which returns data along with a success or failure status code (or ofcourse include a status code in a response object).

C#, Java etc. also use exceptions, so we might prefer to throw exceptions on the server.

gRPC supports exceptions but with some limitations…

Exceptions come with status codes

Any exceptions thrown on the server will result in RpcException being thrown – these do NOT reflect the exception that was originally thrown, i.e. if your server code threw an ArgumentException the client will only ever see an RpcExecption with a StatusCode of Unknown.

As you can see, an RpcException includes a status code, so obviously our server can instead throw an RpcException and supply the Status and StatusCode along with it, for example

throw new RpcException(
   new Status(
      StatusCode.FailedPrecondition, 
      $"Argument {nameof(request)} is invalid"));

Ofcourse handling such exceptions is as simple as

try
{
   var response = client.Query(new NotesReques())
}
catch(RpcException e) 
{
   Debug.WriteLine(e.Status.Detail);
   Debug.WriteLine(e.Status.StatusCode);
}

A little more gRPC

In my last two posts I looked at using .proto files to define the IDL which protoc.exe along with gRPC generated our preferred language’s data files and method calls – in our case this meant generating C# files.

Now let’s look at how we can extend our code a little further, for example to include meat data/headers for sending tokens or other state between the client and server.

What was generated for us?

When we generated our code using the gRPC plugin, we started with the IDL method written as

service MusicService {
   rpc Query(NotesRequest) returns (NotesResponse) {
   }

however several overloads for this method were generated (I’ve cleaned them up to remove superfluous namespaces etc. and to make them easier to read in a blog post

public virtual NotesResponse Query(
   NotesRequest request, 
   Metadata headers = null, 
   DateTime? deadline = null, 
   CancellationToken cancellationToken = 
      default(CancellationToken))
{
   return Query(request, 
      new CallOptions(
         headers, 
         deadline, 
         cancellationToken));
}

public virtual NotesResponse Query(
   NotesRequest request, 
   CallOptions options)
{
   return CallInvoker.BlockingUnaryCall(
       __Method_Query, 
       null, 
       options, 
       request);
}

public virtual AsyncUnaryCall<NotesResponse> QueryAsync(
   NotesRequest request, 
   Metadata headers = null, 
   DateTime? deadline = null, 
   CancellationToken cancellationToken = 
       default(CancellationToken))
{
   return QueryAsync(
       request, 
       new CallOptions(
           headers, 
           deadline, 
           cancellationToken));
}

public virtual AsyncUnaryCall<NotesResponse> QueryAsync(
   NotesRequest request, 
   CallOptions options)
{
   return CallInvoker.AsyncUnaryCall(
      __Method_Query, 
      null, 
      options, 
      request);
}

We’ve got several overloads, include async version and there’s support for passing metadata headers, a deadline (similar to a timeout), as well as a cancellation token.

Metadata

We might use the Metadata argument to pass in SSO tokens or other relevant information that does not form part of the actual message, here’s an example of using the Query method in such a way

// client code
var response = client.Query(request, new Metadata
{
   new Metadata.Entry("SSO", token)
});

// server 
var token = 
   context?.RequestHeaders?.FirstOrDefault(e => e.Key == "sso");

Note: Beware, for some reason, the key has been turned to lower case.

In the server method we implemented in the last post you’ll notice that along with the request there’s the ServerCallContext type which contains all the other parameters that might be sent via the client, i.e. headers, cancellation token etc.

The server can return meta data using the ServerContext’s ResponseTrailers. However the client must use the Async versions of the client methods to receive these extra bits of data.

Here’s an example of returning something via ResponseTrailers, from the server

context.ResponseTrailers.Add(new Metadata.Entry("SSO", "Success"));

and the client would change to use the QueryAsync overload and possibly look like this

var response = client.QueryAsync(request, new Metadata
{
   new Metadata.Entry("SSO", "abcdefg")
});

foreach (var note in response.ResponseAsync.Result.Notes)
{
   Console.WriteLine(note);
}

var rt = response
   .GetTrailers()
   .FirstOrDefault(e => e.Key == "sso")
   .Value;

CallOptions

Ultimately the methods that take Metadata and other arguments end up wrapping those arguments in a CallOptions struct. However CallOptions also supports CallCredentials as well as a couple of other types (WriteOptions and ContextPropagationToken which I will not be looking at in this post).

Using gRPC with Protocol Buffers

In the last post, Using Protocol Buffers, we looked at creating .proto files and how we generate C# code from them along with how to stream the binary data created by the generated code.

Let’s now look at how we might like to use Protocol Buffers “over the wire”.

Whilst we can write our own sockets code and stream data via this, there’s a close relationship between gRPC and Protocol Buffers which allows us to generate RPC code using the. proto file.

Let’s begin by adding a service and a method to the .proto file, so add the following

service MusicService {
   rpc Query(NotesRequest) returns (NotesResponse) {
   }
}

In the last post, I mentioned the NoteResponse was designed for use in the remoting code. Here it’s our return type.

To generate the gRPC code we need to make some additions to our previously defined Pre-Build event. Before we can do that, we need some more tools installed. So using nuget install the following package, Grpc.Tools, while you’re at it, if you’re working with the VS project previously defined, also add the package Grpc.

Now, append the following to the Pre-Build command (formatted to be a little more readable)

--grpc_out $(ProjectDir) 
--plugin=protoc-gen-grpc=$(SolutionDir)packages\Grpc.Tools.1.12.0\tools\windows_x86\grpc_csharp_plugin.exe

In my case, rebuilding my VS solution will result in a new file MusicGrpc.cs which I’ll need to include in the project.

If you’ve created a Console application already, this can act as the server, so you’ll need to create another Console application to be our client. I won’t go through all the steps for adding the files etc. but let’s just jump straight into looking at the server code.

The Server

Add a new class, mine’s MusicServer we derive this from the gRPC generated MusicServiceBase, like this

using System.Threading.Tasks;
using Grpc.Core;
using PutridParrot.Music;

namespace Server
{
    public class MusicServer : MusicService.MusicServiceBase
    {
        public override Task<NotesResponse> Query(
           NotesRequest request, 
           ServerCallContext context)
        {
            if (request.Key == Note.C)
            {
                return Task.FromResult(new NotesResponse
                {
                    Name = request.Name,
                    Key = request.Key,
                    Notes =
                    {
                        Note.C, Note.E, Note.G
                    }
                });
            }
            return base.Query(request, context);
        }
    }
}

Obviously the functionality here is rather limited, but you get the idea, the Query method was generated for us by protoc, and we simply supply our implementation.

To run up the server, we change our Main method to look like this

var server = new Grpc.Core.Server
{
   Services = 
   {
      MusicService.BindService(new MusicServer())
   },
   Ports = 
   { 
      new ServerPort("127.0.0.1", 
         50051, 
         ServerCredentials.Insecure)
   }
};

server.Start();

Console.ReadKey();
server.ShutdownAsync().Wait();

This is pretty self-explanatory, we supply the Server with the Services and the ports, then start the server.

The Client

The client code looks like this

var channel = new Channel(
   "127.0.0.1:50051", 
   ChannelCredentials.Insecure);

var client = new MusicService.MusicServiceClient(channel);

var request = new NotesRequest
{
   Key = Note.C,
   Name = "Major"
};

var response = client.Query(request);

// output the results
foreach (var note in response.Notes)
{
   Console.WriteLine(note);
}
      
channel.ShutdownAsync().Wait();

As you can see, we create a Channel which is the equivalent of a socket connection, passing in the and port information.

Next we create an instance of the MusicServiceClient which was generated by protoc for us. Everything else is as you’d expect, we create our request object call our rpc method passing in the request and a response object is returned.

Code available here https://github.com/putridparrot/blog-projects/tree/master/ProtocolBuffers/CSharp

Using Protocol Buffers

I’ve written once before about using protocol buffers, using the protobuf-net library, but I didn’t go into any depth regarding the .proto file which is used for the IDL. Let’s rectify this.

Introduction

Protocol buffers are simply a way to define a “language-neutral, “platform-neutral, extensible mechanism for serializing structed data”. What this really means is this is a specification (and tooling) for creating binary data. This data might exist as files or streamed over HTTP or any other type of stream. Think of Protocol Buffers as CSV, XML or the likes, but obviously being binary these resultant streams would generally be more compact than these other formats.

Proto file format

I’m not going to cover the .proto syntax in full as it’s already available at Language Guide (proto3), but as I build up an example .proto file I will cover the pieces that I add to the file as I go.

We’re going to want to create a .proto file which we be used to declare our messages/data. Currently the latest syntax supported is “proto3” and we declare the version we support in our .proto file. If you do not specify the syntax, currently this syntax will default to proto2 syntax.

So first off create a file with the .proto extension – I’m doing this within Visual Studio which supports Protocol Buffer syntax highlighting etc.

To declare the supported syntax we start off by adding the following line (I’m going to use proto3 syntax in the post, there are several differences between proto3 and proto2)

syntax = "proto3";

Packages/Namespaces

Whilst it’s optional, the next thing we’ll add is a package which, whilst optional, is useful for code generation in your preferred language. For example in Java this maps directly to the Java package name and in C# and C++ this maps to the namespace of the code.

We can actually override the package/namespace name for Java and C# by using option java_package and/or option csharp_namespace instead or as well as the package line. Obviously we might wish to have all three in our .proto file so the file can be used to generate for Ruby, Go, C++ etc. as well as explicit definitions of Java and C#

So let’s add a package

package music;

option java_package = "com.putridparrot.music";
option csharp_namespace = "PutridParrot.Music";

Types

Scalar types are supported, such as double, float, int32, int64 etc. along with the string type.

Enum’s are all supported, so let’s add an enum to our file

/*
 Note definitions, where two letters are used,
 the first denotes the # (sharp) and the second 
 the b (flat)
*/
enum Note {
   C = 0;
   CD = 1;
   D = 2;
   DE = 3;
   E = 4;
   F = 5;
   FG = 6;
   G = 7;
   GA = 8;
   A = 9;
   AB = 10;
   B = 11;
}

We need to define the possible values for the enum and these must have a zero element. This obviously gives us a default value (hence zero should be your enum default value).

We’ve also added a multi-line comment using /* */ syntax, single line comments using // are also supported.

A message type can be viewed as a composite type, such as structs, i.e. we can combine types, so let’s create a request and response type (the response will be used in my next post on gRPC)

message NotesRequest {
   Note key = 1;
   string name = 2;
}

message NotesResponse {
   Note key = 1;
   string name = 2;
   repeated Note notes = 3;
}

Notice the use of = 1 etc. these are field numbers and each field must have a unique field number.

As per the Google documentation, fields in the range 1 through 15 take one byte to encode, fields 16 through to 2047 take two bytes. So yes, you could have up to 2047 fields in a message, if you really wanted.

Notice in the NotesResponse message we define a repeated keyword which denotes this field can be repeated, think of this like an array (or list) field.

Code Generation

One of the key things XML gave developers was a specification which allow developers to write tools for generating code from the data specifications. Protocol Buffers is no different and ofcourse, this makes such specification more usable to the developer.

The tool we use is protoc.exe. If you’re using Visual Studio/nuget you can install Google.Protobuf.Tools via nuget. This will then be installed to ${SolutionDir)packages\Google.Protobuf.Tools.3.6.0\tools\windows_x86 (or whichever OS you’re supporting).

Now we can run this tool from nant, or other build tools, or as a pre-build event, i.e. selecting your project in Visual Studio, right mouse clicking, selecting Properties then Build Events.

Here’s an example command (formatted to make it readable)

$(SolutionDir)packages\Google.Protobuf.Tools.3.6.0\tools\windows_x86\protoc.exe 
$(ProjectDir)Proto\music.proto 
-I=$(ProjectDir)Proto 
--csharp_out=$(ProjectDir)

The first line is obviously the location of the installed protoc.exe. Next up we declare where the proto file(s) is/are. We can use wildcard, i.e. *.proto, but if we have several different location for the files we will probably need to run the command multiple times.

The -I= allows us to define import directories. This isn’t really needed in the example here as we’re not importing anything. Finally we declare that we want to generate C# code into the project folder.

Note: If you want to generate the code into another folder you’ll need to ensure it already exists, protoc will not create it for you.

Once run, this command will create a C# file which will include the types/messages as well as serialization/deserialization code.

If you’re using Visual Studio to create an application which uses Protocol Buffers, then you’ll need to install the nuget package Google.Protobuf to install the library that the generated source references.

Serializing/Deserializing

Let’s create a Visual Studio Console application, in the project folder add a Proto folder (which will contain our *.proto) files. Now, added the two nuget packages (previously mentioned, Google.Protobuf.Tools and Google.Protobuf).

Next, create a file in the Proto folder named music.proto which should look like this

syntax = "proto3";

package music;

option java_package = "com.putridparrot.music";
option csharp_namespace = "PutridParrot.Music";

/*
 Note definitions, where two letters are used,
 the first denotes the # (sharp) and the second 
 the b (flat)
*/
enum Note {
   C = 0;
   CD = 1;
   D = 2;
   DE = 3;
   E = 4;
   F = 5;
   FG = 6;
   G = 7;
   GA = 8;
   A = 9;
   AB = 10;
   B = 11;
}

message NotesRequest {
   Note key = 1;
   string name = 2;
}

message NotesResponse {
	Note key = 1;
	string name = 2;
	repeated Note notes = 3;
} 

Next, add to the Pre-Build event for the solution the command line (listed previously) to generate the C# from the .proto file.

Lastly, let’s just add the following using clauses to Program.cs

using PutridParrot.Music;
using Google.Protobuf;

and here’s the code to place in Main

var request = new NotesRequest
{
   Key = Note.C,
   Name = "Major"
};

using (var w = File.Create(@"C:\Data\request.dat"))
{
   request.WriteTo(w);
}

NotesRequest request2;
using (var r = File.OpenRead(@"C:\Data\request.dat"))
{
   request2 = NotesRequest.Parser.ParseFrom(r);
}

This will create a file request.dat with the request instance data and then, if all goes well, load the contents of the file into the request2 variable and that’s all there is to it.

We can stream the object using the WriteTo and ParseForm methods but Protocol Buffers also supports gRPC which we’ll look at in the next post.

Code available here https://github.com/putridparrot/blog-projects/tree/master/ProtocolBuffers/CSharp