Author Archives: purpleblob

Creating a Custom Layout for Xamarin Forms

First off, there’s an excellent post on this subject at Creating a Custom Layout. My intention with this post is to just go through my experience creating a layout to organise controls into a grid of squares, i.e. the intention was to layout buttons in a 3×3 matrix for a simple TicTacToe game board.

Getting Started

Your class needs to be derived from Layout or Layout and override the OnMeasure method and the LayoutChildren method.

My class will be named Squared grid and it will require the number of rows and columns that we want to display our View in, so for example here’s the XAML I want to use (excluding namespace and view bindings)

<SquareGrid Rows="3" Columns="3">
   <Button />
   <Button />
   <Button />
   <Button />
   <Button />
   <Button />
   <Button />
   <Button />
   <Button />
</SquareGrid>

So my expectation would be that we will have three rows and three columns of equal width/height and display buttons in each “cell” of the grid.

Note: I’m not going to handle situations where the number of controls doesn’t match the expected rows/columns etc.

Here’s the code (excluding OnMeasure and LayoutChildren for now)

public class SquareGrid : Layout<View>
{
   public static readonly BindableProperty RowsProperty =
      BindableProperty.Create("Rows",
         typeof(int),
         typeof(SquareGrid),
         0,
         propertyChanged: (bindable, oldValue, newValue) =>
            ((SquareGrid)bindable).NativeSizeChanged(),
         validateValue: Validate);

   public static readonly BindableProperty ColumnsProperty = 
      BindableProperty.Create("Columns", 
         typeof(int), 
         typeof(SquareGrid),
         0,
         propertyChanged: (bindable, oldValue, newValue) => 
            ((SquareGrid)bindable).NativeSizeChanged(),
            validateValue: Validate);

   public int Rows
   {
      get => (int)GetValue(RowsProperty);
      set => SetValue(RowsProperty, value);
   }

   public int Columns
   {
      get => (int)GetValue(ColumnsProperty);
      set => SetValue(ColumnsProperty, value);
   }

   private static bool Validate(BindableObject bindable, object value)
   {
      return (int)value >= 0;
   }
}

If you’re used to WPF then, whilst the name BindableProperty don’t exist in WPF (they’re DependencyProperty instead) you’ll probably understand what’s going on, for others reading this – basically we’ve created two properties that are available in XAML, Rows and Columns. We’ve created BindableProperty static’s to handle the storage, binding etc. to these properties. The default values for both Rows and Columns is set to 0 and in both cases, if there’s a change to the property then we call the NativeSizeChanged to alert the layout of changes and also ensure, using validation methods, that the Validate method to check the supplied Rows and Columns properties are within certain bounds (in this case that they’re greater than or equal to 0).

Children

The controls/views (from the example above, the Button objects) become the Children of the layout. In some (maybe most cases) we will want to ask the children for their measurements, i.e. their minimum size requirements and hence would use code like this

foreach (var child in Children)
{
   var childSizeRequest = child.Measure(widthConstraint, heightConstraint);
   // handle any childSizeRequest
}

In our SquareLayout we’re actually going to ignore what the control/view requires and instead force it into the available space, but this gives you an example of how you might handle measurements from the children during the next section, the OnMeasure method.

OnMeasure

Here’s the method signature

protected override SizeRequest OnMeasure(
   double widthConstraint, double heightConstraint)
{
}

The OnMeasure method may be called, depending upon where our SquareGrid is placed and depending upon constraints of any outer layout. For example, if the SquareGrid is within a Grid.Row and that Row height is “*” then OnMeasure is not called. OnMeasure is called when the outer layout is asking “how much space do you require?”. In the case of “*”, we can think of it more like “this is how much space you’ve got”.

In cases where OnMeasure is called, the widthConstraint or heightConstaint might be set to infinity. For example, if the SquareGrid is within a StackLayout, the StackLayout, in portrait orientation, will have not constrain height, hence heightConstraint will by set to infinity. Likewise with a landscape orientation the widthConstraint will be set to infinity. Therefore, when you are calculating the SizeRequest to return from OnMeasure, you will need to handle infinity situations.

This SquareLayout will ignore the child controls measurement requirements and instead will take all the available width or height to create a square of layout space. Hence in a scenario where this layout in within a GridLayout with “Auto” sizing, the SquareGrid will just say it requires an equal width and height based upon the minimum of the two.

Here’s the code

protected override SizeRequest OnMeasure(
   double widthConstraint, 
   double heightConstraint)
{
   var w = double.IsInfinity(widthConstraint) ? 
      double.MaxValue : widthConstraint;
   var h = double.IsInfinity(heightConstraint) ? 
      double.MaxValue : heightConstraint;

   var square = Math.Min(w, h);
   return new SizeRequest(new Size(square, square));
}

LayoutChildren

So OnMeasure is called when the parent wants to ask how much space do you require, and the LayoutChildren (as the name suggests) is when the Layout control is asked to layout it’s children given the x, y, width and height as the bounding rectangle where it should layout it’s children. Here’s a simple example of the code I’m using

protected override void LayoutChildren(
   double x, double y, 
   double width, double height)
{
   var square = Math.Min(width / Columns, height / Rows);

   var startX = x + (width - square * Columns) / 2;
   var startY = y;

   var rect = new Rectangle(startX, startY, square, square);
   var c = 0;
   foreach (var child in Children)
   {
      LayoutChildIntoBoundingRegion(child, rect);

      if (child.IsVisible)
      {
         rect.X += square;
         if (++c >= Columns)
         {
            rect.Y += rect.Height;
            rect.X = startX;
            c = 0;
         }
      }
   }
}

Notice we use the LayoutChildIntoBoundingRegion which ultimately calls the child.Layout but applies margins etc. for us.

Adventures in UWP – Globalization & Localization

In a previous post I created a “default” UWP blank application, which did nothing. But we did look at some of the key parts of the template code, manifest etc.

Usually I would go the “Hello World” route and just display a TextBlock or similar to display the text “Hello World”, so let’s do that here and then look to make this post a little more useful by taking a look at globalization & localization.

Obviously if we were to actually deploy an application to the Window’s store it would be good to have it available in different languages etc.

I’m not intending to spend too long on this as it’s quite a large subject in and of itself, but let’s look at the basics.

In MainPage.xaml, add a TextBlock between the Grid (here’s the code including the Grid)

<Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">
   <TextBlock Text="Hello World" 
      HorizontalAlignment="Center" 
      VerticalAlignment="Center"/>
</Grid>

In the above code we’ve effectively hard coded “Hello World” into our application.

Globalization & Localization

Internationalization (also known as i18n) is the combination of Globalization & Localization.

Globalization tends to mean the implementation of code within your application which can change automatically based upon the locale, so for example formatting of numbers or dates.

Localization tends to mean the changes to an application for a different locale, such as translation etc.

Note: I covered some i18n coding for WPF previously in my post A quick look at WPF Localization Extensions.

Let’s localize our app.

Okay, we don’t have a lot of text to translate (or images or the likes), so this is very much an overview of the process we’d undertake on our application. Let’s change our “Hello World” into a format that can be localized.

To do this we use the x:Uid markup element on our TextBlock, for example

<TextBlock x:Uid="helloWorld" 
   HorizontalAlignment="Center" 
   VerticalAlignment="Center"/>

Note: In another post I intend to cover accessibility, such as the narrator. In some instances we might wish to add AutomationProperties.Name text to a control but we’ll also want that to be localized and so whilst a x:Uid=”hellWorld” might be used for a control, in the .resw file we’d have the name/key helloWorld.AutomationProperties.Name and put our localized text in here.

We don’t actually need to have the Text property, but you might find having something displayed at design time useful and so you can just as easily leave the Text=”Hello World” markup and it’ll be overwritten by our localization.

The x:Uid differs from the x:Name markup as it’s aimed specifically for localization.

Now we need to create our resources, in our project we simply create a folder (at the same level as our Assets) with the name of the locale we want to store resources for, so for example we can create en for general English language, or en-GB, en-CA, en-US for example, for British, Canadian and US variants of English.

See Supported Languages for a list of language codes.

With our folder in place we simple add a new item of type Resources file (.resw) and this gives us a simply grid entry screen for key/value pairs along with a comment for passing onto our translators (for example).

Within the grid, the key (actually it’s the header Name) is our x:Uid but in the format uid.PropertyName, where the property name for our TextBlock is Text. Therefore remember if the UI element changes from a TextBlock to a Button (for example) this PropertyName would need to also change from Text to Content.

We can (and should) also store string representations of colours, such as “Red” etc. as values for those elements where the colour may need to change for the different locale.

Testing our localization

So let’s assume we’ve created several folders and localized our application strings etc. How do we test the different localization without having to change our Windows setup.

There’s actually a Default language option in the app’s manifest, but changing this doesn’t appear to have the desire affect.

A simple way to change our locale is within the App constructor, simply place the following

ApplicationLanguages.PrimaryLanguageOverride = "fr-FR";

We could also have an option within the application for switching languages, in which case we’ve need to reload the UI, but I’m not going to cover that within this post.

Using the Multilingual App Toolkit

An alternate route to translation is using the MAT (Multiligual App Toolkit) which can be used in WPF or UWP.

The Multilingual app toolkit 4.0 for VS 2015 or Multilingual app toolkit for VS 2017. Give us another way to handle translations/localization.

MAT seems a little tricky to get working, but it’s not too bad once you know what’s required (ofcourse that statement can be used for most things).

  1. Select Tools | Multilingual App Toolkit and Enable Selection
  2. Depending upon your manifests Default Language (mine’s en-GB) you’ll need a folder of the same name, i.e. en-GB with a Resources.resw file in it before you can add translations via MAT. As the default language, the key’s and default strings will be taken from this file. So add our helloWorld.Text name and “Hello” as the value to this resource file.
  3. You should build the project here just to make sure all is working
  4. Now you can right mouse click on the project and select Multilingual App Toolkit and now Add translation languages will be enabled, from here you can select multiple languages.
  5. Let’s select fr-FR for some French translation. This will create another folder fr-FR, as you’d expect, along with a Resources.resw for this translation. Do not edit any new translation files, only the default language file
  6. Along with the Resource.rews a MultiligualResources folder will appears with xlf files matching the appname.language format, i.e. HelloWorld.fr-FR.xlf. It is the XLF file that we translate our text into.

Whilst we can edit our XLF in Visual Studio as it’s just an XML file, we can also double click on it from File Explorer to load via the Multilingual Editor. This editor could, ofcource, be used by a translator to go through each resource and supply the translation. Once saved MAT will again sync. but this time it will automatically supply the strings and identifiers to the fr-FR Resource.resw (in my case).

Each time we finished adding new strings to our default locale (in my case the en-GB Resource.resw) then rebuild to get MAT to resync. Then translate when you’re ready.

In most cases we’d wait until all (or most) of our application is in ready before sending the XLF to translators ofcourse, but for us to test things, just translate and rebuild to resync.

Pseudo language with MAT

Whilst the MAT includes many locales that you’ll recognise, it also includes a Pseudo language which is generated from your default language strings but with alterations to the string, for example taking “OK” and creating a longer string with semi-random characters as well as the O and K, so it’s still readable in English (my default language).

To get translations in the pseudo language, open the *.qps-ploc.xlf file and for each string you wish to translate click the Translate button or for all strings, click the Translate button’s drop down and select Translate All. This will then create translations for testing your layouts etc.

Using resource strings in the manifest, for the Display Name

In some cases we might wish to localize the Display name, i.e. Title of the main Window.

From the manifest we can reference our resource string using

ms-resource:myTitle

where myTitle is a key/name within the Resources.

We can (sort of) handle this in code by accessing using

ApplicationView.GetForCurrentView().Title = "My Title";

but actually this doesn’t replace the Display Name from the manifest but instead prepends the text.

Finally, using our resource in code

Whilst we obviously want to do most of our UI work in XAML, we still need to store strings etc. for use in code, for example when displaying a message box of the likes.

To use the resources in code we can declare the strings within the .resw. Interestingly, not sure if this is just me or how things work, but for strings that will be used in code, the name/key of the string does not include the .PropertyName format. Hence “hello.Text” fails to work when used in code, whereas “hello” as a key does work.

Here’s the code that we can use to get the string from the default resources

var rl = ResourceLoader.GetForCurrentView();
var r = rl.GetString("hello");
// r will now be our translated string

References

Globalization and localization
Put UI strings into resources

Xamarin Forms TabbedPage

The TabbedPage is, as you’d probably expect, a page that hosts ContentPage and NavigationPage elements and displays “tabs” for selecting between those pages.

You cannot have a ContentPage hosting a tab control within it without using a third party control or writing your own.

Let’s create a TabbedPage based application

Let’s take look at implementing a TabbedPage based UI in Visual Studio 2017 (the Visual Studio for Mac project template creates a tabbed page application by default).

  • In Visual Studio 2017, File | New | Project
  • In the search box, type mobile (or select Cross-Platform | Mobile App (Xamarin.Forms))
  • Give your project a name, mine’s TabbedPageExample
  • Select Blank App, I’m aiming to write for all three listed platforms and so keep all three checked, I then select .NET Standard (or Shared Project if you prefer)
  • Build the project just to get NuGet packages updated etc.

We’ve got our project created and by default on Visual Studio 2017 you’ll have created a MainPage.xaml which is a ContentPage, we’re going to change this to a TabbedPage

  • Open the MainPage.xaml and change ContentPage to TabbedPage
  • Before we leave MainPage.xaml remove the StackLayout and everything within it, we won’t be needing this
  • Open the MainPage.xaml.cs and change ContentPage to TabbedPage here also
  • Right mouse click on the shared project (the one not suffixed with a platform name) and select Add | New Item. Select The Xamarin.Forms item on the left, then Content Page (note: not the Content Page (C#) as we’re going to just writing XAML here). Do this two more times as these will form our three tab pages. Mine are named ProjectsPage, HistoryPage and AboutPage.
  • Back in MainPage.xaml, add the following, within the TabbedPage element (i.e. where StackLayout used to reside)
    <local:ProjectsPage />
    <local:HistoryPage />
    <local:AboutPage />
    

    Note: For those not used to XAML etc. the local text is an xmlns (namespace), automatically added to our XAML files.

  • Now in each XAML file (ProjectsPage.xaml, History.xaml etc.) just change the Label to use the display then name as the page (just so we can see the page changes when we click on a tab). Also add a Title=”” to each ContentPage as an attribute and inside the “” put the tab name, i.e. Projects, History, About

If you build and run this now (I’m using the Visual Studio Emulator for Android to view things on Windows), you should see – on Android and UWP three tabs with the labels all at the top of the screen, if you run this against an iOS emulator/simulator the tabs will be at the bottom of the screen and the text is very small – this is because Android and UWP, by default, show just text, iOS be default shows text and an image, so the small text is that size to accommodate the image.

Platform specific XAML

We’re trying to keep all our shared code, whether these are services, other libraries or UI, in the shared project. Xamarin.Forms allows us to conditionally add XAML based upon the platform, i.e.

<ContentPage.Icon>
   <OnPlatform x:TypeArguments="FileImageSource">
      <On Platform="iOS" Value="history.png"/>
   </OnPlatform>
</ContentPage.Icon>

For each ContentPage we add the above as a child to the ContentPage element. the x:TypeArguments refers to the type of objects for the On Platform Value, i.e. these are Image file locations in this case.

As this adds platforms specific conditions to the XAML, we would then create the various .png images, not within the shared code but instead within the TabbedPageExample.iOS project, into the Resources folder and marked as BundleResource.

Each image file should come in three sizes, so let’s suppose we create an file about.png (for the About tab). This should be 30 x 30 pixels in size. Then the next file should be about@2x.png and should be 60 x 60 pixels, finally the last file should be about@3x.png and should be 90 x 90 pixels in size.

Once you’ve created a group of three image’s per tab (as outlined) and ensured their Build Action is BundleResource and got the XAML correctly references each of the smaller images, then you should find that when you run the app. on iOS, text and an image is displayed on each tab. Whilst Android and UWP is solely text. By only including the images in the iOS project you’re not wasting space on the other projects that do not use them.

Code

Code is available on github.

Errors within gRPC

Whilst we’d love our code to work perfectly, we also need to handle errors that might come from our server code.

We might simple wrap any response’s in a Result class (or similar) which returns data along with a success or failure status code (or ofcourse include a status code in a response object).

C#, Java etc. also use exceptions, so we might prefer to throw exceptions on the server.

gRPC supports exceptions but with some limitations…

Exceptions come with status codes

Any exceptions thrown on the server will result in RpcException being thrown – these do NOT reflect the exception that was originally thrown, i.e. if your server code threw an ArgumentException the client will only ever see an RpcExecption with a StatusCode of Unknown.

As you can see, an RpcException includes a status code, so obviously our server can instead throw an RpcException and supply the Status and StatusCode along with it, for example

throw new RpcException(
   new Status(
      StatusCode.FailedPrecondition, 
      $"Argument {nameof(request)} is invalid"));

Ofcourse handling such exceptions is as simple as

try
{
   var response = client.Query(new NotesReques())
}
catch(RpcException e) 
{
   Debug.WriteLine(e.Status.Detail);
   Debug.WriteLine(e.Status.StatusCode);
}

A little more gRPC

In my last two posts I looked at using .proto files to define the IDL which protoc.exe along with gRPC generated our preferred language’s data files and method calls – in our case this meant generating C# files.

Now let’s look at how we can extend our code a little further, for example to include meat data/headers for sending tokens or other state between the client and server.

What was generated for us?

When we generated our code using the gRPC plugin, we started with the IDL method written as

service MusicService {
   rpc Query(NotesRequest) returns (NotesResponse) {
   }

however several overloads for this method were generated (I’ve cleaned them up to remove superfluous namespaces etc. and to make them easier to read in a blog post

public virtual NotesResponse Query(
   NotesRequest request, 
   Metadata headers = null, 
   DateTime? deadline = null, 
   CancellationToken cancellationToken = 
      default(CancellationToken))
{
   return Query(request, 
      new CallOptions(
         headers, 
         deadline, 
         cancellationToken));
}

public virtual NotesResponse Query(
   NotesRequest request, 
   CallOptions options)
{
   return CallInvoker.BlockingUnaryCall(
       __Method_Query, 
       null, 
       options, 
       request);
}

public virtual AsyncUnaryCall<NotesResponse> QueryAsync(
   NotesRequest request, 
   Metadata headers = null, 
   DateTime? deadline = null, 
   CancellationToken cancellationToken = 
       default(CancellationToken))
{
   return QueryAsync(
       request, 
       new CallOptions(
           headers, 
           deadline, 
           cancellationToken));
}

public virtual AsyncUnaryCall<NotesResponse> QueryAsync(
   NotesRequest request, 
   CallOptions options)
{
   return CallInvoker.AsyncUnaryCall(
      __Method_Query, 
      null, 
      options, 
      request);
}

We’ve got several overloads, include async version and there’s support for passing metadata headers, a deadline (similar to a timeout), as well as a cancellation token.

Metadata

We might use the Metadata argument to pass in SSO tokens or other relevant information that does not form part of the actual message, here’s an example of using the Query method in such a way

// client code
var response = client.Query(request, new Metadata
{
   new Metadata.Entry("SSO", token)
});

// server 
var token = 
   context?.RequestHeaders?.FirstOrDefault(e => e.Key == "sso");

Note: Beware, for some reason, the key has been turned to lower case.

In the server method we implemented in the last post you’ll notice that along with the request there’s the ServerCallContext type which contains all the other parameters that might be sent via the client, i.e. headers, cancellation token etc.

The server can return meta data using the ServerContext’s ResponseTrailers. However the client must use the Async versions of the client methods to receive these extra bits of data.

Here’s an example of returning something via ResponseTrailers, from the server

context.ResponseTrailers.Add(new Metadata.Entry("SSO", "Success"));

and the client would change to use the QueryAsync overload and possibly look like this

var response = client.QueryAsync(request, new Metadata
{
   new Metadata.Entry("SSO", "abcdefg")
});

foreach (var note in response.ResponseAsync.Result.Notes)
{
   Console.WriteLine(note);
}

var rt = response
   .GetTrailers()
   .FirstOrDefault(e => e.Key == "sso")
   .Value;

CallOptions

Ultimately the methods that take Metadata and other arguments end up wrapping those arguments in a CallOptions struct. However CallOptions also supports CallCredentials as well as a couple of other types (WriteOptions and ContextPropagationToken which I will not be looking at in this post).

Using gRPC with Protocol Buffers

In the last post, Using Protocol Buffers, we looked at creating .proto files and how we generate C# code from them along with how to stream the binary data created by the generated code.

Let’s now look at how we might like to use Protocol Buffers “over the wire”.

Whilst we can write our own sockets code and stream data via this, there’s a close relationship between gRPC and Protocol Buffers which allows us to generate RPC code using the. proto file.

Let’s begin by adding a service and a method to the .proto file, so add the following

service MusicService {
   rpc Query(NotesRequest) returns (NotesResponse) {
   }
}

In the last post, I mentioned the NoteResponse was designed for use in the remoting code. Here it’s our return type.

To generate the gRPC code we need to make some additions to our previously defined Pre-Build event. Before we can do that, we need some more tools installed. So using nuget install the following package, Grpc.Tools, while you’re at it, if you’re working with the VS project previously defined, also add the package Grpc.

Now, append the following to the Pre-Build command (formatted to be a little more readable)

--grpc_out $(ProjectDir) 
--plugin=protoc-gen-grpc=$(SolutionDir)packages\Grpc.Tools.1.12.0\tools\windows_x86\grpc_csharp_plugin.exe

In my case, rebuilding my VS solution will result in a new file MusicGrpc.cs which I’ll need to include in the project.

If you’ve created a Console application already, this can act as the server, so you’ll need to create another Console application to be our client. I won’t go through all the steps for adding the files etc. but let’s just jump straight into looking at the server code.

The Server

Add a new class, mine’s MusicServer we derive this from the gRPC generated MusicServiceBase, like this

using System.Threading.Tasks;
using Grpc.Core;
using PutridParrot.Music;

namespace Server
{
    public class MusicServer : MusicService.MusicServiceBase
    {
        public override Task<NotesResponse> Query(
           NotesRequest request, 
           ServerCallContext context)
        {
            if (request.Key == Note.C)
            {
                return Task.FromResult(new NotesResponse
                {
                    Name = request.Name,
                    Key = request.Key,
                    Notes =
                    {
                        Note.C, Note.E, Note.G
                    }
                });
            }
            return base.Query(request, context);
        }
    }
}

Obviously the functionality here is rather limited, but you get the idea, the Query method was generated for us by protoc, and we simply supply our implementation.

To run up the server, we change our Main method to look like this

var server = new Grpc.Core.Server
{
   Services = 
   {
      MusicService.BindService(new MusicServer())
   },
   Ports = 
   { 
      new ServerPort("127.0.0.1", 
         50051, 
         ServerCredentials.Insecure)
   }
};

server.Start();

Console.ReadKey();
server.ShutdownAsync().Wait();

This is pretty self-explanatory, we supply the Server with the Services and the ports, then start the server.

The Client

The client code looks like this

var channel = new Channel(
   "127.0.0.1:50051", 
   ChannelCredentials.Insecure);

var client = new MusicService.MusicServiceClient(channel);

var request = new NotesRequest
{
   Key = Note.C,
   Name = "Major"
};

var response = client.Query(request);

// output the results
foreach (var note in response.Notes)
{
   Console.WriteLine(note);
}
      
channel.ShutdownAsync().Wait();

As you can see, we create a Channel which is the equivalent of a socket connection, passing in the and port information.

Next we create an instance of the MusicServiceClient which was generated by protoc for us. Everything else is as you’d expect, we create our request object call our rpc method passing in the request and a response object is returned.

Code available here https://github.com/putridparrot/blog-projects/tree/master/ProtocolBuffers/CSharp

Using Protocol Buffers

I’ve written once before about using protocol buffers, using the protobuf-net library, but I didn’t go into any depth regarding the .proto file which is used for the IDL. Let’s rectify this.

Introduction

Protocol buffers are simply a way to define a “language-neutral, “platform-neutral, extensible mechanism for serializing structed data”. What this really means is this is a specification (and tooling) for creating binary data. This data might exist as files or streamed over HTTP or any other type of stream. Think of Protocol Buffers as CSV, XML or the likes, but obviously being binary these resultant streams would generally be more compact than these other formats.

Proto file format

I’m not going to cover the .proto syntax in full as it’s already available at Language Guide (proto3), but as I build up an example .proto file I will cover the pieces that I add to the file as I go.

We’re going to want to create a .proto file which we be used to declare our messages/data. Currently the latest syntax supported is “proto3” and we declare the version we support in our .proto file. If you do not specify the syntax, currently this syntax will default to proto2 syntax.

So first off create a file with the .proto extension – I’m doing this within Visual Studio which supports Protocol Buffer syntax highlighting etc.

To declare the supported syntax we start off by adding the following line (I’m going to use proto3 syntax in the post, there are several differences between proto3 and proto2)

syntax = "proto3";

Packages/Namespaces

Whilst it’s optional, the next thing we’ll add is a package which, whilst optional, is useful for code generation in your preferred language. For example in Java this maps directly to the Java package name and in C# and C++ this maps to the namespace of the code.

We can actually override the package/namespace name for Java and C# by using option java_package and/or option csharp_namespace instead or as well as the package line. Obviously we might wish to have all three in our .proto file so the file can be used to generate for Ruby, Go, C++ etc. as well as explicit definitions of Java and C#

So let’s add a package

package music;

option java_package = "com.putridparrot.music";
option csharp_namespace = "PutridParrot.Music";

Types

Scalar types are supported, such as double, float, int32, int64 etc. along with the string type.

Enum’s are all supported, so let’s add an enum to our file

/*
 Note definitions, where two letters are used,
 the first denotes the # (sharp) and the second 
 the b (flat)
*/
enum Note {
   C = 0;
   CD = 1;
   D = 2;
   DE = 3;
   E = 4;
   F = 5;
   FG = 6;
   G = 7;
   GA = 8;
   A = 9;
   AB = 10;
   B = 11;
}

We need to define the possible values for the enum and these must have a zero element. This obviously gives us a default value (hence zero should be your enum default value).

We’ve also added a multi-line comment using /* */ syntax, single line comments using // are also supported.

A message type can be viewed as a composite type, such as structs, i.e. we can combine types, so let’s create a request and response type (the response will be used in my next post on gRPC)

message NotesRequest {
   Note key = 1;
   string name = 2;
}

message NotesResponse {
   Note key = 1;
   string name = 2;
   repeated Note notes = 3;
}

Notice the use of = 1 etc. these are field numbers and each field must have a unique field number.

As per the Google documentation, fields in the range 1 through 15 take one byte to encode, fields 16 through to 2047 take two bytes. So yes, you could have up to 2047 fields in a message, if you really wanted.

Notice in the NotesResponse message we define a repeated keyword which denotes this field can be repeated, think of this like an array (or list) field.

Code Generation

One of the key things XML gave developers was a specification which allow developers to write tools for generating code from the data specifications. Protocol Buffers is no different and ofcourse, this makes such specification more usable to the developer.

The tool we use is protoc.exe. If you’re using Visual Studio/nuget you can install Google.Protobuf.Tools via nuget. This will then be installed to ${SolutionDir)packages\Google.Protobuf.Tools.3.6.0\tools\windows_x86 (or whichever OS you’re supporting).

Now we can run this tool from nant, or other build tools, or as a pre-build event, i.e. selecting your project in Visual Studio, right mouse clicking, selecting Properties then Build Events.

Here’s an example command (formatted to make it readable)

$(SolutionDir)packages\Google.Protobuf.Tools.3.6.0\tools\windows_x86\protoc.exe 
$(ProjectDir)Proto\music.proto 
-I=$(ProjectDir)Proto 
--csharp_out=$(ProjectDir)

The first line is obviously the location of the installed protoc.exe. Next up we declare where the proto file(s) is/are. We can use wildcard, i.e. *.proto, but if we have several different location for the files we will probably need to run the command multiple times.

The -I= allows us to define import directories. This isn’t really needed in the example here as we’re not importing anything. Finally we declare that we want to generate C# code into the project folder.

Note: If you want to generate the code into another folder you’ll need to ensure it already exists, protoc will not create it for you.

Once run, this command will create a C# file which will include the types/messages as well as serialization/deserialization code.

If you’re using Visual Studio to create an application which uses Protocol Buffers, then you’ll need to install the nuget package Google.Protobuf to install the library that the generated source references.

Serializing/Deserializing

Let’s create a Visual Studio Console application, in the project folder add a Proto folder (which will contain our *.proto) files. Now, added the two nuget packages (previously mentioned, Google.Protobuf.Tools and Google.Protobuf).

Next, create a file in the Proto folder named music.proto which should look like this

syntax = "proto3";

package music;

option java_package = "com.putridparrot.music";
option csharp_namespace = "PutridParrot.Music";

/*
 Note definitions, where two letters are used,
 the first denotes the # (sharp) and the second 
 the b (flat)
*/
enum Note {
   C = 0;
   CD = 1;
   D = 2;
   DE = 3;
   E = 4;
   F = 5;
   FG = 6;
   G = 7;
   GA = 8;
   A = 9;
   AB = 10;
   B = 11;
}

message NotesRequest {
   Note key = 1;
   string name = 2;
}

message NotesResponse {
	Note key = 1;
	string name = 2;
	repeated Note notes = 3;
} 

Next, add to the Pre-Build event for the solution the command line (listed previously) to generate the C# from the .proto file.

Lastly, let’s just add the following using clauses to Program.cs

using PutridParrot.Music;
using Google.Protobuf;

and here’s the code to place in Main

var request = new NotesRequest
{
   Key = Note.C,
   Name = "Major"
};

using (var w = File.Create(@"C:\Data\request.dat"))
{
   request.WriteTo(w);
}

NotesRequest request2;
using (var r = File.OpenRead(@"C:\Data\request.dat"))
{
   request2 = NotesRequest.Parser.ParseFrom(r);
}

This will create a file request.dat with the request instance data and then, if all goes well, load the contents of the file into the request2 variable and that’s all there is to it.

We can stream the object using the WriteTo and ParseForm methods but Protocol Buffers also supports gRPC which we’ll look at in the next post.

Code available here https://github.com/putridparrot/blog-projects/tree/master/ProtocolBuffers/CSharp

It’s been a long time JDBC…

As I’m working in Java again. I’m having to reacquaint myself with various Java lanaguge features and libraries.

It’s been a long time since I’ve had to use JDBC, but here’s a snippet of code to demonstrate a accessing an Oracle LDAP datasource.

// import java.sql.*;

Properties connectionProps = new Properties();
connectionProps.put("user", "YOUR_USER_NAME");
connectionProps.put("password", "YOUR_PASSWORD");

Connection conn = DriverManager.getConnection(
   "jdbc:oracle:thin:@ldap://SOME_URL:SOME_PORT/,cn=OracleContext,dc=putridparrot,dc=com",
                    connectionProps);

Statement statement = conn.createStatement();
ResultSet rs = statement.executeQuery("select id, blob from SOME_TABLE");

while(rs.next()) {
   String id = rs.getString(1);
   String blob = rs.getString(2);
   // do something with the results
}

Where to store your application data?

Actually I don’t intend to answer the question “Where to store your application data?” because this will depend on your requirements, but what this post will look at is, some of the options available and hopefully help shed light on what best suits your application.

Application data can be separated into two types, user specific data and application specific data (or if you prefer “All User” data).

Obviously multiple user’s might have access to a single machine but an application may be available to all users of a machine, hence we need a way to store settings specific to each user on the machine, for example user preferences. However we also may have application specific data, maybe the application stores a list of URL’s specific, in essence global settings.

Let’s look at some of our options for storing data…

Within your application’s folder

Obviously we could simply store configuration data etc. along with the application, one way to locate this folder is, as follows

var folder = Path.GetDirectoryName(
   Assembly.GetExecutingAssembly().Location) +
   Path.DirectorySeparatorChar + 
   SettingsFileName;

Here we might store a file for application specific data then create another file using the username (here we can use the Environment.UserName) of the user logged into the machine for each user.

This is a simple solution and in some cases more than adequate, plus it has the benefit that if we delete the application (i.e. it wasn’t installed via an installer) then we delete any configuration files.

Program Data

The ProgramData folder is on the system drive and is hidden by default (see C:\ProgramData). As can be inferred from the name, it’s generally used for settings specific to the application itself, i.e. not based upon specific users on a machine.

We can access it using the following code

var folder = Path.Combine(
   Environment.GetFolderPath(
      Environment.SpecialFolder.CommonApplicationData),
      "YourAppName");

Interestingly you can access the ProgramData using C:\Users\AllUsers in File Explorer, although File Explorer will state that you are in C:\Users\AllUsers it’s the same folder as C:\ProgramData.

Program Data can also be located using the environment variable %programdata%.

User Data

So we’ve seen that we can use the Environment.UserName to combine with our file name to create user files, but Windows already has the capability locations for user data. Plus, depending how your OS is set up, this data may be used across any machine a user logs into (known as Roaming).

The default location for the following “User Data” folders is under the location C:\Users\<username>\AppData

Local

The Local folder can be located using the special folder LocalApplicationData, for example

var folder = Path.Combine(
   Environment.GetFolderPath(
      Environment.SpecialFolder.LocalApplicationData),
      "YourAppName");

and is also available via the environment variable %localappdata%.

This location contains data that cannot be stored in the Roaming folder, for example data that’s specific to the machine the user is logged into or that is too large to store in a synchronized roaming folder (i.e. where Roaming folders are synchronized with a server).

Roaming

As hinted at in the section on the Local folder. The Roaming folder can be synchronized with a server, i.e. this is a profile which is accessible from other machines that a user logs into on the same domain. Hence anything stored here will “follow” the user around and so is very useful for preferences, favourites etc. However the space available may be limited depending upon quota settings or other space limitations.

To access this folder we simply use the ApplicationData special folder or environment variable %appdata%, for example

var folder = Path.Combine(
   Environment.GetFolderPath(
      Environment.SpecialFolder.ApplicationData),
      "YourAppName");

LocalLow

The LocalLow folder is basically a restricted version of the Local folder. The data is not synchronized with a server and hence does not move from the machine its created on and it has a lower level of access.

When I say “restricted” or “lower level access” basically this means the application being run, itself has security constraints placed upon it.

The LocalLow folder does not have an entry within the SpecialFolder enumeration, so access the folder you need to use the following (copied from Thomans Levesque’s answer on StackOverflow – https://stackoverflow.com/questions/4494290/detect-the-location-of-appdata-locallow)

[DllImport("shell32.dll")]
static extern int SHGetKnownFolderPath(
   [MarshalAs(UnmanagedType.LPStruct)] Guid rfid, 
   uint dwFlags, 
   IntPtr hToken, 
   out IntPtr pszPath);

public static string GetFolderLocalLow()
{
   var pszPath = IntPtr.Zero;
   try
   {
      var hr = SHGetKnownFolderPath(folderGuid, 0, IntPtr.Zero, out pszPath);
      if (hr < 0)
      {
         throw Marshal.GetExceptionForHR(hr);
      }
      return Marshal.PtrToStringAuto(pszPath);
   }
   finally
   {
      if (pszPath != IntPtr.Zero)
      {
         Marshal.FreeCoTaskMem(pszPath);
      }
   }
}

Accessing location via the environment variables

In a few places I’ve shown the environment variable for each of the locations mentioned. We can also use these variables to locate the folders, for example

var location = 
   Environment.ExpandEnvironmentVariables("%AppData%")

This will result in returning the Roaming folder location, but what’s nice is this static method will work with environment variables combined with file locations (as you’d probably expect), so for example

var location = 
   Environment.ExpandEnvironmentVariables("%AppData%\MyApp")

This will return a path along the lines C:\Users\<username>\AppData\Roaming\MyApp

Source added to https://github.com/putridparrot/blog-projects/tree/master/FileLocations