Category Archives: Azure

Azure Functions, AWS Lambda Functions, Google Cloud Functions

Some companies, due to regulatory requirements, a desire to not get locked into one cloud vendor or the likes, look towards a multi-cloud strategy. With this in mind this post is the first of a few showing some of the same functionality (but with different names) across the top three cloud providers, Microsoft’s Azure, Amazon’s AWS and Google Cloud.

We’re going to start with the serverless technology known as Lambda Functions (in AWS, and I think they might have been the first), Azure Functions and the Google cloud equivalent Google Cloud Functions. Now, in truth the three may not be 100% compatible in terms of their API but they’re generally close enough to allow us to worry about the request and response only and keep the same code for the specific function. Ofcourse if your function uses DB’s or Queues, then you’re probably starting to get tied more to the vendor than the intention of this post.

I’ve already covered Azure Functions in the past, but let’s revisit, so we can compare and contrast the offerings.

I’m not too interested in the code we’re going to deploy, so we’ll stick with JavaScript for each provider and just write a simple echo service, i.e. we send in a value and it responds with the value preceded by “Echo: ” (we can look at more complex stuff in subsequent posts).

Note: We’re going to use the UI/Dashboard to create our functions in this post.

Azure Functions

From Azure’s Dashboard

  • Type Function App into the search box or select it from your Dashboard if it’s visible
  • From the Function App page click the Create Function App button
  • From the Create Function App screen
    • Select your subscription and resource group OR create a new resource group
    • Supply a Function app name. This is essentially our apps name, as the Function app can hold multiple functions. The name must be unique across Azure websites
    • Select Code. So we’re just going to code the functions in Azure not supply an image
    • Select a runtime stack, let’s choose Node.js
    • Select the version (I’m sticking with the default)
    • Select the region, look for the region closest to you
    • Select the Operating System, I’m going to leave this as the default Windows
    • I’ve left the Hosting to the default Consumption (Serverless)
  • Click Review + create
  • If you’re happy, now click Create

Once Azure has done it’s stuff, we’ll have a resource and associated resources created for our functions.

  • Go to resource or type in Function App to the search box and navigate there via this option.
  • You should see your new function app. with the Status running etc.
  • Click on the app name and you’ll navigate to the apps. page
  • Click on the Create in Azure portal button. You could choose VS Code Desktop or set up your own editor if you prefer
  • We’re going to create an HTTP trigger, which is basically a function which will start-up when an HTTP request comes in for the function, so click HTTP trigger
    • Supply a New Function, I’m naming mine Echo
    • Leave Authorization level as Function OR set to Anonymous for a public API. Azure’s security model for functions is nice and simple, so I’ve chosen Function for this function, but feel free to change to suite
    • When happy with your settings, click Create

If all went well you’re now looking at the Echo function page.

  • Click Code + Test
  • The default .js code is essentially an echo service, but I’m going to change it slightly to the following
    module.exports = async function (context, req) {
      const text = (req.query.text || (req.body && req.body.text));
      context.log('Echo called with ' + text);
      const responseMessage = text
        ? "Echo: " + text
        : "Pass a POST or GET with the text to echo";
    
      context.res = {
        body: responseMessage
      };
    }
    

Let’s now test this function. The easiest way is click the Test/Run option

  • Change the Body to
    {"text":"Scooby Doo"}
    
  • Click Run and if all went well you’ll see Echo: Scooby Doo
  • To test from our browser, let’s get the URL for our function by clicking on the Get function URL
  • The URL will be in the following format and we’ve added the query string to use with it
    https://your-function-appname.azurewebsites.net/api/Echo?code=your-function-key&text=Shaggy
    

If all went well you’ll see Echo: Shaggy and we’ve basically created our simple Azure Function.

Note: Don’t forget to delete your resources when you’ve finished testing this OR use it to create your own code

AWS Lamba

From the AWS Dashboard

  • Type Lambda into the search box
  • Click the button Create function
  • Leave the default as Author from scratch
  • Enter the function name. echo in my case
  • Leave the runtime (this should be Node), architecture etc. as the default
  • Click Create function

Once AWS has done it’s stuff let’s look at the code file index.mjs and change it to

export const handler = async (event, context) => { 
  console.log(JSON.stringify(event));
  const response = {
    statusCode: 200,
    body: JSON.stringify('Echo: ' + event.queryStringParameters.text),
  };
  return response;
};

You’ll need to Deploy the function before it updates to use the latest code but you’ll find that, at this time, you’ll probably get errors use the Test option. One thing we haven’t yet done is supply trigger.

  • Either click Add trigger or from the Configuration tab click Add trigger
  • Select API Gatewway which will add an API to create a HTTP endpoint for REST/HTTP requests
  • If you’ve not created a existing API then select Create a new API
    • We’ll select HTTP API from here
    • I’m not going to create JWT authorizer, so for Security for now, select Open
    • Click the Add button
  • From the Configuration tab you’ll see an API endpoint, in your browser paste the endpoint url and add the query string so it looks a bit like this

    https://end-point.amazonaws.com/default/echo?text=Scooby%20Doo
    

    Note: Don’t forget to delete your functions when you’ve finished testing it OR use it to create your own code

    Google Cloud Function

    From the Google Cloud dashboard

    • Type Cloud Functions into the search box
    • From the Functions page, click Create Function
    • If the Enable required APIs popup appears you’ll need to click ENABLE to ensure all APIs are enabled



    From the Configuration page

    • Set to Environment if required, mine’s defaulted to 2nd gen which is the latest environment
    • Supply the function name, mine’s again echo
    • Set the region to one near your region
    • The default trigger is HTTPS, so we won’t need to change this
    • Just to save on having to setup authentication let’s choose the Allow unauthenticated invocations i.e. making a public API
    • Let’s also copy the URL for now which ill be something like
      https://your-project.cloudfunctions.net/echo
      
    • Clich the Next button



    This defaulted to creating a Node.js runtime. Let’s change our code to the familiar echo code

    • The code should look like the following
      const functions = require('@google-cloud/functions-framework');
      
      functions.http('helloHttp', (req, res) => {
        res.send(`Echo: ${req.query.text || req.body.text}`);
      });
      
    • Click the Test button and GCP will create the container etc.



    Once everything is deployed then change the test payload to

    {
      "text": "Scooby Doo"
    }
    

    and click Run Test. If all went well you’ll see the Echo response in the GCF Testing tab.

    Finally, when ready click Deploy and then we can test our Cloud function via the browser, using the previously copied URL, like this

    https://your-project.cloudfunctions.net/echo?text=Scooby%20Doo
    

    Note: Don’t forget to delete your function(s) when you’ve finished testing this OR use it to create your own code

Azure Functions

Azure functions (like AWS lambdas and GCP cloud functions) allow us to write serverless code literally just as functions, i.e. no need to fire up a web application or VM. Ofcourse just like Azure containers, there is a server component but we, the developer, need not concerns ourselves with handling configuration etc.

Azure functions will be spun up as and when required, meaning we will only be charged when they’re used. The downside of this is they have to spin up from a “cold” state. In other words the first person to hit your function will likely incur a performance hit whilst the function is started then invoked.

The other thing to remember is Azure functions are stateless. You might store state with a DB like CosmoDB, but essentially a function is invoked, does something then after a timeout period it’s shut back down.

Let’s create an example function and see how things work…

  • Create a new Azure Functions project
  • When you get to the options for the Function, select Http trigger and select Amonymous Authorization level
  • Complete the wizard by clicking the Create button

The Authorization level allows the function to be triggered without providing a key. The HTTP trigger, as it sounds, means the function is triggered by an HTTP request.

The following is basically the code that’s created from the Azure Function template

public static class ExampleFunction
{
  [FunctionName("Example")]
  public static async Task<IActionResult> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
        ILogger log)
  {
    log.LogInformation("HTTP trigger function processed a request.");

    string name = req.Query["name"];

    var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    dynamic data = JsonConvert.DeserializeObject(requestBody);
    name = name ?? data?.name;

    var responseMessage = string.IsNullOrEmpty(name) 
      ? "Pass a name in the query string or in the request body for a personalized response."
            : $"Hello, {name}. This HTTP triggered function executed successfully.";

    return new OkObjectResult(responseMessage);
  }
}

We can actually run this and debug via Visual Studio in the normal way. We’ll get a URL supplied, something like this http://localhost:7071/api/Example to access our function.

As you can see from the above code, we’ll get passed an ILogger and an HttpRequest. From this we can get query parameters, so this URL above would be used like this http://localhost:7071/api/Example?name=PutridParrot

Ofcourse the whole purpose of the Azure Function is for it to run on Azure. To publish it…

  • From Visual Studio, right mouse click on the project and select Publish
  • For the target, select Azure. Click Next
  • Select Azure Function App (Windows) or Linux if you prefer. Click Next again
  • Either select a Function instance if one already exist or you can create a new instance from this wizard page

If you’re creating a new instance, select the resource group etc. as usual and then click Create when ready.

Note: I chose Consumption plan, which is the default when creating an Azure Functions instance. This is basically a “pay only for executions of your functions app”, so should be the cheapest plan.

The next step is to Finish the publish process. If all went well you’ll see everything configures and you can close the Publish dialog.

From the Azure dashboard you can simply type into the search textbox Function App and you should see the published function with a status of Running. If you click on the function name it will show you the current status of the function as well as it’s URL which we can access like we did with localhost, i.e.

https://myfunctionssomewhere.azurewebsites.net/api/Example?name=PutridParrot

Blazor and the GetFromJsonAsync exception TypeError: Failed to Fetch

I have an Azure hosted web api. I also have a simple Blazor standalone application that’s meant to call the API to get a list of categories to display. i.e. the Blazor app is meant to call the Azure web api, fetch the data and display it – should be easy enough, right ?

The web api can easily accessed via a web browser or a console app using the .NET HttpClient, but the Blazor code using the following simply kept throwing an exception with the cryptic message “TypeError: Failed to Fetch”

@inject HttpClient Http

// Blazor and other code

protected override async Task OnInitializedAsync()
{
   try
   {
      _categories = await Http.GetFromJsonAsync<string[]>("categories");
   }
   catch (Exception e)
   {
      Debug.WriteLine(e);
   }
}

What was happening is I was actually getting a CORS error, sadly not really reported via the exception so not exactly obvious.

If you get this error interacting with your web api via Blazor then go to the Azure dashboard. I’m running my web api as a container app, type CORS into the left search bar of the resource (in my case a Container App). you should see the Settings section CORS subsection.

Add * to the Allowed Origins and click apply.

Now your Blazor app should be able to interact with the Azure web api app.

Azure Container apps

Azure offers a Kubernetes solution, which we looked at in the post Running and deploying to Azure Kubernetes and also a solution called simply Azure Container Apps.

In fact Container apps are built upon Kubernetes, just think of them as a simplification layer on top of k8s.

The main difference between the Kubernetes offering and container apps is exactly that – simplicity. You don’t get the managed infrastructure with container apps. Container apps are essentially a “serverless” solution. Container apps. also have the ability to scale, not just on CPU or memory usage but also on HTTP requests and events, such as those from the Azure Service Bus. So for example, if there are no items in the service bus queue then containers apps can scale down.

Let’s create our Container App…

  • Either search for Container App in the Azure dashboard or Create a resource then from the Containers category select Container App
  • As usual select or create a resource group
  • Give you container app a name, mine’s test-container
  • Set the region etc.
  • Select the Container tab and uncheck the Use quickstart image as we’re use the Azure registry where we pushed our images to in the previous post Running and deploying to Azure Kubernetes
  • Set the Registry to your Azure registry OR the Docker registry. If you get Cannot access ACR XXX because admin credentials on the ACR are disabled. then goto to your Azure registry and select Access Keys where you can enable Admin user – if you have to do this step you’ll probably have to start the creation process over again.
  • Now select an image from your registry and the image tag
  • To expose our service we’ll now select the Ingress tab and tick Enabled, leave as Limited to Container Apps Environment checked OR if you want to expose your app to the world then endure Accepting traffic from anywhere is checked. Now set Target port to whatever you want, I’m going with the standard port 80

Now click Review + create then when you’re happy with the review, click the Create button.

When completed an Application Url should be created. We set the Ingress as Limited to Container App Environment, so this will not be available to the outside world.

If you have more services to add, then add another container app, search the Dashboard for Container Apps Environments select the environment that was created by Azure then select the Apps | Apps option from the left hand navigation bar. From here we can go through the same process as above and add further apps.

Once created, select the Application Url for the container and this should now be accessible internally or via the web depending on what ingress traffic option you chose.

Running and deploying to Azure Kubernetes

We’re going to be deploying our web services to k8s using Docker, so first off we need to create a registry (if you don’t already have one) on Azure.

  • Go to the Azure Dashboard and select Create a resource
  • Locate and click on Container Registry
  • Click on Create
  • Supply the Resource Group you want to use, I’m using the one I created for my previous post Creating and using the Azure Service Bus
  • Create a registry name, mines apptestregistry
  • Select your location and SKU, I’ve gone Basic on the SKU

Now click Review + create. Review your options and if all looks correct then click Create. Now wait for the Deployment to complete and go to the resource’s dashboard where you’ll see (at least on the current Dashboard) options to Push a Container image, Deployment a container image etc.

Adding a Kubernetes service

We now need to return to the main dashboard to select Containers | Kubernetes services or just type Kubernetes services into the Dashboard search bar.

  • In Kubernetes services click Create
  • Click Create a Kubernetes cluster
  • Supply a Resource Group
  • For Cluster preset configuration choose Dev/Test for now
  • Enter a Kubernetes cluster name, mine’s testappcluster
  • Fill in the rest of the options to suite your location etc.

Now click Review + create.

Stop, don’t press create yet. Before we click create, go to the Integrations tab and set the Container Registry to the one we created – if you don’t do this then you’ll get 401’s when trying to deploy from your registry into K8s.

Note: There is a way to create this integration between k8s and your registry later, but it is so much simpler letting the Dashboard do the work for us.

Now, review your options and if happy all looks correct, click Create.

Note: I kept getting an error around quota’s on the setup above, I found if you reduce the autoscaling slider/values (as mine showed it would be maxed out) then this should pass the review phase.

Once the deployment in complete (and it may take a little while) we’ll need something to push to it…

Creating a simple set of microservices

  • Using Visual Studio, create a new ASP.NET Core Web API, make sure to have Docker support checked
  • Delete the Weatherforecast controller and domain object
  • Right mouse click on Controllers and select Add | Controller
  • Select an empty controller

Note: We’re going to use the Azure CLI tools, so if you’ve not got the latest, go to https://learn.microsoft.com/en-us/cli/azure/install-azure-cli-windows?tabs=azure-cli#install-or-update and install from here

Now let’s dockerize our WebApi…

Note: Replace apptestregistry with your registry name and I’m not going to include th az login etc. and my service within the registry is called myname, so replace with something meaningful. Also ver01: is the version I’m assigning to the image.

  • Copy your Docker file for your WebApi into the solution folder of your project
  • Run az acr build -t myname:ver01 -r apptestregistry .

That’s all there is to it, repeat for each WebApi you want to deploy.

To check all went well. We’re going to use the following az command to list the Azure registry (change the registry name to yours)

az acr repository list --name apptestregistry

Or you can go to the Azure container registry dashboard and select Repositories

Deploying to Kubernetes

At this point we’ve created our WebApi, dockerized it and deployed to the Azure registry of our choice, but to get it into k8s we need to create the k8s manifests.

We create a YML file, for example kubernetes-manifest-myname.yaml (change myname to the name of your service as well is with the YML below, don’t forget to change apptestregistry to your registry name also)

apiVersion: apps/v1 
kind: Deployment 
metadata: 
  name: myname
  labels: 
    app: myname
spec: 
  selector: 
    matchLabels: 
      app: myname 
  replicas: 1 
  template: 
    metadata: 
      labels: 
        app: myname
    spec: 
      containers: 
      - name: myname
        image: apptestregistry.azurecr.io/myname:ver01 
        resources: 
          requests: 
            cpu: 100m 
            memory: 100Mi 
          limits: 
            cpu: 200m 
            memory: 200Mi 
        ports: 
        - containerPort: 80 
--- 
apiVersion: v1 
kind: Service 
metadata: 
  name: myname
spec:
  ports: 
  - port: 80 
  selector: 
    app: myname
---

We need to set up kubectl access from Terminal/PowerShell using

az aks get-credentials -n testappcluster -g test-app-cluster
kubectl apply -f .\kubernetes-your-manifest-file-yaml

We can check that the pods exist using

kubectl get pods

If you notice Status or ImagePullBackOff then it’s possible you’ve not set up the integration of k8s and your registry, to be sure type “kubectl describe pod” along with your pod name to get more informations.

and finally, let’s check the status of our services using

kubectl get services

In the Azure Kubernetes services dashboard you can select Services and ingresses to view the newly added service(s).

We can see Cluster IP address which is an internal IP address. So for example if our myname service has Cluster IP 10.0.40.100 then other services deployed to the cluster can interact with the myname service by this IP.

External facing service

We’ve created an service which is hosted in a pod within k8s but we have no external interface to it. A simple way to create an external IP from our service is by declaring a service as a load balancer which routes calls to the various services we’ve deployed.

So let’s assume we created a new WebApi, with Docker support and we added the following controller which routes operations to a single endpoint in this case, but ofcourse it could route to any WebApi that we’ve deployed via it’s Cluster IP

[ApiController]
[Route("[controller]")]
public class MathController : Controller
{
    [HttpGet("Get")]
    public async Task<string> Get(string op)
    {
        var httpClient = new HttpClient();
        using var response = await httpClient.GetAsync($"http://10.0.40.100/myname/get?operation={op}");
        return await response.Content.ReadAsStringAsync();
    }
}

Once we have an External IP associate with this load balance type of service, then we can access from the web i.e. http://{external-ip}/myservice/get?op=display

Creating and using the Azure Service Bus

The Azure Service Bus is not much different to every other service bus out there, i.e. we send messages to it and other applications or services receive those messages by pulling the messages off the bus or monitoring it.

Let’s set up an Azure Service bus.

We’ll use the Azure Dashboard (the instructions below are correct as per the Dashboard at the time of writing).

  • Type Service Bus into the search bar of the dashboard or locate the Service Bus from the dashboard buttons etc. if available
  • Click Create then either give the resource group a name (or select an existing). I’ll create new and mine’s going to be called test-app. Create a namespace, mine’s test-app-bus and set the location pricing tier etc. as you wish.
  • Click the Review + create button.
  • Review your settings then if you’re happy, click the Create button

If all went well, you’ll see the deployment in progress. When completed, we need to set up a queue…

  • Click the Go to resource button from the deployment page
  • Click the Queue button
  • The queue name needs to be unique within the namespace, I’ve chosen test-app-queue, although it’s more likely you’ll want to choose a name that really reflects what the purpose of the queue is, for example trades, appointments, orders are some real world names you might prefer to use
  • I’m going to leave all queue options as the default for this example
  • Click the Create button and in a few seconds the queue should be created.

In the dashboard for the Service Bus Namespace you’ll see the queues listed at the bottom of the page. This pages also shows requests count, message count etc.

We’ve not completed everything yet. We need to create a SAS policy for accessing the service bus…

  • From the Service Bus Namespace dashboard, select Entities | Queues select the queue to view the dashboard page Service Bus Queue
  • From here select Settings | Shard access policies
  • Click the Add button
  • We’re going to set the policy up for applications sending messages, so give the policy an appropriate name, such as SenderPolicy and ensure the Send checkbox is checked
  • Finally, click the Create button

If you now click on the policy it will show keys and connection strings. We’ll need the Primary Connection String for our test application.

Note: Obviously these keys need to be kept secure otherwise anyone could interact with your service bus queues.

Creating a test app to send messages

This is all well and good, but let’s now create a little C# test app to send messages to our queue.

  • From Visual Studio create a new project, we’ll just create a Console application for now
  • Add a NuGet reference to Azure.Messaging.ServiceBus

In Program.cs simply copy/paste the following

using Azure.Messaging.ServiceBus;

const string connectionString = "the-send-primary-connection-string";
const string queueName = "test-app-queue";

var serviceBusClient = new ServiceBusClient(connectionString);
var serviceBusSender = serviceBusClient.CreateSender(queueName);

try
{
    using var messageBatch = await serviceBusSender.CreateMessageBatchAsync();

    for (var i = 1; i <= 10; i++)
    {
        if (!messageBatch.TryAddMessage(new ServiceBusMessage($"Message {i}")))
        {
            throw new Exception($"The message {i} is too large to fit in the batch");
        }
    }

    await serviceBusSender.SendMessagesAsync(messageBatch);
    Console.ReadLine();
}
finally
{
    await serviceBusSender.DisposeAsync();
    await serviceBusClient.DisposeAsync();
}

Creating a test app to receive messages

Obviously we will want to receive messages from our service bus, so let’s create another C# console application and copy/paste the following into Program.cs

using Azure.Messaging.ServiceBus;

async Task MessageHandler(ProcessMessageEventArgs args)
{
    var body = args.Message.Body.ToString();
    Console.WriteLine($"Received: {body}");
    await args.CompleteMessageAsync(args.Message);
}

Task ErrorHandler(ProcessErrorEventArgs args)
{
    Console.WriteLine(args.Exception.ToString());
    return Task.CompletedTask;
}

const string connectionString = "the-listen-primary-connection-string";
const string queueName = "test-app-queue";

var serviceBusClient = new ServiceBusClient(connectionString);
var serviceBusProcessor = serviceBusClient.CreateProcessor(queueName, new ServiceBusProcessorOptions());

try
{
    serviceBusProcessor.ProcessMessageAsync += MessageHandler;
    serviceBusProcessor.ProcessErrorAsync += ErrorHandler;

    await serviceBusProcessor.StartProcessingAsync();

    Console.ReadKey();

    await serviceBusProcessor.StopProcessingAsync();
}
finally
{
    await serviceBusProcessor.DisposeAsync();
    await serviceBusClient.DisposeAsync();
}

Before this will work we also need to go back to the Azure Dashboard, go to the Queues section and click on Shared access policies. Along side our SenderPolicy add a new policy, we’ll call it ListenPolicy and check the Listen checkbox. Copy the Primary Connection String to the code above.

This code will listen for messages but in some cases you may wish to just get a single message, in which case you could use this code

using Azure.Messaging.ServiceBus;

const string connectionString = "the-listen-primary-connection-string";
const string queueName = "test-app-queue";

await using var client = new ServiceBusClient(connectionString);

var receiver = client.CreateReceiver(queueName);
var message = await receiver.ReceiveMessageAsync();
var body = message.Body.ToString();

Console.WriteLine(body);

await receiver.CompleteMessageAsync(message);

Anatomy of an Azure Function

In a previous post we looked at creating Azure Functions, hopefully from this we’re able to quickly and easily get things up and running. We also looked (in another post) and various triggers and hooks which help run our functions.

For this post I want to look into functions a little deeper.

Template code

As we’ve seen, from the Azure Portal or from Visual Studio (with the relevant extensions) we can easily generate our function code. Different function types have slightly different code patterns.

This post will concentrate on C# code, but we can generated code in JavaScript, F# and more.

Azure Portal Generated Code

The Azure Portal generates code for us, which for a C# developer is actually generating C# scripting code. Hence, unlike via Visual Studio, to reference external assemblies which are not included by default. In this case we can use #r (requires) and the assembly name, for example

#r "System.Data"

It’s fairly obvious that an Azure function is literally that, a function/method. Not a class with multiple methods where you create an instance of the class and interact with it. Hence Azure functions are static and may be async, or not, as required.

Depending upon the type of function we’re creating we may return a response or void or another object type. Again, depending on the type of function, we may have a TimerInfo, or an HttpRequestMessage and there may be other arguments, whether they’re parameters for a query of connections to BlobStorage (via a Stream), Queue names or whatever.

With the Azure Portal functions we also have generated for us, the function.json file (and in some cases a README.md file).

function.json

This file acts as a binding (using JSON as the extension suggests) which defines the inputs and outputs as well as some configuration details, such as whether it’s disabled or not.

Visual Studio Generated Code

Code from Visual Studio differs from the Portal code in that we’re creating assemblies here, we’re not uploading scripts to the Azure Portal. Hence we add references in the standard way to the project. We don’t have a function.json file, but instead have a host.json file as well as a local.settings.json file (which we’re look into later).

Our function name (i.e. the name we see in the Portal etc.) is defined using the FunctioNameAttribute on the method. The configuration that exists in the function.json file becomes attributes on the Run method (generated by Visual Studio). So for example, what type of trigger is often defined as an attribute on the first arguments, for example

[FunctionName("TimerTest")]
public static void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, TraceWriter log)
{
   log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}

and this attribute then will tend to have the configuration for the trigger, for example this TimerTriggerAttribute has the CRON format string, an HttpTriggerAttribute might have the AuthorizationLevel used, the HTTP methods uses, the Route etc. The BlobTriggerAttribute has connection information etc. Obviously, just like the Azure generated code, other parameters exist depending upon the type of trigger and/or params passed to the method. Also included is the TraceWriter for us to log information to.

For anything other than an HttpTrigger function, you’ll need to supply AzureWebJobStorage details etc. in the local.settings.json file, to do this, from Visual Studio Command Prompt and from your project’s folder (where it’s host.json and local.settings.json files reside) run

func azure functionapp fetch-app-settings (your-app-name)

Replacing your-app-name with the app you will need to have created in the Azure Portal. If you have multiple projects then you have to do this in each one’s project folder. This command will fill in connections strings, account details and encrypted keys etc.

Note: if the func exe is not in your path etc. you can run

npm i -g azure-functions-core-tools

to install the tools.

Running/Debugging locally

Obviously, for as much development as possible, it would be best if, whilst testing, we ran our functions locally. As mentioned, to integrate your Azure Portal app settings we can run

func azure functionapp fetch-app-settings (your-app-name)

and you may have noticed when we run via the debugger the func.exe is executed. So it also makes sense that we can run our binary representations of our functions locally outside of the debugger.

Within your project’s output folder, i.e. bin\Debug\net461 (for example) and obviously after you’ve actually built you project, you can run (via the command prompt or Visual Studio command prompt (where func.exe is in your path) you can run

func host start

This will fire up an Azure emulator, read host.json, generate the host for your functions and then run a service on localhost:7071 (in my case) to host your functions.

Azure function triggers

In a previous introductory post to Azure functions I mentioned that there were more than just the HttpTrigger.

The current list of triggers available (at least for C#/.NET developers) are

  • HttpTrigger – triggers a function based upon an HTTP request, like a REST service call
  • HttpTriggerWithParameters – triggers a function based upon an HTTP request, like a REST service call
  • TimerTrigger – triggers a function based upon a CRON like time interval/setup
  • QueueTrigger – triggers when a message is added to the Azure Queue Storage
  • BlobTrigger – triggers when a blob is added to the specified container
  • EventHubTrigger – triggers when a new event is received via the event hub
  • ServiceBusQueueTrigger – triggers when a message is added to the service bus queue
  • ServiceBusTopicTrigger – triggers when a message is added to the service bus topic
  • ManualTrigger – triggers when the Run button is pressed in the Azure Portal
  • Generic Webhook – triggers when a webhook request occurs
  • GitHub Webhook – triggers when a GitHub webhook request occurs

Many of these are pretty self-explanatory, but let’s have a quick look at some of them anyway.

Note: I’m not going to cover the triggers that require other services, such as queue’s, event hubs, web hooks etc. in this post as they require more than a short paragraph to set-up etc. So I’ll look to dedicate a post to each as and when I begin using them.

HttpTrigger

The HTTP Trigger is analogous to a REST like service call. i.e. when an HTTP request is received the Azure function is called, passing in the HttpRequestMessage object (and a TraceWriter for logging).

Here’s a bare bones function example.

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
   log.Info("C# HTTP trigger function processed a request.");
   return req.CreateResponse(HttpStatusCode.OK, "Hello World");
}

The HttpTrigger supports a lot of HTTP methods, including the obvious GET, POST, DELETE along with HEAD, PATCH, PUT, OPTIONS, TRACE.

Note: Everything we can do with HttpTriggerWithParameters, i.e. changing the HTTP method type, adding a route to the function.json and adding parameters to our method to decode the route parameters, can be implemented with the HttpTrigger. At the time of writing this post, I’m not wholly sure of the benefit of the HttpTriggerWithParameters over HttpTrigger.

HttpTriggerWithParameters

An HttpTriggerWithParameters trigger is similar to an HttpTrigger except when we create one via the Azure Portal, by default we get the following code

public static HttpResponseMessage Run(HttpRequestMessage req, string name, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");
    return req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}

Notice the return is not wrapped in a Task and includes the string name parameter which, via Azure Portal we need to set the HTTP method to GET. GET is not the only supported method for this function but appears that way by default. To add HTTP methods we need to go to the function’s integrate menu option and add supported HTTP post methods and parameters can be passed in via the Azure portal’s Query parameters. More interestingly though a route is automatically created for us in the associated function.json file. The route looks like this

“route”: “HttpTriggerCSharp/name/{name}”

This obviously means we can navigate to the function using https://(app-name).azurewebsites.net/api/HttpTriggerCSharp/name/{name}

Let’s add an age parameters to the Query using the Add parameter option in the portal. If we change our route to look like this (in the function.json file)

“route”: “HttpTriggerCSharp/name/{name}/{age}”

and the source to look like this

public static HttpResponseMessage Run(HttpRequestMessage req, string name, int? age, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");

    // Fetching the name from the path parameter in the request URL
    return req.CreateResponse(HttpStatusCode.OK, "Hello " + name + " " + age);
}

we can now navigate to the function using https://(app-name).azurewebsites.net/api/HttpTriggerCSharp/name/{name}/{age}

TimerTrigger

As the name suggests, this trigger is dependent upon a supplied schedule (using CRON format). Obviously this is very useful in situations where we maybe run reports every day at a set time or could be used to periodically check some source (which is not supported as a trigger) such as a file location.

The code generated by the Azure Portal looks like this

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}

ManualTrigger

The manual trigger is a trigger which is executed via the Azure Portal’s Run button. One might view it a little like running a batch file or simply a function which you do not want others to have access to, the code generated by the portal looks like this

public static void Run(string input, TraceWriter log)
{
    log.Info($"C# manually triggered function called with input: {input}");
}

and values are supplied as input (to the string input via the Request Body input box as plain text.

Azure page “Hmmm… Looks like something went wrong” in Edge

I just tried to access my Azure account via Microsoft Edge and logged in fine but upon clicking on Portal was met with a fairly useless message

Hmmm... Looks like something went wrong

Clicking try again did nothing.

On searching the internet I found that if you press F12 in Edge to display the developer tools, the Cookies node had the following UE_Html5StorageExceeded.

Running localStorage.clear() from the developer tools console window is suggested here. Sadly this failed with an undefined response.

However Edge Crashing with HTML5 storage exceeded suggests going to Edge | Settings | Clear browsing data, clicking the button Choose what to clear and selecting Choose Cookies and saved website data followed by clicking the Clear button did work.

Oh and yes, sadly you will need to login again to various websites that you probably left logged in permanently.

Using Azure storage emulator

For all of my posts on using Azure Storage I’ve been using my online account etc. but ofcourse it’s more likely we’d normally want to test our code using an offline solution, hence it’s time to use the storage emulator.

In your App.config you need to have the following

<appSettings>
   <add key="StorageConnectionString" value="UseDevelopmentStorage=true;" />
</appSettings>

Now the following code (taken from my Azure table storage post) will use the local storage/Azure storage emulator.

var storageAccount = CloudStorageAccount.Parse(
   CloudConfigurationManager.GetSetting(
   "StorageConnectionString"));

var client = storageAccount.CreateCloudTableClient();
var table = client.GetTableReference("plants");

table.CreateIfNotExists();

var p = new Plant("Fruit", "Apple")
{
   Comment = "Can be used to make cider",
   MonthsToPlant = "Jan-Mar"
};

table.Execute(TableOperation.InsertOrMerge(p));

Note: If not already installed, install Azure Storage Emulator from https://azure.microsoft.com/en-gb/downloads/

and Azure storage emulator will be installed in

C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator

Microsoft Azure Storage Explorer can be used to view the storage data on your local machine/via the emulator and when you open the Development account in the Local and Attached Storage Accounts this will run the emulator, however I had several problems when running my client code.

This firstly resulted in no data in the table storage and then a 400 bad request.

So I had to open my command prompt at C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator (or run from the Window run/search textbox “Microsoft Azure Storage Emulator” to get a command prompt which will already have the emulator running and offers a CLI to interact with the emulator) and from here you can check the status, start, stop and create local storage using AzureStorageEmulator.exe. All looked fine but still didn’t work. Azure storage emulator, ultimately uses a default local MSSQL DB, so time to check things on the DB side.

Using Microsoft SQL Server Management Studio I connected to (local)\MSSQLLocalDb and under databases deleted the AzureStorangeEmulatorDb52 (ofcourse Azure Storage Emulator needs to have been stopped) as I was using Azure Storage Emulator 5.2, the DB name may differ depending on the version of the emulator you’re running.

From the Azure Storage Emulator, I ran

AzureStorageEmulator init

to recreate the DB (you might need to use the /forceCreate switch to reinitialize the DB). Oddly I still got an error from this command (see below)

Cannot create database 'AzureStorageEmulatorDb52' : The login already has an account under a different user name.
Changed database context to 'AzureStorageEmulatorDb52'..
One or more initialization actions have failed. Resolve these errors before attempting to run the storage emulator again.
Error: Cannot create database 'AzureStorageEmulatorDb52' : The login already has an account under a different user name.
Changed database context to 'AzureStorageEmulatorDb52'..

However if Microsoft SQL Server Management Studio all looked to be in place. So I re-ran my code and this time it worked.

Now we can run against the emulator and work away for the internet and/or reduce traffic and therefore possible costs of accessing Azure in the cloud.

References

Use the Azure storage emulator for development and testing