Category Archives: Testing

Specflow/Gherkin tags

We’re going to take a look at tags.

We add tags to our features like this, using the @ to prefix a name

@Calculator
Scenario: Calculate two values
# Given/When/Then steps

We can have multiple tags for a scenario, just comma separate them, like this

@Calculator, @Math
Scenario: Calculate two values
# Given/When/Then steps

Great, so what use do I have for tags?

Tags can be used to create documentation, they can be used to for start up and clean up code and they can be uses within the test runners to run groups of tests via their category, you guessed it, denoted by the tag, for example

// run tests on anything tagged Math
dotnet test MyTests.dll --filter Category=Math

// to run tests with both Calculator and Math tags
dotnet test MyTests.dll --filter "Category=Calculator & Category=Math"

// to run tests with either Calculator or Math tags
dotnet test MyTests.dll --filter "Category=Calculator | Category=Math"

Converting data to a Table in SpecFlow

The use case for this is, I have a step Set fields on view which takes a table like this

Scenario: With table
  Given Set fields on view 
    | A  | B  | C  |
    | a1 | b1 | c1 |
    | a2 | b2 | c2 |      
    | a3 | b3 | c3 |                            

The code for this step, currently just outputs the data set to it to the console or log file, so looks like this

[Given(@"Set fields on view")]
public void GivenFieldsOnView(Table table)
{
   table.Log();
}

Now in some cases, I want to set fields on a view using an Examples. So in the current case we’re sending multiple rows in one go, but maybe in some situations we want to set fields, one row at a time, so we uses Examples like this

Scenario Outline: Convert To Table
  Given Set fields on view A=<A>, B=<B>, C=<C>

  Examples:
    | A  | B  | C  |
    | a1 | b1 | c1 |
    | a2 | b2 | c2 |      
    | a3 | b3 | c3 |            

We’ll then need a new step that looks like this

[Given(@"Set fields on view (.*)")]
public void GivenFieldsOnView(string s)
{
}

Ofcourse we can now split the variable s by comma, then split by = to get our key values, just like a Table and there’s nothing wrong with this approach, but an alternative is to have this transformation as a StepArgumentTransformation. So our code above would change to

[Given(@"Set fields on view (.*)")]
public void GivenFieldsOnView2(Table table)
{
   table.Log();
}

and now in our hook class we’d have something like this

[Binding]
public class StepTransformer
{
  [StepArgumentTransformation]
  public Table TransformToTable(string input)
  {
    var inputs = input.Split(new[] { ',' }, StringSplitOptions.RemoveEmptyEntries);
    var d = inputs.Select(s =>
      s.Split(new[] { '=' }, StringSplitOptions.RemoveEmptyEntries))
        .ToDictionary(v => v[0], v => v[1]);

    // this only handles a single row 
    var table = new Table(d.Keys.ToArray());
    table.AddRow(d.Values.ToArray());
    return table;
  }
}

Note: This is just an example, with no error handling and will only convert a string to a single row, it’s just a demo at this point.

So, now what we’ve done is create a transformer which understands a string syntax such as K1=V1, K2=V2… and can convert to a table for us.

I know that you’re probably asking, why, we could we not just execute the same code in the public void GivenFieldsOnView(string s) method ourselves. Well you could ofcourse do that, but now you’ve got a generic sort of method for making such transformations for you.

What I really wanted to try to do is use a single step to handle this by changing the regular expression, i.e. we have one method for both situations. Sadly I’ve not yet found a way to achieve this, but at least we can reduce the code to just handle the data as tables.

Ordering of our SpecFlow hooks

In my post Running code when a feature or scenario starts in SpecFlow I showed we can use hooks to run before the feature and scenario. However, what if we have, for example, a lot of separate scenario hooks but the order that they run in matters. Maybe we need to have the logging of the scenario title and this should run first.

The BeforeScenario has a property Order which we can assign a number like this

[BeforeScenario(Order = 1)]
public static void BeforeScenario(ScenarioContext scenarioContext)
{
  Debug.WriteLine($"Scenario starting: {scenarioContext.ScenarioInfo.Title}");
}

This will run before other scenarios, including those with no Order property.

Beware, if you set the [AfterScenario(Order = 1)] it would also be run first. Which you might not want in a logging situation, then (the only solutions I’ve found thuis far is) you’ll have to actually have an Order property in all AfterScenario attributes, i.e. explicitly state the order or all such hooks.

Generic steps using regular expressions within SpecFlow

The use case I have is as follows.

Using SpecFlow for running UI automation tests, we have lots of views off of the main application Window, we might write out scenarios specific to each view, such as

Given Trader view, set fields
Given Middle Office view, set fields
Given Sales view, set fields

and these in turn are written as the following steps

[Given(@"Trader view, set fields")]
public void GivenTraderViewSetFields()
{
  // set the fields on the trader view
}

[Given(@"Middle Office view, set fields")]
public void GivenMiddleOfficeViewSetFields()
{
  // set the fields on the middle office view
}

[Given(@"Sales view, set fields")]
public void GivenSalesViewSetFields()
{
  // set the fields on the sales view
}

obviously this fine, but if all our views had the same automation steps to set fields, then the code within will be almost exactly the same, so we might prefer to rewrite the step code to be more generic

[Given(@"(.*) view, set fields")]
public void GivenViewSetFields(string viewName)
{
  // find the view and set the fields using same automation steps
}

This is great, we’ve reduced our step code, but the (.*) accepts any value which means that if we have a view which doesn’t support the same steps to set fields, then this might confuse the person writing the test code. So we can change the (.*) to restrict the view names like this

[Given(@"(Trader|Middle Office|Sales) view, set fields")]
public void GivenViewSetFields(string viewName)
{
  // find the view and set the fields using same automation steps
}

Now if you add a new view like the step below, your SpecFlow plugin will highlight it as not having a matching step and if you run the test you’ll get the “No matching step definition found for one or more steps.” error.

Given Admin view, set fields

We can ofcourse write a step like the following code, and now the test works

[Given(@"Admin view, set fields")]
public void GivenAdminViewSetFields()
{
}

But this looks different in the code highlighting via the SpecFlow extension to our IDE but also, what if Admin and Settings views both can use the same automation steps, then we’re again back to creating steps per view.

Yes, we could reuse the actual UI automation code, but I want to reduce the number of steps also to a minimum. SpecFlow allows, what we might think of as regular expression overrides, so let’s change the above to now look like this

[Given(@"(Admin|Settings) view, set fields")]
public void GivenAdminViewSetFields(string viewName)
{
}

Obviously we cannot have the same method name with the same arguments in the same class, but from the Feature/Scenario design perspective it now appears that we’re writing steps for the same method whereas in fact each step is routed to the method that understands how to automate that specific view.

This form of regular expression override also means we might have the method for Trader, Middle Office and Sales in one step definition file and the Admin, Settings method in another step definition file making the separation more obvious (and ofcourse allowing us to then use the same method name).

What’s also very cool about using this type of expression is the the SpecFlow IDE plugins, will show via autocomplete, that you have a “Admin view, set fields”, “Trader view, set fields” etc. steps .

Transforming inputs in SpecFlow

One really useful option in SpecFlow, which maybe most people will never need, is the ability to intercept input and potentially transform it. So the example I have is, we want to include tokens wrapped in { }. These braces will include a token which can then be replaced by a “real” value. Let’s assume {configuration.XXX} means get the value XXX from the configuration file (App.config).

With this use case in mind, let’s assume we have a feature that looks like this

Feature: Transformer Tests

  Scenario: Transform a number
    Given The value {configuration.someValue1}

  Scenario: Transform a table
    Given The values 
    | One                        | Two                        |
    | {configuration.someValue1} | {configuration.someValue2} |

We’re going to hard code our transformer to simple change configuration.someValue1 to Value1 and configuration.someValue2 to Value2, but ofcourse the real data could be read from the App.Config or generated in some more useful way.

Our step file for the above feature file, might look like this

[Given(@"The value (.*)")]
public void GivenTheValue(string s)
{
  s.Should().Be("Value1");
}

[Given(@"The values")]
public void GivenTheValues(Table table)
{
  foreach (var row in table.Rows)
  {
    var i = 1;
    foreach (var rowValue in row.Values)
    {
      rowValue.Should().Be($"Value{i}");
      i++;
    }
  }
}

We create a class, usually stored in the Hooks folder which looks like the following, where we include methods with the StepArgumentTransformation attribute, meaning these methods are call for the given type to do something with the step argument.

[Binding]
public class StepTransformer
{
  private string SimpleTransformer(string s)
  {
    switch (s)
    {
      case "{configuration.someValue1}":
        return "Value1";
      case "{configuration.someValue2}":
        return "Value2";
      default:
        return s;
    }
  }
        
  [StepArgumentTransformation]
  public string Transform(string input)
  {
    return SimpleTransformer(input);
  }
        
  [StepArgumentTransformation]
  public Table Transform(Table input)
  {
    foreach (var row in input.Rows)
    {
      foreach (var kv in row)
      {
        row[kv.Key] = SimpleTransformer(kv.Value);
      }
    }
    return input;
  }
}

This is a very simple example of such code, what we’d do in a real world is create some more advanced transformer code and inject that via the constructor but the basic Transform method would look much like they are here.

Running code when a feature or scenario starts in SpecFlow

One of the useful capabilities within SpecFlow is for us to define hooks which starts when the feature or scenario starts and ends.

An obvious use of these is to log information about the feature or scenario – we’ll just use Debug.WriteLine to demonstrate this.

All we need to do is create a binding and then supply static methods marked with the BeforeFeature and AfterFeature attributes, which might look like this

[Binding]
public class FeatureHook
{
  [BeforeFeature]
  public static void BeforeFeature(FeatureContext featureContext)
  {
    Debug.WriteLine($"Feature starting: {featureContext.FeatureInfo.Title}");
  }

  [AfterFeature]
  public static void AfterFeature(FeatureContext featureContext)
  {
    Debug.WriteLine($"Feature ending: {featureContext.FeatureInfo.Title}");
  }
}

as you can probably guess, we can do the same with the BeforeScenario and AfterScenario attributes like this

[Binding]
public class ScenarioHook
{
  [BeforeScenario]
  public static void BeforeScenario(ScenarioContext scenarioContext)
  {
    Debug.WriteLine($"Scenario starting: {scenarioContext.ScenarioInfo.Title}");
  }

  [AfterScenario]
  public static void AfterScenario(ScenarioContext scenarioContext)
  {
    Debug.WriteLine($"Scenario ending: {scenarioContext.ScenarioInfo.Title}");
  }
}

Creating a SpecFlow code generation plugin

Over the last few months I’ve been helping develop an Appium/WinAppDriver framework (or library if you prefer) to allow one of the application to look to move one of my client’s application away from CodedUI (support is being phased out for this).

One of the problem I had is, I wanted to be able to put text into the Windows Clipboard and paste it into an various edit controls. The pasting made things slightly quicker but also, more importantly, bypassed some control’s autocomplete which occasionally messes up using SendKeys.

The problem is that the clipboard uses OLE and NUnit complains that it needs to be running in an STA.

To fix this, SpecFlow classes are partial so we can create new classes within another file with the same classname (and mark as partial) and namespace and then add the NUnit require attribute as below

[NUnit.Framework.Apartment(System.Threading.ApartmentState.STA)]
public partial class CalculatorFeature
{
}

This is all well and good. It’s hardly a big deal, but we have hundreds of classes and it’s just more hassle, remembering each time to do this, or you run the tests and when it hits a Paste from the Clipboard, it fails, wasting a lot of time before you’re reminded you need to create the partial class.

SpecFlow has the ability to intercept parameters passed into our steps, it has tags and hooks which allow you to customize behaviour and it also comes with a way to interact with the code generation process which we can use to solve this issue and generate the required attributes in the generated code.

The easiest way to get started is follow the steps below, although if you prefer, the code I’m presenting here is available on my PutridParrot.SpecFlow.NUnitSta GitHub repo., so you can bypass this post and go straight the code for this article if you prefer.

  • Grab the zip or clone the repo to get the following GenerateOnlyPlugin
  • You may want to update SampleGeneratorPlugin.csproj to
    <TargetFrameworks>net472;netcoreapp3.1</TargetFrameworks>
    

    I would keep to these two frameworks to begin with, as we have a few other places to change and it’s better to get something working before we start messing with the rest of the build properties etc.

  • In the build folder edit the .targets file to reflect the Core and !Core frameworks, i.e.
    <_SampleGeneratorPluginFramework Condition=" '$(MSBuildRuntimeType)' == 'Core'">netcoreapp3.1</_SampleGeneratorPluginFramework>
    <_SampleGeneratorPluginFramework Condition=" '$(MSBuildRuntimeType)' != 'Core'">net472</_SampleGeneratorPluginFramework>
    

    Again I’ve left _SampleGeneratorPluginFramework for now as I just want to get things working.

  • Next go to the .nuspec file and change the file location at the bottom of this file, i.e.
    <file src="bin\$config$\net472\PutridParrot.SpecFlow.NUnitSta.*" target="build\net472"/>
    <file src="bin\$config$\netcoreapp3.1\PutridParrot.SpecFlow.NUnitSta.dll" target="build\netcoreapp3.1"/>
    <file src="bin\$config$\netcoreapp3.1\PutridParrot.SpecFlow.NUnitSta.pdb" target="build\netcoreapp3.1"/>
    
  • Now go and change the name of the solution, the project names and the assembly namespace if you’d like to. Ofcourse this also means changing the PutridParrot.SpecFlow.NUnitSta in the above to your assembly name.

At this point you have a SpecFlow plugin which does nothing, but builds to a NuGet package and in your application you can set up your nuget.config with a local package pointing to the build from this plugin. Something like

<packageSources>
  <add key="nuget.org" value="https://api.nuget.org/v3/index.json" protocolVersion="3" />
  <add key="Local" value="F:\Dev\PutridParrot.SpecFlow.NUnitSta\PutridParrot.SpecFlow.NUnitSta\bin\Debug"/>
</packageSources>

Obviously change the Local package repository value to the location of your package builds.

Now if you like, you can create a default SpecFlow library using the default SpecFlow templates. Ensure the SpecFlow.NUnit version is compatible with your plugin (so you may need to update the package from the template one’s). Finally just add the package you built and then build your SpecFlow test library.

If all went well, nothing will have happened, or more importantly, no errors will be displayed.

Back to our Plugin, thankfully I found some code to demonstrated adding global:: to the NUnit attributes on GitHub, so my thanks to joebuschmann.

  • Create yourself a class, mine’s NUnit3StaGeneratorProvider
  • This should implement the interface IUnitTestGeneratorProvider
  • The constructor should look like this (along with the readonly field)
    private readonly IUnitTestGeneratorProvider _unitTestGeneratorProvider;
    
    public NUnit3StaGeneratorProvider(CodeDomHelper codeDomHelper)
    {
       _unitTestGeneratorProvider = new NUnit3TestGeneratorProvider(codeDomHelper);
    }
    
  • We’re only interested in the SetTestClass, which looks like this
    public void SetTestClass(TestClassGenerationContext generationContext, string featureTitle, string featureDescription)
    {
       _unitTestGeneratorProvider.SetTestClass(generationContext, featureTitle, featureDescription);
    
       var codeFieldReference = new CodeFieldReferenceExpression(
          new CodeTypeReferenceExpression(typeof(ApartmentState)), "STA");
    
       var codeAttributeDeclaration =
          new CodeAttributeDeclaration("NUnit.Framework.Apartment", new CodeAttributeArgument(codeFieldReference));
             generationContext.TestClass.CustomAttributes.Add(codeAttributeDeclaration);
    }
    

    All other methods will just have calls to the _unitTestGeneratorProvider method that matches their method name.

  • We cannot actually use this until we register out new class with the generator plugins, so in your Plugin class, mine’s NUnitStaPlugin, change the Initialize to look like this
    public void Initialize(GeneratorPluginEvents generatorPluginEvents, 
       GeneratorPluginParameters generatorPluginParameters,
       UnitTestProviderConfiguration unitTestProviderConfiguration)
    {
       generatorPluginEvents.CustomizeDependencies += (sender, args) =>
       {
          args.ObjectContainer
             .RegisterTypeAs<NUnit3StaGeneratorProvider, IUnitTestGeneratorProvider>();
       };
    }
    
  • If you changed the name of your plugin class ensure the assembly:GeneratorPlugin reflects this new name, the solution will fail to build if you haven’t updated this anyway.

Once all these things are completed, you can build your NuGet package again. It might be worth incrementing the version and then update in your SpecFlow test class library. Rebuild that and the generated .feature.cs file should have classes with the [NUnit.Framework.Apartment(System.Threading.ApartmentState.STA)] attribute now. No more writing partial classes by hand.

This plugin is very simple, all we’re really doing is creating a really minimal implementation of IUnitTestGeneratorProvider which just adds an attribute to a TestClass and we registered this with the GeneratorPluginEvents, there’s a lot more why could potentially do.

Whilst this was very simple in the end. The main issues I had previously with this are that we need to either copy/clone or handle these build props and targets as well as use .nuspec to get our project packaged in a way that works with SpecFlow.

Unit testing your MAUI project

Note: I found this didn’t work correctly on Visual Studio for Mac, I’ll update the post further if I do get it working.

This post is pretty much a duplicate of Adding xUnit Test to your .NET MAUI Project but just simplified for me to quickly repeat the steps which allow me to write unit tests against my MAUI project code.

So, you want to unit test some code in your MAUI project. It’s not quite as simple as just creating a test project then referencing the MAUI project. Here are the steps to create an NUnit test project with .NET 7 (as my MAUI project has been updated to .NET 7).

  • Add a new NUnit Test Project to the solution via right mouse click on the solution, Add | Project and select NUnit Test Project
  • Open the MAUI project (csproj) file and prepend the net7.0 to the TargetFrameworks so it looks like this
    <TargetFrameworks>net7.0;net7.0-android;net7.0-ios;net7.0-maccatalyst</TargetFrameworks>
    
  • Replace <OutputType>Exe</OutputType> with the following
    <OutputType Condition="'$(TargetFramework)' != 'net7.0'">Exe</OutputType>
    

    Yes, this is TargetFramework singular. You may need to reload the project.

  • Now in your unit test project you can reference this MAUI project and write your tests

So, why did we carry out these steps?

Our test project was targeting .NET 7.0 but our MAUI project was targeting different platform implementations of the .NET 7 frameworks, i.e those for Android etc. We need the MAUI project to build a .NET 7.0 compatible version hence added the net7.0 to the TargetFrameworks.

The change to add the OutputType ensures that we only build an EXE output for those other frameworks, and therefore for .NET 7.0 we’ll have a DLL to reference instead in our tests.

Now we can build and run our unit tests.

Mocking – JustMock Lite, NSubstitute, FakeItEasy and Moq

I having been working on several projects with different teams, each have their preferred mocking framework (as well as many other differences in choices of tech.), this post is not meant as a comparison of popularity, preference or even functionality, its really more of a reminder to myself how to use each framework for the core mocking requirements so when switching between teams/projects I’ve a little reminder of the different syntax etc.

Note: This post is not complete or particularly comprehensive and I’ll update further as I need to use other features of the various libraries, but just wanted to get this published before I forget about it. Also there may be better or alternates ways to do things in different frameworks, for the purpose of this post I’m only really interested in showing the bare minimum changes to switch between frameworks.

Please note, MOQ version 4.20 has introduced a SponsoreLink which appears to send data to some third party. See discussions on GitHub.

What are we mocking?

Before we get into the details of the Mock frameworks, let’s see what we’re mocking

public interface IDataProvider
{
  IList<string> GetIds(string filter);
  event EventHandler Updates;
}

So let’s assume this will be an interface to some data provider, whether it’s remote or what the implementation is makes no difference, but the premise is we get some data using GetIds – maybe a list of trades in a trading application. The event tells us there have been updates to the data (this is a rather simplistic example as we don’t tell the consumer what the updates are, but you get the idea hopefully).

We’ll be passing the IDataProvider into a class which will act like a repository and will be used to get the data specific to that repository and the likes, the implementation of this is not really important to the specifics of this post and hence not included here.

Arrange, Act and Assert (AAA)

AAA is a pattern used within unit tests to

  • Arrange – initialize code including setting up any requirements for the system under test, i.e. create instances of objects to be tested, initialize values of the system under test etc.
  • Act – this is really just the action of using the code that we’re testing
  • Assert – this is the stage where we assert or verify results, i.e. did the system under test return the expected data/value

In terms of mocking frameworks, these should also follow and allow a similar flow when used. What we’re after is a way to create instances of

Arrange

To arrange our mock (or unit tests in general) we aim to create the component parts of a test and setup methods regarding what happens when they’re called etc.. In the case of using a mocking framework, we’ll use the framework to create instances of interfaces (or actual objects in some cases) and as the methods on an object don’t really have implementations behind them, we will want to declare what happens when code tries to call our mocked methods.

JustMock

var provider = Mock.Create<IDataProvider>();

Mock.Arrange(() => provider.GetData("/trades")).Returns(tradeData);

Moq

var provider = new Mock<IDataProvider>();
// or
var provider = Mock.Of<IDataProvider>();

provider.Setup(instance => 
   instance.GetTrades("/trades")).Returns(tradeData);

NSubstitute

var provider = Substitute.For<IDataProvider>();

provider.GetTrades("/trades").Returns(tradeData);

FakeItEasy

var provider = A.Fake<IDataProvider>();

A.CallTo(() => provider.GetTrades("/trades")).Returns(tradeData);

Act

The Act phase is actually the use of our mock objects as if we’re calling the actual implementation, hence in our code we may actually call

provider.GetTrades(route);

See also the section on events to see the mocking frameworks calling events.

Assert

In the unit tests we would now assert the results of our system under test, this may take a form such as of an NUnit assert, such as

Assert.AreEqual(3, systemUnderTest.Trades.Count);

This isn’t really part of the mock, as by this time hopefully our mock objects and methods have been called but we might wish to verify that the calls to the mocked methods actually happened.

Verify

In cases where you wish to assert that a mock has been called correctly, i.e. with the correct arguments or it’s been called N number of times, then we need to Assert. Assertions may be quick complex, for example we might want to ensure that a mock was only called with specific arguments, was called once or many times, maybe we don’t care about the arguments etc.

Let’s look at a fairly simple example where we want to ensure our mocked method is called once and once only

Note: JustMock includes the expectations of call occurrences etc. when arrange the mock hence we just verify or in JustMock, Assert, that all the previous Arrange definitions have been met.

JustMock

Mock.Assert(provider);

Moq

provider.Verify(instance => instance.GetTrades("/trades"), Times.Once());

NSubstitute

provider.Received(1).GetTrades("/trades").

FakeItEasy

A.CallTo(() => provider.GetTrades("/trades")).
   Returns(tradeData).MustHaveHappenedOnceExactly();

What about exceptions?

In some cases we want to mimic the mocked code throwing an exception, we do this as part of the Arrange phase on all frameworks

JustMock

Mock.Arrange(() => provider.GetIds("/trades"))
   .Throws<ArgumentNullException>();

Moq

provider.Setup(instance => instance.GetIds("/trades"))
   .Throws<ArgumentNullException>();

NSubstitute

provider.GetIds("/trades").Returns(x => throw new ArgumentNullException());

FakeItEasy

A.CallTo(() => provider.GetIds("/trades")).Throws<ArgumentNullException>();

What about events?

Ofcourse C# has an event model which we would also want to mock, let’s see how each framework allows us to simulate events. The Mock framework would already have supplied the event code and hence what we’re really wanting to do it cause an event to occur. This would therefore be part of the Act phase

JustMock

Mock.Raise(() => provider.Updates += null, null, null);

Moq

provider.Raise(instance => 
   instance.Updates += null, EventArgs.Empty);

NSubstitute

provider.Updates += Raise.Event();

FakeItEasy

provider.Updates += Raise.WithEmpty();

Code

Code is available on GitHub

The Gherkin language

Gherkin is a DSL used within BDD development. It’s used along with Cucumber which processes the DSL or in the case of .NET we can use tools such as SpecFlow (which I posted about a long time back, see Starting out with SpecFlow) to help generate files and code.

Gherkin allows us to create the equivalent of use cases in a human readable form, using a simple set of keywords and syntax which can then be use to generate a series of method calls to undertake some action and assertion.

Getting started

We can use a standard text editor to create Gherkin’s feature files, but if you prefer syntax highlighting and intellisense (although this language we’re using is pretty simple) then install SpecFlow into Visual Studio or add a Gherkin syntax highlighter into VSCode (for example) or just use your preferred text editor.

I’m going to create a feature file (with the .feature extension) using the SpecFlow item template, so we have a starting point which we can then work through. Here’s the file generated code

Feature: Add a project
	In order to avoid silly mistakes
	As a math idiot
	I want to be told the sum of two numbers

@mytag
Scenario: Add two numbers
	Given I have entered 50 into the calculator
	And I have entered 70 into the calculator
	When I press add
	Then the result should be 120 on the screen

What we have here is a Feature which is meant to describe a single piece of functionality within an application. The feature name should be on the same line as the Feature: keyword.

In this example, we’ve defined a feature which indicates our application will have some way to add a project within our application. We can now add an optional description, which is exactly what the SpecFlow template did for us. The Description may span multiple lines (as can be seen above with the lines under the Feature line being the description) and should be a brief explanation of the specific feature or use case. Whilst the aim is to be brief, it should include acceptance criteria and any relevant information such as user permissions, roles or rules around the feature.

Let’s change our feature text to be a little meaningful to our use case.

Feature: Add a project

	Any user should be able to create/add a new project
	as long as the following rules are met

	1. No duplicate project names can exist
	2. No empty project names should be allowed

I’m sure with a little thought I can come up with more rules, but you get the idea.

The description of the feature is a useful piece of documentation and can be used as a specification/acceptance criteria.

Looking at the generated feature code, we can see that SpecFlow also added @mytag. Tags allow us to group scenarios. In terms of our end tests, this can be seen as a way of grouping features, scenarios etc. Multiple tags may be applied to a single feature or scenario etc. for example

@project @mvp
Feature: Add a project

I don’t need any tags for the feature I’m implementing here, so I’ll delete that line of code.

The Scenario is where we define each specific scenario of a feature and the steps to be taken/expected. The Scenario takes a similar form to a Feature, i.e. Scenario: and then a description of the context.

Following the Scenario line, we then begin to define the steps that make up our scenario, using the keywords Given, When, Then, And and But.

In unit testing usage, we can view Given as a precondition or setup. When, And and But as actions (where But is seen as a negation) and this leaves Then as an assertion.

Let’s just change to use some of these steps in a more meaningful manner within the context of our Scenario

Scenario: Add a project to the project list
	Given I have added a valid project name
	When I press the OK button
	Then the list of projects should now include my newly added project

Eventually, if we generate code from this feature file, each of these steps would get turned into a set of methods which could be used as follows

  • Given becomes an action to initialize the code to the expected context
  • When becomes an action to set-up any variables, etc.
  • Then becomes an assertion or validate any expectations

Multiple When‘s can be defined using the And keyword, for example imagine our Scenario looked like this

Scenario: Add a project to the project list
	Given I have added a valid project name
	When I press the OK button
	And I checked the Allow checkbox
	Then the list of projects should now include my newly added project

Now in addition to the When I press the OK button step I would also get another When created, as the And keyword simply becomes another When action. In essence the And duplicates the previous keyword.

We can also include the But keyword. As we’ve seen And, this is really another way of defining a additional When steps but in a more human readable way, But works in the same way as the And keyword by simply creating another When step in generated code, however But should be viewed as a negation step, for example

Scenario: Add a project to the project list
	Given I have added a valid project name
	When I press the OK button
	But the Do Not Allow checkbox is unchecked
	Then the list of projects should now include my newly added project

Finally, as stated earlier, Then can be viewed as a place to write our assertions or simply check if the results matches our expectations. We can again use And after the Then to create multiple then steps and thus assert multiple expectations, for example

Scenario: Add a project to the project list
	Given I have added a valid project name
	When I press the OK button
	But the Do Not Allow checkbox is unchecked
	Then the list of projects should now include my newly added project
        And the list of project should increase by 1

More keywords

In the previous section we covered the core keywords of Gherkin for defining our features and scenarios. But Gherkin also includes the following

Background, Scenario Outline and Examples.

The Background keyword is used to define reusable Given steps, i.e. if all our scenarios end up requiring the application to be in edit mode we might declare a background before any scenarios, such as this

Background:
	Given the projects list is in edit mode
	And the user clicks the Add button

we’ve now created a sort of, top level scenario which is run before each Scenario.

The Scenario Outline keywords allow us to define a sort of scenario function or template. So, if we have multiple scenarios which only differ in terms of data being used, then we can create a Scenario Outline and replace the specific data points with variables.

For example let’s assume we have scenarios which actually define multiple project names to fulfil the feature’s two rules (we outlined in the feature). Let’s assume we always have a project named “Default” within the application and therefore we cannot duplicate this project name. We also cannot enter a “” project name.

If we write these as two scenarios, then we might end up with the following

Scenario: Add a project to the project list with an empty name
	Given the project name ""
	When I press the OK button
	Then the project should not be added

Scenario: Add a project to the project list with a duplicate name
	Given the project name "Default"
	When I press the OK button
	Then the project should not be added

If we include values within quotation marks or include numbers within our steps, then these will become arguments to the methods generated for these steps. This obviously offers us a way to reuse such steps or use example data etc.

Using a Scenario Outline these could be instead defined as

Scenario Outline: Add a project to the project list with an invalid name
	Given the project name <project-name>
	When I press the OK button
	Then the project should not be added

	Examples: 
	| project-name |
	| ""           |
	| "Default"    |

The <> acts as placeholders and the string within can be viewed as a variable name. We then define Examples which becomes our data inputs to the scenario.

Gherkin also includes the # for use to start a new line and mark it as a comment, multiple lines may be commented out using triple quotation marks, such as “””. Here’s a example of usage

# This is a comment
	
Scenario Outline: Add a project to the project list  with an invalid name
	Given the project name <project-name>
	"""
	Given the project name <project-id>
	"""
	When I press the OK button
	Then the project should not be added

	Examples: 
	| project-name |
	| ""           |
	| "Default"    |

It should be noted that after listing these different keywords and their uses you can also create a scenario that’s a Given followed by a Then, in other words a setup step followed by an assertion, if this is all you need.

For example

Scenario: Add a project
	Given A valid project name
        Then the project list should increase by 1

SpecFlow specifics

On top of the standard Gherkin keywords, SpecFlow adds a few bits.

The @ignore tag is used by SpecFlow to generate ignored test methods.

SpecFlow has also add Scenario Template as a synonym to Scenario Outline. Like wise Scenarios is a alternate to Examples.

Code generation

We’re not going to delve into Cucumber or the SpecFlow generated code except to point out that if you define more than one step within a scenario with the same text, this will generate a call to the same method in code. So whilst you might read a scenario as if it’s a new context or the likes, ultimately the code generated will execute the same method.

References

Gherkin Reference
Using Gherkin Language In SpecFlow