Category Archives: Java

Parameterized unit testing in Java

Occasionally (maybe even often) you’ll need some way to run the same unit test code against multiple inputs.

For example, you might have some code that iterates over a string counting certain characters (let’s say it counts the letter Z), the unit test would be exactly the same for testing the following scenarios

  1. When no characters of the expected type exist
  2. When characters of the expected type exist
  3. When the string is empty
  4. When the string is null

The only difference would be the input to the unit test and the expectation for the assert. In such situations we would tend to use parameterized unit tests, in C#/NUnit this would be with TestCase attribute. In Java with JUnit 5 (org.junit.jupiter.*) this would be with the @ParameterizedTest annotation.

We’re going to need to add the following dependency to our pom.xml (change the version to suit).

<dependency>
  <groupId>org.junit.jupiter</groupId>
  <artifactId>junit-jupiter-params</artifactId>
  <version>5.8.1</version>
  <scope>test</scope>
</dependency>

We could write two unit tests for the above scenario, one for success and one for failure, such as when no characters exist (i.e. the string is null or empty we expect a 0 return from our system under test), a second test would maybe check a known return value. In such situations we can simply use something like

@ParameterizedTest
@ValueSource(strings = {"", null, "aaabbb" })
void noZExists_ShouldReturn_Zero(string input) {
    assertEqual(0, CharacterCounter.countZ(input));
}

Now we’d have another unit test for successful cases, for example

@ParameterizedTest
@ValueSource(strings = {"aaaaZZZ", "ZZZ", "ZZZaaa" })
void zExists_ShouldReturn_Three(string input) {
    assertEqual(3, CharacterCounter.countZ(input));
}

This would be far better if we simply wrote one unit test but could pass in both the input as well as the expected result, hence combine all values into a single test. The only option I found for this was to use the @CvsSource annotation, so for example we write the input followed by the comma separate followed by the expectation – ofcourse we could supply more than two args per call of the unit test, but this is adequate for our needs – this means our test would look more like this

@ParameterizedTest
@CsvSource({ "\"\",0", "aaabbb,0", "aaaaZZZ,3", "ZZZ,3", "ZZZaaa,3" })
void zExists_ShouldReturn_Three(string input, int expectation) {
    assertEqual(expectation, CharacterCounter.countZ(input));
}

Property based testing in Java with kqwik

There are a couple (maybe more) libraries for property based testing, I found that jqwik worked well for me.

I’m not going to go too deep into what property based testing is, except to state that for my purposes, property based testing allows me to generate random values to pass into my tests. I can generate multiple “tries” or items of data and even restrict ranges of values.

Specifically I’m using property based testing to generate values for some unit conversion code, the values get converted to a different unit and back – if the resultant value is the same as the input value we can assume that the conversion back and forth works correctly.

Anyway there are plenty of resources to read on the subject, so let’s get stuck into some code.

First off we need the following dependency to our pom.xml (obviously change the version to suit)

<dependency>
  <groupId>net.jqwik</groupId>
  <artifactId>jqwik</artifactId>
  <version>1.6.2</version>
  <scope>test</scope>
</dependency>

Let’s look a an example unit test

@Property(tries = 100)
public void testFromDegreesToGradiansAndBack(@ForAll @DoubleRange(min = -1E12, max = 1E12) double value) {
  final double convertTo = Angle.Degrees.toGradians(value);
  final double convertBack = Angle.Gradians.toDegrees(convertTo);
  assertEquals(value, convertBack, 0.01);
}

The @Property annotation tells the jqik runner to supply data for the method, creating 100 items of data (or tries) and @ForAll is used to assign data to the double value. Finally for this test I wanted to constrain the range of double values to [-1E12, 1E12].

The output (upon success) looks similar to this

timestamp = 2022-01-05T20:00:08.038391600, AngleTests:testFromDegreesToGradiansAndBack = 
                              |--------------------jqwik--------------------
tries = 100                   | # of calls to property
checks = 100                  | # of not rejected calls
generation = RANDOMIZED       | parameters are randomly generated
after-failure = PREVIOUS_SEED | use the previous seed
when-fixed-seed = ALLOW       | fixing the random seed is allowed
edge-cases#mode = MIXIN       | edge cases are mixed in
edge-cases#total = 7          | # of all combined edge cases
edge-cases#tried = 6          | # of edge cases tried in current run
seed = -3195058119397656120   | random seed to reproduce generated values

Errors will cause the test to fail and output the failure for you to review.

Adding certificates to the Java cacerts (or fixing PKIX path issue)

I’m back on some Java coding after a fair time away and was getting the old PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException error.

Basically Java is complaining that it didn’t recognise the HTTPS SSL certificate of the maven repository (in this case one hosted in Artifactory). Here’s the steps to resolve this….

Note: Instructions are on Windows using Chrome, but should be similar in different browsers.

  • Open the HTTPS repository in Chrome (or preferred web browser)
  • Use the dev tools (ctrl+shift+i) and select the Security tab
  • Click on the View certificate button
  • Select the Details tab
  • Click the Copy to file… button
  • Click Next until you see the format selector, I used DER format
  • Click Next etc. and save the exported cert to your hard drive

Let’s assume we saved the file as mycert.cer, now we need to import this into the cacert using the keytool…

  • Go to the location of the JDK/JRE you’re using, for example C:\Program Files\Java\jdk1.8.0_101\jre\lib\security
  • Open a command prompt and type
    keytool -import -alias mycert -keystore "C:\Program Files\Java\jdk1.8.0_101\jre\lib\security\cacerts" -file mycert.cer
    

    Replace the first occurence of mycert with a unique name (key) for your certificate and then obviously the mycert.cer is replaced with the name of the certificate file you saved.

  • You’ll be asked for a password, the default is changeit obviously if this has been changed then use that
  • Type yes when prompted if you want to proceed

That’s it – the certificate should now be available to Java.

Spring boot and CORS

A while back I wrote a post Building a REST service with Spring and today I needed to try out the CORS support to allow a similar Spring based REST service to allow for CORS access from a React application I’m working on.

It’s super easy to allow access to all clients by simply adding @CrossOrigin to your controller, for example

package demo;

import org.springframework.web.bind.annotation.CrossOrigin;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@CrossOrigin
@RestController
public class SampleController {

    @RequestMapping("/purchaseOrder")
    public PurchaseOrderType getPurchaseOrder() {
        return new PurchaseOrderType(1234);
    }
}

Equally we can just add the same annotation to REST methods instead of the whole controller, for example

@CrossOrigin
@RequestMapping("/purchaseOrder")
public PurchaseOrderType getPurchaseOrder() {
   return new PurchaseOrderType(1234);
}

Note: @CrossOrigin is the equivalent of @CrossOrigin(origins = “*”).

If we want to limit the origins then we simply use the following instead

@CrossOrigin(origins = "http://localhost:3000")

// or an array of origins like this

@CrossOrigin(origins = {"http://localhost:3000", "http://localhost:3001"})

Docker, spring boot and mongodb

I wanted to create a docker build to run a spring boot based application along with it’s mongodb database which proved interesting. Here’s what I fouand out.

Dockerfile

To begin with, we need to create a docker configuration file, named Dockerfile. This will be used to create a docker image which we will host a spring boot JAR. Obviously this will require that we create an image based upon a Java image (or create our own). So let’s base our image on a light weight open JDK 1.8 image, openjdk:8-apline.

Below is an example Dockerfile

FROM openjdk:8-alpine
MAINTAINER putridparrot

RUN apk update

ENV APP_HOME /home
RUN mkdir -p $APP_HOME

ADD putridparrot.jar $APP_HOME/putridparrot.jar

WORKDIR $APP_HOME

EXPOSE 8080

CMD ["java","-Dspring.data.mongodb.uri=mongodb://db:27017/","-jar","/home/putridparrot.jar"]

The above will be used to create our image, based upon openjdk:8-apline, we then run an update (in case its required) we create an environment variable for our application folder (we’ll simply install our application into /home, but it could be more specific, such as /home/putridparrot/app or whatever), we then create that folder.

Next we ADD our JAR, so this is going to in essence copy our JAR from our host machine into the docker image so that when we run the image it’ll also include the JAR within that image.

I’m also exposing port 8080 as my JAR will be exposing port 8080, hence when we interact with port 8080 docker will proxy it through to our JAR application.

Finally we add a command (CMD) which will run when the docker image is run. So in this case we run the executable JAR passing in some configuration to allow it to access a mongodb instance (which will be running in another docker instance.

Note: The use if the db host is important. It need not be named db but the name needs to be the same as we’ll be using within the upcoming docker-compose.yml file

Before we move onto the mongodb container we need to try to build our Dockerfile, here’s the commands

docker rmi putridparrot --force
docker build -t putridparrot .

Note: These commands should be run from the folder containing our Dockerfile.

The first command will force remove any existing images and the second command will then build the docker image.

docker-compose.yml

So we’ve created a Dockerfile which will be used to create our docker image but we now want to create a docker-compose file which will be used to run both our newly created image and then a mongodb image and by use of commands such as depends_on and the use of the name of our mongo service (which we used within the JAR execution command). Here’s the docker-compose.yml file

version: "3.1"

services:
  putridparrot:
    build: .
    restart: always
    ports: 
      - "8080:8080"
    depends_on:
      - db

  db:
    image: mongo
    volumes:
      - ./data:/data/db
    ports:
      - "27017:27017"
    restart: always

The first line simply sets the version of the docker-compose syntax, in this case 3.1. This is followed by the services which will be run by docker-compose. The first service listed is our JAR’s image. In fact we do not use the image, we rebuild it (if required) via the build command – this looks for a Dockerfile in the supplied folder (in this case we assume it’s in the same folder as the docker-compose.yml file). We then set up the port forwarding to the docker image. This service depends on a mongodb running, hence the depends_on option.

The next service is our mongodb image. As mentioned previously, the name here can be whatever you want, but to allow our other service connect to it, should be used within our JAR configuration. Think of it this way – this name is the hostname of the mongodb service and docker will handle the name resolution between docker instances.

Finally, we obviously use the mongo image, and we want to expose the ports to allow access to the running instance and also store the data from the mongodb on our host machine, hence allow it to be used when a new instance of this service is started.

Now we need to run docker-compose using

docker-compose up

If all goes well, this will then, possibly build a new image of our JAR, the will bring up the services. As the first service depends_on the second, it will in essence be executed once the mongodb service is up and running, obviously allow it to then connect to the database.

JSON within subclasses, across multiple programming languages

I was developing an expression tree on a project, i.e. made up of subclasses of the class Expression, such as AndExpression, MemberExpression, LiteralExpression etc. The main code is TypeScript/JavaScript but this needs to pass JSON to TypeScript/JavaScript, C# or Java code (and possibly other languages).

Now when JavaScript JSON.stringify does it’s thing we’re left with no type information making it problematic converting each type back to it’s actual type, i.e. to an AndExpression not just an Expression.

A relatively easy way to solve this, whilst not as elegant as one might hope is to store a string representing the type within the object, for example

export class LiteralExpression extends Expression {
  public readonly $type: string = "LiteralExpression"
}

When we run the expression through JSON.stringify we get JSON with “$type”:”AndExpression” for example. In JavaScript we still need to do some work to convert this back to JavaScript classes, it’s easy enough to use JSON.parse(json) then iterate over our expression objects converting to subclasses and revive our objects from JSON in this way.

Note the use of the variable name $type. It’s important that it’s named $type if you want it to easily be translated into C# objects with Json.NET as this name is hard coded in this library, whereas Java’s jackson JAR allows us to easily change the name/key used.

Json.NET (Newtonsoft.Json)

Sadly we don’t quite get everything for free using Json.NET because it’s expecting C# style naming for classes, i.e. assembly/namespace etc. The easiest way to deal with this it to serialize/deserialize using our own SerializationBinder, for example

public static class Serialization
{
  public class KnownTypesBinder : ISerializationBinder
  {
    public IList<Type> KnownTypes { get; set; }

    public Type BindToType(string assemblyName, string typeName)
    {
      return KnownTypes.SingleOrDefault(t => t.Name == typeName);
    }

    public void BindToName(Type serializedType, out string assemblyName, out string typeName)
    {
      assemblyName = null;
      typeName = serializedType.Name;
    }
  }

  private static KnownTypesBinder knownTypesBinder = new KnownTypesBinder
  {
    KnownTypes = new List<Type>
    {
      typeof(AndExpression),
      typeof(BinaryExpression),
      typeof(LiteralExpression),
      typeof(LogicalExpression),
      typeof(MemberExpression),
      typeof(NotExpression),
      typeof(OperatorExpression),
      typeof(OrExpression)
    }
  };

  public static string Serialize(Expression expression)
  {
    var json = JsonConvert.SerializeObject(
      expression, 
      Formatting.None, 
      new JsonSerializerSettings
    {
      TypeNameHandling = TypeNameHandling.Objects,
      SerializationBinder = knownTypesBinder
    });
    return json;
  }

  public static Expression Deserialize(string json)
  {
    return JsonConvert.DeserializeObject<Expression>(
      json, 
      new JsonSerializerSettings
    {
      TypeNameHandling = TypeNameHandling.Objects,
      SerializationBinder = knownTypesBinder
    });
  }
}

Don’t forget you’ll also need to mark your properties and/or constructor parameters with [JsonProperty(“left”)] especially if you have situations where the names are keywords.

com.fasterxml.jackson.core

In Java we can add the following dependency to our pom.xml

<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.10.0</version>
</dependency>

Now in our Expression base class we write the following

import com.fasterxml.jackson.annotation.JsonTypeInfo;

@JsonTypeInfo(
  use = JsonTypeInfo.Id.NAME, 
  include = JsonTypeInfo.As.PROPERTY, 
  property = "$type")
public class Expression {
}

This tells jackson to include the type info using the keyword $type.

We also need to add the @JsonCreator annotation to each of our classes and each constructor parameter requires the following annotation @JsonProperty(“left”). Finally to serialize/deserialize we create an ObjectMapper to allow us to map our types to real objects using

package com.rbs.expressions;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.jsontype.NamedType;

public class Serialization {

  private static ObjectMapper createMapper() {
    ObjectMapper mapper = new ObjectMapper();

    mapper.registerSubtypes(
      new NamedType(AndExpression.class, "AndExpression"),
      new NamedType(BinaryExpression.class, "BinaryExpression"),
      new NamedType(OrExpression.class, "OrExpression"),
      new NamedType(LiteralExpression.class, "LiteralExpression"),
      new NamedType(LogicalExpression.class, "LogicalExpression"),
      new NamedType(MemberExpression.class, "MemberExpression"),
      new NamedType(NotExpression.class, "NotExpression"),
      new NamedType(OperatorExpression.class, "OperatorExpression")
    );

    return mapper;
  }

  public static String serialize(Expression expression) throws JsonProcessingException {
    return createMapper().writeValueAsString(expression);
  }

  public static Expression deserialize(String json) throws JsonProcessingException {
    return createMapper().readValue(json, Expression.class);
  }
}

Spring boot Eureka server

Using Spring boot, we can very easily run up a Eureka server.

In IntelliJ create a new project using the Spring Initializr, simply select the Eureka Server dependency.

Once the application is create, I found I needed to add the @EnableEurekaServer annotation to the application class, so here’s my EurekatesApplication.java

package com.putridparrot.eurekatest;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.netflix.eureka.server.EnableEurekaServer;

@EnableEurekaServer
@SpringBootApplication
public class EurekatestApplication {

    public static void main(String[] args) {
        SpringApplication.run(EurekatestApplication.class, args);
    }
}

Next we need to add the following properties to the resources/application.properties

server.port=8761

eureka.client.register-with-eureka=false
eureka.client.fetch-registry=false

logging.level.com.netflix.eureka=OFF
logging.level.com.netflix.discovery=OFF

Now when you run the application and view the web page http://localhost:8761/, you should see the Spring Eureka web page. Running http://localhost:8761/eureka/apps from your preferred browser will list any applications registered with the server.

At this point we’ve not application registered with the server, so let’s add a .NET server in the next post.

Java’s Linq equivalent, Streams

I was working on a previous post on Java and GraphQL and wanted to use a Linq query. Java doesn’t have Linq but it does have something equivalent (in Java 8) called Stream. For the examples within this post we’ll create the following simple method for generating our test data

private static List<String> getData() {
   return Arrays.asList(
      "2", "2", "2", "4", "4", 
      "6", "7", "7", "9", "10");
}

Let’s look at some of the standard sort of functionality…

Distinct

We can get a distinct stream from our data using

Stream<String> distinct = getData()
   .stream()
   .distinct();

Selection/Filtering

If we want to filter our data to select only strings longer than one character, for example, then we can use the following

Stream<String> where = getData()
   .stream()
   .filter(s -> s.length() > 1);

Matching

There are several methods which return a Boolean, for example if all items in the Stream are of a certain value, or any of the values no matches, for example

boolean allMatch = getData()
   .stream()
   .allMatch(s -> s.contains("4"));

boolean anyMatch = getData()
   .stream()
   .anyMatch(s -> s.contains("4"));

boolean noneMatch = getData()
   .stream()
   .noneMatch(s -> s.contains("4"));

Map

In scenarios where we wish to change the data returned from a stream (i.e. like Select in Linq), for example we’ll simply loop through each item and return the length of each string as a Stream, we can use

Stream<Integer> select = getData()
   .stream()
   .map(String::length);

Reduce

If we want to take a stream and reduce it to a single value, for example taking our data and concatenating into a single String, we could use the following

String reduce = getData()
   .stream()
   .reduce("", (a, b) -> a + b);

The initial value “” in this case is a starting value for the reduced value, in this case we’re not wanting to prepend anything to our string. The result of the reduce is a String with the value 22244677910.

Creating an empty Stream

We can create an empty Stream (useful in those situations where we do not wish to return a null) using

Stream<String> empty = Stream.empty();

Iteration

We can iterate over the Stream using the forEach method, for example

getData()
   .stream()
   .forEach(s -> System.out.println(s));

Parallel Streams

We can also parallelize (is that a word) our Stream using the parallelStream, for example

Stream<Integer> parallel = getData()
   .parallelStream()
   .map(String::length);

As you’d expect the order or processing of the map in this instance is indeterminate.

Obviously there’s more to Stream than I’ve listed here, but you get the idea.

ExecutorService based multi-threading in Java

A threading feature within Java is the ExecutorService. This allows us to, in essence, create our own threadpools. This is useful because it allows us to create a limited number of threads that our code may run on, ensuring we do not end up with thread starvation or the number of threads getting out of hand.

By writing our code to use the ExecutorService we can also limit our methods to only use a single thread if we want.

Here’s a simple example of using the ExectorService

public static void main(String[] args) throws 
   InterruptedException, 
ExecutionException {

   System.out.println("Main Thread - " + Thread.currentThread().getId());

   //ExecutorService executorService = Executors.newSingleThreadExecutor();
   ExecutorService executorService = 
      Executors.newFixedThreadPool(
         Runtime.getRuntime().availableProcessors());

   Future<?> f1 = executorService.submit(() 
      -> outputInformation(1));
   Future<?> f2 = executorService.submit(() 
      -> outputInformation(2));
   Future<?> f3 = executorService.submit(() 
      -> outputInformation(3));

   // the equivalent of waiting on the threads
   f1.get();
   f2.get();
   f3.get();

   executorService.shutdown();
}

private static void outputInformation(int id) {
   try {
      // some arbitrary time wasting
      Thread.sleep(2000);
      System.out.println(id + " - " + Thread.currentThread().getId());
   }catch (Exception e) {
   }
}

In the code above, we are creating our own threadpool (using Executors.newFixedThreadPool) with the size based upon the number of available processors. Hence in this example (assuming you have more than 1 processor on your machine) the resulting output is indeterminant, i.e. we might see 2 followed by 1, followed by 3 and all on different threads.

By simply changing to use Executors.newSingleThreadExecutor() the output will be in the correct order as only a single thread is used for each call.

The ExecutorService is a useful abstraction for containing and controlling multiple threads.

Self hosting Tomcat in a Java Web application

Following on from my previous post where I created a web application in Java, let’s now look at hosting the WAR within an embedded Tomcat instance.

Adding dependencies to include embedded Tomcat

Add the following to your pom.xml (after the description tag)

<properties>
   <tomcat.version>9.0.0.M6</tomcat.version>
</properties>

Note: There’s a newer version of a couple of the dependencies, but this version exists for all three of the dependencies we’re about to add.

Now, add following dependencies

<dependencies>
   <dependency>
      <groupId>org.apache.tomcat.embed</groupId>
      <artifactId>tomcat-embed-core</artifactId>
      <version>${tomcat.version}</version>
   </dependency>
   <dependency>
      <groupId>org.apache.tomcat.embed</groupId>
      <artifactId>tomcat-embed-jasper</artifactId>
      <version>${tomcat.version}</version>
   </dependency>
   <dependency>
      <groupId>org.apache.tomcat.embed</groupId>
      <artifactId>tomcat-embed-logging-juli</artifactId>
      <version>${tomcat.version}</version>
   </dependency>
</dependencies>

Time to run mvn install if not auto-importing.

Time to create the entry point/application

Let’s add a new package to the src folder, com.putridparrot now add a Java file, mine’s HostApp.java, here’s the code

package com.putridparrot;

import org.apache.catalina.LifecycleException;
import org.apache.catalina.startup.Tomcat;

import javax.servlet.ServletException;
import java.io.File;

public class HostApp {
    public static void main(String[] args) throws ServletException, LifecycleException {

        Tomcat tomcat = new Tomcat();
        tomcat.setBaseDir("temp");
        tomcat.setPort(8080);

        String contextPath = "";
        String webappDir = new File("web").getAbsolutePath();

        tomcat.addWebapp(contextPath, webappDir);

        tomcat.start();
        tomcat.getServer().await();
    }
}

In the above we create an instance of Tomcat and then set up the port and the context for the web app, including the path to the web folder where our index.jsp is hosted in our WAR file.

We then start the server and then wait until the application is closed.

By the way, it’s also worth adding logging to the pom.xml. The embedded Tomcat server using “standard” Java based logging, so we can add the following to the pom.xml dependencies

<dependency>
   <groupId>log4j</groupId>
   <artifactId>log4j</artifactId>
   <version>1.2.15</version>
</dependency>

Create a run configuration

Now select Edit Configuration and add a configuration that runs main from HostApp.

Adding gzip/compression support, before we start the server. Hence select Application and set Main class to com.putridparrot.HostApp.

Testing

Run the newly added application configuration, don’t worry about the exceptions. You should see a line similar to

INFO: Starting ProtocolHandler [http-nio-8080]

At this point the server is running, so navigate your browser to http://localhost:8080 and check that the index.jsp page is displayed.

Configuring the embedded Tomcat for gzip/compression support

I’ve been looking into compression with gzip on some web code and hence wanted to configure this embedded Tomcat server to handle compression if/when requested via Accept-Type: gzip etc.

So add the following to the main method (before tomcat.start())

Connector c = tomcat.getConnector();
c.setProperty("compression", "on");
c.setProperty("compressionMinSize", "1024");
c.setProperty("noCompressionUserAgents", "gozilla, traviata");
c.setProperty("compressableMimeType", "text/html,text/xml,text/css,application/json,application/javascript");
tomcat.setConnector(c);

You’ll also need the import import org.apache.catalina.connector.Connector;.

Testing gzip/compression

To test whether Tomcat is using compression is best done with something like curl. I say this because, whilst you can use a browser (such as Chrome’s) debug tools and see a response with Content-Type: gzip, I really wanted to see the raw compressed data to feel I really was getting compressed responses, the browser automatically decompressed the responses for me.

Go the main method and just change “on” to “force”

i.e.

c.setProperty("compression", "force");

This just forces compression to be on all the time.

Now to test our server is set up correctly. Thankfully Windows 10 seems to have curl available from a command prompt, so this works in Linux or Windows.

Run the following

curl -H "Accept-Encoding: gzip,deflate" -I "http://localhost:8080/index.jsp"

This command adds the header Accept-Encoding and then outputs the header (-I) results from accessing the URL. This should show a Content-Encoding: gzip if everything was set up correctly.

To confirm everything is as expected, we can download the content from the URL and save it (in this case saved to index.jsp.gz) and then use gzip -d to decompress the file if we wish.

curl -H "Accept-Encoding: gzip,deflate" "http://localhost:8080/index.jsp" -o index.jsp.gz

This will create index.jsp.gz which should be compressed so we can use

gzip -d index.jsp.gz

to decompress it and we should see the expected web page.