Category Archives: JavaScript

this is not the this you might expect

this within TypeScript/JavaScript is not the same as you’re used to if you come from an OO language such as C#, Java, C++ or the likes.

Within these languages this is the internal reference to the instance of the object your methods are members of, within TypeScript/JavaScript this depends upon the current “execution context”. This is one of the reasons when writing code for event handler in React (for example) we need code such as

this.handleClick = this.handleClick.bind(this);

In JavaScript the runtime maintains a stack of execution contexts. As such functions (not part of a class) have access to this which will point to the global object, which in a browser would be window and via node is a special global object.

Yes for those coming from an OO language it would seem odd that functions outside of classes have access to this.

For example, running node index.js on this index.js file, will display Object [global], however if we add ‘use strict’; to the start of index.js this will be undefined.

function fn() {
    console.log(this);
}

fn();

If we now create a JavaScript class (either ECMAScript 2015) for example

class MyClass {
    constructor(arg1, arg2) {
        this.arg1 = arg1;
        this.arg2 = arg2;
    }
}

We’ll find that this no longer references the global object, instead the new keyword (when we create an instance of this class) causes the JavaScript runtime to create a new object assigned to this specific to the class. Hence if you console.log(this) from the class you’ll see this within a new execution context, scoped to the class.

Let’s return to our class and add a new method, so the MyClass code should look like this

class MyClass {
    constructor(arg1, arg2) {
        this.arg1 = arg1;
        this.arg2 = arg2;
    }

    output() {
        console.log(this);
    }
}

if we now execute the following

const mc = new MyClass("Scooby", "Doo");
mc.output();

as you’d expect, the out function logs the MyClass instance (shown below) to the console.

MyClass { arg1: 'Scooby', arg2: 'Doo' }

If, however we instead have the following code

const mc = new MyClass("Scooby", "Doo");
const fn = mc.output;
fn();

the fn() call, which is ultimately just a call mc.output() will output undefined for this. What’s happening is we’re actually executing the function outside of the class and this (in node) is now undefined.

We might think that C# etc. is wrapping this in a closure or the likes, whereas JavaScript is not. So how do we “bind” this to the new output function – the clue is in the use of the work “bind”. Adding the following to the constructor, now binds the this from the class to the method and now calling fn() will output the MyClass object.

this.output = this.output.bind(this);

Interestingly, as JavaScript function have a bind, call and apply functions on them, we could actually bind the function to a totally different instance of an object, hence if we created a totally different class and then bind the function to it, the output will display the new object, i.e.

class PointlessClass  {
}

and now in the MyClass constructor we have

this.output = this.output.bind(new PointlessClass());

Then, using either method of calling the output method or assigning to fn and calling outside of the class, we’ll get PointlessClass {} logged to the console.

If we go back to the code

const mc = new MyClass("Scooby", "Doo");
const fn = mc.output;
fn();

We can change this to bind (instead of the constructor) if we so wished, for example

const mc = new MyClass("Scooby", "Doo");
const fn = mc.output.bind(mc);
fn();

The above will now output the instance of MyClass as it’s this.

As previously mentioned a function can bind, call and apply. So bind sets the this on the method and every time the method is called it’s bound to MyClass. Call executes a method against the supplied this only for that single call as does apply i.e.

const fn = mc.output;

fn.call(mc);
fn();

the fn.call will output an instance of MyClass but the fn() will again output undefined (or the global execution context).

I mentioned call and apply did the same things, so what’s the difference? The difference is in the way arguments are passed to these functions, call takes an array of arguments whereas apply takes a set, other than that they do the same thing, immediately execute the method against the supplied this.

Finally, arrow (also known as fat arrow) functions are part ECMAScript 2015 and they are automatically bound to this, let’s assume we change our MyClass to

class MyClass {
    constructor(arg1, arg2) {
        this.arg1 = arg1;
        this.arg2 = arg2;
        this.output = () => console.log(this);
    }
}

Now if we use this function like we did earlier, which if you recall output undefined for this

const fn = mc.output;
fn();

with the arrow function we found this was set to the instance of MyClass it was declared within. So basically it appears to be already bound, in reality it’s better to think of the arrow functions as inheriting the this because unlike non arrow methods, we cannot rebind it’s this reference. For example, the following will still output the MyClass instance, not the new PointlessClass that we’ve attempt to bind to

const fn = mc.output.bind(new PointlessClass());
fn();
[/code[

Redux saga

Redux sagas allow us to handle application side effects, such as asynchronous loading of data.

Assuming you have a React application created, we need run the following

  • yarn add react-redux
  • yarn add redux-saga

To create a simple demo, we’ll change the App.js file to

import React from 'react';
import store from './store';
import { Fetch } from './rootReducer';

function App() {

  function handleDoFetch() {
    store.dispatch({type: Fetch});
  }

  return (
    <div className="App">
      <button onClick={handleDoFetch}>Do Fetch</button>
    </div>
  );
}

export default App;

So this will simply dispatch an action which will ultimately be handled by our saga. Before this happens let’s create a redux store and set up the redux saga middleware, here’s my store.js file

import { createStore, applyMiddleware, combineReducers } from "redux";
import createSagaMiddleware from "redux-saga";
import rootReducer from "./rootReducer";
import rootSaga from "./rootSaga";

export const sagaMiddleware = createSagaMiddleware();

const store = applyMiddleware(sagaMiddleware)(createStore)(
  combineReducers({
    rootReducer,
  })
);

sagaMiddleware.run(rootSaga);

export default store;

We don’t need to combineReducers as there’s only one, but it’s there for an example of setting up multiple reducers. Let’s now create a very basic reducer named rootReducer.js

export const Fetch = "FETCH";
export const FetchEnded = "FETCH_ENDED";

export default (state = {}, action) => {
  switch (action.type) {
    case FetchEnded:
      console.log("Fetch Ended");
      return {
        ...state,
        data: "Fetch Ended"
      }
    default:
      break;
  }   
  return state;
}

Notice we’ve got two actions exported, Fetch and FetchEnded but there’s nothing handling Fetch in this case. This is because redux middleware will pass this through to the redux-saga we’re about to create. We could also handle Fetch here and still handle it also within the saga, the point being the saga is going to handle this action when it see’s it.

Now we’ve got everything in place, let’s put the final piece in place, the saga will be stored in rootSaga.js and here it is

import { put, takeLatest } from 'redux-saga/effects'
import { Fetch, FetchEnded } from "./rootReducer";

function *fetchData() {
    console.log("fetchData")
    yield put({ type: FetchEnded });
}

function* rootSaga() {
    yield takeLatest(Fetch, fetchData);
}

export default rootSaga;

Notice that the rootSaga function is a function generator and it yields the result of a call to fetchData each time the Fetch action is detected.

It might be the fetchData yield’s many values or even sits in a loop yielding data but in this example we’ve kept things simple. Running the application will display a button and using the browser’s dev tools, when the button is pressed the Fetch action is detected by the saga and the fetchData function runs, which then in turn dispatches a FetchEnded action which is handled by the reducer.

As stated, our fetchData is very simple but in a real world scenario this function could be acting as a websocket client and for every value returned would yield each value within a while(true) loop or the likes until cancelled or maybe an error ocurred.

JavaScript generator functions

In C# we have iterators which might return data, such as a collection but can also yield return values allowing us to more dynamically return iterable data.

Yield basically means return a value but store the position within the iterator so that when the iterator is called again, execution starts at the next command in the iterators.

These same type of operations are available in JavaScript using generator functions. Let’s look at a simple example

function* generator() {
  yield "A";
  yield "B";
  yield "C";
  yield "D";
}

The function* denotes a generator function and it’s only in the generator function we can use the yield keyword.

When this function is called a Generator object is returned which adheres to the iterable protocol and the iterator protocol which basically means, the Generator acts like an iterator.

Hence executing the following code will result in each yield value being output, i.e. A, B, C, D

for(const element of generator()) {
  console.log(element);
}

JSON within subclasses, across multiple programming languages

I was developing an expression tree on a project, i.e. made up of subclasses of the class Expression, such as AndExpression, MemberExpression, LiteralExpression etc. The main code is TypeScript/JavaScript but this needs to pass JSON to TypeScript/JavaScript, C# or Java code (and possibly other languages).

Now when JavaScript JSON.stringify does it’s thing we’re left with no type information making it problematic converting each type back to it’s actual type, i.e. to an AndExpression not just an Expression.

A relatively easy way to solve this, whilst not as elegant as one might hope is to store a string representing the type within the object, for example

export class LiteralExpression extends Expression {
  public readonly $type: string = "LiteralExpression"
}

When we run the expression through JSON.stringify we get JSON with “$type”:”AndExpression” for example. In JavaScript we still need to do some work to convert this back to JavaScript classes, it’s easy enough to use JSON.parse(json) then iterate over our expression objects converting to subclasses and revive our objects from JSON in this way.

Note the use of the variable name $type. It’s important that it’s named $type if you want it to easily be translated into C# objects with Json.NET as this name is hard coded in this library, whereas Java’s jackson JAR allows us to easily change the name/key used.

Json.NET (Newtonsoft.Json)

Sadly we don’t quite get everything for free using Json.NET because it’s expecting C# style naming for classes, i.e. assembly/namespace etc. The easiest way to deal with this it to serialize/deserialize using our own SerializationBinder, for example

public static class Serialization
{
  public class KnownTypesBinder : ISerializationBinder
  {
    public IList<Type> KnownTypes { get; set; }

    public Type BindToType(string assemblyName, string typeName)
    {
      return KnownTypes.SingleOrDefault(t => t.Name == typeName);
    }

    public void BindToName(Type serializedType, out string assemblyName, out string typeName)
    {
      assemblyName = null;
      typeName = serializedType.Name;
    }
  }

  private static KnownTypesBinder knownTypesBinder = new KnownTypesBinder
  {
    KnownTypes = new List<Type>
    {
      typeof(AndExpression),
      typeof(BinaryExpression),
      typeof(LiteralExpression),
      typeof(LogicalExpression),
      typeof(MemberExpression),
      typeof(NotExpression),
      typeof(OperatorExpression),
      typeof(OrExpression)
    }
  };

  public static string Serialize(Expression expression)
  {
    var json = JsonConvert.SerializeObject(
      expression, 
      Formatting.None, 
      new JsonSerializerSettings
    {
      TypeNameHandling = TypeNameHandling.Objects,
      SerializationBinder = knownTypesBinder
    });
    return json;
  }

  public static Expression Deserialize(string json)
  {
    return JsonConvert.DeserializeObject<Expression>(
      json, 
      new JsonSerializerSettings
    {
      TypeNameHandling = TypeNameHandling.Objects,
      SerializationBinder = knownTypesBinder
    });
  }
}

Don’t forget you’ll also need to mark your properties and/or constructor parameters with [JsonProperty(“left”)] especially if you have situations where the names are keywords.

com.fasterxml.jackson.core

In Java we can add the following dependency to our pom.xml

<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.10.0</version>
</dependency>

Now in our Expression base class we write the following

import com.fasterxml.jackson.annotation.JsonTypeInfo;

@JsonTypeInfo(
  use = JsonTypeInfo.Id.NAME, 
  include = JsonTypeInfo.As.PROPERTY, 
  property = "$type")
public class Expression {
}

This tells jackson to include the type info using the keyword $type.

We also need to add the @JsonCreator annotation to each of our classes and each constructor parameter requires the following annotation @JsonProperty(“left”). Finally to serialize/deserialize we create an ObjectMapper to allow us to map our types to real objects using

package com.rbs.expressions;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.jsontype.NamedType;

public class Serialization {

  private static ObjectMapper createMapper() {
    ObjectMapper mapper = new ObjectMapper();

    mapper.registerSubtypes(
      new NamedType(AndExpression.class, "AndExpression"),
      new NamedType(BinaryExpression.class, "BinaryExpression"),
      new NamedType(OrExpression.class, "OrExpression"),
      new NamedType(LiteralExpression.class, "LiteralExpression"),
      new NamedType(LogicalExpression.class, "LogicalExpression"),
      new NamedType(MemberExpression.class, "MemberExpression"),
      new NamedType(NotExpression.class, "NotExpression"),
      new NamedType(OperatorExpression.class, "OperatorExpression")
    );

    return mapper;
  }

  public static String serialize(Expression expression) throws JsonProcessingException {
    return createMapper().writeValueAsString(expression);
  }

  public static Expression deserialize(String json) throws JsonProcessingException {
    return createMapper().readValue(json, Expression.class);
  }
}

Apollo GraphQL client

Based upon our previous implementation(s) of a server it’s now time to write some client code.

  • yarn add apollo-client
  • yarn add apollo-cache-inmemory
  • yarn add apollo-link-http
  • yarn add graphql-tag
  • yarn add isomorphic-fetch
  • yarn add -D @types/isomorphic-fetch

Now let’s create a simple script entry (this extends the previous post’s scripts section)

"scripts": {
  "build": "tsc",
  "client": "node client.js"
}

Now let’s create the file client.ts which should look like this

import ApolloClient from 'apollo-client';
import { InMemoryCache } from 'apollo-cache-inmemory';
import { HttpLink } from 'apollo-link-http';
import gql from 'graphql-tag';
import fetch from 'isomorphic-fetch';

const cache = new InMemoryCache();
const link = new HttpLink({
  uri: 'http://localhost:4000/',
  fetch: fetch
})

const client = new ApolloClient({
  cache,
  link,
});

client.query({
  query: gql`
    {
    users {
      firstName
      lastName
    }
  }`
  })
  .then((response: any) => console.log(response.data.users));

The response.data.users should now output the array of User objects.

Here’s an example of a mutation

client.mutate({
  mutation: gql`
    mutation {
      create(user: {
        firstName: "Miner",
        lastName: "FortyNiner"
      }) {
        firstName
        lastName
      }
    }
  `
}).then((response: any) => console.log(response.data.create));

Note: I had lots of issues around the fetch code, with errors such as Invariant Violation:
fetch is not found globally and no fetcher passed, to fix pass a fetch for your environment like
https://www.npmjs.com/package/node-fetch.
. The addition of the isomorphic-fetch solved this problem.

Writing our first Apollo GraphQL server

In the previous posts we’ve seen how to build an express based server and eventually add GraphQL to it.

graphql-express is not the only option for implementing GraphQL servers, let’s now look at Apollo – I cannot say whether one library is better than the other as I’ve not used them enough to comment although.

Let’s go through the process of creating a new project for this, so carry out the following steps

  • Create a folder for your project
  • cd to that folder
  • Run yarn init -y
  • Run tsc –init
  • Add a folder named models and a file named user.ts (we’ll use the same model/data from the previous posts), here’s the code for this file
    export default class User {
      constructor(public firstName: string, public lastName: string) {
      }
    }
    
    export const stubData = [
      new User('Scooby', 'Doo'),
      new User('Fred', 'Jones'),
      new User('Velma', 'Dinkley'),
      new User('Daphne', 'Blake'),
      new User('Shaggy', 'Rogers'),
    ];
    

So that’s the basics in place for the support data and model, let’s now look at the specifics for Apollo.

We’re going to create a file for the resolvers named resolver.ts and here’s the code

import User, { stubData } from "./models/user";

export const resolvers = {
  Query: {
    users: () => stubData,
  },
  Mutation: {
    create: (parent: any, args: any): User => {
      const data = new User(args.user.firstName, args.user.lastName)
      stubData.push(data);
      return data;
    }
  }
}

As you can see, we specify the Query and Mutation separately and the code’s pretty much the same as the express-graphql implementation except in the create mutation parameters. In this case the parent will supply any parent nodes whilst the args supplies the parameters from the mutation call.

Now we’ll create the file server.ts which will have the (as I’m sure you guessed) the server code as well as the schema definition (Apollo names this typeDefs).

import { ApolloServer, gql } from "apollo-server";
import { resolvers } from "./resolvers";

const typeDefs = gql`
type User {
    firstName: String
    lastName: String
  }

  input UserInput {
    firstName: String
    lastName: String
  }

  type Mutation {
    create(user: UserInput): User
  }

  type Query {
    users: [User]
  }
`;

const server = new ApolloServer({ typeDefs, resolvers });

server.listen()
  .then(({ url }) => {
    console.log(`Server listening to ${url}`);
  });

Note that Apollo uses a template tag gql function to declare the GraphQL schema, which along with the resolves is passed into the ApolloServer, then we simply start the server.

If you now run yarn build followed by yarn run the server will run up with default port 4000. Navigating to http://localhost:4000/ will display the GraphQL playground.

So, as you can see, Apollo is very simple to set-up for use as a GraphQL server.

Adding GraphQL to our Express server

In the previous two posts, we firstly create a basic server, then extended this with REST functionality, now lets add GraphQL capabilities.

First off, run the following

  • yarn add express-graphql graphql
  • Add a folder named schema

With the schema folder add a new file userSchema.ts with the following code

import { buildSchema } from "graphql";

export const schema = buildSchema(`
  type User {
    firstName: String
    lastName: String
  }

  type Query {
    users: [User]
  }
`);

In the above code, we create a GraphQL schema based upon our User type, along with the type Query for us to use the User type.

Next we’ll create a folder named resolvers along with the file userResolver.ts. GraphQL uses resolvers to interact with the data via GraphQL (like the code in the indexController in the previous post).

import { stubData } from "../models/user";

export const userResolver = {
  users: () => {
    return stubData;
  }
}

Now let’s update the server.ts file by adding the GraphQL code, I’ll include the whole file here (at least the specifics for GraphQL)

import graphqlHTTP from "express-graphql";
import { schema } from "./schema/userSchema";
import { userResolver } from "./resolvers/userResolver";

const server = express();

server.use('/graphql', graphqlHTTP({
  schema: schema,
  rootValue: userResolver,
  graphiql: true,
}));

const port = 4000;
server.listen(port,
  () => console.log(`Server on port ${port}`)
);

In the above we’ve specified graphiql: true which allows us to via the GraphIQL page using http://localhost:4000/graphql

Now if you type the following into the left hand (query) pane

{
  users {
    firstName
    lastName
  }
}

Running this from GraphIQL should output all our stubData.

We’ve essentially got the users method from our original code but now let’s add the equivalent of the create method, which in GraphQL terms is a mutation type.

In the userSchema.ts file, change the schema to look like this

export const schema = buildSchema(`
  type User {
    firstName: String
    lastName: String
  }

  input UserInput {
    firstName: String
    lastName: String
  }

  type Mutation {
    create(user: UserInput): User
  }

  type Query {
    users: [User]
  }
`);

We’ve added an input type UserInput along with the type Mutation. Next, within the userResolve.ts file change the userResolve to add the create function, as below

export const userResolver = {
  users: () => {
    return stubData;
  },
  create: (data: any) => {
    stubData.push(new User(data.user.firstName, data.user.lastName)); 
    return data.user;
  }
}

Let’s test this code by executing the following query via GraphIQL

mutation {
  create(user: {
    firstName: "Miner",
    lastName: "FortyNiner",
  }) {
    firstName
    lastName
  }
}

Extending our Express based server

In my previous post I showed how to create an Express server in TypeScript. Let’s now extend things…

To start with, create some folders (the names are not important, in that this is not using convention based folder names). We’re going to set things up in a similar way to I’ve done for C# and Java…

  • Add a new folder controllers
  • Add a new folder models

Create a file server.ts in the root folder. Here’s the code from the previous post plus some extras

import express from "express";
import bodyParser from "body-parser";

const server = express();

server.use(bodyParser.json());

server.get("/", (request, response) => {
  response.send("<h1>Hello World</h1>");
});

const port = 4000;
server.listen(port, 
  () => console.log(`Server on port ${port}`)
);

The body-parser exposes middleware etc. in this case we’re adding the ability to work with JSON content type.

Now let’s change the existing route / and add some more, below is the additional code required in server.ts

import * as indexController from "./controllers/indexController";

server.get('/', indexController.index);
server.get('/users', indexController.users);
server.get('/users/create', indexController.create);

// POST implementation
// server.post('/users/create', indexController.create);

We now need to implement the model and the controller, so starting with the model, create the file user.ts within the models folder and it should look like this

export default class User {
  constructor(public firstName: string, public lastName: string) {
  }
}

export const stubData = [
  new User('Scooby', 'Doo'),
  new User('Fred', 'Jones'),
  new User('Velma', 'Dinkley'),
  new User('Daphne', 'Blake'),
  new User('Shaggy', 'Rogers'),
];

The stubData ofcourse is just here to give us some data to start things off.

In the controllers folder, add a new file named indexController.ts and the file should look like this

import { Request, Response } from "express";
import User, { stubData } from '../models/user';

export const index = (req: Request, res: Response) => {
  res.send("<h1>Methods are</h1><ul><li>users</li><li>create</li></ul>")
};

export const users = (req: Request, res: Response) => {
  res.json(stubData);
};

export const create = (req: Request, res: Response) => {
  const newUser = new User(req.query.firstName, req.query.lastName);
  stubData.push(newUser);
  res.json(newUser);
};

// POST implementation
// export const create = (req: Request, res: Response) => {
//   const newUser = new User(req.body.firstName, req.body.lastName);
//   stubData.push(newUser);
//   res.json(newUser);
// };

The scripts for package.json should also be taken from the previous post.

If you now run yarn build then yarn start your server will be up and running. Using the following URL’s

  • http://localhost:4000/
  • http://localhost:4000/users
  • http://localhost:4000/users/create?firstName=Miner&lastName=FortyNiner
  • will result in, the first URL returning a simple help screen using HTML, the second will list the current array of users using JSON and finally the third URL will create a new user, the response will show the new user as JSON, but if you call the users method again you will see the newly added user.

    I’ve also listed the POST implementations which you’d probably more likely use for mutations, but the GET implementations are fine in this instance and easier to test via our preferred browser.

Express server with TypeScript

Express is a popular web application server in JavaScript.

Let’s create a really quick and simple expression server which will be a good starting point for further posts which use this library.

I’ll assume you’ve created a folder, run yarn init and tsc –init, next up run

  • yarn add express
  • yarn add -D @types/express
  • Add a new file, mine’s named server.ts
  • Add the usual scripts
    "scripts": {
      "build": "tsc",
      "start": "node server.js"
    }
    

The code to run up a really simple server is as follows

import express from "express";

const server = express();

server.get("/", (request, response) => {
  response.send("<h1>Hello World</h1>");
});

const port = 4000;
server.listen(port, 
  () => console.log(`Server on port ${port}`)
);

Using our scripts, run yarn build followed by yarn start. The server should start on port 4000, so now navigate to http://localhost:4000 using your preferred browser and that’s it.

We’ve created a server instance on port 4000, we’re routed any root (i.e. /) calls to the response, which in this case returns a simple HTML string.

Simple node based HTTP server

In a previous post we used webpack to run an HTTP server, in this post we’re going to create a bare bones HTTP server using the http package.

In your chosen folder run the usual commands

  • yarn init y
  • tsc –init
  • Add a folder named public off of the folder you source will be in (this will be where we add static html files)

Now lets’ add the required packages

  • yarn add node-static @types/node-static

Add the following to the package.json

"scripts": {
  "start": "node server.js",
  "build": "tsc"
}

In the public folder add index.html with the following

<html>
  <head></head>
  <body>
  Hello World
  </body>
</html>

Now let’s add the code to start up our server (mine’s in the file server.ts)

import ns from "node-static";
import http from "http"

const file = new ns.Server("./public");

http.createServer((request, response) => {
    request.addListener("end", () => {
        file.serve(request, response);
    })
    .resume()
})
.listen(4000);

Simple run yarn build then yarn start and the server will start. Navigating your preferred browser to http://localhost:4000/ will then display the HTML file, i.e. Hello World text.