Global Event with…. Global Events

Global Event with…. Global Events

I am currently working on a personal project, one about which I may reveal more in the future. For this project I needed a simple way of communicating between processes. In my case, these processes are created ad-hoc, so I didn’t want to have to keep track of said processes, as I viewed this as being error-prone.

In the past I wrote redis-streams-manager,which patches node’s EventEmitter to manage reading from multiple streams. This setup was what I wanted this time; only the EventEmitter native interface, with the specifics buried below. I wanted to be able to emit an event from one process, which will then be received by all listeners, including in other processes. Because I didn’t need the extra features streams offer, I decided to use the publish/subscribe mechanism instead.

The idea worked. My project had global events. Since it’s a relatively self-contained module, I decided to extract and publish it. The result is Global Events, which does exactly what it sounds like. It even uses msgpackr to serialize and de-serialize an optional data payload.

Global Events has the same interface as redis-streams-manager, and both can be used to achieve the same thing. The APIs of both are as close to the original EventEmitter API as possible, with only slight additions or improvements.

So you want an example:

import IORedis from "ioredis";
import GlobalEvents from "@art-of-coding/global-events";

// Create a non-dedicated connection
// You can use this connection elsewhere
const connection = new IORedis();

const events = new GlobalEvents({
  // Set the Redis connection
  // This connection is duplicated internally to act as subscriber
  // An optional prefix
  prefix: "prefix:",
  // Optional msgpackr configuration
  // See
  msgpackr: { useRecords: true },

// Listen for an event
// Will automatically subscribe to the event if no subscription is open yet
events.on("expected-event", (data: MyDataInterface) => {
  // `data` is the event data if the event had a data payload

// Emit an event

// Emit an async event
await events.emitAsync("async-event");

// Emit an event with some data
events.emit("another-event", { some: "data" });

// Disconnect the subscriber
// After calling this the instance is no longer usable
await events.disconnect();
Signal Fire Connect

Signal Fire Connect

I’m happy to introduce Signal Fire Connect, a very slim client for Signal Fire Server. Connect provides a minimal interface to make working with the Server a breeze. In contrast to Signal Fire Client it ditches the sessions in favor of a more low-level approach. This makes Connect more relevant to varying use cases, and allows developers more freedom in how to handle WebRTC in their web applications.

So, what does Connect actually do? It does the minimum necessary for a signaling server interface; it allows sending and receiving of offers, answers, and ICE candidates. What the developer does with them afterward is of no concern to the module.

Below we find an example using Signal Fire Peer, a wrapper around the native RTCPeerConnection which makes life a little easier while still remaining low-level.

import Connect, { ConnectInit, IncomingOfferEvent, IncomingAnswerEvent, IncomingICECanidateEvent } from './index'
import Peer, { OfferEvent, AnswerEvent, ICECandidateEvent } from '@signal-fire/peer'

// Initial configuration
const init: ConnectInit {
  reconnectOnClose: false,
  reconnectOnError: true,
  reconnectInterval: 2000,
  reconnectAttempts: 2,
  urlTransform: previousUrl => previousUrl

// Create a new Connect instance
const connect = new Connect(init)

// Connect to the signaling server
await connect.connect('wss://')

// Create a new peer
const target = '<target id>'
const connection = new RTCPeerConnection()
const peer = new Peer(connection)

peer.addEventListener('offer', ({ detail: offer }: OfferEvent) => {
  // send the offer to the remote peer through the signaling server
  connect.sendOffer(target, offer).catch((err: Error) => { /* ... */ })

peer.addEventListener('answer', ({ detail: answer }: AnswerEvent) => {
  // send the answer to the remote peer through the signaling server
  connect.sendAnswer(target, answer).catch((err: Error) => { /* ... */ })

peer.addEventListener('icecandidate', ({ candidate }: ICECandidateEvent) => {
  if (candidate) {
    // send the ICE candidate to the remote peer through the signaling server
    connect.sendICECandidate(target, candidate).catch((err: Error) => { /* ... */ })

// Listen to incoming offers
connect.addEventListener('offer', ({ detail: { origin, offer } }: IncomingOfferEvent) => {
  if (origin === target) {

// Listen to incomging answers
connect.addEventListener('answer', ({ detail: { origin, answer } }: IncomingAnswerEvent) => {
  if (origin === target) {

// Listen to incoming ICE candidates
connect.addEventListener('ice', ({ detail: { origin, candidate } }: IncomingICECandidateEvent) => {
  if (origin === target) {

As you can see, only the bare minimum is being handled here. Connect provides this low-level interface to a Signal Fire Server instance.

Talking to Eve

Talking to Eve

I have been playing Eve Online on and off since 2007. One of the features I liked most was the ability to interact with the world of Eve Online through their API. In the beginning, working with their API was painful for me, as documentation was lacking. More recently they switched over to a more modern API, the Eve Swagger Interface (ESI). Using ESI an application or web site can interact with Eve Online in various ways.

ESI allowed me to build Ageira Trade, a web site where characters can buy and sell in-game resources from and to me. The web site uses ESI to fetch in-game contracts and compare them against quotes submitted through the web site. The matching process happens automatically, which allows me to see which contracts are valid and accept them without having to manually check each one. I will write a blog post about Ageira Trade in the near future.


A while ago I developed a small node,js module for working with Eve Online Single-Sign On (SSO). By using SSO developers can authenticate Eve Online characters and communicate with the API on their behalf. Eve uses scopes to implement this; scopes are requested by the developer, the user can decide whether or not to grant the requested scopes.

My first module, which handles SSO exclusively, is eve-sso. Using eve-sso developers can easily use SSO to authenticate characters and request scopes. I am using this module myself in production over at Ageira Trade. The module takes away the hassles of dealing with redirect URLs and grant requests. It supports access tokens and refresh tokens.

I built eve-esi on top of eve-sso. The module is a more complete module which includes account and character management, and aims to make working with ESI less painful. However, at the time of writing, the module can use an overhaul, which I intend to do in the near future. I made a wrong assumption when first building it, resulting in each character having a separate account without being able to link characters together. I intent to rectify this in the next major release.


Recently I started doing front-end development again, as you can read here. I needed a way to access the Eve Swagger Interface (ESI) from the browser. Luckily, I only needed to access endpoints which do not require authentication. This greatly simplified the task of writing a small module for this purpose.

The result is esi-browser, a simple browser module which caches results in LocalStorage and respects ESI’s Expires and ETag headers. I am using it as part of Agera Trade. The small module allows developers to easily access, for example, info about in-game items. This negates the need to load the information on the server instead.

Next steps

My Ageira Trade project is nearly complete; some functionality in the administration panel is still painfully absent, but the customer-facing portion, as well as the back-end, work like a charm.

Be sure to check out the modules if you’re interested in developing against ESI. The source code for these modules, as well as Ageira Trade, is available on GitHub under the MIT license. Happy coding!

My first full-stack deployment in a decade

My first full-stack deployment in a decade

It has been over a decade since I last developed a full-stack application. Back then I wrote bad PHP and absolutely murdered the server with grotesque SQL queries. A lot has changed in the intervening decade, including which skills are required for full-stack development.

Today I launched Ageira Trade, a website where fellow Eve Online players can sell me their ore or buy minerals. The website makes use of the API of Eve Online to authenticate characters and read in-game contracts. I started with the API and made my way to the front-end, an area I have little experience with. The result is a React app which appears to be performant, although I am sure there are many optimizations to make. Mostly the front-end was a learning process for me, learning skills I intent to use in the future.

Ageira Trade offered me the chance to experience the full development cycle, including Ubuntu administration. I have learned valuable lessons and gained new experience with front-end development. It has definitely been worth the (metaphorical) headaches.

Thanks for reading, hope to see you soon!

WebRTC Signaling with Signal-Fire

WebRTC Signaling with Signal-Fire

WebRTC is a technology which allows individual peers to talk directly to each other. This requires a signaling server.

A WebRTC signaling server communicates between peers to set up peer-to-peer audio/video and/or data channels. This allows your clients to communicate directly with each other.

Years ago I developed signal-fire, a WebRTC signaling server built for node.js. There also was a browser client available which greatly reduced the burden of setting up peer connections. Lack of maintenance led to the module’s eventual demise and I recently officially retired it.

Luckily I had some inspiration for the new and improved version, and I got to work. The result was the Signal-Fire ecosystem, starting with Signal-Fire Server, a server that does exactly the same as its predecessor did, but better!

The Server

Signal-Fire Server is based on my other quite recent module Luce. Luce is a versatile WebSocket framework for node.js. An excellent pairing for my new project.

Command-Line Interface (CLI)

If you want to get started with Signal-Fire Server without too much hassle, and you’re content with the basic features (for now), you can use the CLI to start and manage Server workers.

Install the CLI globally:

> npm i -g @signal-fire/cli

To start a worker on port 3003:

> signal-fire start -p 3003

Starting the Server

The Server can be installed through npm:

> npm i @signal-fire/server

To manage client IDs the Server requires a registry. In the example below we use LocalRegistry, an in-memory store.

import { Server } from 'http'

import createApp from './index'
import { LocalRegistry } from '@lucets/registry'

const registry = new LocalRegistry()
const app = createApp(registry)
const server = new Server()

server.on('upgrade', app.onUpgrade())
server.listen(3003, () => {
  console.log('Server listening on port 3003')

Congratulations, you now have a basic server running!

The Client

Signal-Fire Client is the replacement of signal-fire-client, which has also been deprecated. The Client is also new and improved. The Client is designed for the browser and uses the native EventTarget.

The Client is meant to be used with browserify.

Install the client through npm:

> npm i @signal-fire/client

Connecting to the Server is exceedingly simple:

import connect from '@signal-fire/client'

const client = await connect('ws://localhost/socket')


Sessions are requests and responses for setting up the peer connection. One peer creates a session, which its target can either accept or deny.

This example shows how to start a session:

import connect, { PeerConnection } from '@signal-fire/client'

async function run () {
  const client = await connect('ws://localhost:3003/socket')
  const session = await client.createSession('<target id>')

  session.addEventListener('accepted', (ev: CustomEvent<PeerConnection>) => {
    console.log('Session accepted!')

    const connection = ev.detail
    const stream = await navigator.mediaDevices.getUserMedia({
      video: true,
      audio: true

    stream.getTracks().forEach(track => connection.addTrack(track, stream))

  session.addEventListener('rejected', () => {
    console.log('Session rejected')

  session.addEventListener('timed-out', () => {
    console.log('Session timed out')

This example shows how to accept a session:

import connect, { IncomingSession } from '@signal-fire/client'

async function run () {
  const client = await connect('ws://localhost:3003/socket')

  client.addEventListener('session', (ev: CustomEvent<IncomingSession>) => {
    const session = ev.detail
    const connection = await session.accept()


The Signal-Fire Server and Client are projects I intent to keep maintaining, and using myself. If you’ve checked out either and found a bug, please open an issue on GitHub, or better yet, a pull request.

WebRTC signaling with Signal-Fire

WebRTC signaling with Signal-Fire

In 2016 I wrote signal-fire, a WebRTC signaling server built for node.js and client built for the browser. I had not maintained the module since then, which unsurprisingly resulted in the modules no longer working.

So recently I took it upon myself to start the projects from scratch. I wrote a capable and extensible WebRTC signaling server for node.js and accompanying client module for the browser. Together they form a strong first start towards using WebRTC in any framework.

Early versions are already available:

The Server

The Signal-Fire Server is the main component of the ecosystem. It provides a flexible Luce application. Luce is a versatile WebSocket framework which uses asynchronous hooks to extend functionality. I developed Luce as the spiritual successor to my now deprecated module Illustriws.

At its core the Server provides each client with a unique ID, which can then be used to process the signaling necessary to set up a WebRTC peer connection. The protocol is JSON-based and simple to work with. Methods of exchanging IDs falls outside the scope of the Server, although the versatility of Luce allows many possible strategies for creating and storing IDs.

The Command-Line Interface (CLI)

To make using Signal-Fire Server as easy as possible, I have developed a command-line interface (CLI). Using the interface one can start multiple app workers and manage their lifecycle. The interface is currently a work in progress, as are all Signal-Fire modules.

The Client

The Signal-Fire Client works in combination with the Server to provide an easy to use and (almost) complete WebRTC solution. The Client abstracts away the hassle of communicating with the Server, negotiating ICE candidates, and setting up peer connections and data channels.

The Client is designed to be used in the browser. The spec has somewhat stabilized since 2016, so it is my hope the new Signal-Fire modules will be a little more future-proof.

The Future

I would like to continue development of both the Luce and Signal-Fire ecosystems. Unfortunately I lack some basic skills, like unit testing and CI. I plan to rectify the situation and refactor where necessary to get reasonable test coverage.

I intent to develop a product which includes both ecosystems as a fundamental part of its architecture. This should help me get an idea of what is actually working and important, and focus development accordingly.

It’s my hope both ecosystems will see some usage. I have deprecated a couple of modules recently and resurrected some others (like Wormhole, my IPC module). The result has been Luce and Signal-Fire. I am curious to see if they will see any use.

Back From Beyond

Back From Beyond

This blog has not been updated in a long while. So I thought it would be time to do so. This is the grand reopening of the Art of Coding blog. Welcome, grab yourself a piece of cake, and enjoy. This post will contain a summary of what I’ve been working on recently.

I have been coding Luce, the spiritual successor to Illustriws and signal-fire (which I have both officially deprecated). Luce is a versatile WebSocket server framework built for node.js. Luce uses asynchronous hooks analogous to middleware functions in, for example, Koa. I have created the beginning of an ecosystem which I hope others can use as well.

I have also resurrected signal-fire, the WebRTC Signaling Server for node.js. I have rewritten the server and client from the ground up. Both are still works-in-progress. The WebRTC signaling server and client can be used to establish peer connections between individual peers, for the exchange of video and audio, or other data. Signal-Fire helps ease the pain of implementing them directly.

In order to track peers, I have made a Registry interface which others can extend to implement client registries with multiple back-ends. This way you can scale your messaging apps with east. Included is an in-memory registry, which can be used as a reference implementation. I have also made a Redis registry, which is currently a work-in-progress.

I have done some work with Redis Streams, and as such developed redis-streams-manager, a streams manager built around the EventEmitter interface. Stream entries are emitted by stream name.

My Inter-Process Communication (IPC) module Wormhole had aged a little, so I have rewritten it in TypeScript. Now it’s future-proof and ready to be used (again).

That’s it for today. Thanks for coming, hope to see you again!

Enter the Wormhole; IPC goodness

Enter the Wormhole; IPC goodness

This post and the example in it have been updated to match version 1.x.x.

Node’s child_process module allows you to spawn and fork new processes, optionally with a built-in Inter-Process Communication (IPC) channel. This is an easy way to communicate with the child process, and valuable on its own in many situations. But it’s not very developer-friendly.

Wait, scratch that, its very developer-friendly. It’s just not… easily and repeatably usable. You will need to define your own JSON-based protocol, which can become tedious if you need to do this often. Addition: it’s not really necessary to define your own protocol: primitive values can be sent too, albeit without metadata.

So, to stay in keeping with DRY and KISS I decided to make a little module that does most of the heavy lifting for me. Note that there are probably dozens of modules out there that do what I did and probably better, but that doesn’t take the fun out of building it. It’s simple and it works, so I’m sure there are applications.

I wanted a couple of things:

  • A way to notify the other end of the link of events that have happened
  • A way to call commands on the remote end and receive a result back (RPC-like behavior)

To satisfy these requirements I built wormhole. It’s designed to work with Node.JS’s child_process and process, and provides what I was looking for.

Installing it is super simple:

npm i @art-of-coding/wormhole --save

How do you use it, you ask? It’s fairly simple – just fork a child process that uses wormhole, and you can send events and call commands!

In the example below we fork a child process, and both processes use wormhole. This allows them to send events and call remote commands. All features are supported in either direction (it’s bidirectional).

The master process:

const childProcess = require('child_process')
const Wormhole = require('@art-of-coding/wormhole')

const child = childProcess.fork('./my-child.js')
const wormhole = new Wormhole(child)

// Register a `startup` event handler'startup', () => {
  console.log('received startup event!')

// Register an `add` command
wormhole.define('add', function (a, b) {
  return a + b

// Send the `quit` event to the child
setTimeout(() => wormhole.event('quit'), 5000)

The child process:

const Wormhole = require('@art-of-coding/wormhole')

// Without the `channel` argument, `process` is selected by default
const wormhole = new Wormhole()

// Register a `quit` event handler'quit', () => {

// Send an event

// Call a remote command
wormhole.command('add', 5, 6).then(result => {
  console.log(`5 + 6 = ${result}`)

As you can see for yourself, using it could not be easier!

You can find wormhole on GitHub or npm.

A simple procedure caller

A simple procedure caller

Sometimes you just need something simple. Something that’s light-weight though capable, and does what you want – nothing more. That’s what I did with procedure-caller, a simple Node.JS module for calling, you guessed it, procedures. To be clear: in this context, a ‘procedure’ is nothing more than a function that can be called repeatedly.

Installing it using npm is child’s play:

npm i @art-of-coding/procedure-caller --save

Now that it’s installed we’ll dive right into an example:

const ProcedureCaller = require('@art-of-coding/procedure-caller')

// Create a new instance
const pc = new ProcedureCaller()

// Define a procedure named 'add', which adds two numbers
pc.define('add', function (a, b) {
  if (isNaN(a) || isNaN(b)) {
    throw new TypeError('arguments must be numbers')

  return a + b

// Now call the procedure
const result ='add', 5, 6)

// Display the result
console.log(`5 + 6 = ${result}`)

As you can see, it’s easy to call a procedure and get the result. But what if we’re using asynchronous methods, like Promises or async/await? That’s covered too!

const ProcedureCaller = require('@art-of-coding/procedure-caller')
// We're using gh-got to talk to the GitHub API
const ghGot = require('gh-got')

const pc = new ProcedureCaller()

// Define an async procedure
pc.define('repo', async function (user, name) {
  const response = await ghGot(`repos/${user}/${name}`)
  return response.body

// Call the async procedure'repo', 'Art-of-Coding', 'procedure-caller').then(result => {
  console.log(`Repo description: ${repo.description}`)

Like I said in the opening of this post, my goal was to make something that was exceedingly simple to use. I believe I have done so.

You can view the module on npm and GitHub.