Du siehst unsere Seite gerade im Ecomodus.

Using Nats.io as primary data-storage

Since recently, at Sandstorm we have a dashboard showing the quality of our projects. So far, it shows the latest Google Lighthouse results. We execute Google Lighthouse during our continuous integration and continuous deployment pipeline, and we are planning to include other metrics from other tools.

The nice thing about this dashboard, the application itself, is stateless. It has list of project or metrics and no database. So but how does it work?

Our Google Lighthouse Metrics
Bild laden
Screenshot of the Google Lighthouse metrics in our new Dashboard

Recently we introduced a Nats.io high availability cluster into our infrastructure. Oversimplified, Nats.io is a message broker. Applications can send message to or receive messages from it and do not need to communicate directly anymore.

Communication over Nats.io is easier to configure than individual connections, it's easier to work with and easier to extend. Especially since you can add add applications to the listening side and to the writing side without touching the existing ones. Beside other things, we use Nats.io to send metrics from our CI/CD jobs to the dashboard.

Nats.io allows you to record and replay messages. They call it JetStreams. So in our case for the dashboard, the metrics are messages on the JetStream. Those messages are replayed and read to render the dashboard. In the application we use the JetStream just like a database. It's not (yet) live updating. Another nice thing about JetStream: it allows you to configure a data retention policy. In our case we store five versions of the same metric for at most 30 days.

# we publish metrics on my-prefix.my-stream.my-project.my-category # eg my-prefix.my-stream.sandstorm-de.google-lighthouse ➜ nats stream add quality-dashboard ? Subjects my-prefix.my-stream.*.* ? Storage file ? Replication 1 ? Retention Policy Limits ? Discard Policy Old ? Stream Messages Limit -1 ? Per Subject Messages Limit 5 ? Total Stream Size -1 ? Message TTL 30d ? Max Message Size -1 ? Duplicate tracking time window 2m0s ? Allow message Roll-ups No ? Allow message deletion No ? Allow purging subjects or the entire stream Yes Stream quality-dashboard was created

Since adding new application to the consumer side of a Nats.io JetStream is very easy, local development is very easy as well. You start a local version of the app, you read the events from the very same JetStream and can work on life data locally.

The dashboard itself becomes extremely simple. No user management, no receiving and storing data, only fetching events. It consists of 180 lines of TypeScript and 90 lines of markup.

Talking about code, the implementation is based in Deno and the according JetStream client

import * as nats from "https://deno.land/x/nats@v1.12.1/src/mod.ts"; … const conn = await nats.connect({ servers: natsUrl, authenticator, inboxPrefix, }); const jetstream = conn.jetstream(); const jetstreamManager = await conn.jetstreamManager(); const streamInfo = await jetstreamManager.streams.info(streamName); const numberOfEvents = streamInfo.state.messages; if (numberOfEvents > 0) { const opts = nats.consumerOpts(); opts.bindStream(streamName); opts.deliverTo(nats.createInbox(inboxPrefix)); opts.deliverAll(); opts.ackAll(); opts.maxMessages(numberOfEvents); const eventStream = await jetstream.subscribe(">", opts); const messages: MetricMessage[] = []; let eventCount = numberOfEvents; const startTime = nowMillis(); while (eventCount > 0 && nowMillis() - startTime < 3_000) { /** * This for loop might terminate before we received all events. * The events stored in the JetStream are re-published on an temporary inbox topic. * It seems that sometimes, one out of 20 times, this script reads the inbox empty * before all events have been published. * In this case we have to repeat the for-loop. */ for await (const event of eventStream) { eventCount--; const subject = event.subject.split("."); const project = idToLabel(subject[subject.length - 2]); const category = idToLabel(subject[subject.length - 1]); try { const body = codec.decode(event.data); const payload: MetricsPayload = JSON.parse(body); const creationMillis = nats.millis(event.info.timestampNanos); messages.push({ project, category, creationMillis, ...payload, }); } catch (e) { console.error(…); } } } return messages; } else { return []; }

As always you can use the code snippet as you like. It is licensed as MIT. If you have comments or feedback, feel free to contact us and thanks for reading.