Aircraft Transponder (ADS-B) data template and Aurora MCP Analytics Engineer

Published Wed, April 2, 2025 ∙ Educational, Data Products, AI, Templates ∙ by Johanan Ottensooser

A few weeks ago, I was at the ClickHouse meetup in Seattle, and I saw Alexei's incredible demo of the cool stuff you could do with Aircraft transponder data loaded up into Clickhouse.

I wanted to replicate this, so with Moose + some vibe-coding, I had a data engineering project up, sending (a subset—all military plane traffic) of that data to my local ClickHouse: you can see the adventure yourself!

Since I had a Moose project, that would get all this data onto my computer, there was no reason not to make this a template and release it to the world, so I did! https://github.com/514-labs/moose/tree/main/templates/ads-b

Here's how you get it running…

Requirements

Nice to haves

These will be required if you want to interrogate this data with our MCP tools:

Install Moose / Aurora

bash -i <(curl -fsSL https://fiveonefour.com/install.sh) moose,aurora

This will install Moose (our open source developer platform for data engineering) and Aurora (our AI data engineering product).

🔑 It will ask you for your Anthropic API key, again, if you don't have it, here's the setup guide.

Create a new project using the ADS-B template configured with Claude Desktop

aurora init aircraft ads-b --mcp claude-desktop

This will use Aurora to initialize a new project called "Aircraft" based on the templated "ADS-B", whilst configuring Claude Desktop to have the Aurora MCP tools built out with respect to this.

Then, you'll need to run a few more commands to get things ready to go:

cd aircraft
npm install

This will install the dependencies you have.

☸️ Make sure Docker Desktop is running before the next step!

moose dev

This will run the Moose local dev server, spinning up all your local data infrastructure including ClickHouse, Redpanda, Temporal and our Rust ingest servers.

Then, open a new terminal, and navigate to the same directory.

cd path/to/your/project

You'll know you are in the correct directory if the moose-config.toml is in the directory.

moose workflow run military_aircraft_tracking

This will run the scripts in a named directory in the /scripts/ folder, here, the military_aircraft_tracking directory.

This script grabs data from adsb.lol's military aircraft tracking API, conforms certain data to the ingestion data-model (for example, the sneaky barometric altitude field, which either gives you the height as an integer, or gives you the string "ground". This is then ingested, enriched with our streaming function that adds an "ordinate" derived from the latitude and longitude (useful as an index), and deposited into the ClickHouse table corresponding with the OLAP data model.

If you go back to your original terminal running the Moose dev server, you'll see hundreds of incoming datapoints a second.

Explore the data with Claude Desktop

Now's an interesting time to go to Claude Desktop and explore the data we've ingested.

I'd start by asking an exploratory question like

tell me about the data in my clickhouse tables

or

tell me about the flow of data in Moose project

Claude will access Aurora's MCPs to be able to answer these questions.

Then, we can move onto some more exciting analytic type questions:

can you create a pie chart of the types of aircraft in the air right now, with reference to the data we've landed in clickhouse?

or

Can you create a visualization of aircraft type against altitude

The options are endless.

We find Claude to be a creative place to explore the newly ingested data.

You can ask Claude to productionize the results by creating an egress API that corresponds to the SQL query it suggests, but for that kind of code-forward workflow, we prefer Cursor.

Explore the data and productionize your results with Cursor

So first, lets configure Cursor with the Aurora MCP tool-suite pointed at this project.

Navigate to the project directory and open Cursor

cd path/to/your/project
cursor .

(Or, just open the folder as a project in Cursor).

Then, run the Aurora command to configure the MCP for cursor

aurora setup --mcp cursor-project

This will create a /.cursor/mcp.json file configured for Aurora's MCP—whenever a this Cursor project is running, this MCP will be started.

If you go to cursor>settings>cursor settings>MCP you'll see the server.

Click enable and refresh, and you should be ready to go!

Now, try ask it a question that requires code generation:

could you create an egress api that, for a given aircraft type input, returns the altitude, and longitude of each instance of that aircraft?

This will require the host to use Aurora tools to:

  1. understand the form of the data by inspecting the data models
  2. understand the form of the data by inspecting the database
  3. test SQL queries for achieving this goal
  4. create an egress api to productionize this
  5. test that egress api

The possibilities are endless, but this should spark the kind of work you can do with the Analytics Engineer agent—discover you data, build scalable APIs off the back of that discovery.

Careers
We're hiring
2025 All rights reserved