Aircraft Transponder (ADS-B) Data Template and Aurora MCP Analytics Engineer
*Aurora has been rebranded Sloan. See Sloan docs here: docs.fiveonefour.com/sloan
A few weeks ago, I was at the ClickHouse meetup in Seattle, and I saw Alexei's incredible demo of the cool stuff you could do with Aircraft transponder data loaded up into Clickhouse.
I wanted to replicate this, so with Moose + some vibe-coding, I had a data engineering project up, sending (a subset—all military plane traffic) of that data to my local ClickHouse: you can see the adventure yourself!
Since I had a Moose project, that would get all this data onto my computer, there was no reason not to make this a template and release it to the world, so I did! https://github.com/514-labs/moose/tree/main/templates/ads-b
Here's how you get it running…
Requirements
Nice to haves
These will be required if you want to interrogate this data with our MCP tools:
- Claude Desktop or Cursor, or both!
- An Anthropic API key (if you don't have one, here's the setup guide)
Install Moose / Aurora
This will install Moose (our open source developer platform for data engineering) and Aurora (our AI data engineering product).
🔑 It will ask you for your Anthropic API key, again, if you don't have it, here's the setup guide.
Create a new project using the ADS-B template configured with Claude Desktop
This will use Aurora to initialize a new project called "Aircraft" based on the templated "ADS-B", whilst configuring Claude Desktop to have the Aurora MCP tools built out with respect to this.
Then, you'll need to run a few more commands to get things ready to go:
This will install the dependencies you have.
☸️ Make sure Docker Desktop is running before the next step!
This will run the Moose local dev server, spinning up all your local data infrastructure including ClickHouse, Redpanda, Temporal and our Rust ingest servers.
Then, open a new terminal, and navigate to the same directory.
You'll know you are in the correct directory if the moose-config.toml is in the directory.
This will run the scripts in a named directory in the /scripts/ folder, here, the military_aircraft_tracking directory.
This script grabs data from adsb.lol's military aircraft tracking API, conforms certain data to the ingestion data-model (for example, the sneaky barometric altitude field, which either gives you the height as an integer, or gives you the string "ground". This is then ingested, enriched with our streaming function that adds an "ordinate" derived from the latitude and longitude (useful as an index), and deposited into the ClickHouse table corresponding with the OLAP data model.
If you go back to your original terminal running the Moose dev server, you'll see hundreds of incoming datapoints a second.
Explore the data with Claude Desktop
Now's an interesting time to go to Claude Desktop and explore the data we've ingested.
I'd start by asking an exploratory question like
tell me about the data in my clickhouse tables
or
tell me about the flow of data in Moose project
Claude will access Aurora's MCPs to be able to answer these questions.
Then, we can move onto some more exciting analytic type questions:
can you create a pie chart of the types of aircraft in the air right now, with reference to the data we've landed in clickhouse?
or
Can you create a visualization of aircraft type against altitude
The options are endless.
We find Claude to be a creative place to explore the newly ingested data.
You can ask Claude to productionize the results by creating an egress API that corresponds to the SQL query it suggests, but for that kind of code-forward workflow, we prefer Cursor.
Explore the data and productionize your results with Cursor
So first, lets configure Cursor with the Aurora MCP tool-suite pointed at this project.
Navigate to the project directory and open Cursor
(Or, just open the folder as a project in Cursor).
Then, run the Aurora command to configure the MCP for cursor
This will create a /.cursor/mcp.json file configured for Aurora's MCP—whenever a this Cursor project is running, this MCP will be started.
If you go to cursor>settings>cursor settings>MCP you'll see the server.
Click enable and refresh, and you should be ready to go!
Now, try ask it a question that requires code generation:
could you create an egress api that, for a given aircraft type input, returns the altitude, and longitude of each instance of that aircraft?
This will require the host to use Aurora tools to:
- understand the form of the data by inspecting the data models
- understand the form of the data by inspecting the database
- test SQL queries for achieving this goal
- create an egress api to productionize this
- test that egress api
The possibilities are endless, but this should spark the kind of work you can do with the Analytics Engineer agent—discover you data, build scalable APIs off the back of that discovery.
Interested in learning more?
Sign up for our newsletter — we only send one when we have something actually worth saying.
Related posts
All Blog Posts
OLAP, AI, ClickHouse
Data modeling for OLAP with AI ft. Michael Klein, Director of Technology at District Cannabis
District Cannabis rebuilt its entire data warehouse in just four hours using AI-assisted OLAP modeling. See how Moose copilots optimized raw Snowflake data for ClickHouse performance—with tight types, smart sort keys, and clean materialized views.

ClickHouse, Educational, AI
ClickHouse x Fiveonefour
ClickHouse is a blazingly fast open source analytical database. We love Clickhouse here at Fiveonefour, and we’re on a mission to make Clickhouse accessible to every developer. If you’re a developer building on ClickHouse, you’ve come to the right place ;) Read on to learn about Fiveonefour's dev tool stack to enable Clickhouse developers...