Tiny Wren
This advert inspired me, as I often ask myself "Where is that plane going?" when generally pottering about and looking out the window. Perhaps I'm a five year old at heart.
So thought about building something that could answer that question for me. I suspect the neighbours might not like me building something of that scale on the bungalow and opted for a simple ambient type screen / display in the office instead.
Here is what its showing.
For the nearest plane that is less than 5 miles away.
Then, when there are no local planes show some temperature stuff.
Its been running for over a year now, and its rather stable!
Be fun to learn about stuff. I wanted to play around with SDR, Raspberry Pis, Microcontrollers, Smart homes. Perhaps 3d Printing?
Offline. It would be simple to throw a load of cash on an Aviation API and just display it. That however, would be costly, and wouldn’t offer up to the second info I could get by doing this stuff locally.
Unobtrusive. I don’t want bright screens everywhere. If I had displays I wanted them to fit into my general smart home, so dimmable via smart dimmer. Want them to be liveable and not just a random bunch of wires on a desk.
Extendable. I wanted to add N number of devices which could show what is flying over head. Think of a small screen in the conservatory, one in the living room. Perhaps some local website. More in the future.
Easy to build. I’m not a Radio or Aviation expert, nor know much about building electronics. Nor do I want to really go deep into those topics. I have a light interest in all three, but you aren’t going to get me talking on my self built HAM radio set anytime soon.
A play area. My day job means I don’t get to write code anymore, but I do love it so. So part of this project is giving me space to scratch that itch.
We have a little raspberry pi with a USB Software defined radio, that listens to plane transponders. It entriches this transpoder data from other others sources. Hosts a web interface (above). Also broadcasts the nearest plane to a MQTT server.
I have this installed in the garage, with the arial in the loft. It gives OK range, if you wanted you can go to town with mounting this outside, high, away from obstructions. I’m not too bothered about chasing perfect reception/coverage here.
HomeAssistant is great for many things, in this case I happened to already have it running a load of things in the smart home, and a importantly a Mosquito MQTT server instance. So rather than having multiple message queues on the network, made sense to just open it up to other clients on the network to host the nearest plane messages.
I got one to mess around with, then added another, so its an odd setup. Now they have bigger ones, so if I was doing this again I would grab that. The code is ropey, but its using Micropython and listening to the local plane and Smart lamp topics. They also a few climate temperature topics (inside, outside, conservatory) populated from homeassistant integrations. I also did hook it up to the doorbell topic to make a noise when fired, but is slightly redundent now.
This sits in the living room for now. I’m working on my 3D cad skills to print of a nicer container for it. But this crappy box does for now.
So really the job is to
Thankfully, loads of folks already have worked out the hard work of turning a tiny computer, a cheap USB FM/TV tuner, into a device that can listen to plane transponder transmissions. Indeed, thousands of folks have them setup around the world and they populate services like flightradar24.com and flightaware.com. FlightRadar24 have a great tutorial over here, which is basically what I followed.
So how do we script this up to be used in our little project? Well, part of the FlightRadar24 installation is the setup of dump1090, an open source library that listens and decodes the plane transponder broadcast from a Software Defined Radio devices. I actually use a version which powers the FlightAware feeder which is slightly diffferent, dump1090-fa., fight aware installation details.
When installed you can get a list of aircraft your device is tracking over here.
http://127.0.0.1/dump1090-fa/data/aircraft.json
Now, there are probably better ways of getting this out of dump1090 than relying on a JSON feed over HTTP. However, when looking into it the nitty gritty of possible outputs of dump1090, my eyes start to glaze over and remember I don’t particularly want to go into the weeds on this.
So this JSON endpoint has the aircraft, with callsigns, lat/longs , headings, altitude and a whole load of other data updated every second. Plus JSON over HTTP is easy for a hack like me to play with.
The transponders from the planes won’t tell you their entire flight plans, but you get the planes callsign.
A callsign is the unique identifier used in radio communications between an aircraft and air traffic control, or between aircraft. What is interesting, is that in the case of scheduled flights, which are the vast majority of flights near me, the format of these callsigns denote, the company, and a flight number. e.g.
BA1383 = British Airways, flight number 1383.
That route number typically responds to a regular route, in this case, Manchester to Heathrow. Now these routes don’t change much, only when things go wrong or if the company decide to use the flight number for a different route.
How to convert a callsign to route, all locally? Step in folks over at Virtual Radar Server who built the following database to use for their virtual ATC software.
vradarserver/standing-data: Dumps of aviation data from Virtual Radar Server's SDM site.
(Interestingly this is the same Dataset that adsb.lol API are using, which I just found out when writing this up!)
This is stored as CSVs, but I for my needs its better in a DB, so I have a simple script that downloads this info and puts into a simple SQLlite db. Its sheduled to update once a week or so.
So I now have my Callsign along with the airports the plane has departed and arriving at.
I also wanted a little bit of detail of the plane themselves. Think who the operator is, also the aricraft type. Which the folks over at adsbexchange have.
http://downloads.adsbexchange.com/downloads/basic-ac-db.json.gz
As with the flight data, I import the airframe info into a local DB.
Now we need to pull the flight data and the airframe information together, and ping it over to the MQTT server. So i build a simple Python script that does does the following.
The first is pretty simple. Open the dump1090-fa aircraft feed, look up of the plan Lat long, my Lat Long ,calculate distance and sort the aircraft array by nearest. Add the distance to the object. Select the nearest.
Then, for the nearest plane, simply take the callsign and query the two local copies of data source for airframe and flight info in our local DB.
Then its simply the case to ping it over MQTT. via a library like paho-mqtt.
A final MQTT message we send to our HomeAssistant server looks like this if a plane is within 5 miles, and is updated every few seconds. In this case, a Boeing 737-800 operating by ASL Airlines UK, going from Manchester Airport to Charles de Gaulle. We even get details stuff such as roll, speed, and the rather interesting squawk codes.
{
"hex": "407f6e",
"flight": "ABV9AN",
"alt_baro": 13975,
"alt_geom": 14075,
"gs": 332.6,
"ias": 288,
"tas": 354,
"mach": 0.56,
"track": 142.8,
"track_rate": 0.06,
"roll": -0.2,
"mag_heading": 149.8,
"baro_rate": 3520,
"geom_rate": 3488,
"squawk": "6301",
"emergency": "none",
"category": "A3",
"nav_qnh": 1013.6,
"nav_altitude_mcp": 19008,
"nav_altitude_fms": 37008,
"nav_heading": 149.8,
"lat": 53.108329,
"lon": -2.271908,
"nic": 8,
"rc": 186,
"seen_pos": 0.4,
"version": 2,
"nic_baro": 1,
"nac_p": 10,
"nac_v": 2,
"sil": 3,
"sil_type": "perhour",
"gva": 2,
"sda": 2,
"mlat": [],
"tisb": [],
"messages": 5412,
"seen": 0.3,
"rssi": -2.9,
"distance": 4.98,
"inflated": true,
"RouteId": 560806,
"Callsign": "ABV9AN",
"OperatorId": 1256,
"OperatorIcao": "ABV",
"OperatorIata": "",
"OperatorName": "ASL Airlines UK",
"FlightNumber": "9AN",
"FromAirportId": 286,
"FromAirportIcao": "EGCC",
"FromAirportIata": "MAN",
"FromAirportName": "Manchester",
"FromAirportLatitude": 53.3536987304688,
"FromAirportLongitude": -2.27495002746582,
"FromAirportAltitude": 257,
"FromAirportLocation": "Manchester",
"FromAirportCountryId": 191,
"FromAirportCountry": "United Kingdom",
"ToAirportId": 82,
"ToAirportIcao": "LFPG",
"ToAirportIata": "CDG",
"ToAirportName": "Charles de Gaulle",
"ToAirportLatitude": 49.0127983093,
"ToAirportLongitude": 2.54999995232,
"ToAirportAltitude": 392,
"ToAirportLocation": "Paris",
"ToAirportCountryId": 65,
"ToAirportCountry": "France",
"icaotype": "B738"
}
I'm not going to go into super detail on this, becuase my code here is hacky as hell. The key things is though, the MiroPython Galactic Unicorn libraries are really detailed, so is mostly just editing them and using a simple MQTT client. For the smaller OLED display, similar, but as its running on a normal Raspbian instance I'm using python3 running as a service.
I wanted to be able to treat the displays as Smart lamps so I can switch them off with Hue Dimmer switches, or via Siri. To do this, I added the following configuration in to HomeAssistant config yaml. Basically creating lights that which when interacted with would publish messages to specific mqtt topics the displays could listen to and act accordingly.
mqtt:
- light:
unique_id: "unicorn_id"
name: "Unicorn light"
state_topic: "office/unicorn/status"
command_topic: "office/unicorn/switch"
payload_on: "ON"
payload_off: "OFF"
brightness_state_topic: 'office/unicorn/brightness'
brightness_command_topic: 'office/unicorn/brightness/set'
qos: 0
optimistic: true
- light:
unique_id: "planedisplay_id"
name: "Plane display"
state_topic: "livingroom/planedisplay/status"
command_topic: "livingroom/planedisplay/switch"
payload_on: "ON"
payload_off: "OFF"
brightness_state_topic: 'livingroom/planedisplay/brightness'
brightness_command_topic: 'livingroom/planedisplay/brightness/set'
qos: 0
optimistic: true
I added these to Rooms in the HomeAssistant interface. This means now, Unicorn, is turned off with the same Hue dimmer switch I use to turn off the lights at night.
I have had loads of fun on this one. The Unicorns sit behind me in meetings at work and seem to interest folks. For me, its a nice little useful project and means I can add to it, tinker and play, whilst being geniunaly useful as a thing/product just for me.
Future things I will probably do with it.
06/02/2026
My Sony Bravia 49XE9005 has a great picture and the size is just right.
Its software, however, is infuriating. I suffer from two main issues.
Thankfully, rather than spend a small fortune to buy a new TV only to have another set of problems, I've overcome them with Home Assistant!
Below roughly outlines what I did.
This was relatively straightforward — all the details can be found in the Integration guide. Before setting that up, I would highly recommend setting your TV's IP address as static in your router's DHCP settings.
Once set up, Home Assistant should look something like this.
Now, this is the key bit that is easy to miss. Your TV can receive a multitude of remote commands, more than are on your remote. To find out these commands, the best way is to use the integration and send a "Test" command to the TV.
If you run that script you should see something like this in your Home Assistant Core logs.
Now you have a list of all the commands to fix your silly TV!
In my list of commands from Home Assistant there is one called "AudioOutput_AudioSystem". Sending this command forces the TV to select my soundbar when it fails to do so from time to time.
To ensure this happens automatically I set up an automation that, after the TV starts up, calls a script which sends that command.
The automation looks like this. The 10 second delay is just to allow the TV to start some services so it doesn't ignore the request.
Now this works for me — no more fumbling around the Sony interface to do what it should have done in the first place.
I tend to only watch content from either my Apple TV or a YouView PVR connected over HDMI. For reasons I don't understand, the Apple TV, when started correctly, forces the TV to switch to that HDMI channel. However, the YouView box doesn't.
Switching the source is annoying using the Sony tv remote. To switch from my Apple TV to the YouView box I have to press the source button seven times, then OK — while dealing with input lag due to this TV's woeful performance. The fact they don't have HDMI select buttons but do have a Google Play button sums up how a business focus on profit over decent UX makes TVs terrible.
I digress. As above, I made two scripts that send the HDMI channel commands to the TV.
So now we have scripts that change the channel. But how to press the buttons?
Obviously I put them in a dashboard as a starting point.
But opening Home Assistant in an app or website is even slower than using the TV remote. So I opted to configure a single Aqara Zigbee button.
Then I mapped the button controls as follows:
This is an admittedly extreme set of workarounds to get around issues I have with this TV. I'm really happy with the results, and naturally give massive thanks to the developers of Home Assistant and the integrations I've used here.
05/10/2025
It’s obvious to say that the world shrank during the Covid lockdowns. For a Product Manager and recovering Engineer / Team Leader, it meant working from home and pottering around the house talking to plants far more than is probably healthy.
An aspect I didn’t expect was how much my technical inspiration had changed. In the before times, I would hunt the web, write loads on Twitter, post on blogs, make little scripts/sites and share them. Heck, I even used to chat on IRC groups. Covid somehow changed that. The world became a scarier place — harder to engage with, harder to read about, harder to interact with. Just… less nice, overall.
I’d also become slightly jaded with web technologies. There are probably a few reasons for this. After years of developing mobile apps, it’s easy to fall out of the loop with current web tech. I’m sure there are still wonderful places online, but I’d lost the joy of building simple websites. I found it baffling how many modern sites had become slower, harder to use, and oddly complex to get back into. Plus, I became more hesitant to run projects on the open web — especially with all the security concerns that come with it. These days, setting up a VM and installing WordPress just to play around with stuff, just isn't right.
So I became more obsessed with my small little world. The world on my LAN — where I could play with hacky things, learn, and tinker freely. A space to explore without worrying about Covid, or the outside world. My digital potting shed.
This might sound a little desperate. Or dull. Or maybe you’re picturing a bloke slowly retreating from society. Don’t despair though. I actually found the opposite to be true — and strangely, this little dive into my offline world inspired me more than I expected. So much, in fact, that it felt right to write some of it down. Which I guess... brings me here.
So what did this foray into my digital shed result in? A few things I’ll no doubt be posting about soon:
So, in the coming “insert time frame”, I’ll probably post a bit more about all of that.
16/04/2025
My name is Andy, I build Apps and things for a living in the UK. This blog is where I’ll share a few thoughts and topics around tech, links I find interesting, and the occasional side project I create from time to time.
I love wrens. Their lovely round little bodies. Their oddly loud song. The absolute joy of seeing them bob around the garden between flower pots.
Perhaps a Tweet of the Day puts it best:
"[Their song] is the extrovert side of a normally introverted bird."
I’m a bit of an introverted bird myself — with a slightly round body.
I'm not a professional web developer, but I like to dabble. Here were my main requirements when choosing a platform:
I want to control where I host my stuff. I want to own it — and be able to move it around if I choose to.
In the past, I’ve used WordPress or some kind of Ruby-based platform. Frankly, it was a pain to maintain and a security nightmare.
These days, I don't want the hassle. I'm happy to skip more complex features (like comments or search) if it means less maintenance.
I despise slow, confusing, cumbersome websites that take forever to load. I want both the server and the site itself — HTML, JS, CSS — to be lightning fast.
I suck at JavaScript, and I want to keep it that way.
This is a blog, not a fancy web app. I don’t want to fight JS just to render something that HTML and CSS were built to handle.
No WYSIWYG editors, thanks. I deal with enough of those at work in messy internal wikis.
Markdown is simple to use for me.
Here’s what I ended up with:
Thats enough for now, I will at some point add some way to contact me. But in good time I think.
15/04/2025