Following on from my previous post showing my fancy little rain radar monitor which followed on from my dashboard monitor I’ve made some further updates. Not out of desire to add features but out of necessity since the Bureau Of Meteorology has kept tweaking their website to the consternation of the public while the cost of the website refresh continues to rise and the service continues to fall, eventually causing my simple home project using open data to break.
Now in case anyone is in a similar situation of wanting to use the BoM rain radar PNG images as GIS for a cute little dashboard, I’ll cover what I had to change first before any rant.
Rain Radar Via FTP
My previous example made HTTP calls to the rain radar images but I found after the BoM’s website refresh it would often crash due to various website outages and changes to their CDN and WAF suddenly deciding that one request every five minutes was a DoS attack. Of course, none of this would have been an issue if they provided easy and open methods for accessing data.
While I could try using the domain for their “old website” I have about as much faith in that staying up as I have in the competency of any Big Four consultancy handed a fat government handout. However BoM does offer an FTP service with a bunch of products including access to those rain radar images (you do not need to pay $2266/year for them). This was a method I considered in my first iteration but skipped so I could avoid adding FTP handling while foolishly assuming a critical government department would have a website that works.
Below you’ll find a little Python class I built to handle pulling the BoM radar images and weather observations from the FTP server:
import numpy as np
from io import BytesIO, StringIO
from PIL import Image
from ftplib import FTP
import xml.etree.ElementTree as ET
from weather import Weather
from perth import Perth
HOSTNAME = "ftp.bom.gov.au"
STATIONNAME = "PERTH METRO"
class BomFtp:
def __init__(self):
self.connection = FTP(HOSTNAME)
self.connection.login()
self.latest_names = []
def download_weather(self) -> Weather:
self.connection.cwd("/anon/gen/fwo")
sp = StringIO()
self.connection.retrlines("RETR IDW60920.xml", sp.write)
root = ET.fromstring(sp.getvalue())
observation = {}
observation["name"] = STATIONNAME
for element in root.findall(f"./observations/station[@stn-name={STATIONNAME}]/period/level/element"):
observation[element.get("type")] = element.text
weather = Weather(observation)
return weather
def new_name(self, line: str):
if line.startswith("IDR70A.T") and line.endswith(".png"):
self.latest_names.insert(0, line)
def rainfall_image_names(self) -> list[str]:
self.connection.cwd("/anon/gen/radar")
self.connection.retrlines("NLST", self.new_name)
self.latest_names = self.latest_names[0:4]
self.latest_names.reverse()
return self.latest_names
def download_rainfall_image(self, filename: str) -> np.ndarray:
bp = BytesIO()
self.connection.retrbinary(f"RETR /anon/gen/radar/{filename}", bp.write)
array = np.asarray(Image.open(bp)).copy()
perth = array[Perth.x1:Perth.x2, Perth.y1:Perth.y2]
rows = perth.shape[0]
cols = perth.shape[1]
low = [[], []]
high = [[], []]
for x in range(0, rows - 1):
for y in range(0, cols - 1):
if np.array_equiv(perth[x, y], [127, 254, 255, 255]):
low[1].append(rows - x)
low[0].append(y)
elif not np.array_equiv(perth[x, y], [255, 255, 255, 0]):
high[1].append(rows - x)
high[0].append(y)
return [low, high]
This is pretty similar to the code that I discussed in my previous post and was designed so I could replace the previous HTTP code without having to completely change how the rest fits. It is purposefully barebones. I don’t handle errors, I don’t ensure FTP disconnection, I hardcode the stuff I want. This is not professional code or ready-to-use FOSS; this is just so I can see if it’s gonna start raining while I head to the kebab shop.
The Weather class is used by the graph to turn observations into data plotted in display boxes at the bottom and the Perth class contains the bounding box (in pixels) of the area I want to slice out of the image (that whole “GIS when it’s not GIS” thing).
Because the FTP directory has each radar product with the timestamp in the name I can simply grep the product that is relevant, grab four file names, and then put them in reverse order to fit the existing display code. That let me delete a whole bunch of time handling code that would construct correct filenames, test for image presence, and add incrementing delay counters if a new image wasn’t yet available. We love deleting code.
To avoid making unecessary requests, I keep a copy of the four filenames currently in use and only check every five minutes for new files. If a new one exists, I only download that one and kick out the last one. Maybe it’s overkill ensuring I don’t download more than 10kb every five minutes but I like to avoid causing extra hassle for people on the other end. *frustrated glare at The Bureau*
Status
It works!
It did crash the next day and I was fed up so I waited 24hrs before looking into it only to discover that all the products for IDR70 (Perth - Serpentine rain radar) were removed from the FTP server. There was nothing on the website about service outages or radar problems and when I viewed their website rain radar it did show data on their fancy new map.
I sent the BoM an enquiry message with specific details to see if it was an issue or purposeful but then decided to Google the specific radar product ID which lead me to the “reg.bom.gov.au” old version of that radar page which had a handy notice saying that the radar was down due to maintenance and would be up within 24hrs. Truly just a victim of coincidence. However; that feels like the kinda thing you should have on your public rain radar so users know that the results they are seeing are interpolated results from much further radars and thus are inaccurate.1
Thankfully the radar is online again and my dashboard is working wonderfully getting data via FTP.
I thought you hated FTP?
I still think my previous comments about how fucked it is that a government department focused on critical data dissemination relying entirely on FTP in the the Year Of Our Lord 2025 2026 are relevant. Especially that for a lot of products they deign to charge us such outrageous prices.
But if this is the only way I can reliably access BoM data I guess I’ll take it.
The New Website
It’s been available for a while now and to their credit the BoM have listened to user feedback and made changes, like swapping back to Rain reflectivity (dBZ) after public outcry when the new website defaulted to Rain rate (mm/h).
I think with enough time, people have gotten used to the redesign and if you are a layperson using it to check the forecast it is sufficent. It mostly2 does it’s job. I feel like a lot of the attention and negative press it received was due to two factors:
- It was a complete change in both design, navigation, and experience.
- It was debuted at the worst time during extreme weather events.
The latter is absolutely a major cockup on the department’s part. The former is a common experience when there is a major UX change to any service and it takes an adjustment period for people to adapt. However, there is a difference between something like Facebook circa 2010 (a website run by a private company that is not used by people to receive life saving advice) making sweeping website changes and the BoM (the opposite). A private company might want to do a refresh to push users towards new features or prevent themselves from fading into banality. A government department shouldn’t invest $96.5 million of taxpayer money into a website refresh because “it was looking tired”.
Value For Money? No.
The BoM attempted to explain the eye-watering cost:
“The $96.5 million that we’re talking about was not just the front end of the website, the tip of the iceberg that the public sees, but the back end, which sees data flowing from tens of thousands of pieces of equipment in the field, to the supercomputer that does all the modelling, right through to systems that actually forecast the weather and put it through to the website” - BoM CEO Stuart Minchin
But all of that was covered under the BoM’s $866m Robust transformation project including the supercomputer, modelling/forecasting, IoT data collection, asset maintenance/upgrades, and more. It is fair to say that the BoM website would cost a lot more than a GovCMS website because there is a lot of integration with the BoM’s backend data platforms that have masses of frequently updating data but that should still be possible for under $10m. And I’m pretty confident because I have worked on more complex, public-facing platforms across government and private sector that all came in under eight figures. While I (usually) consider it improper to speak definitively on a project I wasn’t involved in, the fact that all the data on their website is also (and has always been) available on their FTP server does feel like they’ve just pissed our money up the wall.
Back in 2010 I remember submitting feedback to the BoM about getting “bom.gov.au” added as an option to visit the website (back then, it was “www.bom.gov.au” or nothing) and received a response outlining the difficulties due to running everything from DNS to webservers on their own hardware with (pre-cloud days) while receiving over 10,000,000 hits a day. But with the government’s push to commercial cloud, using platforms that handle auto-scaling, CDNs for burst traffic, and a bunch of ACSC default security templates to apply to services, those sort of issues are not what caused the price tag.
Consultancy Driven Development
So why did it cost so much? Because the actual dole bludgers sucking on the teat of taxpayer-funded welfare are supermassive consultancy firms.
Whether they be “professional services” (Deloitte, KPMG, PwC, EY) or “technology providers” (Accenture, IBM) - they are able and content to demand an outrageous price and then be handed the tender because they’re a reliable and safe firm. They’re then happy to blow out the costs and timeline by 40% because it’s too late for the government to back out only to eventually produce a half-baked solution that doesn’t meet the client’s needs.
Having been required to work alongside these types many times over my career seen that they have “ready to go” generic template solutions for common digital transformation projects despite selling it each time as custom-built from the ground up. I would be okay with this (why reinvent the wheel?) if it meant they delivered on time or on budget or priced it less than what it would cost for a small/specialist consultancy or internal development team to build from scratch. Instead every time we see are given a square wheel with a massive markup. Relying on generic templates also means they have no desire to confirm it’s fit for purpose (“we didn’t realise a department that tracks offshore fishing would have a need for GIS data”) and no willingness to use the existing systems (“we don’t care that your entire infrastructure is on Azure, we need to deliver this on AWS”).
We were sold the lie that it’s cheaper for the private sector to build something than the government, which is why we slashed every department’s capability to build or develop internally and castigated them if they dared step out of line. But constantantly handing fistfuls of cash to companies that can screw Australian citizens over with impugnity, knowing very well that they’ll still get the next tender, has fared the Australian public much worse.
What Can I Do?
- If you work for a major consultancy, quit. Take a pay cut and work for a smaller company that’s not run by ghouls before it turns you into one.
- If someone you know works for a major consultancy? Make fun of them until they quit and work somewhere that doesn’t exist to screw us all over.
- If a project is going out to tender, see if it can be broken down into smaller, well-defined tenders that can go out to more reliable specialised consultancies.
- If you’re on an assessment panel for a tender, make sure to review responses and factor in a lack of specificity or understanding of the project/domain, as well as a history of previous fuckups.
- File Freedom Of Information requests for projects by big consultants and publish your analysis/findings.
- When fancy new govt IT projects (rolled out over budget and over schedule) ask for feedback3, give it frankly and critically. Make sure you point out that the government department performs important functions but this specific under delivers for how much it cost.
- When it comes time to vote at the state or federal level, remember which parties have a minister-to-consultancy job pipeline.
Try to find workarounds for crappy systems. If you can find ways around having to use some sub-par overpriced platform that’s taken away what you used to use, look for other ways the department provides data to make a neat little dashboard so you and your spousey-boo can glance at the rain, weather, and forecast before you leave home.
-
My amazing co-CEO @shmouflon showed me that if you’re on the mobile app (and only on the mobile app) looking at the radar it will have a message down the bottom saying if there is maintenance at one of the towers. ↩︎
-
Not completely. Aforementioned lack of warnings about radar outages for example. ↩︎
-
Also why does a government website or app keep popping up modals requesting I leave a review? Because massive consultancies wants to juice review numbers as false metric of success of a project. Due to self-selection bias, people are more likely to seek out the opportunity to review something when they have a major positive or negative experience. With a government app that just tells you the weather, the former is less likely as it is purely functional. So developers use dark patterns like pressure imposing/nagging (p100) designed to get more feedback. ↩︎