Finding the Cheapest Petrol in Britain, One API Call at a Time

The UK government quietly runs one of the most useful APIs nobody talks about. Every petrol station in the country reports its prices to the GOV.UK Fuel Finder, and they expose all of it through a public REST API. Around 7,400 stations, updated within 30 minutes of price changes at the pump. Free. No commercial tier. Just... there.

Naturally, I had to do something with it.

The Idea

I wanted to know which petrol station near me had the cheapest diesel without opening five browser tabs and squinting at comparison sites that haven't updated since last Tuesday. And I wanted it tracked over time - are prices going up? Is Costco actually always cheapest? Does that Shell on the A5 ever drop below the Tesco Express?

The plan: pull every station's prices every four hours, store them in InfluxDB, build Grafana dashboards, and throw a live public page on sillymoo.dev for anyone to check. The whole thing open source so anyone can clone it and track their own area.

Getting Into the API

The Fuel Finder API uses OAuth 2.0, which sounds straightforward until you realise their implementation is... creative. The token endpoint isn't at any of the standard paths. It's not /oauth/token or /token or even /auth. It's POST /api/v1/oauth/generate_access_token. And it wants a JSON body, not form-encoded. Every OAuth library I tried needed convincing.

The prices come in batches of 500 stations. You increment a batch number until the API returns a 404, which is how it tells you there are no more batches. Fifteen batches later, you've got the whole country.

async def fetch_all_prices(self) -> list[Station]:
    all_stations = []
    batch = 1
    while True:
        stations = await self._fetch_batch(batch)
        if not stations:
            break
        all_stations.extend(stations)
        batch += 1
        await asyncio.sleep(0.5)  # Rate limit: 30 RPM
    return all_stations

Simple enough. Except for the rate limit - 30 requests per minute, one concurrent request. So you wait politely between batches and hope your token doesn't expire mid-run.

The Region Problem

Here's the thing the API doesn't give you: location data. No postcodes, no coordinates, no county. Just a station name like "SHELL MILTON KEYNES SOUTH" and a phone number. That's it.

So how do you figure out which stations are near you? You pattern-match on the trading name. "MILTON KEYNES" in the name? That's Milton Keynes. "BLETCHLEY"? Also Milton Keynes. "SOUTHAM"? Leamington area.

This works surprisingly well until it doesn't. "SOUTHAMPTON" matches "SOUTHAM". "CARLISLE WARWICK ROAD" matches "WARWICK". The fix was adding exclusion keywords - if the name contains "SOUTHAMPTON", skip the Southam region even though "SOUTHAM" is in there.

regions:
  - name: "leamington-southam"
    label: "Leamington Spa / Southam"
    keywords:
      - "LEAMINGTON"
      - "SOUTHAM"
      - "WARWICK"
    exclude_keywords:
      - "SOUTHAMPTON"
      - "CARLISLE"

Not elegant, but it handles the edge cases. And the whole thing is configurable - swap in your own keywords and you're tracking whatever area you want.

The Stack

It follows the same pattern as my energy tracker: Python async collector on the ansible box, InfluxDB for storage, Grafana for dashboards, systemd timer for scheduling.

GOV.UK Fuel Finder API
        |
  FuelTrack collector (every 4 hours)
        |
  InfluxDB (365-day retention)
        |
  Grafana dashboards + Ghost page (sillymoo.dev/uk-fuel-prices)

The Grafana dashboard has the works - stat panels for cheapest E5 and diesel, a full price table sorted cheapest first, time series trends per station, and bar charts comparing brands and regions. Template variables let you filter by region, fuel type, and brand.

The Live Page

The part I'm most pleased with is the Ghost page. After each collection run, FuelTrack generates an HTML page with a UK top 20 cheapest for unleaded and diesel, then pushes it to sillymoo.dev via the Ghost Admin API. No manual intervention. The page at sillymoo.dev/uk-fuel-prices just... stays current.

Getting Ghost to accept HTML programmatically was its own adventure (see my previous post about the MCP server bugs), but the publisher itself is straightforward - build HTML tables, POST to the Admin API with ?source=html, done.

It filters out obviously wrong prices too. Some stations report prices like 1.3p per litre, which would be incredible but is probably meant to be 130p. Anything below 50p or above 300p gets quietly dropped.

Making It Searchable

The original page only showed stations from two configured regions - Milton Keynes and Leamington Spa. If you lived anywhere else, the page was useless. But the collector was already pulling data from all 7,400 stations. It seemed a waste to throw 99% of it away.

So I added client-side search. Every station's name, brand, and a best-effort location extraction gets bundled into a JSON file (~650KB, covering 6,300+ stations with valid prices) and served as a static asset from Ghost's content directory. The page loads it via fetch() and lets you search by typing any combination of station name, brand, or location.

The location extraction was an interesting subproblem. Station names like "SHELL TRUMPINGTON" are easy - strip the brand prefix, you get "Trumpington". But "MFG MORRISONS MILTON KEYNES WESTCROFT" needs two rounds of prefix stripping to get to "Westcroft". And "SAINSBURYS LOCAL ASHTON MOSS SERVICE STATION" needs both prefix and suffix stripping. A loop over known brand prefixes and common suffixes ("SERVICE STATION", "FILLING STATION", "GARAGE LTD") handles most cases.

Ghost made this harder than expected. It strips <script>, <style>, <input>, and <button> tags from HTML content when converting to its Lexical editor format. The workaround: use Ghost's per-page code injection fields for the CSS and JavaScript (they bypass Lexical sanitization), and have the JS dynamically create the search UI elements. The JSON data was too large for code injection (~650KB vs a ~65KB limit), so it gets uploaded as a static file to Ghost's /content/files/ directory via SCP.

The result: type "shell milton keynes" and you instantly see every Shell station in MK with current prices. Sort by cheapest unleaded or diesel. Paginated to 100 results at a time so it doesn't melt your phone. All running client-side with zero backend beyond the static JSON file that updates every four hours.

What I Learned

The GOV.UK Fuel Finder is genuinely one of the better government APIs I've used. It's fast, it's reliable, and the data is remarkably fresh. The documentation could be better - I spent longer than I'd like to admit discovering that the base URL is www.fuel-finder.service.gov.uk and not the api.fuelfinder.service.gov.uk that the docs imply - but once you're in, it just works.

The lack of location data is the biggest limitation. If the API returned postcodes or coordinates, you could do proper radius searches and map visualisations. As it stands, keyword matching on station names is a hack that works 95% of the time. Good enough for a homelab project. Maybe not good enough for a commercial product.

Try It Yourself

The whole thing is on GitHub at beaglemoo/fueltrack. Clone it, drop in your API credentials from the Fuel Finder developer portal, add your regions to the config, and you've got your own local fuel price tracker.

Or just visit sillymoo.dev/uk-fuel-prices and let mine do the work.