Flooding in Ireland is becoming increasingly unpredictable. Heavy winter rainfall overwhelms rivers with little warning. In flood-prone towns like Midleton, Co. Cork, real-time situational awareness can mean the difference between preparation and crisis.
This winter I built a real-time water level monitoring dashboard aggregating live data from multiple Irish government sources. It answers a straightforward question: what is the current flood risk status for Midleton, right now?
The Problem
Midleton sits at the confluence of two rivers (the Owennacurra and Dungourney) with a documented history of flooding. During rainy periods, residents and local authorities need to know the current river levels relative to dangerous thresholds, what weather forecasters predict for rainfall, whether official warnings are active, and how tide timing affects local discharge.
These data exist. Ireland maintains open data infrastructure across multiple agencies, but they're scattered across different government websites and APIs with no single view for a specific location.
The Solution: A Spatial Data Aggregation Platform
The Midleton Water Level Monitor aggregates data from four independent Irish government sources and fetches fresh readings every 15 minutes:
- OPW Water Level Stations (Ballyedmond and Townparks river gauges)
- Met Éireann Weather Warnings (county-level alerts)
- Met Éireann Precipitation Forecast (24-hour rainfall for Midleton's exact coordinates)
- Marine Institute Tide Predictions (high tide timing and heights for nearby coastal stations)
All incoming data is stored in a PostGIS-enabled PostgreSQL database, which enables trend analysis and benchmarking against historical river behavior. When you visit the dashboard, the API combines all four data streams, current river levels against percentiles, rainfall intensity, active warnings, and tidal effects, to compute a situational picture without making explicit risk predictions. Results render on an interactive map with time-series charts, benchmark comparisons, and detailed modals for each data source.
Technical Architecture
The codebase is a monorepo with two main components:
Backend: Node.js API with Postgres
The API runs on Express and exposes endpoints like /api/status for current conditions and /api/benchmarks/* for historical context. Data lives in PostgreSQL with PostGIS extensions, storing river levels, weather warnings, precipitation forecasts, and tide observations. Scheduled pollers written with node-cron fetch fresh data from upstream APIs every 15 minutes; OPW explicitly requests no more than one call per 15 minutes, and we respect that exactly.
All incoming data gets validated against Zod schemas before insertion, catching API changes early. Heavy computation, benchmark calculations, historical aggregations, runs offline via scheduled jobs rather than on-demand, keeping endpoints fast and cacheable.
Frontend: Vite + TypeScript + Leaflet
The frontend is a single-page application built with Vite and TypeScript. No framework, just vanilla DOM optimized for performance. Leaflet renders the map showing river station locations and weather warning zones as interactive overlays. Time-series charts display river level trends at daily and weekly scales. The design is responsive for mobile use during emergencies, with dark mode support for extended viewing.
The GIS Angle: Spatial Data Integration
PostGIS handles geographic queries so I can ask "what precipitation is forecast for a 5km radius around Midleton?" The Met Éireann API requires precise lat/lon coordinates; I fetch data for Midleton's location at 52.0°N, 8.5°W. Each OPW water level station is a point feature with associated metadata, station ID, river name, catchment area.
Understanding watershed context matters for flood risk. Historical OPW data shows which stations affect Midleton: the Owennacurra and Dungourney river systems. Tide predictions for Ballycotton and Ringaskiddy help predict backwater effects where high tides prevent river discharge and raise flood risk locally.
Flood monitoring is inherently spatial. Different data sources have different geographic coverage, and synthesizing them requires understanding which locations matter and why.
Data Sources & Attribution
All data comes from open Irish government sources. The Office of Public Works operates the water level monitoring network at http://waterlevel.ie/. Met Éireann provides both weather warnings and precipitation forecasts via openaccess.pf.api.met.ie. The Marine Institute supplies tide predictions.
These are read-only public APIs requiring no authentication, though all use respects the source agencies' terms of service.
Key Technical Challenges & Solutions
Challenge 1: Handling Partial Data Outages
Sometimes a river gauge goes offline for maintenance. Solution: Freshness thresholds. If the primary station (Ballyedmond) hasn't reported in 60 minutes, fall back to the secondary station (Townparks) with a "stale data" flag in the UI.
Technical Deep Dive: Failover Logic & Data Staleness
The buildStatus() function in routes/api.ts implements a sophisticated fallback mechanism:
// Pseudocode of the failover strategy
function getPrimaryRiverLevel(stations) {
const primary = stations.ballyedmond
const secondary = stations.townparks
const FRESHNESS_THRESHOLD = 60 * 60 * 1000 // 60 minutes in ms
if (primary.isFresh(FRESHNESS_THRESHOLD)) {
return {
value: primary.level,
station: "ballyedmond",
stale: false,
asOf: primary.timestamp,
}
}
if (secondary.isFresh(FRESHNESS_THRESHOLD)) {
return {
value: secondary.level,
station: "townparks",
stale: false,
asOf: secondary.timestamp,
note: "Primary station offline; using secondary",
}
}
// Both stale, show last known value with prominent warning
return {
value: primary.lastKnownLevel,
station: "ballyedmond",
stale: true,
asOf: primary.lastUpdate,
staleSinceMinutes: (Date.now() - primary.lastUpdate) / 60000,
}
}Why this matters: The API always returns a headline number (for the big card on the dashboard), but the UI can show asOf, stale, and station metadata so users understand data freshness. During maintenance windows, the dashboard degrades gracefully rather than showing errors.
Database schema:
CREATE TABLE river_levels (
id SERIAL PRIMARY KEY,
station_id INTEGER NOT NULL,
level_m NUMERIC(8,3),
timestamp TIMESTAMP NOT NULL,
recorded_at TIMESTAMP DEFAULT NOW(),
FOREIGN KEY (station_id) REFERENCES stations(id)
);
CREATE INDEX idx_river_levels_station_recent
ON river_levels(station_id, timestamp DESC);This index ensures freshness queries are O(1).
Challenge 2: Different Data Formats
Met Éireann returns JSON; OPW returns GeoJSON; Marine Institute returns XML. Solution: Adapter pattern, each data source has a thin adapter/ module that normalizes responses into consistent internal types.
Technical Deep Dive: Adapter Pattern & Zod Validation
Each external API has its own adapter with two responsibilities:
- Fetch and parse the remote API response
- Validate and transform into an internal type using Zod
Example: The OPW adapter (adapters/opw.ts):
import { z } from "zod"
import { fetchJson } from "../utils/fetch"
// Define the expected OPW response shape
const OPWFeatureSchema = z.object({
properties: z.object({
STATION_ID: z.string(),
STATION_NAME: z.string(),
VALUE: z.number(),
DATE_TIME: z.string().datetime(),
UNITS: z.literal("m"), // Meters
}),
geometry: z.object({
type: z.literal("Point"),
coordinates: z.tuple([z.number(), z.number()]), // [lon, lat]
}),
})
const OPWResponseSchema = z.object({
type: z.literal("FeatureCollection"),
features: z.array(OPWFeatureSchema),
})
export async function fetchRiverLevels() {
const response = await fetchJson("http://waterlevel.ie/geojson/latest/")
// Validate shape
const validated = OPWResponseSchema.parse(response)
// Transform to internal type
return validated.features.map(feature => ({
stationId: feature.properties.STATION_ID,
levelMeters: feature.properties.VALUE,
timestamp: new Date(feature.properties.DATE_TIME),
source: "OPW",
}))
}The Met Éireann adapter is different, it returns a JSON array of warnings with different fields:
const MetEiranWarningSchema = z.object({
type: z.enum(["Wind", "Rain", "Temperature"]),
severity: z.enum(["Yellow", "Orange", "Red"]),
issued: z.string().datetime(),
onset: z.string().datetime(),
expiry: z.string().datetime(),
description: z.string(),
})
export async function fetchWeatherWarnings(countyCode: string) {
const response = await fetchJson(`https://www.met.ie/Open_Data/json/warning_${countyCode}.json`)
const warnings = z.array(MetEiranWarningSchema).parse(response)
return warnings.map(w => ({
type: w.type,
severity: w.severity,
issued: new Date(w.issued),
description: w.description,
source: "MetEireann",
}))
}Benefits:
- Type safety: Zod catches API changes at runtime with clear error messages
- Single transform point: All Met Éireann data flows through one adapter
- Testable: Easy to mock adapter responses in tests
- Extensible: Adding a new data source is just a new adapter file
This is a classic adapter pattern applied to microservices integration.
Challenge 3: Respecting Upstream Rate Limits
OPW explicitly requests no more than one request per 15 minutes. Solution: Hard polling schedule, the poller runs exactly once per 15 minutes via node-cron, and recent requests are cached.
Technical Deep Dive: Polling Orchestration & Caching
The polling system in pollers/pollers.ts uses node-cron with a fixed 15-minute schedule:
import cron from "node-cron"
import * as db from "../db"
import * as adapters from "../adapters"
export function startPollers(app: Express) {
// River levels: exactly once per 15 minutes
cron.schedule("*/15 * * * *", async () => {
console.log("[POLLER] Fetching river levels...")
try {
const levels = await adapters.opw.fetchRiverLevels()
// Bulk insert into DB
await db.river_levels.insertBatch(levels)
// Prune old entries (keep only last 30 days)
await db.river_levels.pruneOlderThan(30)
} catch (err) {
logger.error("River level poller failed", { error: err })
// Continue; don't crash the process
}
})
// Weather warnings: every 30 minutes
cron.schedule("*/30 * * * *", async () => {
const warnings = await adapters.metEireann.fetchWarnings("EI04") // Cork
await db.weather_warnings.upsertBatch(warnings)
})
// Precipitation: every 15 minutes
cron.schedule("*/15 * * * *", async () => {
const forecast = await adapters.precipitation.fetch(52.0, -8.5) // Midleton coords
await db.precipitation_forecasts.upsert(forecast)
})
// Tide predictions: daily at 3am
cron.schedule("0 3 * * *", async () => {
const tides = await adapters.tide.fetchMonthAhead()
await db.tide_observations.upsertBatch(tides)
})
}Request caching happens at two levels:
-
Database-level: The most recent fetch is always in Postgres. If the poller crashes, a restart picks up from the last successful write.
-
HTTP-level: The API adds
Cache-Control: max-age=600headers (10 minutes) to/api/statusresponses, so repeated requests during the same 15-minute window don't hit the database multiple times.
Rate limit handling:
const POLLER_START_TIMES = {
OPW: new Date("2024-01-01T00:00:00Z"), // Start at midnight
MetEireann: new Date("2024-01-01T00:02:00Z"), // Staggered by 2 min
Precipitation: new Date("2024-01-01T00:04:00Z"),
}
// Ensures pollers don't all run simultaneously and hammer upstream APIsWhy this design:
- Deterministic: Pollers run at predictable times, not on random intervals
- Fault-tolerant: If a fetch fails, the next scheduled run (in 15 minutes) will retry
- Upstream-friendly: We respect OPW's 15-minute minimum interval explicitly
- Observable: Cron logs show exactly when each poller ran and whether it succeeded
The trade-off is that the live API is at most 15 minutes stale, but that's acceptable for flood monitoring (the water level doesn't change that fast).
Challenge 4: Computing Meaningful Benchmarks
Raw numbers (e.g., "4.2 meters") are meaningless without context. Solution: Historical percentiles and anomalies, I compute P25, P50, P75, P90 benchmarks from years of historical OPW data, then display current levels relative to these baselines.
Technical Deep Dive: Percentile Computation & Historical Aggregation
Benchmarks are precomputed offline via a scheduled job (jobs/computeBenchmarks.ts) to keep the API fast:
export async function computeRiverBenchmarks(stationId: number) {
// Fetch all historical data for this station
const allReadings = await db.river_levels.getAllByStation(stationId)
if (allReadings.length < 100) {
return { available: false, reason: "Insufficient historical data" }
}
// Group by month and year (e.g., "March" across all years)
const byMonthDay = groupByMonthDay(allReadings)
// For each calendar day, compute percentiles across all years
const benchmarks = {}
for (const [monthDay, readings] of Object.entries(byMonthDay)) {
const sorted = readings.map(r => r.level_m).sort((a, b) => a - b)
benchmarks[monthDay] = {
p25: percentile(sorted, 25),
p50: percentile(sorted, 50), // Median
p75: percentile(sorted, 75),
p90: percentile(sorted, 90),
ath: Math.max(...sorted), // All-Time High
min: Math.min(...sorted),
sampleCount: sorted.length,
asOf: new Date(),
}
}
// Store in derived_benchmarks table
await db.derived_benchmarks.upsert(stationId, benchmarks)
}
function percentile(sorted, p) {
const index = (p / 100) * sorted.length
const lower = sorted[Math.floor(index)]
const upper = sorted[Math.ceil(index)]
return lower + (upper - lower) * (index % 1)
}Why month-day matters:
- Seasonal context: March 15 typically has higher baselines than August 15 (winter rainfall)
- Climate signal: When current level is above the P90 for that calendar day, it's genuinely unusual
- Comparison fairness: "4.2m" on March 15 vs "4.2m" on July 15 have different meanings
Example output in the API:
{
"station": "Ballyedmond",
"current": {
"level": 4.2,
"timestamp": "2026-03-29T14:30:00Z"
},
"benchmark": {
"forDate": "2026-03-29",
"p25": 1.8,
"p50": 2.1,
"p75": 2.8,
"p90": 3.5,
"ath": 4.67,
"sampleCount": 15,
"currentVsP90": "above",
"description": "Current level is above the 90th percentile for March 29"
}
}PostGIS integration (for future work):
-- Historical river data table with spatial index
CREATE TABLE river_levels (
id SERIAL PRIMARY KEY,
station_id INTEGER NOT NULL,
station_geom GEOMETRY(Point, 4326),
level_m NUMERIC(8,3),
timestamp TIMESTAMP NOT NULL,
FOREIGN KEY (station_id) REFERENCES stations(id)
);
CREATE INDEX idx_river_geometry ON river_levels USING GIST(station_geom);
-- Query all stations within a 10km radius of Midleton
SELECT station_id, level_m, timestamp
FROM river_levels
WHERE ST_DWithin(
station_geom,
ST_GeomFromText('POINT(-8.5 52.0)', 4326),
10000 -- 10km in meters
)
ORDER BY timestamp DESC;This spatial query enables watershed-aware alerts: if water is rising in upstream stations, we can predict downstream impact.
Deployment & Operations
The dashboard is deployed on Fly.io:
- API: Runs on a shared-cpu-1x instance (€2/month)
- Database: Fly Postgres starter plan with automated backups
- Frontend: Built assets served via the API or could move to a CDN
The stack is designed to be cheap to operate while staying reliable, which matters for a personal project that might actually be consulted during a flood event.
What's Next?
The dashboard is live at https://midleton-flood-dash.fly.dev/. Immediate work in the tracker includes tide surge visualization (comparing actual vs. predicted tide heights), reviewing the benchmark threshold methodology, and automating the historical backfill process on new deployments. Longer-term, I want to correlate the risk thresholds against documented historical flood events to validate whether the scoring model reflects real-world outcomes.
Why This Matters
Ireland has excellent open geospatial data scattered across multiple agencies. The OPW, Met Éireann, and Marine Institute publish quality datasets, yet without integration, individuals and small communities can't easily get a complete picture of flood risk at their location.
GIS work extends far beyond map-making. Flood monitoring is a good example of taking data from disparate sources and synthesising it into something communities can actually use.
Live Dashboard: https://midleton-flood-dash.fly.dev/
Source Code: Available on GitHub (link coming soon)
Data Attribution: See the dashboard footer for full attribution to OPW, Met Éireann, and Marine Institute
