PostGIS Day 2025 Recap: AI, Lakehouses and Geospatial Community

On Nov. 20, the day after GIS Day, Elizabeth Christensen and I hosted the 7th annual PostGIS Day, a celebration of the Little Spatial Database That Could. Brought to you this year by Snowflake, the event featured an amazing collection of speakers from around the globe — from India to Africa, Europe to North America.
The themes this year were, if you can forgive the pun, all over the map! Moving beyond the generic hype of AI into the practical reality of agents, we saw tools from Felt, Carto and Bunting Labs that can actually write, execute and debug spatial SQL on the fly. We also saw the lakehouse architecture take center stage, with PostGIS acting less as a data silo and more as a high-performance connector to the vast world of object storage, Iceberg and GeoParquet via tools such as pg_lake and Apache Sedona.
But it wasn't all bleeding edge; we grounded the day in massive, industrial-scale success stories. From enabling mapping at IGN France for more than 20 years to powering State Farm’s critical disaster response platforms, PostGIS remains the bedrock of modern geospatial infrastructure. A full playlist of PostGIS Day 2025 is available on the Snowflake YouTube channel. Here's a deeper look at what we learned.
Spatial AI agents: The real deal
We've all heard the buzzwords, but this year, we saw the code. The concept of AI agents was everywhere, but crucially, it wasn't just chatbots hallucinating SQL. We saw systems designed to iterate.
Brendan Ashworth from Bunting Labs drilled down to what "agent" actually means in the geospatial context. It isn't just asking a question and getting an answer. It's a loop. The agent writes a query, runs it against PostGIS, reads the error message if it fails, corrects itself, and runs it again. He showed us Mundi, an AI-native WebGIS that treats PostGIS as its execution engine, solving complex spatial problems without a human holding its hand.
Jaime Sanchez and Mamata Akella from Felt showed how they are teaching Claude to think spatially. They aren't just throwing schemas at an LLM; they are building system prompts that handle the ambiguity of spatial language (e.g., "near," "inside") and interpret errors as "schema tips" to guide the model back to a working query.

Ryan Miller from Carto and Brian Timoney rounded out the theme. Ryan demonstrated a full AI workflow in Snowflake where an agent could identify vacant land, filter by environmental constraints, and then "save" the results back to a Postgres database for downstream use. Brian, in his inimitable style, showed us how to call LLMs directly from the database to analyze images and return structured JSON data. The call is coming from inside the house!

Shoaib Burq from Geobase.app took us into the world of geospatial embeddings. He showed how extending PostGIS with pgvector enables semantic search over maps — allowing users to search for "damaged buildings" or "dumping grounds" in satellite imagery by vector similarity rather than mere location.

The lakehouse and modern scale
For years, the pattern was simple: Load your data into PostGIS, then query it. That pattern is changing. We are seeing a shift toward PostGIS as a high-performance query engine for data that lives elsewhere or scales beyond a single node.
Elizabeth Christensen introduced pg_lake, a new open source extension that connects Postgres directly to object storage. We saw how to query GeoParquet and Iceberg tables sitting in S3 as if they were local tables. No ETL required. She showed a live demo connecting a local PostGIS instance to the massive Overture Maps data set in the cloud, filtering competitor locations on the fly.

Matt Forrest from Wherobots walked us through the interplay between PostGIS and Apache Sedona. He made a compelling case for using the right tool for the job: Sedona for the massive, distributed heavy lifting (i.e., processing 2.5 billion buildings in an hour) and PostGIS for the high-speed, transactional retrieval. The future isn't one or the other; it's a pipeline where they talk to each other seamlessly.

Fawad Qureshi from Snowflake echoed this, showing how the boundaries between transactional and analytical systems are fading. He demonstrated how Snowflake and PostGIS can share data without copying, allowing heavy analytical workloads to run alongside operational apps without resource contention.
Darafei Praliaskouski from Maumap and Alper Dincer brought the hexagons. Darafei walked us through building real-world pipelines using PostGIS and H3 to model a Mestastic LoRa mesh network coverage in Georgia, handling road networks and admin boundaries at scale. Alper showed how his platform, drought.uk, uses H3 to aggregate massive climate data sets, turning heavy raster processing into fast, index-based SQL queries.

Krishna Lodha joined us from the future (literally, it was 2 a.m. in New Zealand!) to talk about the modern OGC microservices stack. He showed how pg_featureserv and pg_tileserv allow developers to expose PostGIS data directly to the web without the weight of legacy middleware.
Industrial-grade engineering
Amidst the new shiny tools, we were reminded why PostGIS runs the world. The "boring" stuff is still the most critical.
Cedric Duprez from IGN France gave us a history lesson that felt like a victory lap. IGN adopted PostGIS 20 years ago — back when it was version 0.8! He walked us through two decades of migration, from proprietary systems to a fully open source stack that now powers the French national mapping infrastructure. It was a testament to the long-term stability and maturity of the project.
Michael Keller from State Farm pulled back the curtain on the company’s massive internal geospatial platform. This isn't a toy app; it's the system that responds when hurricanes and wildfires strike. He showed how State Farm uses PostGIS to handle everything from real-time claims adjustment to analyzing catastrophe risk, scaling up infrastructure on AWS to meet demand during disasters.
Lars Aksel Opsahl from NIBIO showed us the extreme end of data modeling with PostGIS Topology. He isn't just storing polygons; he's managing a seamless, topological coverage of Norway's land use. He shared the gritty details of managing millions of shared edges and the custom tools the company has built to keep its data clean and consistent.
Adam Kipkemei from Telkom Kenya shared a fascinating case study on optimizing sales distribution. He used PostGIS to track "Sales Commandos" in the field, ensuring they stick to their sales territories and optimizing their routes to maximize SIM card and airtime sales.

Bruce Rindahl from the Mile High Flood District brought us back to the metal with linear referencing. He demonstrated how he uses PostGIS to snap hydraulic model nodes to stream centerlines, converting messy, disconnected text files into a clean, visualizable stream network.
The PostGIS community
PostGIS isn't just code; it's a community of cartographers, maintainers and storytellers. This year, we saw the breadth of that community in full force.
Regina Obe, a member of the Project Steering Committee, took us on a journey through time — literally. She explored the history of temporal support in databases, from the early days of Ingres to modern extensions like MobilityDB, showing how we can model time alongside space.
Jochen Topf, the maintainer of osm2pgsql, showed us how the tool has evolved beyond simple imports. With the new "flex" output, you can now perform complex data generalization and cleaning on the fly as you load OpenStreetMap data into PostGIS.
Michele Tobias from UC Davis DataLab gave us a masterclass in QGIS cartography. She reminded us that making a map is about storytelling, communication and simplification. Her tips on visual hierarchy and font management are essential for anyone trying to make their spatial SQL results actually look good.
Bonny P McClain dazzled us with 3D data storytelling. Using Blender and PostGIS together, she turns dry data tables into immersive, visual narratives about biodiversity, pollution and urban planning. It was a reminder that data doesn't exist in a vacuum; it has to tell a story to drive change.
Samuel Mather, a high school student, closed the loop on what this community is all about. He presented his epidemiology class project, where he used PostGIS to generate isochrones for emergency resource distribution. Seeing the next generation pick up these tools and immediately apply them to real-world problems was the highlight of the day.
Paul Ramsey, the event host, closed the day with a recap of the big features that have come to PostGIS (and the libraries it leverages) over the past five years. Robust overlay, faster predicates, polygonal coverages, coverage cleaning, and a whole lot more!
The verdict
It was an honor to host PostGIS Day again. What struck me most this year was the continued growth of the database as a platform. PostGIS is not just a bucket for geometry or even just an engine of spatial SQL. It is the brain of AI agents, the query engine for the data lake and the bedrock of enterprise infrastructure.
Thanks to everyone who participated in the chat and Q&A. It was a lively experience — all 11 hours of it!
