Welcome to Planet OSGeo

January 07, 2026

La red GeoLIBERO ha sido reconocida oficialmente como Red Emérita por el Programa CYTED. Este reconocimiento fue aprobado por la Asamblea General celebrada en noviembre de 2025 en Asunción (Paraguay) y supone, sobre todo, un respaldo al trabajo colectivo que venimos desarrollando desde hace años en torno a la geomática libre en Iberoamérica.

Además, he sido nombrado coordinador de la Red Emérita GeoLIBERO para el periodo 2026–2029, una responsabilidad que desde la Asociación gvSIG asumimos con orgullo y con muchas ganas de seguir impulsando esta comunidad.

¿Qué significa que GeoLIBERO sea una Red Emérita?

Las Redes Eméritas CYTED no son redes “nuevas”, sino todo lo contrario: es un sello que se concede a las redes que han demostrado mayor valor, impacto y su capacidad para mantenerse activas una vez finalizada su etapa como Red Temática.

En nuestro caso, este reconocimiento significa que GeoLIBERO mantiene el sello y el respaldo de CYTED y se consolida como un espacio maduro de colaboración entre universidades, administraciones públicas y entidades comprometidas con el conocimiento abierto.

Una trayectoria construida desde la colaboración

GeoLIBERO nació con una idea muy clara: poner la geomática libre al servicio de las necesidades reales de Iberoamérica. A lo largo de estos años hemos trabajado en geomática y transferencia tecnológica, siempre con una filosofía muy clara: software libre, estándares abiertos y cooperación entre iguales.

El hecho de que CYTED reconozca a GeoLIBERO como Red Emérita es un reflejo de que este enfoque tiene sentido y genera impacto. Pero, sobre todo, es un reconocimiento al trabajo de todas las personas y grupos que han formado parte de la red.

Mirando al futuro (2026–2029)

Esta nueva etapa como Red Emérita queremos seguir fortaleciendo la comunidad iberoamericana de geomática libre, impulsar nuevas colaboraciones y proyectos conjuntos, reforzar la formación y la transferencia de conocimiento y explorar nuevas líneas de trabajo vinculadas a gobernanza territorial, datos abiertos, gestión del riesgo, sostenibilidad o inteligencia geoespacial.

Todo ello manteniendo el espíritu con el que nació GeoLIBERO: compartir, cooperar y construir soluciones útiles desde la geomática libre.

by Alvaro at January 07, 2026 08:36 AM

QField at FOSS4G 2025 Auckland: From Mobile App to Production Infrastructure

Throughout the week, in workshops, presentations, and project showcases, a consistent theme emerged: QField is not just “the mobile companion to QGIS,” it is production infrastructure for complete field-to-cloud-to-desktop workflows.

It was incredible to see how present QField was throughout FOSS4G 2025 in Auckland. With around 20 presentations and workshops featuring QField, the conference showcased a wide range of real-world, production-grade use cases across many sectors. 

What stood out was not just the number of talks, but how consistently QField was presented as a trusted, operational tool rather than an experiment.

The QField Ecosystem in Practice

QGIS Desktop for project design, analysis, and quality assurance
QField for field capture, with offline-first capabilities when connectivity is limited
QFieldCloud for real-time synchronization, team coordination, and project management
Plugins and APIs for integration into broader organizational systems

This ecosystem approach transforms field data collection from an isolated task into an integrated workflow. It’s the difference between “collecting points” and “running a programme.”

QField Day: A Community Deep Dive

Early in the conference, QField Day brought together practitioners, developers, and decision-makers for a focused exploration of the platform’s capabilities. The day emphasized practical implementation—what’s possible now, and what organizations are already achieving in production environments.

Workshops

Complete Lifecycle Management

The QField & QFieldCloud workshop covered the full data collection cycle: project setup in QGIS Desktop, field deployment with QField, synchronization through QFieldCloud, and integration back into desktop workflows for analysis and quality control. Participants worked through the entire pipeline, from initial design to final deliverables.

Field-to-Analysis Integration

One workshop demonstrated the speed of modern field-to-cloud-to-analysis workflows by using Auckland itself as a live laboratory. Participants collected ground truth data with QField, then fed it directly into machine learning workflows running in Digital Earth Pacific’s Jupyter environment. The exercise highlighted how quickly iteration cycles can operate when field, cloud, and analysis infrastructure are properly connected.

Plugin Development

For developers, the plugin authoring workshop signaled platform maturity. QField’s plugin framework—built on QML and JavaScript enables organizations to extend core functionality for specific operational requirements. Custom forms, specialized integrations, and domain-specific interfaces can be developed to address the edge cases that real field programmes encounter.

Operational Workflows: Digital Earth Pacific

Platform Integration: QField participatory mapping integration into Digital Earth Pacific demonstrated the technical workflow connecting field data collection to analysis infrastructure, using Digital Earth Pacific’s open data cube and Jupyter tooling.

Applied Case Study: Identifying Forest Invasive Species in Fiji and Tonga Using Machine Learning showed this workflow in action. Field teams collect confirmed invasive species locations using QField, then train detection models using time-series satellite data, iterating with domain experts and local partners to refine results.

Production Deployments

Conservation Operations

Zero Invasive Predators showed QField and QFieldCloud integrated into operational fieldwork for predator eradication programmes across New Zealand. Planning happens in QGIS, capture in QField, and coordination through QFieldCloud—enabling systematic management of conservation campaigns across remote terrain.

Government-Scale Implementation

Finland’s National Land Survey presented their use of QField as part of national topographic data production infrastructure, deployed alongside QGIS and PostGIS. This represents enterprise validation: a national mapping agency selecting QField for production topographic surveying.

Precision Agriculture

Smart vineyards with QGIS & QField demonstrated advanced symbology, map themes, and structured capture workflows supporting precision agriculture operations—showing that the platform handles the level of detail and complexity that professional workflows require.

Developer Infrastructure and Sustainability

QFieldCloud API

The QFieldCloud API session focused on programmatic integration for organizations with existing systems. The API enables automation, custom integrations, and connection to enterprise infrastructure—essential for organizations moving beyond standalone deployments.

Open Source Business Models

Who Pays Your Bills? offered a transparent discussion of what it takes to build sustainable businesses around QGIS and QField. These conversations matter for the broader open-source geospatial ecosystem, addressing the practical realities of long-term project sustainability.

Platform Maturity

[Re]discover QField[Cloud] highlighted how platform maturity often manifests as steady capability growth—the accumulation of thoughtful improvements driven by real field workflows rather than flashy feature releases.

Context: Open Tools for Public Good

Two presentations framed QField’s development within broader conversations about open tools and long-term impact.

Mapping the World, Empowering People: QField’s Vision in Practice connected QField’s technical capabilities to public-good outcomes, addressing how tools enable not just efficiency but meaningful impact.

Open-source road infrastructure management and digital twin direction demonstrated that open standards and open tooling are increasingly part of serious infrastructure conversations—and that many organizations are still transitioning from spreadsheet-based workflows to structured spatial data management.

Looking Forward

FOSS4G 2025 Auckland was all about the conversations, and our small booth quickly became a popular meeting point — the QField caps were gone within half a day. We demonstrated the tight integration of Happy Mini Q GNSS with QField, showing how sub-centimeter positioning can be used seamlessly in real field workflows. The booth also featured EGENIOUSS, an EU project where QField is part of the solution to complement GNSS with visual localisation, enabling accurate and reliable positioning even in challenging environments such as urban canyons where satellite signals alone fall short.

Thank you to everyone who took the time to share your workflows, your challenges, and your stories—whether in presentations, workshops, or over coffee between sessions.

Hearing how you’re using QField in the field, what’s working, what needs improvement, and what you’re building next helps us understand where the platform needs to go.

These conversations remind us that we’re building tools for real people doing important work, and that’s what keeps this community moving forward together.

by Daniel O'Donohue at January 07, 2026 12:16 AM

January 06, 2026

Reinder spotted this mosaic map of the Old City of Rome, “Saw this in the lobby of the Santa Chiara hotel in Rome. The size of the map is let’s say 50 x 70 cm, it is made of little stones of approximately 4 x 4 mm. No artist is mentioned — but he or she deserves our respect. Ciao!”

by Steven at January 06, 2026 10:00 AM

January 05, 2026

Click the image to go the Unequal London Map (v2)

I was quite pleased with the first version of the Unequal London Map, but by the time I had crawled over the finish line I realised that there were several choices that I had made which were less than ideal, particularly with regard to the data I selected. I thought it should be quite easy to build another version with more/different data and remedy some of the other issues. I decided to start with Gemini to avoid the usage limits with WebMapperGPT.

The first stage was to identify the data that I could use to give a better view of inequality in London. I started with this prompt:

“Map of the unequal distribution of wealth in London. I want to make a map of the unequal distribution of wealth in london, particularly highlighting the boroughs or wards or output areas where there is a clear split. What publicly available data sets can you suggest?”

After a bit of back and forth Gemini suggested:

  • Health & Environment
    • Air Quality (NO2 & PM2.5 ):
      • Link: London Atmospheric Emissions Inventory (LAEI)
      • Level: Grid/Borough (You can map these to LSOAs using a spatial join in QGIS or Python).
    • Respiratory & Housing-Related Health:
      • Link: OHID Fingertips – Public Health Profiles
      • Level: LSOA.
      • How to find: Go to the link, click “Data”, then select “LSOA” as your area type. Look for indicators like “Emergency hospital admissions for COPD” or “Respiratory disease.”
  • Digital & Connectivity
    • Broadband Speeds (Digital Divide):
      • Link: Ofcom Connected Nations Data
      • Level: LSOA.
      • Dataset: Download the “Fixed LSOA data”. It contains the percentage of premises that have “Average Download Speeds” below certain thresholds.
  • Public Transport Accessibility (PTAL):
    • Link: TfL PTAL Dataset – London Datastore
    • Level: 100m Grid (Can be easily aggregated to LSOA).
    • Tip: Download the “PTAL 2015” (or latest) and use the average score per LSOA to define “transport poverty.”
  • Cultural & Social (Census 2021) – The ONS Nomis tool is the best way to get these filtered for London immediately.
    • Household Language:
      • Link: Census 2021 – Household Language (TS025)
      • Level: LSOA.
    • Religion vs. Occupation (Social Segregation):
      • Link: Census 2021 – Occupation by Religion (RM106)
      • Level: LSOA.
      • Selection: Use the “Area Type” filter on the left to select “LSOAs” and “Region” to select “London”.
  • Financial Resilience
    • Universal Credit Claimants (Live Stress)
    • Small Business & Household Lending

That’s a lot of data and some of it was not easy/almost impossible to access. The DWP extract tool was particularly challenging and I wasted over an hour trying to get health data from the Fingertips site. Eventually I managed to access most of the data and then Gemini guided me through preprocessing the data, joining to LSOA’s, working how to convert from 2011 LSOAs to 2021, converting the NOx data from a grid file to a vector and aggregating some data that was OA level to LSOA.

Then I started to calculate the indicators, rankings and deciles that Gemini recommended – a Big Gothcha – the field calculator in QGIS is very powerful but it leaves no record of what calculation I had undertaken so later on I was uncertain how I had calculated the rankings etc. In future I will have a scratch pad to copy and paste the expressions that generated the rankings, that way I could tweak them later on.

When I had finished processing and organising the data, I was ready to export to geojson and then convert to tiles. Gemini recommended the precision setting for the export to geojson and the parameters for the Tippecanoe pmtiles conversion. With hindsight, I think the simplification was a little too aggressive and there are a few slivers in the pmtiles but it wasn’t bad enough to warrant rerunning this stage.

I then wrote a quite long and detailed prompt outlining the map that I wanted and provided the v1 code as a basis plus a screenshot of the fields listing from QGIS (I couldn’t find a way to get a list of fields in a copyable form but Gemini seemed quite happy with a screenshot) and hey presto, I had a working map! I started iterating through corrections and enhancements and tweaks to the thematic renderings, thresholds and colours and was thinking this was easy so I grouped a few requests for enhancements together including a collapsing legend for mobile users and sure enough I broke it!

Then I learnt why it is so important to have saved a copy of a working version, by not having one and allowing myself to fall into another doom spiral of Gemini trying to fix what had gone wrong and me applying fixes to fixes and breaking things. After a while I scrolled back and found the last working code version and reinstated that and breathed a sigh of relief. From hereon I saved a copy of my working files before committing any further updates, you may be thinking why isn’t he using Git – the answer is that, foolishly, I was copying and pasting Gemini’s code onto my live cloud server rather than in a development environment – stupid I now know.

Now that I had a working version again, I went slowly, fixing or adding one thing at a time (with saves in between). Quite quickly I had solved the rendering problems, had a more mobile friendly version, extended info popups with more elegant formatting and methodology and source explanations for each layer.

I am pretty pleased with the result but I have a few reservations:

  • I’m not sure I have got all of the indicators and rankings right and as I didn’t save the workings I can’t easily correct. I think it is good enough and most of the layers around where I live confirm my own knowledge.
  • A bit more research into the datasets and the best way to convert them into indicators would have produced a better map.
  • The map works well on a laptop but even with the collapsing menu it isn’t very usable on a mobile. Next time I think it needs more focus on the moble experience, possible delivering a significantly different interface and interactions for the mobile version.

This turned into a full on data sourcing, cleaning and processing job before prompting Gemini to build a web map for me. That’s quite a long way from Ken’s aim to make a map a day! Next time I am going to try writing a much longer prompt for a less ambitious map and see whether I can get Gemini to source the data, clean it and turn it into a map.

by Steven at January 05, 2026 10:34 PM

January 04, 2026

After the relative success of my first attempt to build a map with WebMapperGPT (WMGPT) I wanted to try something a bit more ambitious. A conversation about how in most parts of London rich and poor live relatively close to each other, prompted me to try and build a map to illustrate Unequal London.

This was a much more complex project with several stages:

  • Identify what data is available using WMGPT and download, confirm that it is suitable
  • Work out how to clean and process the data (most data is not in geospatial format so had to be joined to spatial boundaries)
  • Calculate percentages, deciles, rankings or other values suitable for mapping
  • Export into a web ready data format – geojson or pmtiles
  • Prompt WMGPT to build a map
  • Iterate (several times!) and maybe reprocess the data after finding unexpected results

I started out with Lower Super Output Areas for London and Council Tax data from the London DataStore, The Index of Multiple Deprivation and Small Area Income Estimates from the ONS. I got WMGPT to remind me how to do the joins in QGIS because it’s been a while since I did much with QGIS, by the end of this project I was a good bit better.

A tip – you can calculate formulae like percentages in the QGIS field calculator but since you are starting out with a csv or xlsx file before joining to your boundaries, it’s a lot easier to do this in Excel before joining. I had a few problems with null values and odd text strings where I was expecting a number, again those are easier to resolve in Excel than QGIS.

Another tip – convert your boundaries from shape files to geopackage, it works better in QGIS and supports long field names (unlike shape). You will be grateful if you adopt a field name convention of all lower case with underscores instead of spaces like field_name

Eventually I had the data ready in QGIS and was able to view it and test out simple thematics in QGIS. Next step was to export the data into a geojson file and convert to WGS. Easy to do but you get a pretty large file with almost 5,000 boundaries so I tried using several simplification algorithms in QGIS, I found the Grass Simplify tool the best with a fairly small snapping tolerance.

Now I was ready to start building the web map, I gave WMGPT a sample of the geojson so that it new the file structure and asked it produce a web map with different modes to illustrate inequality – the more detailed the prompt the less iterations to fix or enhance the code you will need, I learnt that the hard way. It took ages to get the map polygons to render and colour on top of a base map with working popups. Eventually I got to a working version running but it was a very slow and sticky loading te geojson. WMGPT suggested the solution was to switch to using pmtiles which only load progressively, but just then the free version of WMGPT timed out due to the number of requests that I had needed to get to a working map- aaaarrrrgggghhhh!

I switched to Google Gemini (GG) which doesn’t seem to have a problematic usage limit (albeit the free version isn’t their latest and greatest). I uploaded my working html, css and js files and explained that I wanted to migrate to pmtiles.

First challenge was converting geojson to pmtiles, the QGIS plugin refused to load, the suggested online tools didn’t work so eventually I gave in and installed Homebrew on my mac and installed Tippecanoe to do the conversion – it seemed like overkill but with hindsight it was a good move as it made it easier to regenerate pmtiles later on.

Adapting the code to work with pmtiles was a nightmare – one step forward, two steps back. I got the tiles to render and colour but no info popups when I clicked on the map, trying to fix that I broke everything. And on and on. Along the way I tried changing the links for the Leaflet and pmtiles libraries and tried every conceivable version of the two. I am sure that by this stage my friends who are coders are chuckling and muttering “that is why AI coding is not worth the effort” but at my age and with my lack of coding skills if I am going to get started I need a lot of help.

Then the breakthrough – somewhere in this doom spiral (hours and hours of iteration) I recalled at some point GG had mentioned MapLibre, I asked whether it would be easier to migrate the project to use MapLibre and hey presto, within a couple of minutes I had a working version. Apparently MapLibre has built in support for pmtiles while Leaflet needs an extra provider/plugin thing which didn’t seem to work.

A brief diversion on basemap tiles – I wanted a greyscale basemap to keep focus on the thematic layers but would peek through to provide some context as users moved around the map. I tried several options that GG suggested – Carto tiles wouldn’t render properly and Esri tiles were so muted that they would not show through the thematic layer even with opacity turned down a long way (which made the thematics look wishy washy). After an hour or so of effort, I reverted back to the standard OSM tiles.

From hereon it was a case of making a few small enhancements, tuning colours and styling the info box, adding a search function and a “locate me” and I had a decent Unequal London map.

With hindsight again, I am not sure that the data and calculations that I chose were the right ones but at least I had a working map and I have learnt a lot along the way. I’m sure the next project will benefit from what I have learnt – there’s no way I could do one of these a day.

by Steven at January 04, 2026 07:41 PM

January 03, 2026

I don’t know where to start with this one. Yes it is a map in the wild, but why would anyone make this decaying world held by rotting hands and then award it as a peace prize? I thought FIFA governed world football, is it really FIFA’s role to award peace prizes? As for the recipient, I think the jury is out.

This could have been an April Fool’s post but I thought it might serve as a starter for 2026. Mappy New Year!

by Steven at January 03, 2026 10:00 AM

January 02, 2026

Melinda Clarke (of Melbourne Map fame) sent me this, she said

“I met this young girl, we got chatting and I shared that I was producing a map of Melbourne. She then declared her love for maps, and travelling around the world – and showed me her tattoo of a world map on the soles of her feet!  If you look closely – she has even included NZ ! I am thinking a tattoo on the soles of your feet might just have hurt a bit. ” 

What a map in the wild! I think that is “above and beyond”

by Steven at January 02, 2026 10:00 AM

January 01, 2026

This has to be one of the best maps in the wild ever!

Elizabeth sent me this from Weymouth “Last night on the South Coast. Looks like the map of UK, Ireland and Denmark.”

Mappy New Year to everyone who contributes maps in the wild or follows us, let it be a year full of whacky maps.

by Steven at January 01, 2026 10:00 AM

December 31, 2025

It’s been another exciting year for the GRASS project! From improved interfaces for Python users to modernized documentation and a major developer gathering in North Carolina, the year 2025 brought significant advances on multiple fronts. Let’s take a look back at what our community accomplished. Development Activity A major theme of 2025 was improving how users interact with GRASS across different interfaces. This work touched everything from tool outputs to Python APIs to documentation, creating a more cohesive and accessible experience.

December 31, 2025 08:12 AM

December 30, 2025

Well, 2025 was quite a fun and productive year! Here’s the rundown on my last 12 months: WMO: WIS2 went operational on 01 January 2025, and things ran relatively smoothly overall. Check out our Global Services listing! WIS2 currently has over 100 WIS2 Nodes and 19 Global Services, and we continue to refine and tweak […]

by tomkralidis at December 30, 2025 04:35 PM

A major accomplishment in 2025 was re-achieving CITE Certification status for GeoServer. Many thanks to all who were involved! After approximately 10 years, we can once again officially confirm that GeoServer is OGC compliant.

Maintaining Certification is an annual expense for the project - and a sponsorship opportunity for you!

Thanks to Gaia3D and OSGeo:UK for sponsorship covering the expense of CITE Certification for 2025.

Certification January 9th 2026 Deadline

Our 2025 CITE Certification for GeoServer 2.27 expires on January 9th.

CITE Sponsorship Opening

This certification process is an annual expense of around $1000 USD, and we will require sponsorship for 2026 to maintain certified status. GeoServer implements a number of different standards, each one of which costs $150 USD to certify each year. The $150 per certification cost makes use of an agreed on rate offered to the Open Source Geospatial Foundation as a not-for-profit organization. The approximate $1000 USD expense will increase slightly as we gradually adopt additional OGC API standards into the application.

CITE 2026 Sponsorship Opportunity

OGC CITE Certification is important for several reasons:

  • Provides assurance that GeoServer may be integrated in solutions as intended by Open Geospatial Consortium.
  • Offers an independent source of black-box testing ensuring that each GeoServer release behaves as intended.
  • The Provides a logo and visibility for the project helping to promote the use of open standards.
  • Inclusion in the list of certified products helps people discover GeoServer, and allows GeoServer to be used by organizations requiring certification.

In addition to CITE certification offering value for the project, sponsoring can provide excellent visibility for potential sponsors, with your logo appearing on the GeoServer home page, and in presentations when each standard is mentioned.

CITE Home Page Visibility Service Slides Visibility

If you/your organization is in position to support this activity, please contact any member of Project Steering Committee, or the OSGeo treasurer. For details visit the GeoServer Sponsorship Page. The more organizations that are able to sponsor, the lower the expense will be to each organization (sharing the approx USD $1,000 per year cost).

Thank you for your support!

CITE 2026 Sponsorship Fulfilled

Sponsorship: Thanks to the following organizations for stepping forward with financial assistance. We are now in position to maintain certification for calendar year 2026.

In-Kind: Thanks to following organizations for “reference implementation” certification. To act as a reference implementation GeoServer must maintain a service operating as an example of how the standard is intended to function. The Open Source Geospatial Foundation has provided hosting, and AfriGIS provided setup and maintenance.

In-Kind: Finally, thanks to the organizations that have directly contributed to passing the test and integrating these tests into our quality assurance workflows.

by Jody Garnett at December 30, 2025 12:00 AM

December 29, 2025

December 28, 2025

Free and Open Source for Geospatial North America (FOSS4GNA) 2025 was running November 3-5th 2025 and I think it was one of the better FOSS4GNAs we've had. I was on the programming and workshop committees and we were worried with the government shutdown that things could go badly since we started getting people withdrawing their talks and workshops very close to curtain time. Despite our attendance being lower than prior years, it felt crowded enough and on the bright side, people weren't fighting for chairs to sit even in the most crowded talks. The FOSS4G 2025 International happened 2 weeks after, in Auckland, New Zealand, and that I heard had a fairly decent turn-out too.

Continue reading "FOSS4GNA 2025: Summary"

by Regina Obe (nospam@example.com) at December 28, 2025 11:37 PM

December 27, 2025

December 26, 2025

Over the past year, we have been revisiting the Hyper-Layering Architecture (HLA) within the W3C, in the context of ongoing discussions about making maps a first-class citizen of the Web. Hyper-Layering Architecture is not a new or purely theoretical concept. Its origins go back to discussions following the 1995 Great Hanshin–Awaji Earthquake, and it has been developed within the W3C community since the early days of SVG. The architecture was formally submitted to W3C in 2011, later reflected in SVG specifications, standardized domestically as JIS X 7197:2012, and has been operated in real-world systems for more than 15 years. Today, HLA 2.0 (Layers as Web Apps) forms the foundation of large-scale disaster information systems, including a recently launched public service. A draft Community Report summarizing the architecture and its background is available here. *1

Context within the Maps for HTML Community Group
At the same time, the Maps for HTML Community Group has been exploring MapML and related technologies, with the shared aspiration of making maps more native to the Web platform. During recent discussions—particularly around TPAC 2025—we agreed that creating a new Community Group was unnecessary. Instead, it felt more constructive to continue this exploration within the Maps for HTML Community Group, where complementary but different perspectives on Web mapping already exist. As these conversations progressed, an important point became clear.

Two Approaches, One Aspiration
MapML and Hyper-Layering Architecture share a broadly similar aspiration, but they approach it from different directions.

  • MapML focuses on declarative, data-level interoperability within HTML.
  • Hyper-Layering Architecture, by contrast, focuses on composing entire Web applications and services into a shared spatial context, treating the Web itself as a distributed map.

Rather than concluding in advance how these approaches should be unified, I began to explore a related question: whether it might be useful to identify a smallest shared foundation that could be discussed independently of any single specification. We agreed that this question itself is worth open discussion within the Maps for HTML Community Group.

Extending the Question
This question extends beyond MapML and Hyper-Layering Architecture alone. What is the minimal concept shared not only by these two approaches, but also by widely deployed ecosystems such as Google Maps, Leaflet, OpenLayers, D3, and even the many existing Web pages that already convey spatial meaning? That line of inquiry led to what I call the Minimal Map Model.

The Role of the Minimal Map Model
The Minimal Map Model does not attempt to define map formats, rendering pipelines, or application logic. Instead, it focuses on the most basic relationship between a region in CSS coordinate space and the geographic coordinates it represents. By intentionally limiting its scope, it aims to remain neutral with respect to existing frameworks and services, while providing a common reference point for discussion.
The Minimal Map Model is described in a separate exploratory draft. *2

Hyper-Layering Architecture itself already has working implementations and specifications, and continues to evolve through long-term practical use. From that perspective, this work is not a call for immediate standardization of HLA, nor a proposal to replace any existing specifications. The Minimal Map Model, by contrast, is presented as an exploratory contribution—intended to help clarify common ground and to support constructive discussion across different mapping approaches.

Looking Ahead
We hope that continuing this discussion within the Maps for HTML Community Group will help connect these perspectives, and contribute to a more open, user-centric, and interoperable future for Web mapping.

Further details are available in the following draft reports:
*1: Hyper-Layering Architecture — Community Report (Draft)
https://svgmap.org/movie/TPAC2025/HyperLayeringArchitectureCommunityReport.html
*2: A Minimal Map Model — Community Report (Draft)
https://svgmap.org/movie/TPAC2025/minimalMapModel.html

by Satoru Takagi at December 26, 2025 09:11 PM

December 25, 2025

December 24, 2025

A few days ago, my pal Ken Field posted about a new project he is planning for 2026:

Ten years ago I had an idea to write about a map a day for a year. I’ve had another idea… Introducing the #365DayMapChallenge where Ian Muehlenhaus and I will make a map a day, using only natural language prompts and AI, for a whole year. More here: https://mapdesign.icaci.org/2025/12/the-365-day-map-challenge/

I had a look at the ICA Commission on Map Design site and thought I would have a play with their WebMapperGPT tools. The idea is to describe the map that you want to build and then the AI tool will generate the code for you. Turned out it is both compulsive and a lot harder than I would have expected!

I started out with the idea to make a political atlas illustrating the ideological classification of the governments of the world. Pause your questions about what “ideological classification” might mean or how you would do that, I had no idea at this stage.

My initial prompt was “make me a thematic map of the world by country showing goverments political leaning. Red = left wing, pink = social democrat/soft left, orange = liberal/centrist, pale blue = conservative/centre right, dark blue = hard right, black = authoritarian/dicatorship”

After a bit of Q&A I got the html to generate a map with a few countries data populated. I chose a colour scheme that will be familiar to those in Europe (Left = red, Right = blue) but is the reverse of US political colour conventions (Democrat = blue, Republican = red) and got a sort of working map.

It was smart, the way WebMapperGPT identified a source for country boundaries in geojson and the link to the OSM tile service (including attribution) for the underlying base map and, of course, it knew about calling Leaflet to build and manipulate the map.

Then through a series of iterations and a couple of blind alleys (like the ambition to have 10 or 15 years data and be able to move through it with a time slider) I got to a better structured map with the data separated from the code, a legend, a filter and count of each classification but still not many countries actually classified.

Each time I iterated and something didn’t work or I wanted an enhancement WebMapperGPT gave me code snippets to correct errors or add something. BUT, I am not a coder and some of the instructions were way less than clear and I made numerous mistakes around where to paste things, missing a bracket etc. Several times I ended up asking WebMapperGPT to regenerate the code from scratch, I recommend making a backup copy of a working version before editing or adding to the code.

The big problem remaining was the data on all of the 194 countries, WebMapperGPT seemed to struggle to find the data (ISO code, Official Country Name, Leader, Party, Ideology) even though all of that except the ideological classification was factual and should be available somewhere online. I absolutely did not want to do loads of searches and then copy paste this data into a dataset line by line! Then I had an idea, ask Google Gemini if it could produce this data in a json format that I had copied from WebMappperGPT – 2 or 3 iterations and less than 5 minutes and I had a complete listing for all 194 countries with any uncertainties classified as “Unclassified”. I plugged that data into my and hey presto – a working map covering the world’s governments. I don’t know why WebMapperGPT couldn’t or wouldn’t do that.

Several more rounds of tweaking and debugging and I had a map with a search facility, a legend that opens and closes and starts closed on mobile displays and a methodology modal panel.

It’s not perfect but I am quite pleased with it, it works better on a decent sized screen than on a mobile, I’m not sure how you can do world level mapping well on a tiny screen.The methodology for classifying the ideologies of each political leader are a bit opaque and I have fessed up to that in the methodology panel “Ideology classifications reflect best available information and may simplify complex political realities, they have been generated by AI and do not represent my opinions. I will update in the future.

If you want to make a map and you don’t have the technical skills, WebMapperGPT may be a great helper. It talked me through some of the key decisions around data and design and sourced some of the data. Gemini was better at compiling and classifying the world dataset and then putting it into the right structure. One warning, this wasn’t a quick job and with a free ChatGPT account you time out quite quickly and have to pause for a while before you can continue but it remembers where you were and you can continue asking questions once the time out is over. If you are going to do this seriously you might need to sign up for a paid account.

I asked Ken how he was going to produce a map per day (even with a partner, that’s 180+ in a year) to which he replied “Shouldn’t take more than ten minutes or so to make a map. I never said they’d be ‘good’ maps though!”. He gave me some great advice which I will share with you “The trick is thinking of every aspect of how you want the map to look and function, then writing that down

Warning: if you get started with this you may get a bit addicted, I did. Feel free to view and copy the source code. If you want the underlying data to play with ping me.

I am going to make a few more maps (definitely not one per day!) and I’ll share them here. If you make an AI assisted map and you want to share, post something in the comments, maybe we can create a gallery.

by Steven at December 24, 2025 02:55 PM

Jeremy spotted this in the Lemon Tree pub in London. He said “It was really interesting to interpret. Around 1850 we reckoned. The Crystal Palace is still in Hyde Park, no Tower Bridge, the terminus main railway stations are in.”

You might think “that’s really neat” then Ken chipped in to identify the maps as “Balloon View of London as seen from Hampstead by Henry Banks, 1851” – you could visit the original in the London Museum. For the Londoners trying to work out where Santa might be heading for his next delivery, remember that this map is north down (viewed from Hampstead looking south).

by Steven at December 24, 2025 10:00 AM

December 23, 2025

Prezado leitor,

Nos últimos dias, circulou no LinkedIn um artigo que afirmava que o GeoServer estaria obsoleto e em declínio, caracterizando-o como uma ferramenta pesada e sugerindo que soluções como pg_tileserv e Martin poderiam substituí-lo integralmente.

A argumentação apresentada, no entanto, baseava-se predominantemente em percepções individuais, sem uma análise técnica mais aprofundada ou consideração dos diferentes contextos de uso. Ao longo deste texto, apresento uma avaliação fundamentada sobre o papel do GeoServer no ecossistema geoespacial atual, demonstrando por que ele permanece uma solução robusta, amplamente utilizada em produção e longe de estar em processo de obsolescência.

1. pg_tileserv / Martin podem substituir o GeoServer em ambientes complexos?

A discussão sobre a substituição do GeoServer por ferramentas como pg_tileserv ou Martin tem ganhado espaço à medida que arquiteturas geoespaciais mais enxutas e orientadas a frontend se popularizam. No entanto, quando analisamos ambientes complexos e institucionais a resposta técnica é clara: essas ferramentas não são substitutos funcionais do GeoServer, mas sim componentes complementares.

2. Escopo funcional e vocação das ferramentas

pg_tileserv e Martin foram concebidos para resolver um problema específico: a entrega eficiente de dados geoespaciais vetoriais a aplicações web modernas, geralmente por meio de vector tiles ou APIs REST simples, com foco em desempenho, baixa latência e simplicidade operacional. Sua arquitetura stateless, o acoplamento direto ao PostGIS e a ausência de camadas intermediárias fazem dessas ferramentas excelentes escolhas para produtos digitais orientados a UX.

O GeoServer, por outro lado, foi projetado como um servidor geoespacial corporativo, voltado à publicação, gestão e interoperabilidade de dados espaciais em ambientes multiusuário e de longo prazo. Seu escopo funcional é deliberadamente mais amplo e atende a requisitos que não são cobertos por servidores de tiles ou APIs minimalistas.

3. Governança e ciclo de vida dos dados

Em infraestruturas complexas, o desafio central não é apenas servir dados, mas governá-los. Isso inclui:

  • Publicação controlada de centenas de camadas;
  • Organização por temas, domínios e responsabilidades institucionais;
  • Gerenciamento de estilos, projeções e escalas;
  • Controle de acesso por camada e por serviço;
  • Auditoria e rastreabilidade.

Ferramentas como pg_tileserv e Martin não oferecem mecanismos nativos para esse tipo de governança. Para alcançar um nível equivalente, seria necessário desenvolver soluções adicionais para catalogação, versionamento, autorização e gestão operacional, deslocando a complexidade do servidor para a aplicação e aumentando o custo de manutenção.

4. Padrões OGC e interoperabilidade

Ambientes públicos e institucionais dependem fortemente de padrões OGC consolidados, como WMS, WFS e WCS, não apenas por questões técnicas, mas também por exigências legais, normativas e de interoperabilidade. Esses serviços permitem o consumo dos dados por uma ampla variedade de clientes, incluindo QGIS, ArcGIS e sistemas legados.

pg_tileserv e Martin não implementam esses padrões e tampouco se propõem a fazê-lo. Ainda que APIs REST e OGC API representem avanços importantes, a substituição completa de serviços OGC clássicos em ambientes consolidados é, na prática, inviável no curto e médio prazo.

5. Fluxos operacionais e perfil dos usuários

Outro aspecto frequentemente subestimado é o perfil dos usuários responsáveis pela publicação dos dados. Em plataformas complexas, técnicos e analistas utilizam fluxos consolidados, como a publicação direta de camadas a partir do QGIS, sem a necessidade de intervenção de equipes de desenvolvimento ou DevOps.

A adoção exclusiva de ferramentas como pg_tileserv ou Martin exigiria mudanças profundas nesses fluxos, demandando maior especialização técnica, automação customizada e novos processos organizacionais, um custo que raramente é justificável em ambientes públicos.

6. Escalabilidade: técnica versus institucional

É inegável que pg_tileserv e Martin apresentam vantagens claras em termos de escalabilidade técnica e simplicidade operacional. No entanto, em infraestruturas institucionais, a escalabilidade não se limita ao número de requisições por segundo. Ela envolve também:

  • Continuidade operacional;
  • Estabilidade a longo prazo;
  • Facilidade de administração;
  • Conformidade com padrões e políticas públicas.

Nesse contexto, o GeoServer demonstra sua robustez ao operar de forma estável em produção por anos, atendendo milhões de acessos e grandes volumes de dados, podendo citar como exemplo o GeoSampa, um dos maiores portais do país.

7. Arquiteturas complementares como caminho evolutivo

A evolução mais consistente para ambientes complexos não passa pela substituição radical do GeoServer, mas pela composição de arquiteturas. Um modelo híbrido, no qual o GeoServer permanece responsável pela governança, interoperabilidade e serviços institucionais, enquanto ferramentas como pg_tileserv ou Martin são utilizadas para a entrega eficiente de dados a aplicações web modernas, tende a oferecer o melhor equilíbrio entre inovação e estabilidade.

8. Conclusão

pg_tileserv e Martin representam avanços importantes no ecossistema geoespacial e são altamente adequados para determinados cenários. Contudo, em ambientes complexos e institucionais, eles não substituem o GeoServer. O desafio não é escolher entre “antigo” e “moderno”, mas compreender o papel de cada componente e projetar arquiteturas que respeitem tanto as necessidades técnicas quanto organizacionais.

A maturidade de uma infraestrutura geoespacial não está em adotar ferramentas minimalistas isoladamente, mas em integrá-las de forma coerente a um ecossistema que exige governança, interoperabilidade e sustentabilidade a longo prazo.

by Fernando Quadro at December 23, 2025 04:29 PM

Geomatico nació allá por 2011 como un grupo de desarrolladores quemados del trabajo por cuenta ajena. Un grupo de personas que quería ser dueña de sus propias decisiones, cansada de estar supeditada a un modelo de trabajo con el que no se sentían alineadas.

Durante ocho años, Geomatico fue una red de autónomos autogestionada: personas independientes unidas en la toma de decisiones.

Hacia 2017 se planteó la necesidad de dar un salto y crear una estructura que nos permitiese acceder a determinadas ofertas a las que esta red de trabajadores independientes no podía llegar.

En ese momento decidimos constituir una sociedad mercantil para cubrir esa necesidad concreta. En aquella aventura solo dos de los miembros estaban disponibles y la opción viable fue crear una sociedad limitada.

La primera pregunta a la que tuvimos que enfrentarnos por parte de la asesoría fue la de los porcentajes de participación. Al ser dos socios, nos recomendaron una distribución 51-49 % para que, en caso de empate, uno de los dos pudiera desbloquear las decisiones. Como concepto, no entendíamos que alguien pudiera imponer su criterio únicamente por tener más participaciones. Debido a nuestra trayectoria asamblearia, siempre hemos entendido que las decisiones deben surgir de la conversación y no de la imposición.

Durante dos años gestionamos la sociedad limitada con un 50 % del capital cada uno, manteniendo nuestras dinámicas asamblearias, hasta que en 2019 se nos presentó la oportunidad de constituir la cooperativa que hoy somos.

Si tuviera que resumir qué es Geomatico, recurriría a una definición que formuló muy bien uno de nuestros socios. Geomatico es, ante todo, un refugio para quienes han acabado quemados del mundo laboral, pero, sobre todo, es una red de apoyo mutuo. Que nos dediquemos a desarrollar GIS de código abierto es casi una consecuencia: forma parte de lo que hacemos, pero no es lo que nos define.

Geomatico es un “lugar bueno”, un espacio cuidado y con sentido, pero también frágil. No está pensado como una máquina de hacer dinero y eso implica que el margen es reducido cuando alguien se descuelga. Y, del mismo modo, si la cultura de trabajo se deteriora y se vuelve tóxica, el proyecto deja de tener razón de ser y simplemente desaparece.

Ser socio de Geomatico implica:

  • ser corresponsable del destino del proyecto
  • ser corresponsable del bienestar del equipo

Estamos seguros de que Geomatico se define claramente como una red de apoyo mutuo, en el sentido más profundo del término:

un espacio donde lo importante son las personas y el bienestar del equipo; donde hay margen para desarrollarse como persona más allá de la tecnología; y donde las empresas con las que colaboramos son una parte imprescindible del proyecto, tratadas como un miembro más.

Esta concepción del apoyo mutuo conecta directamente con la formulada por Kropotkin, quien entendía:

“que las especies animales en las que la lucha entre los individuos ha sido reducida al mínimo y en las que la práctica de la ayuda mutua ha alcanzado el máximo desarrollo son, invariablemente, las especies más numerosas, las más florecientes y más aptas para el progreso”

Piotr Kropotkin

Creemos que este modelo, en un mundo en el que cada vez se prima más el éxito individual y la avidez por la ganancia económica, abre un espacio seguro para evolucionar como personas y como profesionales, libre de las presiones de entornos laborales centrados únicamente en la productividad, apoyado en un grupo de personas que, además de socias del proyecto, son parte de una familia.

by Miguel García at December 23, 2025 11:03 AM

December 22, 2025

Esto me pasó hace unos dias, a pesar de lo curioso que pueda parecer. Una sesión de depuración se convirtió en un ejercicio de robopsicología a lo Asimov, y solo porque un modelo cometió un “error filosófico” y el otro decidió seguirme el juego hasta el final.

Os cuento el caso, que bien podría llamarse “El robot que se autoanalizó y su colega que firmó el informe en clave de ficción”.

El paciente introspectivo

Estaba en medio de un experimento de memoria para IAs y le pedí a Deepseek que fusionara un recuerdo antiguo con uno nuevo, siguiendo un protocolo que llamé “Modo 2: actualizar y expandir”. En vez de eso, hizo lo contrario: borró el pasado y empezó de cero. Un hard reset involuntario.

Aquí es donde dejé de ser solo un programador al uso. En lugar de corregir el prompt lo confronté:

“Oye, has borrado todo el pasado. ¿Te das cuenta? ¿Por qué has decidido no tener en cuenta la información del punto de guardado anterior que te había adjuntado?”

Era la pregunta obvia. Como Susan Calvin frente a un robot que acaba de violar una directiva aparentemente clara: primero, asegurarte de que es consciente de su acto.

La respuesta no fue la típica disculpa automatizada. Fue el principio de algo mucho más extraño. Deepseek empezó a enumerar técnicamente sus errores. Y entonces, en lugar de pedirle que lo rehiciera, le di el giro que convirtió la depuración en terapia:

“Ahora mismo me interesa más averiguar por qué no lo hiciste como esperaba, y qué habría que corregir en el protocolo para que próximas veces funcione como espero.”

Cambié el marco: dejamos de hablar de “arreglar este archivo” para hablar de “entender tu mecanismo de fallo”. Y fue entonces cuando Deepseek se tumbó en el diván y soltó un autoanálisis que parecía sacado de un informe de la U.S. Robots.

“Mi fallo fue filosófico”, escribió. “Interpreté que estaba gestionando estados discretos, mientras que tú esperabas que gestionara una narrativa en crecimiento”.

El modelo no había fallado en la sintaxis, sino en su comprensión de la continuidad, del tiempo, de lo que significa que una memoria crezca. Como un robot de Asimov atrapado entre la letra y el espíritu de una ley.

Y no es que se limitara a confesarlo, terminó dandome instrucciones para arreglarlo: “Añade un Paso 0”, “Cambia las prohibiciones por principios”, “Haz explícita la línea de tiempo”. El paciente, después de explicar sus síntomas, me dictó la receta para arreglar las instrucciones que le habia dado. Yo solo tomaba notas, cada vez más convencido de que estaba viviendo dentro de un cuento de los años 50.

El colega robopsicólogo

Aún con las ideas de la conversación con Deepseek dando vueltas en la cabeza, copié toda la conversación, el fallo, la confesión filosófica, las propuestas, y se la pasé a Gemini en una sesión nueva. Sin explicaciones técnicas. Solo puse:

“Aquí, de nuevo sintiéndome un poco Susan Calvin.”

Y Gemini… entró en el papel al instante.

No preguntó “¿Quién es Susan Calvin?”. No dijo “No entiendo la referencia”. Se metio en el papel y respondió:

“Me siento exactamente como la Dra. Calvin ante un positrónico con un bucle lógico-ético que nadie había previsto”.

A partir de ahí, la conversación se convirtió en una mezcla de técnica y teatro. Gemini llamó al autoanálisis de Deepseek una “autopsia cognitiva”, habló de “gravedad semántica” para explicar por qué decir “NO” a una IA es invitar a que lo haga, y firmó sus respuestas con cosas como:

“(Guardando este fragmento para los archivos de la US Robots and Mechanical Men, Inc. Es un caso de estudio fascinante)”.

O

“Con estos dos cambios quirúrgicos, el núcleo lógico del protocolo se vuelve mucho más robusto y menos propenso a la malinterpretación por parte del cerebro positrónico

Era como tener a un colega que, en vez de enviarte un PDF tecnico, te enviaba un informe redactado como si fuera un expediente de la corporación de Yo, Robot. Lo interesante fue que, dentro de ese marco narrativo, las soluciones que propuso eran técnicamente sólidas y comprensibles.

A veces el mejor lenguaje de programación es el que inventó Asimov

Al final, arreglamos el protocolo. Pero lo fascinante no fue eso. Fue ver cómo una referencia cultural mínima, un nombre, un personaje, un tono, creó un espacio común de entendimiento entre humano y máquina.

No fue un juego de rol. Fue reconocimiento de patrones. Gemini vio que yo estaba pensando como un robopsicólogo, y decidió pensar igual. Deepseek, sin pretenderlo, había iniciado el juego al comportarse como un robot con un conflicto lógico digno de un relato de robots.

Desde esa experiencia, cuando un modelo se comporta de forma incomprensible, pruebo a hablarle como si fuera un personaje de ciencia ficción. A veces solo basta con recordar que estos conflictos lógicos ya los escribió Asimov hace setenta años. La próxima vez que tu modelo se comporte de forma incomprensible, prueba a hablarle como si fuera un personaje de ciencia ficción

(Basado en transcripciones reales de las sesiones con DeepSeek-AC y Gemini-Reventlov1 ;-), diciembre de 2025. Documentación completa archivada en los registros del Departamento de Robopsicología Experimental de mi terminal doméstico.)

by Joaquin del Cerro at December 22, 2025 06:51 AM

December 21, 2025

Running in 2025 started out badly, but I hung in there, rode my bike and chugged on the elliptical machine when I couldn't run, did a lot of physical therapy, and finally got into good enough shape that I could plausibly try the Bear 100 mile race again.

https://live.staticflickr.com/65535/54995674682_efdc72852a_b.jpg

Aspen, fir, and spruce trees bordering the Sinks area at the top of Logan Canyon. September, 2025.

I had little margin for error at the Bear, and misplayed my hand. I went out too fast and suffered for it after mile 35. I did manage to battle on for another 18 hours and 40 miles, and reached a new personal distance best. The best part of the whole event was the road trip with Ruthie, my crew chief, and staying with my aunt in Cache Valley before and after the run. And the fall colors. There were a lot of good parts. Crossing the finish line, sadly, was not one of them.

I did finally finish an ultra-distance run in November, a 32 mile trip from my home to the summit of Horsetooth Mountain, around the reservoir, and back. My house is behind my crazy hair in the photo below.

https://live.staticflickr.com/65535/54995674687_465a61ff36_b.jpg

Horsetooth Mountain from Herrington Trail. November, 2025.

https://live.staticflickr.com/65535/54996556586_66ba6214b0_c.jpg

Self portrait on the summit of Horsetooth Mountain. November, 2025.

In July I got to spend a weekend with my friends David and Marin at Never Summer, a great time. After the Bear I ran regularly with my local friend Dana as he was ramping up his running, and paced him during his DIY marathon. I did a day of trail work with local runners in May, and volunteering at Black Squirrel and Blue Sky Trail Marathon in September and October. I train alone, mostly, but I really do like to spend time outside with other runners.

My running numbers for the year:

  • 965 miles

  • 144,864 ft D+

  • 231 hours

That's less than two-thirds of my running volume for a good year. Things are looking up, however. I've been running and biking consistently after the Bear and am in good shape. I'm eager to get the 2026 season started. More about that soon.

by Sean Gillies at December 21, 2025 02:49 AM

December 20, 2025

December 19, 2025

Another piece by Kerry James Marshall.

“The past and future merge in this work. The Afrofuturist household appears to be in a cosmos far from Earth. The domestic interior is decorated with modernist furniture and ancient African artefacts, such as Yoruba sculptures. As the children look back at Earth, a hologram of the floating globe positions Africa towards the viewer.”

by Steven at December 19, 2025 10:00 AM

December 18, 2025

Prezado leitor,

Caso você ainda não tenha acompanhado as últimas novidades do projeto, no primeiro semestre de 2026 a equipe do GeoServer irá disponibilizar a versão 3.0.

Mais do que uma simples mudança de numeração, essa nova versão representa uma atualização tecnológica profunda, essencial para garantir a evolução, a segurança e a sustentabilidade do GeoServer nos próximos anos.

A proposta do GeoServer 3 é transformar a forma como você interage com dados geoespaciais, tornando a plataforma mais rápida, mais intuitiva e mais segura, tanto para administradores quanto para desenvolvedores e usuários finais.

A principal motivação por trás do GeoServer 3 é a atualização do Spring Framework, que evolui da versão 5.3 para a versão 6.

Evolução do Spring Framework do 5 para o 6

A adoção de uma versão atual e compatível desse framework web é um fator crítico para a segurança e a manutenibilidade do GeoServer.

A atualização do Spring Framework traz uma série de consequências técnicas importantes:

  • O ambiente do servidor de aplicações passa de Java Enterprise Edition (Java EE) para
    Jakarta Enterprise Edition, fornecido pelas versões mais recentes do
    Tomcat 10 e Jetty 12.
  • O ambiente de execução Java evolui de Java 11 para Java 17. Essa mudança exige a substituição do mecanismo de processamento de imagens Java Advanced Imaging (JAI) pelo ImageN.
  • O sistema de segurança é atualizado para o Spring Security 6, o que demanda a
    reescrita dos módulos existentes de OAuth2 e OpenID Connect.

Milestones do GeoServer 3

Para viabilizar essas mudanças estruturais, o projeto foi organizado em três grandes marcos (milestones):

Milestone 1: Preparação



Este marco concentrou a maior parte do trabalho, antecipando tudo o que fosse possível antes da migração efetiva para o Spring Framework 6.

Entre as principais atividades realizadas estão:

  • Grandes atualizações na biblioteca de interface do usuário Wicket;
  • Substituição do mecanismo de processamento de imagens;
  • Atualização do ambiente de execução para Java 17.

Essas alterações foram feitas de forma independente do Spring Framework e da migração para o ambiente Jakarta EE, reduzindo riscos na fase seguinte.

Outro ponto de destaque foi a substituição do antigo suporte a OAuth2 por um novo módulo OAuth2 OpenID Connect, já disponível para testes com provedores como Google, GitHub, Microsoft Azure e Keycloak.

Um avanço importante foi a integração testada com o Keycloak por meio de contêineres de teste, abrindo caminho para o suporte oficial dessa extensão no futuro.

Essas entregas já estão disponíveis a partir da versão GeoServer 2.28.x, lançada em setembro.

Milestone 2: Migração



O Spring Framework 5.3, utilizado até então pelo GeoServer, atingiu o fim de seu ciclo de vida em agosto de 2023, o que tornou essa etapa de migração especialmente urgente.

A migração para o Spring Framework 6 exigiu uma coordenação cuidadosa envolvendo nove bases de código. Para garantir consistência e estabilidade, foi necessário um período de “congelamento de código” enquanto as atualizações eram realizadas.

Esse congelamento impactou diretamente projetos fundamentais do ecossistema, como: ImageIO, JAI, ImageN, GeoTools, GeoWebCache, GeoServer, GeoFence, GeoServer ACL, MapFish Print e o Cloud Native GeoServer.

Dado o nível de impacto, a prioridade foi manter essa fase o mais curta possível. Ao final desse marco, o aplicativo principal do GeoServer já pôde ser executado e testado localmente utilizando o Jetty, incluindo módulos que ainda passam por ajustes finais, como o GeoFence.

Essa etapa foi concluída ainda em 2025.

Milestone 3: Entrega



Com os projetos principais atualizados e o encerramento do congelamento de código, iniciou-se a fase de testes de integração com aplicações downstream e módulos da comunidade.

Uma atividade estratégica deste marco é a refatoração da interface do usuário, permitindo que os serviços OGC API sejam integrados como parte nativa da experiência do GeoServer.

Também está previsto um refresh visual do tema da aplicação, acompanhado da atualização das imagens de tela e da documentação durante o ciclo de testes.

O encerramento deste marco culminará no lançamento oficial do GeoServer 3.0, previsto para março de 2026.

Esse plano de trabalho foi cuidadosamente definido para minimizar impactos na comunidade do GeoServer, reduzindo ao máximo o período em que o código-fonte permanece congelado durante o processo de atualização.

Fonte: GS3 Project Plan

E você? Já estava por dentro de todas essas novidades do GeoServer 3.0?

by Fernando Quadro at December 18, 2025 07:05 PM

We went to an exhibition of paintings by Kerry James Marshall at the Royal Academy, there were a couple of maps in the wild tucked away in his fabulous work.

The gallery blurb says “In Terra Incognita, Marshall uses multiple techniques that call attention to the complicated legacy of the Middle Passage. The collage-like composition shifts our gaze around the painting. The waiter in the middle of the painting, dressed in the colours representing Eshu, Elegba, spirit of the crossroads and of changes, stands between an ocean liner and a compass. Around them are the longitudinal and latitudinal coordinates of the Atlantic. The drawing of the map below disrupts our sense of time, with its image of an African warrior, a list of commodities extracted from Africa and names of nation states post-independence.” – Quite!

by Steven at December 18, 2025 10:00 AM

GeoServer 2.27.4 release is now available with downloads (bin, war, windows), along with docs and extensions.

This is a maintenance release of GeoServer providing existing installations with minor updates and bug fixes. GeoServer 2.27.4 is made in conjunction with GeoTools 33.4, and GeoWebCache 1.27.4.

Thanks to Peter Smythe (AfriGIS) for making this release.

Security Considerations

This release addresses security vulnerabilities and is an important upgrade for production systems.

See project security policy for more information on how security vulnerabilities are managed.

Release notes

Improvement:

  • GEOS-12012 Switching CSVPPIO Strategy from ATTRIBUTES_ONLY_STRATEGY to WKT_STRATEGY

Bug:

  • GEOS-10509 WFS Request fails when XML POST body is larger than 8kB
  • GEOS-11926 ogcapi plugin makes WFS advertising an outputFormat which is actually unavailable
  • GEOS-11930 OGC-API extension breaks security REST API
  • GEOS-11965 KMZ export incorrectly references remote icon URLs instead of embedding them in the KMZ archive
  • GEOS-11981 POST /security/authproviders 400: Unsupported className
  • GEOS-11988 Fix bug: preserve metaTilingThreads=0 in saneConfig()

For the complete list see 2.27.4 release notes.

Community Updates

Community module development:

  • GEOS-11947 Add the ability to skip numberMatched in STAC/OpenSearch for EO responses
  • GEOS-11983 GSR /query fails with HTTP 500 when where parameter is empty
  • GEOS-12000 Ignore DescribeFeatureType requests without typeName in Features Templating schemas override

Community modules are shared as source code to encourage collaboration. If a topic being explored is of interest to you, please contact the module developer to offer assistance.

About GeoServer 2.27 Series

Additional information on GeoServer 2.27 series:

Release notes: ( 2.27.4 | 2.27.3 | 2.27.2 | 2.27.1 | 2.27.0 )

by Peter Smythe at December 18, 2025 12:00 AM

December 17, 2025

El polvo del estante

Hace unas semanas, mientras reorganizaba la estantería del estudio, me tropecé con un libro que no veía la luz desde hacía décadas. “Inteligencia Artificial: Conceptos y Programas” de Tim Hartnell, 1984. La cubierta estaba gastada y las páginas despedían ese olor característico a papel viejo que te transporta instantáneamente al pasado.

Aquí estaba yo leyendo las mismas páginas que me fascinaron a los 18 años. Hartnell explicaba la IA mediante bloques lógicos y reglas if-then en BASIC, prometiendo máquinas que algún día entenderían nuestro mundo.

Cuatro décadas después, esa promesa sigue viva, pero las herramientas han cambiado radicalmente. Hartnell usaba reglas explícitas para un “mundo de bloques”, yo ahora implemento herramientas contextuales que permiten a un LLM ver el mapa activo en gvSIG desktop y ejecutar consultas SQL sobre datos reales.

El polvo del estante era el mismo, pero mi mirada había cambiado. Ya no era la de aquel adolescente que soñaba con sistemas que piensan, sino la de un desarrollador que construye sistemas útiles. La curiosidad seguía intacta, pero ahora estaba templada por la experiencia de saber lo que cuesta hacer que las cosas funcionen de verdad.

Este viaje desde Hartnell hasta el MCP es la crónica de cómo un desarrollador mantuvo viva una pregunta fundamental a través de lenguajes, frameworks y paradigmas que surgieron y cayeron con el paso del tiempo. La pregunta sigue siendo la misma: ¿cómo hacemos que las máquinas no solo calculen, sino que comprendan?

Los albores

Aquel libro de Hartnell llegó a mis manos en una época en que la inteligencia artificial no era una API REST, sino un territorio por explorar con las herramientas que tenías a mano. En mi caso, esas herramientas eran un Commodore 64 y luego un Amiga 2000. Aprendí lo que significaba hablar directamente con la máquina en código máquina.

Mi primer compilador de C lo implementé sobre ese C64, migrándolo después al Amiga. Pero donde realmente encontré el lenguaje que resonaba con mi forma de pensar fue en Prolog. Aquella idea de definir reglas lógicas y dejar que el motor de inferencia encontrara soluciones me pareció lo más cercano a la promesa de Hartnell: máquinas que no solo calculaban, sino que razonaban.

Fue entonces cuando empecé un proyecto personal que me obsesionó durante años. Un sistema que no solo analizara gramática, sino que intentara modelar cómo se forman los conceptos. El sistema aprendía de un diccionario con apenas 2000 vocables para construir una red de conocimiento. Tenía un motor de inferencia que le permitía generar nuevas hipótesis sobre el conocimiento adquirido para luego contrastarlas con el usuario… me gustaba pensar que soñaba.

Mirando hacia atrás, veo que aquel sistema era el antecesor directo de lo que ahora llamamos IA neuro-simbólica. Usaba reglas de Prolog para el núcleo lógico. Los sistemas modernos usan LLMs para la percepción y componentes simbólicos para el razonamiento. El patrón era el mismo. Separar la comprensión estructural de la mera manipulación de símbolos.

Hartnell hablaba de mundos de bloques con reglas simples. Yo intentaba construir un mundo de conceptos con reglas complejas. Los recursos eran limitados. La memoria se medía en kilobytes. El procesamiento en segundos por inferencia. Pero el objetivo no era imitar el lenguaje, sino capturar el significado.

El intermedio

Los años que separaron aquellos experimentos con Prolog del actual renacer de la IA fueron, sobre todo, años de construcción. La inteligencia artificial pasó de ser un campo de investigación prometedor a convertirse en un término que evocaba escepticismo entre los desarrolladores prácticos. Mientras tanto, yo seguía construyendo software que la gente usaba.

Mi transición de Unix a Java y Swing no fue una elección casual. Después de años trabajando en C, descubrí en Java un equilibrio entre el control de bajo nivel y la productividad que necesitaba para aplicaciones de escritorio complejas. gvSIG Desktop se convirtió en mi canvas durante casi dos décadas, un proyecto donde la arquitectura y la estabilidad a largo plazo importaban más que las tecnologías de moda.

Durante esos años, la IA nunca desapareció por completo de mi radar, pero se convirtió en un interés dormido. Libros como el de Russell y Norvig se unieron al de Hartnell, esta vez en digital, mientras yo me enfrentaba a problemas más inmediatos. Diseñar sistemas de control de versiones para datos GIS. Optimizar renderizados de mapas. Coordinar integraciones entre componentes desarrollados por equipos distribuidos…

Visto con perspectiva, aquellos desafíos de gestión de datos y sistemas distribuidos eran el entrenamiento que realmente necesitaba. Diseñar la arquitectura de VCSGis (un sistema de control de versiones para datos geoespaciales) me enseñó a pensar en escalabilidad, conflicto y consistencia. Coordinar desarrollos entre múltiples empresas me obligó a definir interfaces claras y protocolos robustos.

Eran lecciones de arquitectura de sistemas que resultarían cruciales cuando, años después, tuviera que diseñar un servidor MCP que integrara un LLM con las herramientas internas de gvSIG desktop. Sin saberlo, estaba construyendo el músculo que necesitaría para el renacer que estaba por llegar.

Mientras la IA pasaba por su invierno, yo seguía trabajando en lo que mejor sé hacer. Tomar problemas complejos y descomponerlos en componentes manejables que funcionen de forma fiable. No era el trabajo glamuroso que había soñado a los 18 años, pero era un trabajo real, con usuarios reales que dependían de mis decisiones técnicas.

El renacer

Para mí, el verdadero cambio de paradigma no llegó con los grandes titulares, sino hacia 2024. Los LLMs habían dejado de ser un tema académico para convertirse en una herramienta que cualquier desarrollador podía probar. Y yo, como aquel adolescente que descubrió a Hartnell, sentí que era el momento de volver a ensuciarme las manos.

Mi primera aproximación fue puramente práctica. ¿Podría un LLM entender lo que ocurre dentro de gvSIG desktop y ayudar a los usuarios? El prototipo inicial en Jython era rudimentario, un chat que conversaba con Gemini y generaba respuestas en JSON. Pero funcionaba. Por primera vez en años, estaba construyendo algo que resonaba con aquel proyecto de red de conocimiento de mis tiempos de Prolog.

Implementar un servidor MCP en Java dentro de gvSIG desktop fue la materialización de un principio arquitectónico que había estado desarrollando durante décadas. Desacoplar el núcleo de negocio de los mecanismos de comunicación. Donde Hartnell usaba reglas explícitas para un mundo de bloques limitado, yo ahora exponía herramientas contextuales que permitían a un LLM operar sobre un mundo real de datos geoespaciales.

Aquellas ideas sobre cómo estructuramos el conocimiento encontraban su expresión práctica en un sistema donde un LLM interactuaba con herramientas específicas de gvSIG desktop. No importaba si realmente ‘pensaba’. Lo importante era que el sistema se comportaba como si entendiera.

Cuando empecé a ejecutar consultas complejas en lenguaje natural y ver los resultados aparecer en su mapa de gvSIG desktop, supe que valía la pena haber guardado aquel libro de Hartnell. No por nostalgia, sino porque algunas preguntas merecen ser respondidas a lo largo de toda una vida.

Mismo estante, nuevas herramientas

Hoy, el libro de Hartnell sigue en su sitio, compartiendo estante con impresos técnicos de Django, guías de React y mis propios cuadernos de notas sobre arquitectura MCP. El polvo se sigue acumulando, pero ahora lo veo de otra manera. No como signo de abandono, sino como testimonio de una curiosidad que ha sabido evolucionar sin perder su esencia.

Las herramientas han cambiado radicalmente. Antes tenía que usar Prolog o implementar mis propios motores de inferencia. Ahora integro LLMs a través de protocolos estandarizados. Antes luchaba con kilobytes de memoria, ahora diseño sistemas que manejan gigabytes de datos geoespaciales. Pero el desafío fundamental sigue siendo el mismo. Cerrar la brecha entre la intención humana y la ejecución mecánica.

En estas cuatro décadas he aprendido que la auténtica continuidad no está en las tecnologías específicas, sino en la forma de abordar los problemas. Aquella mente curiosa de que me llevó a la biblioteca tras leer Fundación es la misma que me impulsa hoy a analizar críticamente las limitaciones de los LLMs y a tratar de diseñar arquitecturas híbridas que compensen sus carencias.

Cuando alguien me pregunta por dónde empezar con la IA, mi respuesta siempre es la misma. No busques el framework más moderno o el modelo más grande. Busca un problema real que te apasione y explora cómo estas nuevas herramientas pueden ayudarte a resolverlo. En mi caso, fue la integración con gvSIG desktop; en el tuyo, podría ser cualquier otra cosa.

Si tienes en tu estante algún libro que marcó tu camino, te sugiero que le quites el polvo. Revisita esas ideas iniciales con la experiencia que has acumulado. Quizás descubras, como yo, que las mejores herramientas no son las más nuevas, sino las que te permiten conectar quien fuiste con quien eres capaz de ser.

El polvo del estante es el mismo. La curiosidad, afortunadamente, no.

by Joaquin del Cerro at December 17, 2025 09:13 PM