Welcome to Planet OSGeo

April 23, 2026

We want to share some updates we have made on the QGIS Plugin Repository. In January 2026 we shared QEP 409. The proposal seeks to improve the general working practices with QGIS plugins, adding some optional and some mandatory checks to every plugin that gets published in the QGIS plugin repo. This builds on initial work (see PR) we did to run ‘soft’ checks on every plugin when they are published.

We also ‘back ran’ the new security checks on every existing plugin in the plugin repository (latest versions only) and assigned them a security badge without blocking or removing any plugin from being published.

Now if your plugin has flagged issues you will see a badge like this (in red below):

If your plugin passes all checks, you will see a green badge like this:

If you see a small ‘i’ on the left there may still be some non-blocking checks to look at.

If you are the owner of a plugin, you can log in to https://plugins.qgis.org and review the issues that have been flagged for your plugin:

If you expand the detail blocks, you can see the individual issues that were flagged:

There are two blocking issue categories (that will prevent you from publishing your plugin) and additional non-blocking issue categories (that are advisories only). You can see all the details at the information page here:

https://plugins.qgis.org/docs/security-scanning

We would like to note that these security advisories and badges are only shown on the plugins website, the plugin manager in QGIS Desktop does not yet provide any indication of the security scan results.

What to do if you have a red badge on a plugin you manage?

Firstly, don’t panic. Almost all plugins initially have this badge, but we expect over time that the repository is populated with ‘green badged’ plugins as developers publish their updates. Then review the issues listed in the report and fix them systematically, refer to https://plugins.qgis.org/docs/security-scanning for the specific tools we use on the server if you want to run them locally too.

What to do if you see a red badge on your favourite plugin.

Again, don’t panic. In a year’s time when most plugins have been updated we expect green badges to be the norm, but for now, just know that we are working on improving the security of our plugin ecosystem.

What if my plugin has a flagged issue for something that is a feature?

We know that in some cases you may actually need to embed API keys or credentials or do things that raise a flag. QGIS does not play an enforcement role beyond requiring that all newly uploaded plugins are green flagged. You can use pragmas / overrides where needed. What we are trying to do is ensure that plugin developers have visited each reported issue, considered it and either consciously chosen to ignore it, or fixed it.

What if I still have questions?

Please file a ticket at https://github.com/qgis/QGIS-Plugins-Website/issues

I have an issue with XXX

We are aware that there are some teething problems with our ruleset e.g. hashlib.md5, xml library flagging etc. Please raise an issue if you think the rules are too strict and we will update them accordingly. If you want to review how the scanning is implemented, please see https://github.com/qgis/QGIS-Plugins-Website/blob/master/qgis-app/plugins/security_scanner.py

by Tim Sutton at April 23, 2026 01:29 PM

April 22, 2026

La versión 2.7 de gvSIG Desktop incluye una nueva herramienta que permite convertir coordenadas entre distintos sistemas de referencia, que facilita por ejemplo el poder buscar coordenadas de puntos que se tienen en un sistema diferente al de la vista desde la propia aplicación, y no tener que acudir a herramientas externas. En el caso de coordenadas geográficas se puede seleccionar formato decimal, o grados, minutos y segundos.

Esta herramienta complementa al capturador de coordenadas, herramienta que ya existía en versiones anteriores de gvSIG Desktop, y con la que se podía obtener las coordenadas de un punto sobre la vista en el sistema de referencia elegido, aunque la vista estuviese en un sistema diferente. Esta herramienta permitía además guardar dichos puntos, para ser utilizados en algunos geoprocesos.

En el siguiente vídeo se muestra el funcionamiento de ambas herramientas:

by Mario at April 22, 2026 03:10 PM

April 21, 2026

Se você já trabalha com dados geoespaciais, provavelmente domina análise. Mas deixa eu te provocar:

👉 Você sabe transformar isso em uma solução acessível na web?

Porque existe uma diferença enorme entre:

✔ Gerar mapas
✔ E entregar uma plataforma que outras pessoas realmente usam

E é exatamente aí que entra o WebGIS.

Hoje, quem se destaca não é só quem analisa dados… É quem consegue:

✔ Centralizar informações
✔ Publicar serviços padronizados (OGC)
✔ Criar aplicações acessíveis via navegador
✔ Controlar acesso e usuários
✔ Escalar o uso dos dados

👉 Em outras palavras: sair do desktop e ir para internet.

Agora vem o ponto que trava muita gente:

Pra fazer isso eu preciso programar?

❌ Não.

Com as ferramentas certas, você consegue construir um WebGIS completo usando:

🗄 PostgreSQL + PostGIS
🌍 GeoServer
📊 GeoNode

Tudo integrado, utilizando tecnologias open source já consolidadas no mercado. E mais importante: você aprende o fluxo completo, não só ferramentas isoladas.

Dado → Serviço → Aplicação → Usuário

🔥 Na prática, isso significa que você será capaz de:

✔ Estruturar dados espaciais de forma profissional
✔ Publicar mapas e serviços na web
✔ Criar portais geoespaciais completos
✔ Desenvolver dashboards e GeoStories
✔ Gerenciar permissões e acessos

Esse não é um curso de programação. É um curso para quem quer resultado aplicado, utilizando ferramentas prontas e poderosas. Se você quer parar de entregar arquivos e começar a entregar soluções acessíveis, escaláveis e profissionais…

👉 Esse é o próximo passo.

O curso é ministrado na modalidade EAD Ao Vivo, com aulas síncronas, porém, as aulas são gravadas e ficam disponíveis ao aluno por 12 meses no nosso portal do aluno.

Garanta sua vaga:

🌐 https://geocursos.com.br/webgis
📱 https://whats.link/geocursos

by Fernando Quadro at April 21, 2026 02:42 PM

GeoServer 3.0-RC is now available, and with it we can celebrate something bigger than a release candidate.

This milestone is the concrete outcome of a successful community crowdfunding campaign.

When we launched the GeoServer 3 crowdfunding initiative in September 2024, the goal was ambitious. GeoServer needed more than incremental maintenance. It needed a full platform modernization, including a new generation user experience, a stronger security foundation, a modern Java stack, improved raster processing, and the engineering effort required to carry those changes across the broader GeoServer ecosystem.

That work is now visible in GeoServer 3.0-RC.

From campaign to release candidate

The GeoServer 3 crowdfunding effort set a total target of 550,000 €. Camptocamp, GeoCat, and GeoSolutions each committed 50,000 €, establishing a community funding goal of 400,000 €. In May 2025, the campaign surpassed that goal.

That achievement mattered because GeoServer 3 was never a small upgrade. It required coordinated investment in core platform work that is essential for users, but often difficult to fund through routine maintenance alone:

  • migration to a modern Spring and Jakarta based platform
  • alignment with JDK 17 and current deployment environments
  • replacement of aging raster processing components with ImageN
  • stronger security and vulnerability management
  • documentation updates and broad compatibility testing
  • user interface and usability improvements across the administration experience

The consortium of Camptocamp, GeoCat, and GeoSolutions provided coordination, delivery capacity, and co-funding. Sponsors, community members, and individual donors made it possible to move from planning into implementation.

What GeoServer 3.0-RC shows

With GeoServer 3.0-RC, the results of that investment are now ready for public testing.

This release candidate introduces a modernized platform with:

  • a new context-driven user experience
  • a responsive administration interface
  • a new full-screen layer preview
  • updated documentation in Markdown
  • support for modern servlet containers including Tomcat 11 and Jetty 12.1
  • a straightforward upgrade path from GeoServer 2.28.x, with no changes to the GeoServer data directory

GeoServer 3.0-RC is also released together with GeoTools 35-RC and GeoWebCache 2.0-RC, making this an important ecosystem milestone, not just a version bump.

GeoServer 3

Why this matters for open source sustainability

Crowdfunding is often discussed in theory as a way to support open source. GeoServer 3 offers a practical example of what that support can achieve.

This campaign did not fund a narrow feature request. It funded the kind of foundational work that keeps a critical open source project healthy: technical modernization, security upgrades, ecosystem testing, documentation improvements, and long-term maintainability.

That is exactly the kind of work communities depend on, and exactly the kind of work that is hardest to finance unless users and organizations step forward together.

GeoServer 3.0-RC proves that this model can work.

Help us finish strong

The arrival of GeoServer 3.0-RC is also a call for community testing.

We encourage everyone to try the release candidate in their own environment, especially for:

  • upgrade workflows from GeoServer 2.28.x
  • the new user interface and administration workflows
  • deployments on Tomcat 11 and Jetty 12
  • raster-heavy and tiling-heavy workloads
  • extension compatibility and operational edge cases

You can download GeoServer 3.0-RC from the release page, review the upgrade instructions, or quickly test the Docker image:

docker run -p 8080:8080 docker.osgeo.org/geoserver:3.0-RC

Please share your feedback on the GeoServer 3.0-RC discourse thread.

New full screen layer preview

Thank you

GeoServer 3.0-RC is an important technical milestone, but it is also a community milestone.

Thank you to the organisations, individual donors, developers, testers, and sponsors who helped make this happen. And thank you to the consortium teams at Camptocamp, GeoCat, and GeoSolutions for carrying the work from campaign to release candidate.

GeoServer 3.0-RC is here because the community decided this work was worth funding.

That is worth celebrating.

GeoServer 3 is supported by the following organisations:


Individual donations: Abhijit Gujar, Hennessy Becerra, Ivana Ivanova, John Bryant, Jason Horning, Jose Macchi, Peter Smythe, Sajjadul Islam, Sebastiano Meier, Stefan Overkamp.

by Emmanuel Belo at April 21, 2026 12:00 AM

April 20, 2026

I needed my training to begin to peak in week 14. Quad Rock is in 20 days (at this writing), and I won't get much adaptation to workout loading in the last 13 days. Weeks 14 and 15 would be my last opportunities to get faster and stronger before the race. Fortunately, a return to good health and favorable weather helped make this my best week yet.

  • 13 hours, 8 minutes all training

  • 42.7 miles running

  • 8,550 feet D+ running and treadmill

The first block of my training was dedicated to power and pure speed, The second to intense aerobic efforts. This last block is about going up and down technical mountain trails at my race pace or a bit faster. In practice, I push pretty hard for half of each climb, run the downhills as fast as I can, and otherwise keep it easy, but not slow.

I did three workouts like this, plus two shorter tempo runs at Pineridge Open Space, which is flatter than the Quad Rock course. Five days of comfortably hard to just plain hard running, a recovery ride on Tuesday, and a full day off to recover on Friday.

Today I went for a loop in Lory State Park that I did five weeks ago. The loop includes the last seven-mile stanza of the 25-mile QR poem: a 1,000 foot climb up from the Arthurs Rock aid station, some rolling terrain, and a 1,100 foot descent to the finish line. I did the loop in the same time as I did in March, but at a noticeably lower level of effort. I'm counting on being able to run at an even faster pace at a higher level of effort in May.

Thursday I ran at elevation for the ffirst time this season, a loop around Lumpy Ridge in Rocky Mountain National Park that begins just below 8,000 feet and tops out just above 9,000 feet. There is no snow to speak of at Lumpy Ridge. If the aspen in the Cow Creek drainage on the more remote north side of Lumpy Ridge had more leaves, you might think it was mid-summer.

https://live.staticflickr.com/65535/55216331884_c8a56e2e40_b.jpg

A row of tall white aspen stems with just a few leaves, backed by dark green Douglas fir and blue spruce.

I loved seeing water in Cow Creek, even if it was only a July-level flow. At least the birds and mammals have something to drink.

https://live.staticflickr.com/65535/55216331879_53d6fe3815_c.jpg

A footbridge made of rough-hewn timbers spans a small mountain creek. Much local rock appears yellowish when wet, and our shallow mountain creeks appear golden.

by Sean Gillies at April 20, 2026 01:52 AM

GeoServer 3.0-RC is now available, with downloads for ( bin, war ), along with docs and extensions. We are working with OSGeo for the windows installer download, and will update this post when it is available. Windows users are asked to test out the bin download while we wait. Release available as docker image docker.osgeo.org/geoserver:3.0-RC .

This is a release candidate intended for public review and feedback. GeoServer 3.0-RC is made in conjunction with GeoTools 35-RC, and GeoWebCache 2.0-RC.

Thanks to Jody Garnett (GeoCat), Andrea Aime (GeoSolutions), and Peter Smythe (AfriGIS) for making this release.

Please Test GeoServer 3.0-RC

We encourage everyone to try GeoServer 3.0-RC in their own environment, especially for upgrade workflows, the new user interface, and deployment on Tomcat 11 and Jetty 12. Real-world testing is the best way to catch regressions and compatibility issues before the final 3.0 release.

You may also quickly test the docker image using:

docker run -p 8080:8080 docker.osgeo.org/geoserver:3.0-RC

Please share your success, feedback, questions, and any issues you encounter on the user forum GeoServer 3.0-RC Release Candidate discourse thread.

Welcome to GeoServer 3

We are overjoyed to share this update with our community, this is the final stretch of a long road, a year of development, and a lot of planning and support to make it all happen.

There will be more technical details in the final release announcement - but for now we wish to say thank you.

GeoServer 3

Straightforward upgrade

We have taken great pains to make the upgrade process seamless from GeoServer 2.28.x.

  1. Important: We have made no changes to the GeoServer Data Directory.

    Download and try GeoServer 3.0-RC today!

  2. A few modules have migrated from core to extensions:

    The pure Java H2 database is no longer provided.

  3. The log file location setting is now managed using the GEOSERVER_LOG_LOCATION application property.

  4. The NetCDF index support has been simplified and is now self-contained. With this improvement, NetCDF no longer needs a database or local .idx files to operate.

    Instructions are provided for how to clean up these now unused files.

Please see the upgrade instructions for details.

New Context-Driven User Experience

GeoServer 3 features a new “context-driven” user experience, which we really hope you enjoy.

  • Search: Using the left hand side search field to find information. Autocomplete results are shown as you type, and results are listed in a tree which can be navigated below.

    User Interface Search

  • Context: Clicking on a search item establishes the context which is shown as breadcrumbs along the top of the page. A drop-down context menu provides quick access to actions that can be performed.

    User Interface Context Menu

  • Page: Page content adjusts to the current context. The welcome page adjusts to showing the layer tile and description, along with preview links, sample data downloads, metadata and data links configured.

    User Interface Welcome Layer Page

  • Menu: The menu bar at the top of the page provides login on the right hand side, and access to the familiar GeoServer top-level menus. Many of these pages now adjust their content to reflect the current context.

    User Interface Top Level Menus

  • Feedback: Admins are provided additional context-menu commands, and per-layer feedback and shortcuts, making the application easier and faster to use.

    User Interface Feedback

For more information see the user guide.

Thanks to Stefano Bovio (GeoSolutions), Jody Garnett (GeoCat), and others for this major improvement.

New User Interface Responsive Design Theme

GeoServer now provides a responsive-design theme:

  • Navigation: Navigation is reduced to a hamburger menu when using a narrow width display.

    Responsive Theme: Menus

  • Forms: Forms have adopted a two-column layout adapting to page width.

    Responsive Theme: Form two-column layerout

Details coming soon to the developers guide!

Thanks to Stefano Bovio (GeoSolutions) for leading this frequently requested improvement, the entire GeoServer 3 team for implementing and checking, and testers at AfriGIS and GeoCat for verifying and updating screenshots.

New Layer Preview

A new full-screen layer preview is provided using the latest OpenLayers library.

New full screen layer preview

Thanks to Stefano Bovio (GeoSolutions) for welcome improvement.

Updated Environment

GeoServer 3 is overjoyed to support Tomcat 11.0.x and Jetty 12.1 application servers after completing our transition to Spring Framework 7 and Jakarta EE Servlet API 6.1.

We have been extensively testing GeoServer 3 with Java 17 and Java 21, maintaining the same Java runtime baseline as GeoServer 2.28.x. Java 25 is subject to automated testing, but we are going to hold off recommending it until the user community has had an opportunity to try it out and report back.

If you are wondering about the compatibility between the Java web stack and GeoServer, here is a table showing the various supported options:

GeoServer Java Tomcat Jetty Java EE Jakarta EE
GeoServer 3.0 17, 21 Tomcat 11.0.x Jetty 12.1   Servlet API 6.1
Not supported   Tomcat 10.1.x Jetty 12.0   Servlet API 6.0
Not supported   Tomcat 10.0.x Jetty 11.0   Servlet API 5.0
GeoServer 2.28.x 17, 21 Tomcat 9.x   Servlet API 4  
GeoServer 2.28.x 17, 21   Jetty 9.4 Servlet API 3.1  

For more information see container considerations.

Thanks to the entire GeoServer 3 team and crowdfunding campaign for this major accomplishment, representing the completion of Milestone 3.

New Documentation

The long-awaited transition to Markdown documentation has finally arrived. Welcome to our new User Manual. The older GeoServer 2.x documentation is available at Docs Archive or via the version switcher. Please help out by fixing any remaining small issues or log an issue for Peter to address.

The new user manual

Thanks to Peter Smythe (AfriGIS) and Jody Garnett (GeoCat) for working on this activity which ended up being an incredible amount of work.

Thanks to the GeoServer 3 Sponsors

GeoServer 3 would not exist without the organizations and individuals who supported the GeoServer 3 crowdfunding campaign. Their sponsorship made this work possible.

GeoServer 3 is supported by the following organisations:


Individual donations: Abhijit Gujar, Hennessy Becerra, Ivana Ivanova, John Bryant, Jason Horning, Jose Macchi, Peter Smythe, Sajjadul Islam, Sebastiano Meier, Stefan Overkamp.

Release notes

New features:

  • GEOS-12063 [GSIP-238] GeoServer 3 UI / UX Refresh

Improvements:

  • GEOS-11886 Sort entries in all .properties files alphabetically
  • GEOS-12015 Switch tests using H2 to GeoPackage
  • GEOS-12023 Improve developer logging during catalog resources loading and WMS capabilities requests
  • GEOS-12024 Add Git branch name in GEOSERVER_NODE_OPTS
  • GEOS-12072 Remove deprecated REST endpoint on the DataStoreFileController
  • GEOS-12077 Remove H2/DB based index and binary index from CoverageMultidim/NetCDF stores
  • GEOS-12081 Update MapML.js ( custom element suite) to v0.17.0
  • GEOS-12082 CoverageStore - quick fail for incorrect files
  • GEOS-12083 Skip brute force login delays when checking for default administrator password

Bugs:

  • GEOS-10509 WFS Request fails when XML POST body is larger than 8kB
  • GEOS-11903 WPS does not respect raw response output selection when there are multiple outputs
  • GEOS-11916 Data directory migration performed on built-in default security configuration
  • GEOS-11926 ogcapi plugin makes WFS advertising an outputFormat which is actually unavailable
  • GEOS-11930 OGC-API extension breaks security REST API
  • GEOS-11942 ImagePPIO does not run any longer
  • GEOS-11964 Metadata Bulk Operations: wicket error
  • GEOS-11965 KMZ export incorrectly references remote icon URLs instead of embedding them in the KMZ archive
  • GEOS-11981 POST /security/authproviders 400: Unsupported className
  • GEOS-11988 Fix bug: preserve metaTilingThreads=0 in saneConfig()
  • GEOS-11999 The version of Jetty (12) no longer supports web.xml CORS configuration
  • GEOS-12065 WMS Layer REST PUT always returns 500 due to Collections.emptySet() in getRemoteStyleInfos()
  • GEOS-12073 Remove log location configuration from Admin Console and REST API
  • GEOS-12084 TemplateController REST endpoints accept non-existent workspace, store, and resource names
  • GEOS-12085 LocalSettingsController does not validate workspace existence

Tasks:

  • GEOS-11987 ImageN 0.9.1 migration requires renaming of registryFile.jai to registryFile.imagen
  • GEOS-12004 Make WMS independent of WFS
  • GEOS-12005 Remove GeoServer H2 extension
  • GEOS-12006 GWC, removal of leftover H2 references
  • GEOS-12011 Move KML module to extension
  • GEOS-12016 Move WCS 1.1 module to extension
  • GEOS-12017 Move WCS 1.0 to extension
  • GEOS-12018 Switch GeoServer tests away from H2
  • GEOS-12019 Turn arcgrid and worldimage formats into plugins
  • GEOS-12025 Split WMS 1.1 and 1.3
  • GEOS-12040 Updating BouncyCatle libraries to LTS 2.73.10
  • GEOS-12041 Update Spring LDAP to 4.0.1
  • GEOS-12071 Remove the WPS remote module
  • GEOS-12064 CSS: add documentation for localized @title and @abstract metadata

Sub-tasks:

For the complete list see 3.0-RC release notes.

Community Updates

Community module development:

  • GEOS-11904 OGC API Processes: add support for envelope input/output
  • GEOS-11905 OGC API processes status response lacks jobid and links to self
  • GEOS-11906 OGC API Processes: use correct error code for access to results when execution is not complete
  • GEOS-11907 OGC API Processes: support multiple raw responses
  • GEOS-11908 OGC API Processes page should be pageable
  • GEOS-11909 Add support for OGC API Echo process
  • GEOS-11915 OGC API Processes: improve support for binary input and output
  • GEOS-11972 GSIP 233 - Community Pending Release Profile
  • GEOS-11980 Add support for uploading a single parquet file to GeoServer via REST
  • GEOS-11983 GSR /query fails with HTTP 500 when where parameter is empty
  • GEOS-12000 Ignore DescribeFeatureType requests without typeName in Features Templating schemas override
  • GEOS-12002 hz-cluster: homepage pop-up fails
  • GEOS-12007 Add AWS credential chain authentication UI and documentation for GeoParquet
  • GEOS-12013 Support vector datasets ingestion in VectorMosaic via REST
  • GEOS-12044 STAC search endpoint should report invalid collection names as invalid parameters instead of internal errors
  • GEOS-12061 New Community Module for PNG-WIND output format for wind datasets
  • GEOS-12062 Add DuckDB datastore community extension (gs-duckdb)
  • GEOS-12069 Align the hazelcast version in hz-cluster to the rest of GeoServer
  • GEOS-12074 Remove activeMQ-broker community module
  • GEOS-12089 GWC sqlite community module breaks legend preview in style page

Community modules are shared as source code to encourage collaboration. If a topic being explored is of interest to you, please contact the module developer to offer assistance.

About GeoServer 3.0.x Series

Additional information on the GeoServer 3.0.x series:

Release notes: ( 3.0-RC )

by Jody Garnett at April 20, 2026 12:00 AM

April 19, 2026

É oficial! É com muita alegria e entusiasmo que confirmo minha participação no FOSS4G 2026 em Hiroshima! 🇯🇵

Banner promocional do evento FOSS4G 2026 Hiroshima. A imagem apresenta um grande origami de tsuru (grou) em tons de rosa e roxo sobrevoando a paisagem aérea da cidade de Hiroshima, com seus rios e montanhas ao fundo. No canto superior esquerdo, o logotipo oficial do FOSS4G acompanhado do texto "HIROSHIMA 2026".

Para quem não está familiarizado, o FOSS4G é o maior evento do mundo dedicado ao software livre geoespacial. Organizado pela OSGeo, é o lugar onde desenvolvedores, usuários e entusiastas se reúnem para moldar o futuro das geotecnologias.

Gratidão à Comunidade

Antes de falar das apresentações, quero expressar meu profundo agradecimento à comissão organizadora. Sabemos que realizar um evento desta magnitude exige uma dedicação quase sobre-humana. O o trabalho de vocês é fundamental para fortalecer o ecossistema global de software livre geoespacial. Obrigado por tornarem isso possível!

A Bandeira Brasileira no Japão 🇧🇷🤝🇯🇵

Nesta edição, terei a honra de levar um pouco da força da nossa comunidade para o território japonês. O Brasil tem uma das comunidades geoespaciais mais vibrantes do planeta, e é hora de mostrar como estamos transformando dados em decisões reais. Vou apresentar quatro trabalhos que cobrem desde a base comunitária até a aplicação na gestão pública:

1. QGIS Brasil: Quebrando a Barreira do Idioma

Título: Bridging the Language Gap: How the QGIS Brazil Community Drives Open Source Adoption in the Global South.

Neste trabalho, discuto como a tradução e a produção de conteúdo local são fundamentais para a democratização do acesso às geotecnologias.

2. QGIS Brasil: 16 Anos de História e o Caminho para o QGIS LATAM 2024

Título: Scaling Geospatial Communities: 16 Years of QGIS Brazil and the Path to LATAM 2024.

Uma retrospectiva da nossa jornada e como estamos expandindo nossa influência para toda a América Latina.

3. A Jornada da OSGeo Brasil

Título: The Map of Our History: The Journey of OSGeo Brazil.

A trajetória da nossa representação nacional e o fortalecimento do software livre local.

4. Inovação Pública na Sefin Caucaia

Título: From Legacy Data to National Standards: Preparing a Brazilian City for Federal Interoperability with FOSS4G.

Talvez um dos meus relatos favoritos: como estamos usando ferramentas de código aberto para modernizar a Secretaria de Finanças de Caucaia, transformando dados legados em padrões nacionais de interoperabilidade.


O que esperar?

Além das palestras, o FOSS4G é sobre troca. Mal posso esperar para aprender com os mestres, rever amigos da comunidade global e, claro, trazer muitos insights (e talvez alguns adesivos raros) para compartilhar com vocês aqui no blog.

Foto de cima de um laptop cinza-escuro, cuja tampa está completamente coberta por dezenas de adesivos coloridos e diversos. Os adesivos representam uma vasta coleção de eventos, comunidades e organizações do mundo do geoprocessamento e software livre (FOSS4G), incluindo logos proeminentes de FOSS4G Prizren 2023, FOSS4G Auckland 2025 (futuro), FOSS4G Belém 2024, QGIS, INPE, IBAMA Prevfogo, YouthMappers UFV, GRASS GIS, GeoChicas, Meninas da Geo e TomTom. O laptop repousa sobre um tapete de mesa com um padrão de mapa-múndi visível nas bordas.O próximo adesivo é o FOSS4G Hiroshima 2026!

Se você também vai estar em Hiroshima ou quer saber mais sobre algum desses temas, deixe seu comentário abaixo!

Nos vemos no Japão! Ou, como dizem por lá: Hiroshima de aimashou! 🗾🗺️

by Narcélio de Sá at April 19, 2026 06:15 PM

Week 13 started out pretty strong. I returned to my favorite Monday evening yoga class, did a fun run with strides at Pineridge on Tuesday, and then a hard running interval workout on Towers Trail in Horsetooth Open Space on Wednesday. Thursday I had cold symptoms again and shifted to dog walking and bike riding for the rest of the week. The running numbers for the week are nothing much.

  • 11 hours, 35 minutes all training

  • 14.6 miles running

  • 2,041 feet D+ running

By Saturday afternoon I felt much better, which gave me hope for a solid week 14.

by Sean Gillies at April 19, 2026 02:41 AM

April 16, 2026

What is the QGIS Sustainability Initiative?

At OPENGIS.ch, we believe that the long-term health of the QGIS ecosystem depends on more than just adding new features. Critical work like bugfixing, code reviews, codebase maintenance, and quality assurance often goes unnoticed, yet it is essential to delivering the stable, reliable software that thousands of organisations depend on every day. That is why we launched the QGIS Sustainability Initiative (#sustainQGIS). For every support contract of more than 10 days, we donate development time to the initiative. In addition, all unused hours at the end of the year of each contract are also donated. This ensures that buying an OPENGIS.ch support contract directly helps enable the long-term, sustainable development of the QGIS and QField ecosystem.


2025 at a glance

In 2025, our team invested a total of 168 hours into the QGIS Sustainability Initiative, spread across five key areas of work. On the wider QGIS project, we contributed 553 comments and 294 merged pull requests throughout the year.

*In addition to these sustainability hours, OPENGIS.ch dedicated 105 hours to QGIS bugfixing funded by QGIS.org.


Sustainability work by category – 168 hours


Our team


OPENGIS.ch on QGIS in 2025

Beyond the sustainability initiative, OPENGIS.ch had a significant presence in the QGIS codebase throughout 2025. In total, our team contributed 773 commits, 221 merged pull requests, 384 PR reviews, and helped close 122 bugs, plus 126 hours of dedicated bugfixing (21h from the sustainability initiative + 105h funded by QGIS.org).


Why It Matters

Every hour invested in the QGIS Sustainability Initiative strengthens the foundation that thousands of organisations rely on. By choosing an OPENGIS.ch support contract, you are not only getting expert support for your projects, you are directly contributing to a healthier, more sustainable open-source GIS ecosystem.


Thank you for being part of this journey.

by Denis at April 16, 2026 01:48 PM

What is the QGIS Sustainability Initiative?

At OPENGIS.ch, we believe that the long-term health of the QGIS ecosystem depends on more than just adding new features. Critical work like bugfixing, code reviews, codebase maintenance, and quality assurance often goes unnoticed, yet it is essential to delivering the stable, reliable software that thousands of organisations depend on every day. That is why we launched the QGIS Sustainability Initiative (#sustainQGIS). For every support contract of more than 10 days, we donate development time to the initiative. In addition, all unused hours at the end of the year of each contract are also donated. This ensures that buying an OPENGIS.ch support contract directly helps enable the long-term, sustainable development of the QGIS and QField ecosystem.


2025 at a glance

In 2025, our team invested a total of 168 hours into the QGIS Sustainability Initiative, spread across five key areas of work. On the wider QGIS project, we contributed 553 comments and 294 merged pull requests throughout the year.

*In addition to these sustainability hours, OPENGIS.ch dedicated 105 hours to QGIS bugfixing funded by QGIS.org.


Sustainability work by category – 168 hours


Our team


OPENGIS.ch on QGIS in 2025

Beyond the sustainability initiative, OPENGIS.ch had a significant presence in the QGIS codebase throughout 2025. In total, our team contributed 773 commits, 221 merged pull requests, 384 PR reviews, and helped close 122 bugs, plus 126 hours of dedicated bugfixing (21h from the sustainability initiative + 105h funded by QGIS.org).


Why It Matters

Every hour invested in the QGIS Sustainability Initiative strengthens the foundation that thousands of organisations rely on. By choosing an OPENGIS.ch support contract, you are not only getting expert support for your projects, you are directly contributing to a healthier, more sustainable open-source GIS ecosystem.


Thank you for being part of this journey.

by Anja Ottiger at April 16, 2026 11:30 AM

April 14, 2026

Desde la Asociación QGIS España queremos compartir con la comunidad una reflexión abierta y, sobre todo, lanzar una consulta clave para la toma de decisiones de este año.

Como ya sabéis, en 2026 no se celebrarán las Jornadas de SIG Libre de Girona, un evento que durante años ha sido el principal punto de encuentro de la comunidad SIG Libre y que, además, servía de marco para la celebración de la QGIS Camp España, donde personas asociadas y usuarias de QGIS nos reuníamos para compartir experiencias, debatir sobre la herramienta y sobre el propio rumbo de la Asociación.

Creemos que la comunidad no debería quedarse sin su encuentro anual, y por ello desde la Asociación estamos valorando la posibilidad de celebrar la QGIS Camp España en 2026, manteniendo el espíritu participativo que siempre ha caracterizado a este espacio.

Dos posibles sedes: Madrid o Granada

En este momento, se están valorando dos posibles ubicaciones para celebrar la QGIS Camp:

  • Madrid, por su accesibilidad y centralidad.
  • Granada, por su tradición universitaria y su vínculo histórico con la comunidad SIG.

Ambas opciones son viables, pero la elección de la sede tiene un impacto directo en el número de asistentes, en la logística y en los recursos necesarios. Por eso, queremos tomar esta decisión basándonos en datos reales de participación, y no sólo en preferencias teóricas.

Propuesta de formato de la jornada

De manera preliminar, la QGIS Camp España 2026 podría estructurarse en:

Por la mañana:

  • Bienvenida institucional
  • Presentación de QGIS 4, la próxima versión principal del proyecto
  • Lightning talks
  • Talleres prácticos aplicados

Por la tarde:

Desconferencias, manteniendo el formato abierto:

  • Cualquier persona puede proponer una charla
  • Las propuestas se votan colectivamente
  • El programa se construye entre todas y todos

¿Por qué esta encuesta?

Organizar un evento de estas características supone un esfuerzo considerable de coordinación, logística y difusión para el equipo de la Junta de la Asociación. Antes de avanzar, queremos responder a dos preguntas clave:

  • ¿Existe un apoyo suficiente de la comunidad para celebrar la QGIS Camp este año?
  • ¿En qué ciudad habría un mayor quorum real de asistentes: Madrid o Granada?

Por eso lanzamos esta encuesta inicial, que nos permitirá medir el interés real en la celebración del evento, a la par que evaluar la distribución potencial de asistentes según la sede y con ello tomar decisiones informadas, responsables y alineadas con la comunidad.

👉 Te animamos a participar en la encuesta tanto si te interesa asistir como si no. Todas las respuestas son importantes para tener una imagen fiel de la situación.

https://cloud.montera34.org/index.php/apps/forms/s/4pPR9gyYBnBw5eTrgT7Pj8xW

Asimismo, agradecemos que des difusión a esta encuesta entre personas usuarias de QGIS, entidades, empresas, administraciones y espacios formativos vinculados al SIG libre.

Desde la Junta Directiva de la Asociación QGIS España agradecemos de antemano el tiempo dedicado a responder y compartir esta encuesta. QGIS es comunidad, y creemos que las decisiones relevantes deben construirse de forma colectiva y transparente.

Gracias por participar ¡Contamos contigo!

April 14, 2026 04:51 PM

It’s my pleasure to announce the release of PROJ 9.8.1!

The release includes a few updates, bug fixes and a major regression fix for ETRS89-related coordinates operations.
See the release notes below.

Download the archives here:

https://download.osgeo.org/proj/proj-9.8.1.tar.gz
https://download.osgeo.org/proj/proj-9.8.1.zip

/Even

## 9.8.1

### Warning

It was discovered after the PROJ 9.8.0 release that several EPSG updates introduced
after EPSG v12.033 - notably the introduction of national realizations of ETRS89
(ETRS89-XXX […] where XXX is the 3-letter ISO country code) - caused backward
incompatibilities in some workflows involving the ETRS89 CRS.

In particular, transformations between ETRS89 and national CRSs based on other
datums are known to be affected for Austria, Belgium, Catalonia, the Netherlands,
Romania, and Serbia. See #4736 for more details.

While we intend to resume tracking the latest EPSG releases in future PROJ
versions, the safest solution identified so far to address these regressions is to
**revert the EPSG related content of its database from EPSG v12.049 to v12.029**,
where v12.029 was the version distributed with PROJ 9.7.1

As a consequence of this revert, the EPSG datum and CRS records introduced in
PROJ 9.8.0, which are mostly related to the new ETRS89-XXX datum and CRS, are no
longer available in PROJ 9.8.1.
### Updates

* Database: **Revert content from EPSG v12.049 to v12.029** (#4741).
See above warning for more details.

* CMake: handle deprecated SQLite::SQLite3 target in CMake 4.3 (#4694)

### Bug Fixes

* Make sure that epoch is set in more scenarios of time-dependent transformations (#4688)

* pj_obj_create: use database context if already open for grid name resolution (#4703)

* Chain vertical CRS transformations through intermediate same-datum vertical CRS (#4711)
Helps for example for **EPSG:5705** (Baltic 1977 height) to **EPSG:5706** (Caspian depth)
by using intermediate operation from Baltic 1977 height to Caspian *height*

* gie: various fixes around crs_src/crs_dst support and bootstrap
test/gie/epsg_grid.gie and test/gie/epsg_no_grid.gie (#4740)

--

My software is free, but my time generally not.

_______________________________________________
Announce mailing list
Announce@lists.osgeo.org

1 post - 1 participant

Read full topic

by jsanz at April 14, 2026 02:24 PM

At OPENGIS.ch, we create open-source software.
We are contributors, maintainers, and in the case of QField, the team that builds it.

That comes with a responsibility we take seriously: giving back.

“Give back” is not a slogan. It is our first core value, and the very reason the sustainability initiative exists.

Open-source is a garden. If you eat from it, water it, and keep seeding.

The Importance of seeding opening keynote, FOSS4G 2023

What is the #sustainQGIS initiative?

Open-source software has a well-known problem: the work that keeps it healthy is largely invisible. Bug fixes, code reviews, refactoring, test coverage, onboarding new contributors: none of these appear in a feature list, but without them, the software eventually degrades. Proprietary projects can budget for this work directly. Open-source projects mostly rely on whoever finds the time.
We wanted to change that, at least in our corner of the ecosystem.


The model is simple. For every support contract we sign that exceeds 10 days, we donate a portion of those days to the initiative. Any unused contract hours at year-end also flow in. That pool of time gets spent on exactly those invisible tasks: triaging and fixing bugs that affect stability, reviewing pull requests so good contributions actually land in the codebase, and doing the unglamorous maintenance work that keeps QGIS’s core solid.

Why we do it

We built a successful company around QGIS and QField. We write code (custom features, plugins, processing algorithms, entire applications) on top of these platforms every day. When a client needs something that cannot be done out of the box, we build it. And we build it inside the project whenever that makes sense, not in a private fork that nobody else benefits from.

Pushing changes upstream instead of maintaining private forks, sponsoring the QGIS project financially, and donating hours are all expressions of the same logic: the ecosystem is a shared asset, and shared assets need shared investment.

I chair the QGIS.org foundation, so I see directly how much the project depends on companies like ours showing up. A bug that slips through costs every QGIS user time. A code review that never happens means a useful feature sits in limbo for months. And a small group of core maintainers carrying the full load eventually burns out. These are not abstract problems. They affect users and the community on a daily basis.

What this means when you work with us?

When you sign a support contract with OPENGIS.ch, you are not just buying expert help with QGIS and QField. A slice of that contract goes back into the project itself. Your investment in solving your own GIS challenges also helps keep the platform reliable for everyone.

We think that is a good deal. It is the way we want to do business.

If you want to know more about the initiative or are ready to make a difference, get a support contract.

Open-source is a garden.
If you eat from it, water it, and keep seeding.

by Marco Bernasocchi at April 14, 2026 11:02 AM

April 13, 2026

QGIS.org is pleased to announce that we will be using some of the funding that is donated to us by our users and sustaining members to fund a dedicated administrative role for the project. 

Principle duties:

  • Support the PSC in activities such as organisation of the annual QGIS user conference, logistics for project initiatives, dealing with partner organisations, dealing with recurring queries about licensing, trademarks, use restrictions, initiatives from our country user groups etc. 
  • Support the treasurer in capturing financial transactions, following up email correspondence with sustaining members and potential sustaining members, tracking expenditures, preparing and consolidating our annual budget, preparing annual reports and anything else that can support the treasurer.
  • Manage and maintain parts of our web site content, such as our country user groups, key contact information, any statutory requirements etc.
  • Help the PSC with the organisation of our virtual annual general meeting, the voting process, grant proposal calls, keeping track of regular events through the calendar year and making sure we don’t  drop any balls.
  • Be a pleasant, polite and professional first point of contact for individuals and organisations wishing to engage with the QGIS project outside of the normal community group activities.

Key personal attributes needed:

  • Highly articulate, especially in English which is the primary language of communication for the QGIS project. The ability to speak additional languages in order to communicate well with our diverse international contributor base would be an advantage.
  • Highly empathetic – we have a diverse community of people with different backgrounds, ethnicities, economic access, neurodiversity etc. so you need to be able to put yourself into the mindset of the people you are interacting with and find ways to communicate effectively regardless of who you are dealing with.
  • Numeracy – you will be asked to help track expenses, funding income, assemble budgets, do forecasting etc.
  • Organised – our project is bustling with daily activities and many of the key people in the project are volunteers and their time is precious, so the more organised, able to take initiative and work independently you are in your approach to your work the better!

Technical skills:

  • Word processing – you need to be able to craft gorgeous looking documents for every communication interaction that follow the QGIS Brand Guidelines and make the reader feel like they are engaging with a professional, well organised project.
  • Spreadsheets – you don’t need to be the best spreadsheet user on the planet but you should have the basic ability to create spreadsheets with formulas and defendable results of the calculations you make.
  • Email – everyone can do email right? Well we want someone who is especially good at crafting emails so that they are short and to the point whilst still being friendly (and not AI slop). We want to make sure that queries to QGIS.org are responded to promptly and questions you can answer yourself are dealt with there and then. For those that you can’t, you need to explore and build a clear framework of how to deal with different queries. For example, escalating to the PSC, the Security group, key developers, the broader community or specific individuals as the situation requires.
  • Basic accounting / bookkeeping skills – you should be able to capture invoices, emit invoices and keep a clear, auditable accounting trail of every transaction.

Job Hours:

  • This is intended for someone looking for a part time position, ideally 20 hours per week to start (or more; ideally, e.g. mornings daily)

Questions about this role:

  • Can I work remotely?: This is in fact a requirement. Most of the people you will interact with are in one of the European timezones, so if you are in a similar longitude, that would be a big plus.
  • I am not a QGIS expert, can I still apply? Yes, however if you have experience with / an understanding of working in open, mainly volunteer based communities that would be a huge asset.
  • What are your feelings about hiring someone who is (you fill in the blank here): We really, really do not care about who you are, we care about your work ethic, your professionalism, your ability to foster a friendly and welcoming environment. We would love to see more diversity in our project, so please don’t be scared to apply even if you think you don’t ‘fit’.
  • Will I be supported with healthcare / social service contributions if I am hired? Yes!
  • Is there a probationary period? There is a mandatory 3 month probationary period during which you have to prove that you are competent and capable of performing the work. We reserve the right to terminate our agreement with just cause if you are not able to perform your role to the PSC’s satisfaction.
  • I think I can do this with a lot of mentoring, will that work? This is probably not the right role for you. Your colleagues are going to be generally volunteers with their own day jobs and limited time for hand holding. We do not plan to leave you ‘high and dry’ but you need to be very comfortable with figuring things out for yourself, taking initiative and preferring well-thought-out action over inaction.

Submitting your application:

Please use this form to submit your application. Deadline for applications is midnight wherever you live, 30 April 2026.

AI Statement:

No LLM / AI was used in writing this job description.

by Tim Sutton at April 13, 2026 08:06 AM

April 10, 2026

April 09, 2026

April 08, 2026

So after finally getting MapGuide Open Source 4.0 out the door, I took a self-imposed hiatus from all things mapping/GIS related for several months, permanently moved from Windows to Linux as my daily driver OS just in time before the end of Windows 10 support, and also to mentally recharge and savor the relief of having this major burden (of releasing MGOS 4.0) being finally lifted off of my back.

I now return with a renewed vigor and some rough roadmaps for things going forward in MapGuide and my other various projects. Part of that renewed vigor is due to the advent of ...

GitHub Copilot

In the past few months in my day job, I have been exposed to GitHub Copilot and it has changed the way I build and ship software, some changes bad, some changes good. Say what you will about AI (or AI-generated code/content) in general, but GitHub Copilot (or any other AI coding assistant) has ultimately been a net positive for me.

What separates how I use AI coding assistants from most depictions of "vibe coding" is that I know the technical and architectural fundamentals of what I am actually after. So I know when GH Copilot is generating what I'm after and when it's generating garbage, and knowing the right prompts to guide it back on track if it starts going off the rails, or know when to cut my losses if the situation is un-salvageable.

So with several months of GH Copilot usage at work, I have been thoroughly convinced that I should get a GH Copilot pro subscription for my own personal use. So last month, I finally bought a GH Copilot pro subscription and ... proceeded to blow my monthly allowance of usage credits in 2 weeks! 😂

But in those 2 weeks, I was able to make some major progress in mapguide-react-layout, knocking off some long standing technical debt and feature requests, some of which you'll see in future dev diary updates on this blog. The productivity gains were massive and turnaround times were quick! A new month has rolled over and with that, a reset of my monthly allowance and I have since learned to judiciously use GH Copilot in a less wasteful manner.

So, now armed with GH Copilot, I have a rough roadmap of things I want to achieve in my various projects, which are all outlined below. So let's start with MapGuide.

MapGuide Open Source

The next release of MapGuide Open Source will be 4.0.1. It will be a bug fix release and include any updated web tier components and upstream FDO fixes since the 4.0 release.

Since permanently moving to Linux as my daily driver OS and understanding that I will still need to produce Windows builds of MapGuide and FDO, I've been re-establishing my windows dev environments for MapGuide/FDO inside a virtual machine and reinstalling all the necessary dev tools. 

One surprise that caught me off-guard was installing Visual Studio 2026 Community Edition. Even though MSVC 2019 is the standardized windows compiler for building MapGuide/FDO on Windows, we can install older compiler workloads on newer releases of Visual Studio. So as part of reinstalling all the required dev tools, I thought it would be a simple case of installing VS 2026 with the MSVC 2019 compiler workloads and everything should be all good, right?

Well, near the tail end of my first MapGuide windows build inside this new VM, I hit a snag on the Windows Installer portion. It turns out our WiX 3.x installer projects are no longer supported in Visual Studio 2026! I was not going waste time and disk space to install VS 2022, so I looked at what options we have for Visual Studio 2026. The answer was to migrate our installer projects to WiX 6.x.

Before GH Copilot, the migration process would've been tedious, having to read migration docs, assess the impact of any breaking changes, etc, etc. But with GH Copilot, I was able to migrate and iterate rapidly towards a WiX 6.x windows installer project that successfully builds and produces a MSI installer.

Now here comes the tedious part that GH Copilot cannot help me with. I still have to manually test this Windows installer, making sure all the feature toggles, custom actions, registry and start menu registrations still work as before and I expect this to take several weeks.

Once this updated Windows installer has been verified as working, there'll be a 1-2 week window of incorporating any bug fixes and updated web tier components before putting out the 4.0.1 release.

mapguide-rest

We will start wrapping up various loose ends on this project. The next release will be 1.0 RC7 and will:
  • Drop support for versions of MapGuide Open Source older than 4.0.
  • Represent mapguide-rest as being feature complete. RC7 to final will be bug fixes only.
I expect GH Copilot to pull heavy duty in helping me knock off all of these long standing items.

mapguide-react-layout

The next release will be 0.15. It will have some exciting new features that you'll see in future dev diary entries on this blog, but the main feature will be to fully decouple ourselves from the Blueprint UI toolkit. Blueprint gave us a nice cohesive set of UI building blocks, but our production bundle sizes have paid a price for this convenience, not to mention that integrating this viewer into other projects that use other UI libraries is problematic as Blueprint is always included, whether you like it or not.

As part of this dev cycle, I've been taking inventory of everything in Blueprint that we actually use and refactoring their usages through a layer of indirection of "abstract UI components" so that we can supply an alternative implementation that is more bare metal and whose appearance and styling can be customized through good ol' CSS.

0.15 will be on the horizon upon the completion of this Blueprint replacement. GH Copilot has paying massive dividends in terms of progress and momentum on this particular project, so I am hoping for a fast turnaround on these remaining items.

MapGuide Maestro

Finally we come to MapGuide Maestro. I will finally bite the bullet and put an end to this endless series of 6.0mSomeNumber releases by putting a final MapGuide Maestro 6.0 release. This 6.0 final release represents the end of MapGuide Maestro in its current form as a Windows Forms based MapGuide authoring and administration tool.

After the 6.0 final release is out. MapGuide Maestro will be rebuilt from the ground-up as a true cross platform MapGuide authoring and administration tool. It will be built on modern .net (and all of its patterns and practices) and our ability to have a true cross-platform UI will be achieved through the use of Avalonia as our UI toolkit. 

A little experiment some years ago plus experience from building/maintaining a separate personal project that uses Avalonia for the UI (not yet announced/revealed on this blog, maybe I will someday) and once again armed with GH Copilot, I have great confidence in pulling off a ground-up rewrite.



And that is a peek into what's in store in the near future. Exciting times are ahead!

by Jackie Ng (noreply@blogger.com) at April 08, 2026 03:11 PM

April 07, 2026

NCRM: Generative AI Tools for Quantitative Research

David Bann and Liam Wright have put together a great guide to Generative AI Tools for Quantitative Research on the NCRM resources site. This is a great overview of what Generative AI is, how it works and all of the potential different models available, both commercial and open source, as well as how to run some models locally rather than relying on the cloud.

They are also very focused on the practical elements of how to actually use the tools in your work, discussing the different approaches as well as highlighting the importance of making sure you do not share sensitive data with cloud services.

They also have a great selection of videos for setting up both cloud based and local LLMs for working with Stata and R scripts in a number of tools including VS Code:

  • Video 1: Obtaining code via chatbots in R and Stata languages
  • Video 2: Illustration of using LLMs to edit R scripts in VS Code
  • Video 3: Agentic analyses using command line interfaces

Video 4: an applied example

Video 4 in their series had a thumbnail of a map, so of course that got me interested! Anyone reading this blog should know that I am a big fan of maps :-)

This was a great example of using Positron IDE (produced by the same people who make RStudio) to help write code that creates a map of crime rates across England and Wales. They give a great overview of the process of making this map, and say:

  • Neither of us is a geographer, but we generated the map in just a few minutes! What a time to be alive. We hope this gives researchers inspiration to increase the ambitiousness of their research.

It really shows the potential this technology has about making a wide range of tools much more widely available and used.

Limitations

However, it also shows some of the limitations of working with a generative AI. It is missing some key subject specific knowledge, with which you could turn this reasonable map to an excellent map without a lot more work.

I would recommend watching the video for more details, but I will summarise the key bits here (thanks Claude.ai for the summary, which I tweaked!)

  • Overview - Liam demonstrates how to use Claude (an LLM) integrated within the Positron IDE to perform geographic data analysis in R — even without prior experience in that area.
  • Data setup – He has two datasets downloaded: crime rate data from the ONS (at community safety partnership level) in an Excel file, and the community safety partnership boundary shapefiles from the Ordnance Survey.
  • Importing the data – He prompts Claude to write R code to import the Excel spreadsheet, specifying details like the sheet name, relevant columns, and the need to drop missing rows and rename variables. When the code throws an error, he simply pastes the error back into Claude, which fixes it successfully.
  • Reading the shapefile – He asks Claude for code to load the geographic boundary data and plot a basic outline of England and Wales to verify it loaded correctly.
  • Merging and mapping – He asks Claude to merge the crime data with the shapefile and plot the data on a map. Notably, Claude correctly identified the matching column names in the shapefile without being told — because Positron automatically sends session metadata to the API in the background.
  • Improving the visualisation – The initial map was hard to read due to a skewed distribution of crime rates. He prompts Claude to address this, and it produces a histogram and an adjusted plot. Claude also spots a major outlier (Westminster, with a crime rate of 446 vs. a mean of 81) from the console output.
  • Removing outliers – A final prompt asks Claude to remove outliers using the interquartile range method and replot. Five areas are removed (Westminster, Camden, Manchester, Kensington & Chelsea, Middlesbrough), resulting in a much clearer map with better regional differentiation.
  • Suggesting further analyses – He ends by asking Claude to suggest follow-up analyses (e.g. spatial autocorrelation), showing how it can also guide next steps.

So - very good as a first effort.

Just to be clear I am not trying to be critical of Liam or the resources he has created - these are amazing and it is great that they are out there. I am trying to highlight some of the limitations of relying exclusively on GenAI for working in an area new to you. In fact, Liam explicitly acknowledges this and has, in fact, signed up to one of my upcoming courses on to learn more about GIS :-)

So, what do I think Claude missed?

  • Data setup – Liam already did most of the hard work in a) finding the data of crime rates and also b) finding the relevant spatial data to display this. So Claude had an easy time here.
    • The first step here is knowing you need both files - the data and the boundaries. It’s also important to make sure the Excel file (rows of data) and the boundary data match, both in terms of what boundary they use (Community Safety Partnership boundaries in this case) but also what year or version of boundary they use.
    • These change over time, and searching shows me that there are at least two versions of these on the ONS GeoPortal - December 2021 and December 2023. I’m not sure which version Liam is using in his example.
  • Importing the data – This went really well, and I am impressed how Claude handled a very common issue in R - with receiving all the columns but only needing 4.
  • Reading the shapefile – Again, this works really well.
  • Merging and mapping – This works well, and Liam specifies to Claude where the codes are in the Excel file to join the data, but Claude works out where they are in the shapefile.
    • Of note, R on its own (without AI) can do a merge without the column names being specified - if they are they same in both data sets. I don’t think they are in this case though, so well done Claude.
  • Improving the visualisation – The initial visualisation isn’t that great. Liam picks up Claude on this and asks for improvements. Looking for skewed data is a great idea - and I would agree that Westminster data should be removed.
    • The interesting question is why the crime rate is so high there. I would suggest this is because Westminster has a relatively low resident population - i.e. very few people live there. However a lot of people work there, so there is still crime, but the low denominator makes the rate very high. This is something that in the GIS world we would call critical spatial thinking - being critical of your data and asking why you see certain values and patterns. Claude won’t tell us this - and experience and training is needed to have a guess as to why this is.
  • Removing outliers - The other element of removing the skewed data is interesting. Claude suggests the IQR method - which I presume is a standard approach from a statistics point of view? I am not a statistician so I don’t know whether this is a reasonable thing to do or not. However it does mean we lose a total of 5 values - which is not something we would normally do in GIS. The other element of the visualisation is that we would usually classify the data before showing them on a map - i.e. group them together into 5 or 6 groups. This type of map is what we would call a choropleth map.
    • In fact, when I asked Claude to summarise the transcript, it included the term choropleth map in the initial summary it created - despite the fact that that word wasn’t in the transcript at all.
    • So, with a choropleth map it is standard practice to classify the data - usually using the Natural Breaks classification. Again, this is something that training will teach you - along with the reasons why and why this makes the map easier to interpret.
    • Another very minor point is that I would usually suggest using the tmap package rather than the ggplot2 package for creating maps in R. ggplot2 is a generic graphics package - it can do maps, but can do other graphics as well. tmap is a specific mapping package, and the defaults for the maps it creates are better than the defaults ggplot2 uses - in my opinion. A lot of preference is down to individual style - there is no categorical right or wrong here. David O’Sullivan wrote a very nice comparison of the differences on his blog.
  • Suggesting further analyses – Here Claude makes some useful suggestions for further analysis (e.g. spatial autocorrelation and Local Moran’s I) which are good suggestions, and there is clearly lots more potential to explore here.

Summary

So overall, generative AI is a fantastic tool and thanks to much to David and Liam for putting together this resource, and thanks to the NCRM for hosting it. It’s a great starting point for any new method, but has some clear limitations. If you know the field, then you already know what the limitations are, but if you are new to the field beware - generative AI will not tell you that it does not know things or what it might be missing. It’s well known for being over confident, so remember to bring your critical thinking when making use of these technologies!

If you want to learn more about GIS, and using R as a GIS, check out my up coming training courses with NCRM in April and May this year. If you have any questions, please do contact me.

by Nick Bearman at April 07, 2026 11:00 PM

In week 12, I started to get things back on track after being sick. I ran five days, biked one day, and took Saturday off to take my family to a Nuggets game in Denver.

  • 12 hours, 9 minutes all training

  • 34.8 miles running

  • 5,800 ft D+ running (and treadmill)

My energy level was low to mid until Friday, when I rallied for a good interval workout on a 12% incline treadmill indoors and a sauna session after. My fitness didn't advance much in week ten, but I didn't lose a step. According to the machine, I went "up" beyond any of the Quad Rock climbs, and at a pace that I'd love to hit on race day.

Saturday, as I mentioned previously, I sat on my butt in a car and in Ball Arena, with Ruthie and our kids, and then we all met my folks in Denver for an early dinner. It was a wonderfully easy and sociable day. Some days I think we should be living in Denver instead of Fort Collins, and this was one of those days.

Sunday I went out for three hours on the rolling and punchy dirt trails east of Horsetooth Reservoir, intending to get 90 minutes of Z2-Z3 running. It was a great run. My legs felt lively during miles 2-12, and I pushed my gas pedal with enthusiasm. If I'd brought another two gels, I could have avoided bottoming out at mile 14. I'm hoping to feel that good and pain free at Quad Rock.

https://live.staticflickr.com/65535/55194044783_1d2c216c79_b.jpg

A dry, yellow-brown landscape under a blue and partly cloudy sky.

by Sean Gillies at April 07, 2026 02:22 AM

April 06, 2026

Listening to the news about the impact of the current war and the closure of the Strait of Hormuz, I was prompted to think about critical minerals that could also have massive disruptive impact in geopolitical turmoil.

I asked Claude “Can you build a dataset of the most critical minerals, where they are mined (mine sites or countries), shares of world production, price changes over 5-10 years, main usages so that we can build a map of these commodities. Build it in a way that we can extend if we find more data” and it found me some data using these sources:

I then asked for a map “let’s build a map using my standard map app settings that you should have stored in memory. Users should be able to see the map symbols showing percentages of world production for each mineral. a selector to choose an individual mineral or an overall view. an option to show mineral supply by usage. Notes to appear on minerals and use cases (popups or side panels)” and after a few iterations and some data cleanup I ended up with the map at the top of this post. Not perfect but good enough to explore the data and to understand which minerals are used in key aspects of modern life and which countries may have a stranglehold on those minerals.

I am learning how to navigate and better prompt Claude, it has stored some of my favourite settings so there is some consistency in what it generates. The whole process is getting faster. One thing I am beginning to realise is that when you use an AI to do your research and extract or build your data you really need some domain knowledge to do a sanity check otherwise you may be just mapping hallucinations.

by Steven at April 06, 2026 10:00 AM

April 04, 2026

Introduction

Cloud masking is one of those steps you cannot really avoid when working with satellite data. Yet, it is often more cumbersome than it needs to be: different sensors come with different workflows, tools, and preprocessing requirements.

That is why I like OmniCloudMask. It is a Python library for cloud and cloud shadow detection in high to moderate resolution satellite imagery. Instead of relying on sensor-specific approaches, it uses a single model that generalizes across platforms.

According to the documentation, it supports resolutions from 10 m to 50 m and works with imagery from Sentinel-2, Landsat, PlanetScope, Maxar, and other sensors with Red, Green, and NIR bands. In practice, however, it also performs well on higher-resolution data, such as the 1.2 m Pléiades NEO imagery used in this example.

Standalone workflow

To illustrate how OmniCloudMask works in practice, I use a scene acquired on 25-04-2025 covering an area just south of Rotterdam. The data comes from the satellietdataportaal.

Using OmniCloudMask is straightforward. After installing the required modules (see the documentation), you only need to define the input scene and specify the band order:

from functools import partial
from pathlib import Path
from omnicloudmask import predict_from_load_func, load_multiband

wd = Path("/path/to/scene")
scene_paths = [wd / "scene.tif"]

# Band order: [Red, Green, NIR]
loader = partial(load_multiband, band_order=[1, 2, 4])

pred_paths = predict_from_load_func(scene_paths, loader)

The model then produces a categorical cloud mask with four classes: clear, thick cloud, thin cloud, and cloud shadow. The example below shows the result alongside the original image. Click the tabs to switch between the original and masked scene.

What stands out here is the simplicity: as long as the required bands are available, the same workflow can be applied regardless of the sensor. This is particularly useful when working with different remote sensing products.

Integration with GRASS

While the standalone workflow is already convenient, many spatial analysis workflows benefit from tighter integration within a geospatial processing environment. This is where GRASS comes in.

GRASS provides a powerful computational framework for raster, vector, and time series analysis, along with a rich set of remote sensing tools. It furthermore provides an efficient mechanism for working on spatial subsets through the GRASS region concept.

To bring OmniCloudMask into this environment, I developed a lightweight addon module: i.omnicloudmask. This module acts as a simple interface to the underlying Python library, allowing cloud masking to be embedded directly into GRASS-based workflows. While it works well in my own analyses, it has undergone limited testing, so some caution is warranted.

Running OmniCloudMask in GRASS

Check the omnicloudmask documentation about how to install the library and its dependencies. Next, install the i.omnicloudmask addon. For now you will have to do this manually. See the GRASS wiki page for pointers.

Once everything is set up, running the module is straightforward. I already imported a Pléiades-NEO scene for an area south of Rotterdam, with the base name Pneo_Rhoon. OmniCloudMask requires the red, green and near infrared (NIR) bands.

import grass.script as gs

gs.run_command(
    "i.omnicloudmask",
    red="Pneo_Rhoon.1",
    green="Pneo_Rhoon.2",
    nir="Pneo_Rhoon.4",
    output="Pneo_Rhoon_cloud",
    memory=20000,
)

This produces the same four-class cloud mask as before. The comparison below shows the OmniCloudMask output alongside the cloud mask provided with the original dataset.

From here, the results can be analysed using standard GRASS tools. For example, we can quantify the proportion of cloud and shadow cover:

gs.run_command("r.stats", flags="pln", input="Pneo_Rhoon_cloud", format="csv")

For this scene, the model estimates that 5% of the area is covered by clouds, with an additional 0.8% classified as thin clouds. A further 2.6% of the area is affected by cloud shadows.

class percent
Clear 94.2%
Thick Cloud 5.0%
Thin Cloud 0.8%
Cloud Shadow 2.6%

The estimated cloud cover is slightly higher than the 4.6% reported in the dataset metadata. Based on visual comparison, the OmniCloudMask result appears more consistent. It furthermore captures cloud shadows pretty accurately as well.

Cloud probabilities

In addition to categorical outputs, OmniCloudMask can generate class probability maps, which provide a more nuanced view of the model predictions.

gs.run_command("r.stats", flags="pln", input="Pneo_Rhoon_cloud", format="csv")

Using the -c flag, the module outputs four rasters—_clear, _thick_cloud, _thin_cloud, and _cloud_shadow—based on the output basename. Each raster represents the per-pixel probability for the corresponding class.

These probability maps are useful for applying custom thresholds or identifying areas of uncertainty, for example when focusing on high-confidence cloud masking in ecological analyses.

Takeaways

What makes OmniCloudMask useful is not just its ability to capture clouds and shadows, but that it’s sensor-agnostic. The same workflow can be applied across datasets without constant adjustments, reducing preprocessing effort and improving consistency.

Integrating it into GRASS further streamlines things, turning cloud masking into just another step in a reproducible, GRASS-centred workflow rather than a separate preprocessing task.

by Paulo van Breugel at April 04, 2026 10:00 PM

I had big plans for week eleven and then came down with a cold. My Wednesday workout's mediocre feeling was the first indication. The rest of the week I shifted into recovery rides and easy runs.

  • 10 hours, 23 minutes all training

  • 19.2 miles running

  • 2,306 ft D+ running

It wasn't a terrible week, to be clear. I didn't fall apart physically, or anything like that. My concern is that it was the first week where I didn't progress very much in my Quad Rock training season.

by Sean Gillies at April 04, 2026 02:13 AM

April 03, 2026

Having knocked up a simple fuel finder based on a csv download from the government site at the beginning of the week I thought “How difficult could it be to make something more functional and elegant and connect to the government fuel finder API?” Answer – the “more functional and elegant” not too difficult – about 3 hours of vibe coding, Connecting to the API and refreshing regularly – pretty damn hard even with Claude helping me.

For comparison here is the first version using a downloaded file

The map pins are simple, the pop-up has limited info and worst of all there are all those miscoded petrol stations in the North Sea.

I asked Claude to refactor the app and to improve it’s appearance: “I want to build a fuel price search app using the government api https://www.developer.fuel-finder.service.gov.uk/dev-guideline. I have built a rough prototype to explore the data at https://knowwhereconsulting.co.uk/maps/fuel-finder-v1/ this is not a particularly good implementation and it uses a static file rather than the api. I have uploaded the code and data file. before building anything can you review the government api spec and consider how to connect to the api. Polling the api every hour is probably ok, we should note when we polled it and show the update time for each fuel point

I got back pretty much the front end that you see at the top of this post, took me a couple of hours to tweak the styling. I had to wrestle with the data to exclude some of the petrol stations that were in the North Sea and also prices that were clearly wrong e.g petrol at 1.4 pence or £14 per litre but we got to a workable solution (I think it will need adjusting if prices got over £2.50 per litre and who knows when that might happen!)

Then it was time to try and connect to the government API, I created an account and got my client ID and secret code. I was now in an area where I knew nothing and was totally dependent on Claude working out how to access the API. This was difficult, maybe it was the documentation, maybe it was Claude but it took a while to work out how to download the latest version of the data and how to write a log file tracking the successes or failures of the download, tried CLI and cURL approaches for the cron job but kept getting blocked somewhere (either my hosting service, Cloudflare or the API). At one stage switched to using an external cron service but hit a few more obstacles. I think it took something like 4-5 hours to get this bit running, not helped by my lack of understanding of how to set up a cron job and API endpoints and authorisation.

These are the sites referenced by the government web site, I think my Find Cheap Fuel Near Me stacks up pretty well.

Next steps are to speed up downloads by only downloading changed data and try to find a way to speed up the data load and then maybe a rethink of the mobile interface.

by Steven at April 03, 2026 01:28 PM

April 02, 2026

https://www.osgeo.org/foundation-news/osgeo-ambassador-programme-call-for-participation/

In line with its commitment to increase the impact of the organization, OSGeo is excited to announce its new Ambassador Programme.

What is an OSGeo Ambassador?

Someone who helps OSGeo to grow its financial resources through fundraising. These are some fundraising activities that we would love for an ambassador to explore:
• Engage with Organizations such as the European Commission (EC), or the United Nations (UN), to seek grants and other opportunities.
• Liaison with commercial companies and acquire new sponsors.

We acknowledge that it may not be easy for everyone o approach many of these organizations. An ideal candidate would be someone who already has a strong professional network that they can leverage.

As the ambassador will be representing OSGeo, it is critical for them to be aligned with the Foundation’s values and vision., as well as to operate within the OSGeo Code of Conduct.

We are very thankful for all the resources brought by our Ambassadors, and we are happy to offer a 10% recommendation bonus based on the amount they bring to OSGeo.

Why increase OSGeo’s financial resources?

Our budget enables us to support OSGeo’s mission, by providing funding to projects, committees and seeding the FOSS4G conference. By increasing our budget we would be able for instance, to improve the marketing of the organization and to provide more travel grants to attend the conferences. We could also support more projects on getting compliance certification of the Standards they implement (for example with the Open Geospatial Consortium [OGC]).

Sound interesting?
Please drop us an email to board-priv@osgeo.org with an expression of interest. In addition, please feel free to circulate this to your network in order to increase and widen the OSGeo Ambassador program.

1 post - 1 participant

Read full topic

by jsanz at April 02, 2026 09:29 PM

This is a reply to David Gasquez’ blog post Atmospheric Data Portals. As there’s so much in it and much of it overlaps with future plans, I thought it makes sense to write a proper public reply instead of following up in a private conversation.

First of all, read his blog post and follow the many links, there is so much to discover.

One re-occurring thing in the documents linked from the “issues on the earlier stages of the Open Data pipeline” section is that for most portals a static site should be sufficient. I fully agree with that. When it’s done properly, an automated rebuild of some parts when new data is added should work well. These days even powerful client-sided search is possible.

It’s a bit off-topic, but David’s Barefoot Data Platforms page links to Maggie Appleton’s Home Cooked Software and Barefoot Developers talk linked. I highly recommend watching it, it was one of my favourite talks at the Local-first Conference 2024. I always wanted to blog about it, but never found the time.

But now to the concrete points David mentions. If anyone has ideas on how to make those things happen with Matadisco, please open issues on the main Matadisco repo.

Take inspiration from existing flexible standards like Data Package, Croissant, and GEO ones for the core fields. Start with the smallest shared lexicon while leaving room for specialized extensions (sidecars?).

I don’t think Matadisco should go into too much detail on specifying what the metadata should look like. Making one metadata standard to rule them all is destined to fail from my experience (ISO 19115/19139 anyone?). Though there might be a lowest common denominator, similar to what Standard.site is doing for long-form publishing. In order to find out what that looks like, I propose that individual communities start by specifying Lexicons for their own needs. This could be done through tags, which I’ve outlined in the Matadisco issue “Introducing tags for filtering and extension point”.

Split datasets from “snapshots”. Say, io.datonic.dataset holds long-term properties like description and points to io.datonic.dataset.release or io.datonic.dataset.snapshot, which point to the actual resources.

Some kind of hierarchical relationship would be useful. FROST, which Matadisco drew a lot of inspiration from, is centred around IceChunk, which also has the concept of snapshots. But I don’t think we should stop at the concept of snapshots. In my original demo, I scrape a STAC catalogue for Sentinel-2 imagery. Every new image is a new record. They are all part of the same STAC collection, so we could use a similar concept in Matadisco as well.

Add an optional DASL-CID field for resources so we “pin” the bytes.

Yes, that’s something @mosh is keen to have. It’s not only useful for pinning things to a specific version, but also to make it possible to verify that the data you received is the one you expected. It sounds trivial, but the problem would be where to put it. Do you only hash the metadata record it points to? Do you hash the data container (if there’s one)? Or each resource a metadata record points to?

Core lexicon should be as agnostic as possible!

As mentioned above, it might be out of scope for Matadisco and for now it’s left to the individual communities.

Bootstrap the catalog. There are many open indexes and organizations. Crawl them!

Indeed! My first two Matadisco producers are sentinel-to-atproto crawling Element 84’s Earth Search STAC catalogue and gdi-de-csw-to-atproto crawling the GeoNetwork instance of the official German geo metadata catalogue.

Integrate with external repositories. E.g., a service that creates JSON-LD files from the datasets it sees appearing on the Atmosphere so Google Datasets picks them up. The same cron job could push data into Hugging Face or any other tool that people are already using in their fields.

At first this would need to happen for each individual type of record, see the tags proposal above.

Convince and work with high quality organizations doing something like this! I’d definitely collaborate with source.coop for example.

That surely is the goal!

by Volker Mische at April 02, 2026 09:02 PM

April 01, 2026

This is a short write-up on the FOSSGIS 2026 conference. It’s a German speaking conference on free and open source geographic information systems and OpenStreetMap. So maybe a blog post in English spreads the word even wider.

While being the biggest edition ever (1000 registrations on-site, 300 online) it was well run and organized as every year. It didn’t even feel larger than usual. The CCC video team streamed live and published the cut videos the same day in outstanding quality as always.

I split this post into two sections, one about interesting talks for the geo world in general and then follow up discussions on my Matadisco talk and ATProto in general.

Talks

I’ve spent most of my time in hallway chatting with people as this is what matters most to me when I’m attending a conference in person. Nonetheless I’ve still managed to see some excellent talks.

Panel discussion on digital sovereignty in the cloud

The conference started with a high-class panel discussion on digital sovereignty in the cloud. The public discussion on that topic is often centered around where servers are located. Though that doesn’t actually matter. US companies can be forced by their government to give access to the data independent of their physical location.

Other topics touched were best practices on switching from proprietary to open source systems.

Barrier-free travelling thanks to paid mappers

Public transport in Germany must be accessible to disabled individuals (reality is far away from that). For routing, you need the data basis for it. This talk gets into the details on how Baden-Württemberg, a federal state in south Germany, works on enabling barrier-free travelling. They decided to add that information of all their 1100 train stations directly to OpenStreetMap. In order to achieve the required high quality they’ve hired through a third party company several experienced mappers from the community.

I really like the idea that OpenStreetMap can now be used as source of truth for that data set. I hope other federal states follow this lead.

Routing talks

I’ve seen two talks about routing. The one about Valhalla routing engine with MapLibre Native was interesting because it was about a special case, where you want to re-route bus lines in case of construction. Although the resulting system is not open source, they’ve contributed upstream to Valhalla, to make it work well with MapLibre Native. Those contributions can be more valuable than a one time source code dump of forked repositories, just to call it open source.

Another one was about Real-time mobility analytics for disaster relief operations, which was interesting to see how routing is used in such cases. The limitations and how such systems really help on the ground.

Matadisco and ATProto

My talk on Matadisco was about the current status of metadata catalogues, the problems and how ATProto can make things better. What I should have made clearer is what Matadisco actually is. I didn’t make it clear that it’s just a schema/convention people would use to announce their data on ATProto. It could’ve been mistaken as a piece of software or a service. You would use Matadisco in order to implement something for your pipeline.

Nonetheless people got the idea and I had good conversations afterwards. I talked with Olivia Guyot about the possible ways on how to integrate Matadisco record publishing into GeoNetwork. With Christian Willmes about creating a portal for combining paleoenvironmental and archaeological data.

While chatting about ATProto at one of the social events Klaus Stein talked about how he would like a social network to be. Users would just put static files somewhere. I agree that having static webspace somewhere without any server component is not only cheap, but also the easiest to get. He is not bothered about other components being operated by other parties, e.g. for indexing. That kept me thinking how far ATProto is away from that. I’d like to build a prototype that is like a static site generator for ATProto records. It won’t be able to act as a full PDS, you would need a WebSocket connection to get the data to a relay. But there could be a minimal service operated by a third party that polls those static PDS for updates and forwards them to a relay.

by Volker Mische at April 01, 2026 11:24 AM

March 31, 2026

This morning there was a lot of amused banter amongst my geo-pals about the UK Government’s Fuel Finder API based on an article in The Times (possible paywall) on the problems with the data. You can download the data or access the api from the gov.uk site.

I downloaded the data for this morning (31st March 2026, 11am) and spun up this fuel finder app in a few minutes. It’s nothing special and some of the government signposted sites are a lot better but this shows how poor the raw data is. Surely the developer of the government api could have introduced a simple check to ensure that all coordinates were in the UK not in the sea or in France, Belgium or Holland?

Remember this will only be up to date for a a day or so before prices have shifted, I suppose I could have tried to access the api and built something that would have stayed up to date but given the number of sites already doing this there isn’t a good reason to duplicate there efforts.

by Steven at March 31, 2026 04:29 PM

I was thinking about what map to make next and I thought about flags, national symbols like birds or animals.

I started out with a very simple prompt to see what Claude would come up with:

I want to make an interactive map of national symbols: Flags, Trees, Flowers, Animal, Symbols,Anything else you can suggest?

It chundered away for a few minutes and came up with this monster

Yes it is truly awful! You can click on the map above to see it in action, it sort of works but the map part is rubbish (but probably represents Claude’s limited sense of world geography) and the linked data is pretty flaky as well. When I challenged Claude it explained that it was using “training data” I guess that means “any old crap that I could scrape.

First up I needed to find a source of data for national symbols, a quick search of Wikipedia turned up this List of National Symbols with a wide range of topics including flags, birds, animals, sports etc and quite good geographic coverage. Claude suggested that it use some world boundaries from D3 and gave me something a lot better looking.

A couple more tries to get all of the data and image links from Wikipedia, tidy up the interface and change the colour scheme and I had a release version. If you are viewing on a mobile you won’t see the map just a country search and the side panel.

The finished version – click map to view

This whole project took a bit over an hour from start to finish, it’s pretty simple and relies on masses of links ot Wikipedia which may get broken in the future. I think it is a much nicer way to view all of the data from the Wikipedia lists than the source web page.

I know someone will dispute the data, don’t tell me what’s wrong with it – update the Wikipedia page and if your update doesn’t get overwritten I will pick it up in a few weeks time (the map is not live linked to the Wikipedia pages).

by Steven at March 31, 2026 09:15 AM

March 28, 2026

https://www.osgeo.org/foundation-news/thank-you-angelos-outgoing-osgeo-president/

After years of service, Angelos Tzotsos is stepping down as President of OSGeo and will continue to serve on the Board of Directors for the remainder of his term. We want to take this opportunity to thank him for his leadership on behalf of the community.

Angelos joined the OSGeo Board in 2016 and has served as President since 2019. Over that period, he has been a consistent and active contributor. Not only in his governance role, but also as a developer and project leader across several key OSGeo projects.

He has been a regular presence at code sprints, FOSS4G events and other international conferences, actively representing and advocating for OSGeo and open geospatial software.

Angelos is a true leader by example as demonstrated during his time as the President of OSGeo.

Angelos, thank you for the time and effort you have put into OSGeo over the years as Board member, as project leader, and as President. It is genuinely appreciated.

We look forward to your continued involvement, both as a Board member and a leader in the OSGeo community.

The OSGeo Board of Directors

1 post - 1 participant

Read full topic

by jsanz at March 28, 2026 05:33 PM

March 27, 2026

El pasado webinar organizado por KAN Territory & IT reunió a la comunidad geoespacial de habla hispana para presentar las principales novedades de GeoNode 5 y explorar cómo evoluciona el ecosistema hacia arquitecturas modernas basadas en la nube.

En este artículo, resumimos los principales conceptos y avances compartidos durante el encuentro.

GeoNode es una plataforma open source diseñada para la gestión, publicación y análisis de datos geoespaciales, ampliamente utilizada para construir Infraestructuras de Datos Espaciales (IDE) y sistemas SIG.

¿Qué es GeoNode y por qué sigue siendo clave?

Permite a organizaciones:

  • Centralizar y catalogar datos geoespaciales
  • Publicar mapas y servicios de forma sencilla
  • Crear dashboards y narrativas geográficas (GeoStories)
  • Gestionar accesos y permisos sobre la información

Uno de sus grandes diferenciales es que democratiza el uso del GIS, permitiendo que usuarios sin perfil técnico puedan cargar, visualizar y analizar datos.

GeoNode 5: una evolución centrada en usabilidad y escalabilidad

La nueva versión marca un punto de inflexión en la plataforma.

Nueva interfaz y mayor personalización

GeoNode 5 introduce un rediseño completo enfocado en:

  • Usabilidad
  • Claridad visual
  • Experiencia de usuario

Además, permite a los administradores crear páginas y secciones personalizadas, adaptando el portal a cada proyecto o institución.

Nuevo módulo de metadatos

Se incorpora una mejora significativa en la gestión de metadatos:

  • Interfaces más simples
  • Mayor control sobre la información
  • Mejor organización de datasets

Esto facilita la documentación y descubrimiento de datos, clave en cualquier IDE.

Gestión del ciclo de vida de los datos

GeoNode 5 mejora la forma en que se actualizan los datos:

  • Reemplazo y versionado más simple
  • Actualización directa de datasets
  • Mejor control sobre cambios

Calidad de datos y validaciones

Ahora es posible:

  • Definir restricciones en atributos
  • Validar datos al momento de carga
  • Mejorar la calidad de la información desde origen

Sistema de permisos más flexible

Uno de los cambios más relevantes:

  • Nuevo sistema de permisos dinámico basado en reglas
  • Posibilidad de definir accesos por condiciones contextuales o temporales
  • Mayor granularidad en la gestión de usuarios

Esto permite construir portales mucho más seguros y adaptados a distintos perfiles.

Nuevos formatos y exportaciones

Se amplían las capacidades de acceso a datos:

  • Exportación en CSV y Excel
  • Mayor interoperabilidad
  • Integración con otros sistemas

Mejoras en performance y procesamiento

El backend fue optimizado para:

  • Procesamiento más rápido
  • Mejor manejo de tareas asíncronas
  • Mayor estabilidad en entornos exigentes

GeoServer Cloud: hacia arquitecturas cloud-native

Uno de los ejes centrales del webinar fue la evolución hacia GeoServer Cloud.

El enfoque es claro:

“Si queremos sistemas realmente cloud-native, todos sus componentes deben ser cloud-native.”

Principales características:

  • Arquitectura de microservicios
  • Despliegue en Kubernetes
  • Procesamiento geoespacial distribuido
  • Escalabilidad horizontal

Esto permite construir infraestructuras más resilientes, flexibles y preparadas para grandes volúmenes de datos.

GeoNode Cloud: la nueva generación de plataformas geoespaciales

GeoNode Cloud representa el siguiente paso:

Una implementación de GeoNode diseñada específicamente para la nube, optimizada para entornos modernos.

Beneficios clave:

  • Escalabilidad automática
  • Reducción de costos operativos
  • Sin necesidad de infraestructura física
  • Acceso desde cualquier lugar

Integraciones destacadas:

  • GeoServer Cloud
  • Kubernetes
  • QGIS Desktop

En conjunto, esto habilita una arquitectura completamente desacoplada y preparada para crecimiento.

Un cambio de paradigma en las IDE

Lo que vimos en GeoNode 5 no es solo una actualización.

Es un cambio de paradigma:

  • De plataformas monolíticas → a microservicios
  • De despliegues locales → a cloud-native
  • De sistemas rígidos → a soluciones configurables

Conclusión

GeoNode 5 marca un antes y un después en la evolución de las plataformas geoespaciales open source.

Con mejoras en usabilidad, gestión de datos, seguridad y arquitectura, se posiciona como una herramienta clave para organizaciones que buscan:

  • Escalar sus Infraestructuras de Datos Espaciales
  • Modernizar sus sistemas GIS
  • Adoptar tecnologías cloud

¿Querés ver el webinar completo o evaluar GeoNode para tu organización?

Desde KAN acompañamos a organismos y empresas en el diseño e implementación de plataformas geoespaciales escalables.

👉 Contactanos para conocer más.

by KAN Territory & IT at March 27, 2026 05:30 PM

Non è facile cominciare a raccontare questa storia, quindi comincio da dove mi viene in mente. Avete presente un gasometro? Avete mai visto un gasometro dal vivo o in foto, in video, in televisione? Fino ad alcuni anni fa per me era una parola un po’ strana eppure è diventato uno dei luoghi con cui ho a che fare più spesso nella mia vita quotidiana. A Ventimiglia, dentro un’area archeologica romana abbastanza famosa e importante, ci sono due gasometri. Alcuni anni fa ho iniziato a occuparmi di questi due gasometri e di tutto quello che ci sta intorno che si chiama Officina del Gas è un impianto abbastanza grande, di 12.000 m² che dal 1906 al 1993 ha funzionato per dare il gas alla città.

Foto aerea di un'area archeologica con resti di edifici romani e di due gasometri metallici. Il terreno è un grande prato verde intenso. Sulla sinistra passa una strada con auto e oltre la strada si vede la prosecuzione dell'area archeologica. Ai margini si vedono i tetti in tegole di costruzioni moderne.

Prima di iniziare a lavorare veramente alla realizzazione del progetto, con alcune persone molto preparate ho iniziato a studiare la storia di questo luogo e a farmi raccontare dalle persone che ci vivono accanto che cosa rappresenta per loro.

Ma un giorno ho anche condiviso alcune immagini di questi gasometri e di altri gasometri, tra cui quello di Corso Farini a Torino, ed è successo quello che succede sul fediverso, cioè qualcuno ha commentato lanciando un ponte verso un altro mondo diverso, il mondo musicale.

Lo scheletro metallico di un gasometro ripreso guardando verso l'alto, la foto non è in bianco e nero ma il cielo grigio chiaro sullo sfondo la rende quasi tale. La struttura è una serie di travi orizzontali, verticali e diagonali disposta a formare un grande cilindro vuoto.

Sembra quasi scontato ma ci sono alcuni generi specifici di musica che sembrano avere un legame profondo con l’immagine e l’essenza stessa di un gasometro e di un’officina del gas. Uno di questi generi è quello che possiamo etichettare come rock industriale, industrial rock, un genere che è molto diffuso soprattutto nel Regno Unito e in Germania. Un altro genere completamente diverso che ha un legame profondo con un gasometro con un impianto industriale è sicuramente la musica elettronica techno e techno industrial. Allora per chiudere temporaneamente il cerchio vediamo alcuni album e gruppi di artisti che hanno realizzato di recente o anche meno di recente una serie di opere musicali legate al gasometro. Non c’è quasi nulla che sia frutto di una mia ricerca: sono tutti suggerimenti che ho avuto sia dal fediverso sia da alcuni colleghi e colleghe che hanno condiviso con me questo percorso ed è interessante vedere che in certi casi addirittura il gasometro compare sulla copertina di un disco.

Ovviamente questa abbuffata musicale molto variegata mi ha fatto pensare che il gasometro, ora che ha smesso di funzionare, rimane un buon posto dove esplorare questa connessione musicale e quindi ho cominciato a lavorare per rendere possibile un ritorno di suoni e musica sotto i gasometri stessi.

Gli album

Questa la selezione sul lato rock

Sul versante techno, ho queste tracce:

  • The bells di Jeff Mills
  • Orbit degli Underground Resistance (sempre con Jeff Mills)

E per finire l’incredibile performance di Jeff Mills, Jean-Phi Dary and Prabhu Edouard dentro il sito archeologico di Delos, Tomorrow comes the harvest. La metto qui perché chiude il cerchio in modo mirabile riportando all’unità l’archeologia classica, la musica elettronica e l’archeologia industriale da cui sono partito.

by Stefano Costa at March 27, 2026 12:22 PM

I'm squishing four weeks worth of recap into this one post.

First, the numbers.

  • 40 hours, 32 minutes all training

  • 99 miles running

  • 15,600 ft D+ running (and treadmill)

I'm less concerned with miles than I used to be, but I'm still writing these numbers down for continuity's sake.

I'm running four days a week and riding or other cross-training 2-3 days. Two of my runs are easy, but not slow. One has some high intensity intervals or hill sprints. The other is a 2-3 hour run with 45-60 minutes of tempo pace in the middle. My top speed hasn't increased in the past four weeks, but my easy pace has improved a lot. With five more weeks of training ahead before I begin to taper off, I'm looking forward to getting even faster at zones 2 and 3.

I've been switching between potential race shoes on my faster and longer runs. While the Hoka Tecton X 3 are growing on me, and the Kjerag are fun, I'm 99% sure that I'll run Quad Rock 25 in La Sportiva Prodigio Pros. They're well suited to the course, fit me well, and feel stable, fast, and adequately cushioned.

My favorite run of March was a Friday afternoon outing in Lory State Park with my friend, Dana. He's not training for any event, but is naturally faster than me, so our runs are a great opportunity to go a little harder. On this occasion we did two warm up miles in the valley and then went rapidly up Quad Rock climb no. 3 and quickly down descent no. 3 to what will be the finish on race day. I recorded segment times that were only a few seconds off my personal bests, and wasn't wrecked afterwards.

Last Sunday I went out by myself and ran through the first and second Quad Rock climbs (and matching descents). My calves and hamstrings cramped after a long, hard push in the middle of the run, and I struggled for the last four miles. That's the first time this season, a good reminder to fuel better on my long runs.

30 minutes of mobility and core strength exercises every morning are keeping my body in good shape. I've no Achilles tendinitis. My hips and back are pain free. No foot trouble. Inflammation and swelling in my right knee is troubling, but I'm keeping it in check with ice and, sometimes, ibuprofen. I hear that some people benefit from tart cherry extract as a supplement, and I'm going to give that a try.

After a couple years of being injured, I'm grateful to be close to 100 percent. It feels good.

by Sean Gillies at March 27, 2026 12:52 AM

Sourcepole hat an der FOSSGIS 2026 in Göttingen verschiedene Themen mit Vorträgen abgedeckt:

  • Datenkataloge mit STAC und OGC API Records
  • Volltextsuche in Echtzeitdaten mit pg_search
  • Kollaboratives GIS mit Jupyter Notebooks und JupyterGIS
  • QGIS Web Client (QWC) – Neues aus dem Projekt

March 27, 2026 12:00 AM

March 26, 2026