Welcome to Planet OSGeo

April 03, 2025

April 02, 2025

As long time sponsors of FOSSGIS, we stepped up the game this year and became Platinum Sponsors for FOSSGIS 2025. We are proud to be part of a thriving open-source GIS community and to contribute to such a great conference. Here’s a recap of everything we were involved in:


🚀 Talks & Presentations

🌍 QField: New Strategy and Application Potential
Berit and Marco presented how QField, with over 1 million downloads and 350,000 active users, is now recognized as Digital Public Good aligned with the UN Sustainable Development Goals. Marco also shared the vision and mission behind QField’s development — highlighting our commitment to empowering field teams across the globe with open, user-friendly tools for data collection.
Real-world stories illustrated how QField helps bridge data gaps to support informed, sustainable decision-making.
👉 View talk

🌐 When Web Meets Desktop
Matthias demonstrated how Django can be used to build consumable geodata layers via OGC API – Features endpoints. His talk covered how to use Python and Django ORM to elegantly define data models and business logic, offering an alternative to complex database logic.
👉 View talk

☁ Extending QFieldCloud – Ideas and Practical Examples
Michael showed how QFieldCloud can be extended with Django apps, sharing practical implementations such as automated project generation and integration of remote sensing workflows.
👉 View talk

🔌 QField Plugins – Examples and Possibilities
In a lightning talk, Michael introduced useful QField plugins, explained how to install and use them, and explored how they can enhance your mobile GIS workflows.
👉 View talk

🧪 Hands-on qgis-js: Building Interactive QGIS-Based Web Maps
In this practical workshop, Michael guided participants through using qgis-js, an exciting new project that brings QGIS functionality directly into the browser.
👉 View talk

💬 QGIS AMA Expert Session
Matthias and Marco hosted a live Q&A session where attendees could ask everything about QGIS development, best practices, organisation and real-world applications.


🤝 At the Booth

Our QField booth was buzzing with activity all week – from plugin demos and project showcases to deep dives into QFieldCloud and field mapping workflows. We had great conversations, received valuable feedback, and met many enthusiastic users.


💚 Supporting Open Source

We were proud to be Platinum Sponsors of FOSSGIS 2025. Supporting open-source events like this is essential for fostering innovation, collaboration, and community-driven growth in the GIS world.


👋 Looking Ahead

Thank you to the organisers, speakers, and everyone who joined us in Münster. We left the event full of ideas, motivation, and appreciation for this community – and we’re already looking forward to the next FOSSGIS!

#QField #QFieldCloud #FOSSGIS2025 #OpenSourceGIS #QGIS #SupportOpenSource

by Marco Bernasocchi at April 02, 2025 05:53 AM

April 01, 2025

There are now nearly 3,000 Maps in the Wild on Mappery. If you are new to the site that may be a little overwhelming to find your way around our back catalogue (the editors speak English not American btw), but htere is a great way to browse our collection using a map of course! Who knew that a map was a great way to organise a massive collection of data? Go on, enjoy a few maps in the wild on our Mapping Maps in the Wild page, there’s enough there to keep you occupied for a few hours/days/weeks!

This Map in the Wild appeared a year or so ago, it still makes me smile and I wonder whether it is a real tatoo disaster or an April Fool’s joke

by Steven at April 01, 2025 10:45 AM

Matt Malone spotted this, he said “Babe…wake up, the new South Amerfrica globe just dropped!”

I though this could be a fun game on April Fools Day, sometimes known as Trump This Day. Loads of possibilities open up – Amerada, Ameriland, Ameranama, Amerikraine, Russikraine and so on. Post your global mix ups in the comments.

by Steven at April 01, 2025 09:00 AM

March 31, 2025

March 30, 2025

A productive week six is done!

  • 22.8 miles running

  • 11 hours, 39 minutes all training

  • 2,277 ft D+

That's not a lot of running, but it's the most I've done in a week since last July. I did two hill workouts outside on a 10% grade stretch of single track above Pineridge open space, Tuesday and Thursday. Today, Sunday, I did an easy long run from my house to the same dirt climb, and went up to the bench one time. My left Achilles, which has been nagging me, feels better. Weather permitting, I'll run 3-4 days next week, and increase my mileage to 25-26.

by Sean Gillies at March 30, 2025 09:18 PM

Raf shared this – cardboard cube containing 3 litres of Dolmens wine from wine making region Empordà in Catalunya by Celler Cooperatiu d’Espolla.

The title to this post is a little homage to “Box of Rain” by Phil Lesh of the Grateful Dead who passed a few months ago (a bit off track I know, but hey!)

by Steven at March 30, 2025 10:00 AM

March 29, 2025

At the end of yesterday’s TimeGPT for mobility post, we concluded that TimeGPT’s trainingset probably included a copy of the popular BikeNYC timeseries dataset and that, therefore, we were not looking at a fair comparison.

Naturally, it’s hard to find mobility timeseries datasets online that can be publicized but haven’t been widely disseminated and therefore may have slipped past the scrapers of foundation models builders.

So I scoured the Austrian open government data portal and came up with a bike-share dataset from Vienna.

Dataset

SharedMobility.ai dataset published by Philipp Naderer-Puiu, covering 2019-05-05 to 2019-12-31.

Here are eight of the 120 stations in the dataset. I’ve resampled the number of available bicycles to the maximum hourly value and made a cutoff mid August (before a larger data collection cap and the less busy autumn and winter seasons):

Models

To benchmark TimeGPT, I computed different baseline predictions. I used statsforecast’s HistoricAverage, SeasonalNaive, and AutoARIMA models and computed predictions for horizons of 1 hour, 12 hours, and 24 hours.

Here are examples of the 12-hour predictions:

We can see how Historic Average is pretty much a straight line of the average past value. A little more sophisticated, SeasonalNaive assumes that the future will be a repeat of the past (i.e. the previous day), which results in the shifted curve we can see in the above examples. Finally, there’s AutoARIMA which seems to do a better job than the first two models but also takes much longer to compute.

For comparison, here’s TimeGPT with 12 hours horizon:

You can find the full code in https://github.com/anitagraser/ST-ResNet/blob/570d8a1af4a10c7fb2230ccb2f203307703a9038/experiment.ipynb

Results

In the following table, you’ll find the best model highlighted in bold. Unsurprisingly, this best model is for the 1 hour horizon. The best models for 12 and 24 hours are marked in italics.

ModelHorizonRMSE
HistoricAverage17.0229
HistoricAverage127.0195
HistoricAverage247.0426
SeasonalNaive17.8703
SeasonalNaive127.7317
SeasonalNaive247.8703
AutoARIMA12.2639
AutoARIMA125.1505
AutoARIMA246.3881
TimeGPT12.3193
TimeGPT124.8383
TimeGPT245.6671

AutoARIMA and TimeGPT are pretty closely tied. Interestingly, the SeasonalNaive model performs even worse than the very simple HistoricAverage, which is an indication of the irregular nature of the observed phenomenon (probably caused by irregular restocking of stations, depending on the system operator’s decisions).

Conclusion & next steps

Overall, TimeGPT struggles much more with the longer horizons than in the previous BikeNYC experiment. The error more than doubled between the 1 hour and 12 hours prediction. TimeGPT’s prediction quality barely out-competes AutoARIMA’s for 12 and 24 hours.

I’m tempted to test AutoARIMA for the BikeNYC dataset to further complete this picture.

Of course, the SharedMobility.ai dataset has been online for a while, so I cannot be completely sure that we now have a fair comparison. For that, we would need a completely new / previously unpublished dataset.

by underdark at March 29, 2025 09:38 PM

March 28, 2025

The problem

GRASS GIS offers powerful tools for working with temporal data. You can create space-time raster or vector datasets, and register these in a temporal database that’s automatically managed by GRASS. A key feature of this temporal framework is that the temporal database is mapset-specific. So, space-time datasets and registered time series maps in a mapset are stored in a temporal database inside the same mapset.

The way GRASS handles spatial data is intuitive and powerful. Yet, I ran into a problem after I renamed a mapset. As it turns out, the mapset name is integral part of how temporal data sets and data layers are registered in the temporal database. And changing the mapset name doesn’t automatically update those references. So renaming the mapset rendered my space-time datasets inaccessible. As far as I could tell, there’s no built-in mechanism in GRASS to resolve this.

A possible solution

By default, GRASS stores the temporal database as a SQLite3 file located in the tgis folder inside the mapset. This means that, in principle, you could manually open that database and replace all references to the old mapset name with the new mapset name.

Caution

It is generally not advisable to make any manual changes to a GRASS database. Only do this when you are really sure what you are doing, and always make a backup first.

Still, I decided to give it a go. Rather than modifying the SQLite database directly, I opted for a safer approach. I dumped the contents of the database to a text file, made the changes there, and then restored the database from the modified dump.

First step, obviously, is to make a backup of the SQLite file. Next, I exported the entire SQLite database using the .dump command. This creates a text-based SQL script of the database.

> cd path_to_the_temporal_db/sqlite.db
> cp path_to_the_temporal_db/sqlite.db backup-location/sqlite_backup.db
> sqlite3 sqlite.db
> .output temp_dump.sqlite
> .dump
> .exit

I then opened the temp_dump.sql file in a text editor and used a simple search-and-replace to update all occurrences of the old mapset name to the new one. Finally, I recreate the SQLite database using the .restore function with the updated temp_dump.sql file as input.

> cd path_to_the_temporal_db/sqlite.db
> sqlite sqlite.db
> .read temp_dump.sqlite
> .exit

The result, all the space-time data sets are available again from within GRASS :-).

Wrapping it up

To make this repeatable (and reduce the chance of messing up manual steps), I wrapped the process in a simple Python script. The script backs up the database, dumps its contents, performs the replacement, and restores the modified version. You can optionally specify a backup location, but if you don’t, it will create the backup in the same folder.

import sqlite3
import subprocess
import os
import shutil


def replace_string_in_qlite(input_db, old_string, new_string, backup_name=None):
    # Step 1: Define backup path
    if not backup_name:
        backup_name = input_db + ".backup"

    # Step 2: Create backup
    if os.path.exists(backup_name):
        raise FileExistsError(
            f"Backup file '{backup_name}' already exists. Aborting to prevent overwrite."
        )

    shutil.move(input_db, backup_name)
    print(f"Original database backed up to: {backup_name}")

    # Step 3: Dump the SQL from backup DB
    dump_file = "temp_dump.sql"
    with open(dump_file, "w", encoding="utf-8") as f:
        subprocess.run(["sqlite3", backup_name, ".dump"], stdout=f)

    # Step 4: Read, modify, and write the dump
    with open(dump_file, "r", encoding="utf-8") as f:
        sql_content = f.read()

    modified_sql = sql_content.replace(old_string, new_string)

    with open(dump_file, "w", encoding="utf-8") as f:
        f.write(modified_sql)

    # Step 5: Restore the modified dump to the original filename
    with open(dump_file, "r", encoding="utf-8") as f:
        conn = sqlite3.connect(input_db)
        cursor = conn.cursor()
        cursor.executescript(f.read())
        conn.commit()
        conn.close()

    os.remove(dump_file)
    print(
        f"Replaced '{old_string}' with '{new_string}' and saved new DB as: {input_db}"
    )

As an example, suppose I have a GRASS database with a project called Climate, and inside it, a mapset named Bioclim. After renaming the mapset to bioclim_variables, the space-time datasets become inaccessible. Running the script solves that:

db2replace = "/home/paulo/GRASSdb/Climate/tgis/sqlite.db"
db2backup = "/home/paulo/Desktop/sqlite_backup.db"
replace_string_in_qlite(db2replace, "Bioclim", "bioclim_variables")

Crisis averted, and, as a bonus, this little exercise has given me a little bit better understanding of how GRASS handles spatial and temporal data under the hood.

That said, as mentioned earlier, directly modifying the GRASS database is generally discouraged. So, as a disclaimer, this post is mostly a note to my future self. You’re welcome to use it, but do so at your own risk! And, if you know a better way, or if I overlooked a standard way to deal with this in GRASS, please let me know.

by Paulo van Breugel at March 28, 2025 11:00 PM

tldr; Maybe. Preliminary results certainly are impressive.

Introduction

Crowd and flow predictions have been very popular topics in mobility data science. Traditional forecasting methods rely on classic machine learning models like ARIMA, later followed by deep learning approaches such as ST-ResNet.

More recently, foundation models for timeseries forecasting, such as TimeGPT, Chronos, and LagLlama have been introduced. A key advantage of these models is their ability to generate zero-shot predictions — meaning that they can be applied directly to new tasks without requiring retraining for each scenario.

In this post, I want to compare TimeGPT’s performance against traditional approaches for predicting city-wide crowd flows.

Experiment setup

The experiment builds on the paper “Deep Spatio-Temporal Residual Networks for Citywide Crowd Flows Prediction” by Zhang et al. (2017). The original repo referenced on the homepage does not exist anymore. Therefore, I forked: https://github.com/topazape/ST-ResNet as a starting point.

The goals of this experiment are to:

  1. Get an impression how TimeGPT predicts mobility timeseries.
  2. Compare TimeGPT to classic machine learning (ML) and deep learning (DL) models.
  3. Understand how different forecasting horizons impact predictive accuracy.

The paper presents results for two datasets (TaxiBJ and BikeNYC). The following experiment only covers BikeNYC.

You can find the full notebook at https://github.com/anitagraser/ST-ResNet/blob/079948bfbab2d512b71abc0b1aa4b09b9de94f35/experiment.ipynb

First attempt

In the first version, I applied TimeGPT’s historical forecast function to generate flow predictions. However, there was an issue: the built-in historic forecast function ignores the horizon parameter, thus making it impossible to control the horizon and make a fair comparison.

Refinements

In the second version, I therefore added backtesting with customizable forecast horizon to evaluate TimeGPT’s forecasts over multiple time windows.

To reproduce the original experiments as truthfully as possible, both inflows and outflows were included in the experiments.

I ran TimeGPT for different forecasting horizons: 1 hour, 12 hours, and 24 hours. (In the original paper (Zhang et al. 2017), only one-step-ahead (1 hour) forecasting is performed but it is interesting to explore the effects of the additional challenge resulting from longer forecast horizons.) Here’s an example of the 24-hour forecast:

The predictions pick up on the overall daily patterns but the peaks are certainly hit-and-miss.

For comparison, here are some results for the easier 1-hour forecast:

Not bad. Let’s run the numbers! (And by that I mean: let’s measure the error.)

Results 

The original paper provides results (RMSE, i.e. smaller is better) for multiple traditional ML models and DL models. Addition our experiments to these results, we get:

ModelRMSE
ARIMA10.56
SARIMA10.07
VAR9.92
DeepST-C8.39
DeepST-CP7.64
DeepST-CPT7.56
DeepST-CPTM7.43
ST-ResNet6.33
TimeGPT (horizon=1)5.70
TimeGPT (horizon=12)7.62
TimeGPT (horizon=24)8.93

Key takeaways

  • TimeGPT with a 1 hour horizon outperforms all ML and DL models.
  • For longer horizons, TimeGPT’s accuracy declines but remains competitive with DL approaches.
  • TimeGPT’s pre-trained nature made means that we can immediately make predictions without any prior training. 

Conclusion & next steps

These preliminary results suggest that timeseries foundation models, such as TimeGPT, are a promising tool. However, a key limitation of the presented experiment remains: since BikeNYC data has been public for a long time, it is well possible that TimeGPT has seen this dataset during its training. This raises questions about how well it generalizes to truly unseen datasets. To address this, the logical next step would be to test TimeGPT and other foundation models on an entirely new dataset to better evaluate its robustness.

We also know that DL model performance can be improved by providing more training data. It is therefore reasonable to assume that specialized DL models will outperform foundation models once they are trained with enough data. But in the absence of large-enough training datasets, foundation models can be an option.

In recent literature, we also find more specific foundation models for spatiotemporal prediction, such as UrbanGPT https://arxiv.org/abs/2403.00813, UniST https://arxiv.org/abs/2402.11838, and UrbanDiT https://arxiv.org/pdf/2411.12164. However, as far as I can tell, none of them have published the model weights.

If you want to join forces, e.g. add more datasets or test other timeseries foundation models, don’t hesitate to reach out.

by underdark at March 28, 2025 10:59 PM

March 27, 2025

March 26, 2025

It never ceases to make me smile at how many bottles of wine or beeer end up here as maps in the wild. This ones comes via our friend Raf in Barcelona.

“Terra Seca is a red wine from Terra del Priorat cellar, made with garnatxa negra and carinyena, in a bottle dressed with contour lines”

by Steven at March 26, 2025 10:00 AM

My week 5 was a light one. I did some indoor workouts early in the week, some telemark skiing on Friday, and then short and easy trail runs Saturday and Sunday. Saturday's was my first run above 8000 ft elevation this season, on some very nice trails outside Nederland, Colorado. Here are the numbers, not including my skiing, which I didn't record.

  • 8.5 miles running

  • 5 hours, 6 minutes all training

  • 761 ft D+ running

https://live.staticflickr.com/65535/54411148790_ed99e98cf8_b.jpg

A sign next to a gravel trail through pine trees

by Sean Gillies at March 26, 2025 03:02 AM

March 25, 2025

Dear OTB community,We are happy to announce that OTB version 9.1.1 has been released!Ready to use binary packages are available on the package page of the website: The Docker images are available with different python versions (3.8 by default, 3.10 for 9.1.1_ubuntu22, 3.12 for 9.1.1_ubuntu24) : docker pull orfeotoolbox/otb:9.1.1 It is also possible to checkout […]

by Tristan Laurent at March 25, 2025 10:50 AM

The Security project for QGIS” is now public ! Pledge now !

The goal of this project is to mutualize funding to improve QGIS security to the highest levels.

Oslandia and other involved partners, especially OPENGIS.ch are OpenSource “pure players” and main contributors to QGIS. This project is an initiative by Oslandia and is endorsed by the QGIS.org association. We work closely with the community of developers, users and stakeholders of QGIS. This project involves QGIS core committers willing to advance QGIS security.

Global context

New regulations like NIS2 and CRA in Europe, as well as other international or local regulations will be activated within the next couple of years. They require software and software producers to improve their cybersecurity practices. OpenSource softwares, while usually having a special treatment, are concerned too. Estimated costs of CRA impact on an opensource project amounts to +30%.

As for QGIS, we consider that the project stays behind what would be sufficient to comply with these regulations. We also do not fulfill requirements coming from our end-users, in terms of overall software quality regarding security, processes in place to ensure trust in the supply chain, and overall security culture in the project.

We have been discussing this topic with clients having large deployments of QGIS and QGIS server, and they stressed the issue, stating that cybersecurity is one of their primary concerns, and that they are willing to see the QGIS project move forward in this area as soon as possible. QGIS faces the risk of IT departments blocking QGIS installations if they consider the project not having enough consideration for security.

Also, requests to security@qgis.org have grown significantly.

Project goals

Oslandia, with other partners and backed by clients and end-users, launch the “Security project for QGIS” : we identified key topics where security improvements can be achieved, classified them, and created work packages to work on, with budget estimations.

  • The main goal is simple : raise the cybersecurity level for the QGIS project
  • Fulfill cybersecurity requirements from regulations and end-users
  • Make QGIS an example of security-aware OpenSource project, helping other OSGeo projects to improve

While QGIS and QGIS server are the main components on which this project focus, improving QGIS security as a whole also needs to consider underlying libraries ( e.g. GDAL/OGR, PROJ, GEOS…).

This project is a specific effort to raise the level of security of QGIS. Maintaining security in the long term will need further efforts, and we encourage you to sponsor QGIS.org, becoming a sustaining member of QGIS.

Memory safety, signing binaries, supply chain management, contributing processes, plugin security, cybersecurity audits and much more topics are included in this project. You can see all items as well as work packages on the dedicated website :

https://security.qgis.oslandia.com

Project organization – Pledge !

Any organization interested in improving QGIS security can contribute to funding the project. We are looking for an estimated total amount of 670K€, divided into 3 work packages ➡ Pledge now !

Once funded, Oslandia and partners will start working on Work Package 1 in 2025. We intend to work closely with the QGIS community, QGIS.org, interested partners and users. Part of the work are improvements over the current system, other require changes to processes or developer’s habits. Working closely with the user and developer’s community to raise our security awareness is fully part of the project.

We will deliver improvements in 2025 and until 2027. You can see the full list of topics, work packages and estimated budget on the project’s dedicated page : security.qgis.oslandia.com . You are invited to participate, but also to help spread the word and recruit other contributors !

We want to especially thank Orange France for being a long-time supporter of OpenSource in general and QGIS particularly, and the first backer of the Security Project for QGIS !

Should you have any question, or need further material to convince other stakeholders, get in touch !

by Vincent Picavet at March 25, 2025 09:21 AM

This year, OPENGIS.ch celebrated its 10th anniversary in Bern, with an afternoon full of workshops attended by clients, long-term friends, and colleagues. Here, we will give a glimpse of QField: its vision, where it’s headed, and the exciting features future users can look forward to.

QField was created on June 8th, 2011, with its first commit titled “added first script”. Since then, it has grown into a powerful tool with a clear vision for the future: to empower people to map and understand the world, tackle daily challenges, and address global issues. Over the next ten years, QField aims to make this vision a reality for everyone, everywhere.

Layers of cake, designed by QField ecosystem’s leading team members, were explained with 2034 in mind: intuitive and accessible to anyone wanting to map our world, while pioneering an innovative and collaborative app for the geospatial community. And last but not least: building strong and engaged communities to drive further adoption of the QField ecosystem.

This vision is taking shape through the forging of strategic partnerships with geospatial stakeholders: hardware manufacturers, ambassadors, trainers and technological partners. For the team, it is clear that good collaboration is key to building a healthy and sustainable ecosystem. Community as well as financial sustainability can become a strong reality with user groups, sponsors and crowdfundings.

And then it became reality: In the summer of 2024, heavy rains caused severe flooding in Switzerland and, suddenly, QField became a vital tool for supporting emergency response through data surveys and photo documentation.

With this emotional story, the technical lead, Mathieu took over and shared other QField success stories and several mapping use cases with partners in Finland and Tonga. But seamless fieldwork wouldn’t be possible without QFieldCloud, so Ivan provided an insight into the last years’ QFieldCloud enhancements, before diving into the busy server-side roadmap for 2025, which includes many new features related to the authentication, security, internationalization and performance. Finally, Zsanett shared QField product news and updates, including new storage capacities like WebDAV and new packaging capabilities. Last but not least, the new Fangorn version introduces new features developed by the evolving QField Community.

Building communities by sharing thoughts and ideas for the ecosystem is now possible through the ideas.qfield.cloud platform, open to everybody to suggest new ideas for QField.

The final topping of the (layer) cake: the new QField plugin framework was presented to the workshop attendees by Mathieu, who explained how the field workflow can be enhanced and optimized through the development of plugins – unique extensions that further personalize QField. For example, with the Routing Plugin, users can compute optimal ways between locations directly in QField using an external API. To make team efforts in the field even more efficient, the Live Location Plugin allows each team member to see the location of other members on the QField map, preventing debilification in the field.

Last but not least, the workshop ended with a Q&A session, where several topics were addressed such as virtual reality, AI, machine learning, etc. This was followed by a happy and cheerful welcome drink with OPENGIS.ch partners. 🍻

by Anja Ottiger at March 25, 2025 09:00 AM

March 24, 2025

Yves van Goethem shared this a while ago.

“I love @openstreetmap. I have heavily relied on it for all my bike trips & long travels, I have modestly contributed to it over the years & I have used tons of apps for navigation, layers, drinking water, etc. Never would I have guessed that one day I would design a PLAY MAT with OSM data and cartoonist layers and objects. Well here it is, a play mat of my hometown.”

by Steven at March 24, 2025 10:00 AM

March 23, 2025

This post is about recent updates to the USGS Spectral Library Data Download.

Recently the USGS has removed the previous Spectral Library download service and the new download information is at https://www.usgs.gov/media/files/usgs-spectral-library-data-download-information-docx , which describes the download of the whole archive as a .zip file.

Consequently, the Semi-Automatic Classification Plugin tool to Download USGS Spectral Library can't work.

However, it is still possible to import the USGS libraries using the Import tool.
Following the main steps to import a library:
  1. download the main archive ASCIIdata_splib07a.zip from https://www.sciencebase.gov/catalog/item/586e8c88e4b0f5ce109fccae
  2. create a directory and move one spectral library (name ending with _AREF.txt) and the wavelengths file (file name splib07a_Wavelengths_ASD_0.35-2.5_microns_2151_ch.txt) in it
  3. create a .zip file of this directory which can be imported in SCP.

For any comment or question, join the Facebook group or GitHub discussions about the Semi-Automatic Classification Plugin.

by Luca Congedo (noreply@blogger.com) at March 23, 2025 12:36 PM

March 22, 2025

March 21, 2025

Description

The QGIS graphical modeler makes it straightforward to link together different operations and algorithms from the toolbox, allowing you to build custom models for a variety of spatial tasks. One practical example is creating a model to compare two raster layers by displaying their values in a scatter plot.

I developed this function while teaching a course on spatial multi-criteria decision analysis (MCDA)—part of the second-year Naturally Geographic program at the HAS green academy. In MCDA, we often need to compare indicators, such as the distance to a wind turbine versus the cost of the land. Because each indicator may have different units, it’s helpful to normalize them onto a common scale (e.g., 0–1 or 0–100). In QGIS, you can do this easily using the Fuzzify functions.

Once layers are rescaled, I like to compare the original and transformed versions in a scatter plot. While there isn’t a built-in tool specifically for this, you can simply create a point layer, extract each raster’s values at those points, and use the Vector layer scatterplot function for visualization. Although this manual approach is straightforward, I wanted to automate the intermediate steps. Hence the custom model, presented below.

Example

Use the Create random raster layer algorithm to generate a raster with random values (e.g., between 0 and 100) in a specified extent and cell size.

Run the Fuzzify raster (Gaussian membership) function on the random raster. This will produce a new layer with values between 0 and 1, based on a Gaussian fuzzy membership curve.

Grab the model from GitHub and open it in QGIS. Provide your two rasters as inputs (e.g., the original random raster and its fuzzified version).

After running the model, an HTML file (e.g., random_gaussian.html) will be generated, displaying a scatter plot. The X-axis shows the original raster values (0–100), while the Y-axis shows the corresponding fuzzified values (0–1). You’ll notice a smooth, Gaussian-like shape that illustrates the relationship between the original and transformed raster layers.

A simple solution if you want to quickly compare the values of two raster layers :-).

by Paulo van Breugel at March 21, 2025 11:00 PM

Location: Remote, preferably with at least 4h overlap to CEST office hours

Employment Type: Full-time (80-100%)

About OPENGIS.ch:

OPENGIS.ch is a team of Full-Stack GeoNinjas offering personalized open-source geodata solutions to Swiss and international clients. We are dedicated to using and developing open-source tools, providing flexibility, scalability, and future-proof solutions, and playing a key role in the free and open-source geospatial community. We pride ourselves on our agile and distributed nature, which allows us to have a motivated and multicultural team that supports each other in working together.

Job Description:

We are seeking a passionate and skilled Django Full-Stack Engineer with a strong affinity for DevOps to join our team. The ideal candidate will work primarily on QFieldCloud, our cutting-edge cloud-based solution that brings QGIS projects to the field. You will help develop and maintain the full stack of the QFieldCloud platform, ensuring high performance and stability and implementing new features.

Responsibilities:

  • Develop, test, and maintain the QFieldCloud platform using Django, Python, PostgreSQL and other modern web technologies.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Ensure the performance, quality, and responsiveness of the application.
  • Identify and correct bottlenecks and fix bugs.
  • Help maintain code quality, organization, and automation.
  • Work closely with the DevOps team to manage and optimize deployment pipelines, including Docker, Kubernetes, and other containerization and orchestration technologies.
  • Provide technical guidance and support to clients regarding deployment and usage of the platform.

Qualifications:

  • Strong experience with Django and Python in a full-stack capacity.
  • Proficiency in front-end technologies, including JavaScript, HTML5, and CSS3.
  • Experience with Linux, Docker (compose), K8s, Git, and PostgreSQL.
  • Familiarity with geospatial concepts and web GIS applications is a plus.
  • Good understanding of software deployment, containerization, and continuous integration practices.
  • Excellent problem-solving skills and ability to work independently.
  • Strong communication skills and ability to work in a distributed team environment.
  • Fluency in English; knowledge of German, French, Italian, Spanish, or Romansh is a plus.

Perks:

At OPENGIS.ch, we enjoy a variety of perks that make our work experience rewarding. Here’s what we get:

  • Flexible Work Hours: We have the freedom to set our own schedules, which helps us better manage our personal and professional lives.
  • Remote Work Opportunities: We can work from anywhere, giving us the flexibility to choose our work environment.
  • Learning and Development: We are encouraged to grow professionally with access to training programs and workshops.
  • Innovative Environment: We thrive in an atmosphere that’s at the forefront of GIS technology, which keeps our work exciting.
  • Collaborative Team: We value teamwork and the exchange of ideas, making our workplace dynamic and supportive.

Questions for Applicants:

  • What’s your experience with software deployment and containers?
  • What is your favourite Django app? Why? Have you ever upstreamed a patch in Django or an app? if so, please provide a link to the pull request.
  • What did you last learn out of interest?

How to Apply:

If you are excited about this opportunity and meet the qualifications, please submit an application at opengis.ch/jobs

Join us at OPENGIS.ch and become a part of our mission to provide innovative open-source geospatial solutions! 🌍💻🚀

by Marco Bernasocchi at March 21, 2025 04:06 PM

March 20, 2025

Die FOSSGIS-Konferenz 2025 findet vom 26.-29. März 2025 im Schloss Münster und Online statt. Es sind nur noch wenige Tage bis zur Konferenz. Der Countdown läuft und die Vorbereitungen laufen auf Hochtouren! Die Konferenz wird vom gemeinnützigen FOSSGIS e.V, der OpenStreetMap Community in Kooperation mit dem Institut für Geoinformatik der Universität Münster organisiert.

Auch in diesem Jahr freuen wir uns über ein großes Interesse an der Konferenz. Die Tausender-Marke wird erneut geknackt. Es werden 750 Teilnehmende vor Ort in Münster erwartet und über 250 Teilnehmende schalten sich Online dazu.

FOSSGIS Konferenz 2025 Schloss Münster

Noch kein Ticket?

Onlinetickets sind weiterhin verfügbar unter https://www.fossgis-konferenz.de/2025/anmeldung/

FOSSGIS 2025 Programm

Das FOSSGIS Team freut sich auch in diesem Jahr auf ein spannendes und reichhaltiges Programm mit zahlreichen Vorträgen, ExpertInnenfragestunden, Demosessions, BoFs und Anwendertreffen und sowie 21 Workshops.

https://www.fossgis-konferenz.de/2025/programm/

In den Workshops sind noch Plätze frei. Buchen Sie gerne noch einen Workshop und nutzen Sie die Chance in kurzer Zeit Wissen zu einem Thema aufzubauen.

FOSSGIS vernetzt

Rund um die Konferenz gibt es zahlreiche Möglichkeiten sich zu vernetzen. Hier sind die Anwendertreffen (Onlineteilnahme ist möglich) zu nennen, aber auch die Abendveranstaltungen und der OSM Samstag und der Community Sprint am Samstag.

https://www.fossgis-konferenz.de/2025/socialevents/

Jobbörse

Nutzen Sie die Jobbörse rund um die Konferenz https://www.fossgis-konferenz.de/2025#Jobwand

FOSSGIS - ein Teamevent

Herzlichen Dank schon an dieser Stelle an die Sponsoren der Konferenz. die durch Ihre Unterstützung maßgeblich zum Gelingen der Veranstaltung beitragen.

FOSSGIS Konferenz 2025 Sponsoren

Auch ohne den Einsatz der zahlreichen ehrenamtlichen HelferInnen wäre die Konferenz nicht möglich. Herzlichen Dank dafür!

Archiv FOSSGIS-Konferenzen

Im FOSSGIS-Archiv finden Sie spannende Beiträge der letzten Konferenzen. https://fossgis-konferenz.de/liste.html

Informiert rund um die Konferenz

Informationen rund um die FOSSGIS finden sich unter dem Hashtag #FOSSGIS2025.

Das FOSSGIS Team 2025 wünscht eine gute Anreise und freut sich auf eine spannende Konferenz in Münster

March 20, 2025 12:00 AM

We are excited to announce the release of EOxCloudless 2024, our latest update to the Sentinel-2 cloudless imagery collection. This year's dataset brings enhanced clarity, improved consistency, and a more seamless visual experience for all your mapping and geospatial needs. Accessing Sentinel-2 dat ...

March 20, 2025 12:00 AM

March 19, 2025

March 18, 2025

Week 4 was light on trail running, because I'm letting my irritated left Achilles tendon settle down, but it was still a pretty good training week. I logged 34 minutes of max intensity intervals on an elliptical trainer, a new weekly high for me. I also did the usual yoga, weight training, pool exercise, and some biking.

  • 4.5 miles running

  • 10 hours, 3 minutes all training

  • 36 ft D+ running

The highlight was a long ride along creeks and rivers in Fort Collins and through the valley trails of Lory State Park. I'm getting more comfortable on the steep road descents between my house and Horsetooth Reservoir and am feeling more fit on the steepest climbs. I didn't set any records climbing up from the Blue Sky trailhead or up Centennial from the reservoir, but I was able to stay below my aerobic threshold. Last fall I was blowing apart on the same climbs.

https://live.staticflickr.com/65535/54394002276_d7935aedaf_b.jpg

A blue gravel bike next to the orange colored dirt singletrack of Colorado's Lory State Park.

by Sean Gillies at March 18, 2025 03:23 AM

March 17, 2025