Welcome to Planet OSGeo

July 03, 2022

July 02, 2022

I had leftover ground beef and lamb from making burgers last night and turned that into no-recipe Korean-style sloppy joes for lunch on the deck with Ruth and Bea. I started by making a generous sofrito of onion, celery, yellow bell pepper, garlic, and ginger in a cast-iron skillet. When that had softened and started to color, I pushed it to one side and browned the meat (half a pound of beef, half a pound of lamb), stirring all of it together when the meat had cooked through and crisped a bit. For the sauce, I stirred in three tablespoons of vegetarian oyster sauce, a splash of black vinegar, a teaspoon of sesame oil, a quarter cup of gochujang, a teaspoon of salt, a tablespoon of sugar, half a teaspoon of black pepper, and half a cup of water. When it had reduced, I checked the salt, added a bit more, and then spooned it onto toasted buns. Sesame seeds, green onion (from my garden), and cucumbers dressed with rice wine vinegar provided the finishing touch.

https://live.staticflickr.com/65535/52189042076_522108e18e_b.jpg

Sloppy joe on a toasted brioche bun with quick pickles and watermelon

I like the traditional sloppy joe, too. But I like it even more with ginger and sesame oil. The mild, sweet funkiness of gochujang goes very well with lamb.

by Sean Gillies at July 02, 2022 08:33 PM

July 01, 2022

The GeoTools team is pleased to share the availability  GeoTools 26.5 : geotools-26.5-bin.zip geotools-26.5-doc.zip geotools-26.5-userguide.zip geotools-26.5-project.zip This release is also available from the  OSGeo Maven Repository and is made in conjunction with GeoServer 2.20.5Fixes and improvementsGrid aggregation in the ElasticSearch community module has been improvedFix aggregate queries

by Andrea Aime (noreply@blogger.com) at July 01, 2022 09:34 AM

We are happy to announce GeoServer 2.20.5 release is available with downloads (bin, war, windows), along with docs and extensions.

This is a maintenance release of the 2.20.x series recommended for production systems. This release was made in conjunction with GeoTools 26.5 and GeoWebCache 1.20.3.

Improvements and Fixes

  • The request logger is now configurable from the UI (form the “Global settings” panel),
  • Importer improvements to support REPLACE mode on raster layers (in addition to the existing support for vector ones).
  • The KML-PPIO module has graduated to extension (allows KML encoding of feature collections in WPS processes). It’s now included in the WPS plugin download.
  • WPS fetching of remote inputs can be disabled.
  • Allow controlling usage of headers in proxy base URL expansion at the workspace level.

For the full list of fixes and improvements, see 2.20.5 release notes.

About GeoServer 2.20

Additional information on GeoServer 2.20 series:

Release notes: ( 2.20.2 | 2.20.1 | 2.20.0 | 2.20-RC )

by Andrea Aime at July 01, 2022 12:00 AM

June 29, 2022

Today marks the 18th anniversary of IOSA, Internet and Open Source in Archaeology, the project/working group that I got started with Giovanni Luca Pesce in 2004. Luca Bianconi would join a few years later and give a substantial contribution to the development of the oldest active software project under the IOSA umbrella, Total Open Station.

It seems an appropriate time to announce that the iosa.it website is now available with a revamped look, and has become the single container for all content previously available in separate websites, such as the Quantitative Archaeology Wiki.

There will be no substantial improvements to the website, but I consider it a “living archive” so I’m going to add more content as I find it and have the time to organize it properly.

Please find it at https://www.iosa.it/ as usual, and browse like it’s 2004 again.

by Stefano Costa at June 29, 2022 10:41 AM

This article is part of our series on Delimitation of hydrographic sub-basins using QGIS and a DEM:

  1. Delimitation of hydrographic sub-basins using QGIS and a DEM
  2. Digital elevation models for hydrological studies
  3. Technical criteria for the delimitation of sub-basins (This post)
  4. Sub-basin delimitation process using QGIS

To delimit the hydrographic sub-basins of a country, it is necessary to establish a series of criteria that allow uniform results to be obtained throughout the study area.

Basic criteria

In this case, the following basic criteria were agreed with the Regional Water Administrations (ARAs):

  • Sub-basins are not delimited in basins with an area of less than 3500 km2.
  • Only the areas of the direct tributaries of the main river, which have a length equal to or greater than 55 km, are delimited as sub-basins.
  • Only the areas of the direct tributaries of the main river that have an area equal to or greater than 300 km2are delimited as sub-basins.

Other criteria

In addition to the sub-basins resulting from the application of the basic criteria, the following areas were delimited for each river basin:

  • Source area of the main river. In the case of transboundary rivers, that is, rivers that cross two or more states, the source may be outside the study area. In the case of Mozambique, there are several examples in which the source of the main river of the river basin is in another country (Zambezi, Limpopo, Incomati…). In these cases this area was not delimited.
  • Area of the mouth of the main river.
  • Area that includes the rest of the sub-basins, which do not meet the basic criteria, normally attached to the bed of the main river.

Exceptions

It was decided to delimit some sub-basins that did not meet the previous basic criteria and that we will call exceptions because they are sub-basins considered of great hydrological importance by the ARAs. An example of this is the delimitation of the sub-basins of the Umbelúzi River. Although the area of this basin is less than 3500 km2, it is of great hydrological importance because it contains a high percentage of large farms, with a very high demand for water.

In the next article, the last of the series, we will see how to do the whole process using QGIS.

La entrada Technical criteria for the delimitation of sub-basins se publicó primero en iCarto.

by iCarto at June 29, 2022 08:17 AM

June 28, 2022

June 27, 2022

Week thirteen was my biggest week yet. I did a non-hilly speed workout on the dirt trails of Pineridge Natural Area, an uphill workout on Towers Trail, a tempo run at Maxwell, and two long runs on the weekend. I didn't feel great Friday, Saturday, or Sunday, but exceptionally nice running weather helped me survive the weekend miles.

  • 12 hours, 15 minutes

  • 57.0 miles

  • 7,664 ft D+

I didn't take any photos during my runs this week. I'll try to do better this week.

by Sean Gillies at June 27, 2022 03:56 AM

June 24, 2022

We are pleased to announce the release of QGIS 3.26 ‘Buenos Aires’!

Installers for all supported operating systems are already out. QGIS 3.26 comes with tons of new features, as you can see in our visual changelog. QGIS 3.26 Buenos Aires is named after last year’s FOSS4G host city.

We would like to thank the developers, documenters, testers and all the many folks out there who volunteer their time and effort (or fund people to do so). From the QGIS community we hope you enjoy this release! If you wish to donate time, money or otherwise get involved in making QGIS more awesome, please wander along to qgis.org and lend a hand!

QGIS is supported by donors and sustaining members. A current list of donors who have made financial contributions large and small to the project can be seen on our donors list. If you would like to become a sustaining member, please visit our page for sustaining members for details. Your support helps us fund our six monthly developer meetings, maintain project infrastructure and fund bug fixing efforts.

QGIS is Free software and you are under no obligation to pay anything to use it – in fact we want to encourage people far and wide to use it regardless of what your financial or social status is – we believe empowering people with spatial decision making tools will result in a better society for all of humanity.

by underdark at June 24, 2022 01:40 PM

June 23, 2022

I have a blog post up today at Crunchy Data on some of the mechanisms that underlie the PostgreSQL query planner, it’s pretty good if I do say so myself.

I was motivated to write it by a conversation over coffee with my colleague Martin Davis. We were talking about a customer with an odd query plan case and I was explaining how the spatial statistics system worked and he said “you should do that up as a blog post”. And, yeah, I should.

One of the things that is striking as you follow the PostgreSQL development community is the extent to which a fairly mature piece of technology like PostgreSQL is stacks of optimizations on top of optimizations on top of optimizations. Building and executing query plans involves so many different paths of execution, that there’s always a new, niche use case to address and improve.

I worked a political campaign a few years ago as a “data science” staffer, and our main problem was stitching together data from multiple systems to get a holistic view of our data.

That meant doing cross-system joins.

The first cut is always easy: pull a few records out of System A with a filter condition and then go to System B and pull the associated records. But then inevitably a new filter condition shows up and applied to A it generates so many records that the association step on B gets overloaded. But it turns out if I start from B and then associate in A it’s fast again.

And thus suddenly I found myself writing a query planner and executor.

It’s only when dumped into the soup of having to solve these problems yourself that you really appreciate the magic that is a mature relational database system. The idea that PostgreSQL can take a query that involves multiple tables of different sizes, with different join cardinalities, and different indexes and figure out an optimal plan in a few milliseconds, and then execute that plan in a streaming, memory efficient way…?

Magic is really the best word I’ve found.

June 23, 2022 08:00 AM

June 22, 2022

This article is part of our series on Delimitation of hydrographic sub-basins using QGIS and a DEM:

  1. Delimitation of hydrographic sub-basins using QGIS and a DEM
  2. Digital elevation models for hydrological studies (This post)
  3. Technical criteria for the delimitation of sub-basins
  4. Sub-basin delimitation process using QGIS

A digital elevation model (DEM) is a visual and mathematical representation that describes the altimetry of an area through a set of terrain elevations. The vegetation and infrastructures created by man (buildings, bridges, power lines, etc.) are not included, it only represents the relief of the ground.

DEMs are particularly useful in hydrological modeling (delimitation of basins, calculations of flow accumulations, flow directions), soil mapping and territorial planning.

There are currently quite a few sources that allow you to download DEMs with worldwide coverage for specific areas for free and without restrictions on use. The availability, updating and improvement of quality and precision of the DEMs represent a great contribution to the studies.

We do not analyze in this article other options subject to the payment of a license for their download and/or use.

Global DEM

Some of the most widely used global DEMs for hydrological analysis worldwide and freely available are the following:

  • SRTMGL1 v003 (1 arc-second, 30 meters) and SRTMGL3 v003 (3 arc-secondar, 90 meters): The Shuttle Radar Topography Mission (SRTM) data sets are the result of collaboration between NASA and the National Geospatial-Intelligence Agency (NGA), with the participation of the German and Italian space agencies. The purpose of SRTM was to generate a digital elevation model. These MDEs are the best known and used globally. Version 3 is derived from the SRTM v2 data, but removes the gaps that were present in previous versions of the SRTM data. The gaps are filled by ASTER Global Digital Elevation Model (GDEM) Version 2.0, Global Multi-solution Terrain Elevation Data 2010 (GMTED2010), and National Elevation Dataset (NED). It offers resolutions of 1 and 3 seconds of arc, or what is the same resolutions between 30 and 90 meters.
  • NASADEM HGT v001(1 arc-second, 30 meters): The resolution of NASADEM is found in 1 degree of arc (just over 110 pixel kilometers) along the earth’s surface between 60º North and 56º South latitude, or what is the same as 30 meters of spatial resolution. This DEM is derived from SRTM data, fine-tuned with ASTER GDEM v2 data, incorporating GLAS ground control points, and developing additional backscatter and radiometric correction layers. Other enhancements include the use of GDEM and PRISM AW3D30 DEM, and gap fill interpolation.
  • ASTER GDEM V3ASTER GDEM Version 3 data products (1 arc-second, 30 meters) offer substantial improvements in coverage and reduction in the appearance of artifacts. It also provides improved spatial resolution and higher horizontal and vertical accuracy. Version 3 shows significant improvements over the previous version. However, it is cautioned that the data still contains anomalies and artifacts that will reduce its effectiveness for use in certain applications.
  • ALOS PALSARThis DEM is one of the resources available within the Japan Aerospace Exploration Agency (JAXA) ALOS satellite products. ALOS (Advanced Land Observation Satellite) also known as DAICHI, carries three sensors on board: the PRISM for panchromatic images, the PALSAR synthetic aperture radar and the AVNIR radiometer. It has higher spatial resolution than the SRTM and ASTER models, native resolution of 30 and 12.5 meters resampled.

Selection criteria

It does not exist a perfect model in order to delimitate all kind of river basins, but we need to select the onee that best suits our use case.

To carry out the delimitation of the sub-basins of Mozambique, the iCarto team decided to use NASADEM. The decision to select this model over others was based on several criteria:

  • The starting layer of basins was made at the time from STRM data. The sub-basin layer had to be adjusted to this layer, so it was more appropriate to use a DEM also obtained from the Shuttle Radar Topographic Mission data.
  • NASADEM is the most updated and corrected DEM of the models that come from STRM data.
  • After reviewing different studies, it was concluded that for the delimitation of these sub-basins there would be no significant differences in the results obtained with any of the products with a resolution of 30 meters. Several studies have been published evaluating the vertical accuracy of these products. One of them is that of González-Morada and Viveen, who compared the ASTER GDEM, SRTM, AW3D30 and TanDEM-X DEMs with a set of 139 measurements collected by a dual-frequency Trimble 5800 GNSS receiver. The root mean square error (RMSE) was below 7 meters for all models.
  • Some tests were carried out using various models and it was concluded that a resolution of 30 m was sufficient for this study. A higher resolution would imply excessively high processing times in exchange for an insignificant increase in precision for this analysis.

Where to download NASADEM products

NASADEM products can be downloaded from different platforms, two of the best known are:

In both cases, the download is free, only prior registration is required.

To download the NASADEM, for example, from the EarthExplorer search engine, it is only necessary to indicate the study area in the “Search Criteria” tab, search for and select “NASADEM_HGT” in the “Data Sets” tab, go to results and download all the files.

In the following articles of this series we will work on the DEM with QGIS and we will define the criteria for the delimitation of the sub-basins.

La entrada Digital elevation models for hydrological studies se publicó primero en iCarto.

by iCarto at June 22, 2022 08:16 AM

June 21, 2022

JTS 1.19 has just been released!  There is a great deal of new, improved and fixed functionality in this release - see the GitHub release page or the Version History for full details.

This blog has several posts describing new functionality in JTS 1.19:

New Functionality

Improvements


Many of these improvements have been ported to GEOS, and will appear in the soon-to-appear version 3.11.  In turn this has provided the basis for new and enhanced functions in the next PostGIS release, and will likely be available in other platforms via the many GEOS bindings and applications.

by Dr JTS (noreply@blogger.com) at June 21, 2022 10:34 PM

The question of why organizations are shy about their use of open source is an interesting one, and not completely obvious.

Open source luminary Even Roualt asks:

is there some explanation why most institutions can’t communicate about their PostGIS use ? just because it is a major hurdle for technical people to get their public relationship department approve a communication ? people afraid about being billed about unpaid license fees 🤣 ?

There’s really very little upside to publicizing open source use. There’s no open source marketing department to trumpet the brilliance of your decision, or invite you to a conference to give you an award. On the other hand, if you have made the mistake of choosing an open source solution over a well-known proprietary alternative, there is surely a local sales rep who will call your boss to tell them that you have made a big mistake. (You do have a good relationship with your boss, I hope.)

These reverse incentives can get pretty strong. Evendiagram reports:

Our small group inside a large agency uses postgis. We don’t talk about it, even internally, to avoid the C-suite forcing everyone back to oracle. RHEL repos allow us a lot of software that would otherwise be denied.

This reminds me of my years consulting for the British Columbia government, when technical staff would run data processing or even full-on public web sites from PostgreSQL/PostGIS machines under their desktops.

They would tell their management it was “just a test system” or “a caching layer”, really anything other than “it’s a database”, because if they uttered the magic word “database”, the system would be slated for migration into the blessed realm of enterprise Oracle systems, never to be heard from again.

Logos

Meanwhile, Daryl Herzmann reminds us that the Iowa Mesonet has been on Team PostGIS since 2003.

Iowa Environmental Mesonet, Iowa State University

  • Data being managed in the database
    Meteorological Data, “Common” GIS datasets (roads, counties), Current and Archived NWS Tornado/Flash Flood/Thunderstorm Warnings, Historical Storm Reports, Current and Archived precipitation reports. Climate data
  • How the data is being accessed / manipulated
    From mapserver! Manipulated via Python and PHP.
  • Why you chose to use PostGIS for the application
    Open-Source. Uses my favorite DB, Postgres. Easy integration with mapserver. The support community is fantastic!

Further afield, the GIS portals of governments throughout Ukraine are running on software built on PostGIS.

Jørgen Larsen de Martino notes that:

The Danish Agency for Data Supply and Infrastructure uses PostGIS extensively - and have been using it for the last 10 years - we would not have had the success we have was it not for @PostGIS.

The Utah Geospatial Resource Center uses PostGIS to provide access to multiple spatial layers for direct access in a cloud-hosted PostGIS database called the “Open SGID”. (I can hear DBA heads exploding around the world.)

Counterpoint

While self-reporting is nice, sometimes just a little bit of dedicated searching will do. Interested in PostGIS use in the military? Run a search for “postgis site:mil” and see what pops up!

The 108th wing of the Air Force! Staff Sgt. Steve De Leon is hard at it!

“I’m taking all the data sources that AMC and A2 compile and indexing them into the PostgreSQL/PostGIS data and then from there trying to script Python code so the website can recognize all the indexed data in the PostgreSQL/PostGIS database,” said the De Leon.

The Canadian Department of National Defense is building Maritime Situational Awareness Research Infrastructure with a PostgreSQL/PostGIS standard database component.

PostgreSQL with its PostGIS extension is the selected DBMS for MSARI. To ease mainte- nance and access, if more than one database are used, PostgreSQL will be selected for all databases.

The Coast Guards “Environmental Response Management Application (ERMA)” is also running PostGIS.

The application is based on open source software (PostgreSQL/PostGIS, MapServer, and OpenLayers), that meet Open Geospatial Consortium (OGC) specifications and standards used across federal and international geospatial standards communities. This ensures ERMA is compatible with other commercial and open-source GIS applications that can readily incorporate data from online data projects and avoids licensing costs. Open-source compatibility supports data sharing, leverages existing data projects, reduces ERMA’s maintenance costs, and ensures system flexibility as the technology advances. Because ERMA is open source, it can easily be customized to meet specific user requirements.

More logos?

Want to appear in this space? Email me!

June 21, 2022 08:00 AM

June 20, 2022

In my training program the last week of every four week block is dedicated to rest and recovery. I did much less running in week twelve, more bike riding, and some yoga. I did more household stuff, re-watched Stranger Things season three and started season four with my family, and got some excellent nights of sleep. Here are the running numbers.

  • 3 hours, 12 minutes

  • 15.0 miles

  • 2,231 ft D+

My Saturday run at Lory State Park unexpectedly coincided with the XTERRA Lory Triathlon but I did manage to find a parking spot at Arthur's Rock trailhead and do a loop that only barely intersected the mountain bike portion of the event. A lot of local runners were in Wyoming for the Bighorn Trail Run, so I had the remote parts of the park all to myself.

by Sean Gillies at June 20, 2022 11:16 PM

Last week, I wrote that getting large organizations to cop to using PostGIS was a hard lift, despite that fact that, anecdotally, I know that there is massive use of PostGIS in every sector, at every scale of institution.

Simple Clues

Here’s a huge tell that PostGIS is highly in demand: despite the fact that PostGIS is a relatively complex extension to build (it has numerous dependencies) and deploy (the upgrade path between versions can be complex) every single cloud offering of PostgreSQL includes PostGIS.

AWS, Google Cloud, Azure, Crunchy Bridge, Heroku, etc, etc. Also forked not-quite-Postgres things like Aurora and AlloyDB. Also not-Postgres-but-trying things like Cockroach and Yugabyte.

If PostGIS was a niche hobbyist project…? Complete the sentence any way you like.

Logos

True to form, I received a number of private messages from people working in or with major institutions you have heard of, confirming their PostGIS use, and the fact that the institution would not publicly validate it.

However, I also heard from a couple medium sized companies, which seem to be the only institutions willing to talk about how useful they find open source in growing their businesses.

Hailey Eckstrand of Foundry Spatial writes to say:

Foundry Spatial uses PostGIS in development and production. In development we use it as our GIS processing engine and warehouse. We integrate spatial data (often including rasters that have been loaded into PostGIS) into a watershed fabric and process summaries for millions of watersheds across North America. We often use it in production with open source web tooling to return results through an API based on user input. One of our more complex usages is to return raster results within polygons and along networks within a user supplied distance from a click location. We find the ease and power of summarizing and analyzing many spatial datasets with a single SQL query to be flexible, performant, efficient, and… FUN!

Dian Fay of Understory writes in:

We use PostGIS at Understory to track and record storms, manage fleets of weather stations, and optimize geographic risk concentration for insurance. PostGIS lets us do all this with the database tools we already know & love, and without severing the connections between geographic and other categories of information.

More logos?

Want to appear in this space? Email me!

June 20, 2022 08:00 AM

June 19, 2022

The latest v0.10 release is now available from conda-forge.

This release contains some really cool new algorithms:

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our recently opened discussion forum.

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

Besides others examples, the movingpandas-examples repo contains the following tech demo: an interactive app built with Panel that demonstrates different MovingPandas stop detection parameters

To start the app, open the stopdetection-app.ipynb notebook and press the green Panel button in the Jupyter Lab toolbar:

by underdark at June 19, 2022 12:04 PM

June 18, 2022

A Marzano si festeggia San Terenziano (San Ransiàn) il 1° settembre o la prima domenica di settembre, anche se la chiesa di Marzano è dedicata a San Bartolomeo (che si festeggia una settimana prima, il 24 agosto).

Premesso che è comunque una generica “festa di fine estate”, è interessante notare che entrambi i santi sono protomartiri.

C’è una ricorrenza di associazioni tra San Terenziano e San Bartolomeo, anche se in località vicine: Rezzoaglio (GE) dove San Bartolomeo è festeggiato a Magnasco, Leivi (GE), Cavriago (RE). Tuttavia il culto di San Bartolomeo è molto diffuso in tutta Italia, mentre quello di San Terenziano è più raro.

Il sito www.encyclocapranica.it non è più online ma anni fa vi ho trovato questa interessante lista tratta da una pubblicazione a stampa (purtroppo non sembra archiviata nemmeno sulla Wayback Machine). La tematica è ripresa sul sito Capranica Storica sempre a cura di Massimo Brizzolara. Unica correzione apportata riguarda la località Rosso, che si trova in comune di Davagna e non di Lavagna.

  1. Recco (Comune di Recco, Arcidiocesi di Genova): si ricorda una cappella intitolata al Santo, mutata in San Rocco tra il sec. XV e il sec. XVI dopo una grave pestilenza (concomitante culto di San Rocco);
  2. Premanico (già appartenente al Comune di Apparizione, ora Comune di Genova, Arcidiocesi di Genova): esistono ruderi di chiesetta intitolata al Santo (concomitante culto di San Rocco);
  3. Pino (Comune di Genova, Arcidiocesi di Genova): chiesa parrocchiale intitolata a San Michele e Terenziano, patrono del luogo. Si ricorda il cosiddetto olio di San Terenziano, benedetto e distribuito alla popolazione in occasione della festa, che possiede virtù taumaturgiche per la cura dei reumatismi e delle artriti. Vi esiste anche una confraternita dal titolo dei Santi Michele e Terenziano;
  4. Rosso (Comune di Davagna, Arcidiocesi di Genova): località intitolata al Santo;
  5. Teriasca (Comune di Sori, Arcidiocesi di Genova): la locale chiesa di San Lorenza, fu originariamente edificata sotto il titolo dei Santi Lorenzo e Terenziano (concomitante culto di San Rocco);
  6. Fumeri (Comune di Mignanego, Arcidiocesi di Genova);
  7. Rezzoaglio (Diocesi di Bobbio-Piacenza): patrono del luogo (concomitante culto di San Rocco);
  8. Leivi (Diocesi di Chiavari): con cappella intitolata ai santi Terenziano e Desiderio;
  9. San Terenziano (Comune di Leivi): patrono del luogo;
  10. Nicorvo (Diocesi di Vigevano): patrono del luogo, con chiesa parrocchiale propria e chiesa rurale;
  11. Rompeggio (Comune di Ferriera Diocesi di Piacenza-Bobbio): patrono del luogo;
  12. Ebbio (Comune di Bettola Diocesi di Piacenza-Bobbio): patrono del luogo, con chiesa parrocchiale propria;
  13. Groppo Ducale (Comune di Bettola Diocesi di Piacenza-Bobbio);
  14. Isola di Compiano (Comune di Compiano Diocesi di Parma): patrono del luogo, vi si svolge una “Fiera Millenaria di San Terenziano” in occasione della festa;
  15. Gorro (Comune di Borgotaro Diocesi di Parma): patrono del luogo;
  16. Fraore (Comune di Parma Diocesi di Parma): patrono del luogo, con chiesa parrocchiale propria;
  17. Soragna (Diocesi di Parma): si ricorda una fiera in onore del Santo all’interno della Rocca;
  18. San Terenziano (Comune di Cavriago Diocesi di Reggio Emilia): con chiesa parrocchiale propria;
  19. Raiano di Cornio (Diocesi di Sulmona-Valva): con eremo dell’XI sec. intitolato al Santo;
  20. Capua (Arcidiocesi di Capua): si ricorda una chiesa intitolata al Santo, già nel XIII sec., fatta demolire da Federico II di Svevia per far posto all’edificazione della porta della Città;
  21. Tortona (Diocesi di Tortona): vi si ricorda un vescovo Terenziano, martirizzato nel 186 d.C., ma non è certo che sia lo stesso Terenziano di Todi;
  22. San Terenziano (Comune di Gualdo Cattaneo Diocesi di Todi): con chiesa parrocchiale e tomba del Santo;
  23. Teano (Diocesi di Teano): con reliquie (un braccio) ed officiatura approvata;
  24. Capranica (Diocesi di Civitacastellana): patrono del luogo, con reliquie (il cranio e un braccio), officiatura approvata e chiesa propria. Vi esiste una confraternita intitolata ai Santi Terenziano e Rocco;
  25. Todi (Diocesi di Todi): officiatura con lezioni proprie e rito doppio di II classe, festa celebrata nella Città e in tutto il territorio diocesano il 1° settembre di ogni anno.

Ho letto per la prima volta una sintesi con ipotesi storiche su questa diffusione riguardo a Rezzoaglio, un documento di qualche anno fa ma sempre molto interessante.

C’è una concentrazione del culto nell’Italia centrale e soprattutto nell’Appennino Ligure-Emiliano e Genovesato. Dall’elenco dei 25 luoghi, a cui si aggiunge Marzano, ho creato una mappa digitale. Certamente chi ha prodotto questo filmato ha usato lo stesso elenco. Altri riferimenti ma privi di elenchi sono sul sito santiebeati.it. Un elenco abbastanza completo è su CathoPedia. Tutti questi elenchi sono comunque incompleti, ma hanno il pregio di registrare notizie altrimenti difficili da reperire, un po’ come quella che associa San Terenziano a Marzano.

Mappa della diffusione del culto di San Terenziano in Italia

Dal punto di vista geografico, Marzano ricade pienamente nella zona di attestazione, anzi è quasi un punto di collegamento tra le presenze del versante ligure e quelle del versante padano.

Mappa della diffusione del culto di San Terenziano in Liguria e nell’Appennino settentrionale

Come abbiamo già avuto modo di vedere le notizie storiche su Marzano sono piuttosto scarne, ma a San Terenziano doveva essere già essere dedicata la prima cappella precedente alla chiesa.

by Stefano Costa at June 18, 2022 01:23 PM

June 16, 2022

Once again I've fallen behind on blogging about running, but will catch up. Week ten started out pretty well, but hay fever and lethargy were dragging me down at the end. I struggled on steep trails at Greyrock Saturday and then opted for more mellow long miles on Sunday. I've blogged about the route to Greyrock before, it's amazing.

  • 9 hours, 33 minutes

  • 44.0 miles

  • 6,932 ft D+

https://live.staticflickr.com/65535/52129550517_14ab194fdf_b.jpg

View south over the Greyrock meadows on a hot Saturday.

I felt much better in week eleven. I did some hard hill intervals on Wednesday and back-to-back long runs with plenty of climbing on the weekend.

  • 11 hours, 45 minutes

  • 50.0 miles

  • 9,633 ft D+

Saturday I ran in toasty warm conditions at Lory and Horsetooth. Even though I cut it short, this run had the most elevation gain of my season. After I got home and saw that more hot weather was in store for Sunday, I decided to head to the high country on Sunday. I succeeded in getting a timed entry reservation for Rocky Mountain National Park (RMNP), an hour away by car, and did a long loop from the Cub Lake trailhead.

https://live.staticflickr.com/65535/52152415846_de0a8e33c5_b.jpg

Cub Lake trailhead in RMNP.

Sunday's run didn't have as much D+ as Saturday's, but it did start above 8,000 ft and went for more than 3 miles above 10,000 ft. I was traveling exclusively on snow for about two hours. Very comfortable, but very slow, and in one spot a bit sketchy. On the descent to Odessa Lake, the well-engineered trail crosses a steep gully, no problem in summer, but in spring the gully and trail are completely covered with snow and a slip on this traverse could result in a long downhill ride. It was exciting! The party ahead of me on the trail almost turned back.

https://live.staticflickr.com/65535/52152415851_ccc89eb97f_b.jpg

Approaching the pass between Bear Lake and Odessa Lake.

In the last mile of the run I saw a big ole male moose just a few yards off the trail. This is normal for an outing in Moraine Park. On the drive home I saw a chonky black bear on the right bank of the Big Thompson River. I would have pulled over to get a photo to share here except the shoulder was completely filled with cars and my phone was dead.

by Sean Gillies at June 16, 2022 10:22 PM

This post aims to show you how to create quick interactive apps for prototyping and data exploration using Panel.

Specifically, the following example demos how to add geocoding functionality based on Geopy and Nominatim. As such, this example brings together tools we’ve previously touched on in Super-quick interactive data & parameter exploration and Geocoding with Geopy.

Here’s a quick preview of the resulting app in action:

To create this app, I defined a single function called my_plot which takes the address and desired buffer size as input parameters. Using Panel’s interact and servable methods, I’m then turning this function into the interactive app you’ve seen above:

import panel as pn
from geopy.geocoders import Nominatim
from utils.converting import location_to_gdf
from utils.plotting import hvplot_with_buffer

locator = Nominatim(user_agent="OGD.AT-Lab")

def my_plot(user_input="Giefinggasse 2, 1210 Wien", buffer_meters=1000):
    location = locator.geocode(user_input)
    geocoded_gdf = location_to_gdf(location, user_input)
    map_plot = hvplot_with_buffer(geocoded_gdf, buffer_meters, 
                                  title=f'Geocoded address with {buffer_meters}m buffer')
    return map_plot.opts(active_tools=['wheel_zoom']) 

kw = dict(user_input="Giefinggasse 2, 1210 Wien", buffer_meters=(0,10000))

pn.template.FastListTemplate(
    site="Panel", title="Geocoding Demo", 
    main=[pn.interact(my_plot, **kw)]
).servable();

You can find the full notebook in the OGD.AT Lab repository or run this notebook directly on MyBinder:

To open the Panel preview, press the green Panel button in the Jupyter Lab toolbar:

I really enjoy building spatial data exploration apps this way, because I can start off with a Jupyter notebook and – once I’m happy with the functionality – turn it into a pretty app that provides a user-friendly exterior and hides the underlying complexity that might scare away stakeholders.

Give it a try and share your own adventures. I’d love to see what you come up with.

by underdark at June 16, 2022 04:19 PM

Background

Understanding which regions QGIS is being used in, which versions are in active use, which platforms it is being used on, and how many users we have is hugely beneficial to our ability as a project to serve our users. Back in 2017 at the bi-annual QGIS hackfest in Nødebo, Denmark, we had a long discussion about key project goals and the need to better understand our user base in order to plan the future direction of the project, and allocate funding and resources to where they are needed most

Typically proprietary software vendors have ready access to detailed user data through telemetry code which they embed in their software. This telemetry code ‘phones home’ key metrics, which together with other techniques such as license sales analysis gives them a very detailed insight into their user base. The data these vendors collect is typically not shared, so their users do not benefit from being able to understand how their data is used.

For QGIS.org, having to resort to what are generally considered to be nefarious and privacy-invading techniques of siphoning user data from our users goes against the ethos we try to promote as an open project. Further, since QGIS is freely available and doesn’t require any self-registration, we do not have a user database we can consult for such analytics. Additional factors make understanding usage levels hard. For example, a single user can download a copy of a QGIS installer and distribute it to many other users, and conversely web crawlers and bots can download many copies of QGIS installers and never install them. Because of this, simply counting the number of downloads from our website does not give a useful picture of our user base.

So we needed to come up with an approach that:

  1. Does not invade our user’s privacy
  2. Does not require including telemetry code in QGIS which exfiltrates user information from their system
  3. Does not store any user-identifiable data on our servers
  4. Is open and transparent in the data collection methodology
  5. Openly shares the insights we gain from our analytics to the broader community

The most obvious privacy-respecting way we could find to understand more about our users was to collect metrics of access to the QGIS News Feed. In order to display the latest news on startup, QGIS Desktop makes a request to https://feed.qgis.org when it is opened. On the server that hosts the feed, we can then use the web server logs to understand which operating system and version of QGIS made the news feed request. Additionally, using the GeoIP library we can resolve each request to the country from which it originated. These pieces of information are included in the User-Agent headers sent by QGIS when it makes a request to the QGIS News Feed.

This process is anonymous, transparent, and simple to disable. It does not identify unique machines. Only one event is logged per unique network per hour. Only one event is logged per QGIS installation per day, and the event is only triggered when the user opens the QGIS Desktop application.

Operating system statistics are derived from QGIS version information, and no system fingerprinting or telemetry is implemented.

Location information is derived from the request source IP address, which is immediately discarded on the server after resolving it to the country of origin.

No logging on the QGIS News Feed server occurs with legacy installations that do not have the news feed feature, offline usage of QGIS, and installations for which feed collection is disabled (see below for info on how to disable it). It will also have statistics skewed in scenarios where atypical networking infrastructure is in effect, such as using a virtual private network.

Despite these caveats, the statistics should provide a good high-level overview of how QGIS is being used, such as the breakdown of QGIS across operating systems and versions – information that is incredibly useful to the QGIS developer team. Only the following four pieces of information are collected:

  • The date (aggregated by day)
  • The QGIS version
  • The Operating System
  • Country (based on IP which is immediately discarded)

Opting out

If you wish to opt-out of this data collection, simply disabling the feed retrieval, using QGIS offline, or blocking access to the QGIS RSS feed address (feed.qgis.org) on your network will exclude you from this process. QGIS Desktop provides options for disabling version checking and feed access under Settings ➔ Options ➔ General ➔ Application. Note that by default this setting is specific to each individual user profile.

Viewing the analytics

We have made a public dashboard publicly available at https://analytics.qgis.org. The dashboard was made using the fantastic open-source Metabase analytics package.

Credits: This post was written by Charles Dixon-Paver and Tim Sutton

by Tim Sutton at June 16, 2022 02:15 PM

Lizmap sur le Socle Interministériel de Logiciels Libres

Lizmap est maintenant référencé sur le Socle interministériel de logiciels libres (SILL). Vous pouvez consulter la fiche du logiciel Lizmap. Elle a été créée par un référent SILL : un agent public utilisant Lizmap.

Pour qu'elle puisse être créée, une autre fiche Lizmap sur le Comptoir du libre et également une fiche Lizmap sur Wikidata ont été créées.

Le Socle interministériel de Logiciels Libres

Le Socle interministériel de logiciels libres (SILL) est le catalogue de référence des logiciels libres recommandés par l'État pour toute l'administration.

Il est publié par le pôle logiciels libres d'Etalab (DINUM) sur le site code.gouv.fr. Son interface de gestion est accessible sur le site sill.etalab.gouv.fr.

Il est construit de façon collaborative par une communauté d'agents publics, les référents SILL. En tant qu'agents publics utilisateur de Lizmap, vous pouvez devenir référents Lizmap.

Pour en savoir plus sur le SILL, vous pouvez vous rendre sur cette définition par SourceHut.

Vous trouverez aussi les fiches de :

Le comptoir du libre

Le Comptoir du Libre recense les logiciels libres métiers utiles aux services publics ainsi que leurs utilisateurs et prestataires.

Il s'agit d'une place de marché du logiciel libre métier. Elle a vocation à partager les informations relatives à ces logiciels, et à mettre en contact les parties intéressées. Les collectivités naviguant sur cette plate-forme trouveront :

  • leurs homologues utilisant ces logiciels libres, avec leurs témoignages,
  • les entreprises fournissant des services (maintenance, formation, support...),
  • des indicateurs chiffrés permettant de comparer les solutions.

C'est un service de l'Adullact. En tant qu'utilisateur de Lizmap, vous pouvez :

  • vous déclarer utilisateur
  • y laisser un témoignage
  • ajouter des copies d'écran
  • lister les logiciels fonctionnant avec Lizmap

Vous trouverez aussi les fiches de :

Wikidata

Wikidata [w] est une base de connaissances, libre, collaborative et multilingue, qui collecte des données structurées afin de fournir un support à Wikipédia, à Wikimedia Commons, aux autres wikis du mouvement Wikimedia, et à toute personne dans le monde.

Wikidata est un dépôt de stockage centralisé auquel tout le monde peut y accéder, comme les wikis gérés par la Wikimedia Foundation. Le contenu chargé dynamiquement à partir de Wikidata n'a pas besoin d'être maintenu dans chaque projet wiki individuel. Par exemple, les statistiques, les dates, les lieux et d'autres données courantes peuvent être centralisées dans Wikidata.

La fiche Lizmap de Wikidata contient les données principales, une description en anglais et français. Cette fiche peut-être complétée et la description traduite dans d'autres langues.

Vous trouverez aussi les fiches de :

René-Luc D'Hont

by René-Luc D'HONT at June 16, 2022 09:00 AM

Lizmap on the French interministerial open source software database

Lizmap is now referenced on the French interministerial open source software base (Socle interministériel de logiciels libres, SILL). You can consult the Lizmap software sheet. This file was created by a SILL referent: a public agent using Lizmap.

As an initial step to have this link, two articles must have been created beforehand : Lizmap on the Comptoir du libre and Lizmap on Wikidata.

French interministerial open source software database

The French interministerial open source software database (Socle interministériel de logiciels libres, SILL) is the reference catalog of free software recommended by the French State for the whole administration.

It is published by Etalab's open source software division (DINUM) on the code.gouv.fr website. Its management interface is accessible on the website sill.etalab.gouv.fr.

It is built in a collaborative way by a community of public agents, the SILL referents. As a public agent using Lizmap, you can become a Lizmap referent.

You will also find the sheets:

The comptoir du libre

The Comptoir du Libre lists the free softwares useful to the public services as well as their users and providers.

It's a free software marketplace. Its purpose is to share information about these softwares, and to put in contact the interested parties. The authorities browsing this platform will find :

  • their counterparts using these free software, with their testimonies,
  • companies providing services (maintenance, training, support...),
  • numerical indicators allowing to compare the solutions.

The Comptoir du Libre is a service of Adullact. As a Lizmap user, you can:

  • declare yourself a user
  • leave a testimony
  • add screenshots
  • list software working with lizmap

You will also find the sheets:

Wikidata

Wikidata [w] is a free, collaborative, multilingual, secondary database, collecting structured data to provide support for Wikipedia, Wikimedia Commons, the other wikis of the Wikimedia movement, and to anyone in the world.

Wikidata is a central storage repository that can be accessed by others, such as the wikis maintained by the Wikimedia Foundation. Content loaded dynamically from Wikidata does not need to be maintained in each individual wiki project. For example, statistics, dates, locations, and other common data can be centralized in Wikidata.

The Wikidata Lizmap sheet contains the main data, a description in English and French. This sheet can be completed and the description translated into other languages.

You will also find the sheets:

René-Luc D'Hont

by René-Luc D'HONT at June 16, 2022 09:00 AM

Hoy queremos felicitar a Carles Martí Montolío que ha sido premiado con el accésit de la sexta edición de los premios Pedro R. Muro-Medrano, un galardón que reconoce los mejores trabajos de fin de estudios en el ámbito de las Infraestructuras de Información Geográfica y los estándares abiertos que las soportan.

El proyecto desarrollado por Carles, “Implementación del plugin ETL para la plataforma gvSIG online”, consiste en una serie de desarrollos orientados a automatizar tareas de transformaciones de datos, ya sean repetitivas o no, de manera que no sea necesario la manipulación de los datos a través de código. De esta manera, cualquier usuario será capaz de hacer una manipulación de los datos (geométricamente o no) o una homogeneización de datos que provengan de diferentes orígenes y formatos. Esto será posible gracias a un canvas que representará gráficamente el proceso de transformación de los datos de una manera fácil e intuitiva.

Más información sobre el premio aquí.

¡Felicidades Carles!

by Alvaro at June 16, 2022 07:11 AM

This is a part of series of blog posts to update QGIS community with the outcome of the funding we had raised during late 2021 to improve elevation and point clouds in collaboration with North Road and Hobu. For other updates see part 1 and part 2.

Profile tool

With the new integrated profile tool, you can generate cross sections of point clouds, raster, vector and mesh data. For more information on this tool, you can see the excellent video introduction by North Road who implemented this part of the project.

To be able to view profiles from different data types, there is now a dedicated Elevation settings under layer properties. Users can set the elevation source, style and some other configurations. You can then enable elevation profile widget window by going to the main menu in QGIS, View > Elevation Profile.

Elevation Profile tool in QGIS

Support for COPC

Cloud Optimized Point Cloud (COPC) is a new format for point cloud data and QGIS 3.26 comes with support for it (for both local files and data hosted on remote servers).

COPC is a very exciting addition to the ecosystem, because it is “just” a LAZ file (a format well established in the industry) that brings some interesting extra features. This means all software supporting LAZ file format will also be able to read COPC files without any extra development. If you are familiar with Cloud Optimized GeoTIFF (COG) for rasters, COPC is an extension of the same concept for point cloud data. Read more at https://copc.io/

Ordinary LAS/LAZ files have an issue that it is not possible to efficiently read a subset of data without reading the entire file. This is less of an issue when processing point cloud data, but much more important for point cloud viewers, which typically show only a small portion of the data (e.g. zoomed in to a particular object or zoomed out to show the entire dataset). For that reason, viewers need to index (pre-process) the data before being able to show it - QGIS also needs to do the indexing when a point cloud file is first loaded. The new feature that COPC brings is that data is re-organized in a way that reading just some parts of data is efficient and easy. Therefore when loading COPC files, QGIS can immediately show them without any indexing (that takes time and extra storage).

In addition to that, COPC files can be efficiently used also directly from remote servers - clients such as QGIS can only request small portions of data needed, without the need to download the entire file (that can have size of many gigabytes). This makes dissemination of point cloud data easier than before - just make COPC files available through a static server and clients are ready to stream the data.

A small note: until now, QGIS indexed point cloud files to EPT format upon first load. From QGIS 3.26 we have switched to indexing to COPC - it has the advantage of being just a single file rather than lots of small files in a directory. If you have point cloud data indexed in EPT format already, QGIS will keep using EPT index (rather than indexing also to COPC).

Display of a remote COPC file

Display of a remote COPC file

Classified renderer improvements

Classified renderer for point clouds has been improved to:

  • Show only classes that are in the dataset (instead of hard-coded list) & show also non-standard classes
  • Show percentage of points for each class
  • Work also for other attributes (return number, number of returns, point source and few other classes)

Point cloud classification

Vector transparency in 3D scene

This improvement is not part of the crowdfunding campaign and was exclusively funded by the Swedish QGIS user group, but it is somehow relevant to the audience of this blog post!

With this feature, you can set polygon transparency in 3D scenes.

3D vector transparency

Want to see more features?

We are trying to improve QGIS to handle point clouds for visualisation and analysis. If you would like certain features to be added to QGIS, do not hesitate to contact us on info@lutraconsulting.co.uk with your idea(s).

June 16, 2022 06:00 AM

June 15, 2022

Este año el curso-taller “Tecnologías de la Información Geográfica y gvSIG Batoví” da un paso más… Con el propósito de ampliar la experiencia uruguaya, en esta oportunidad la convocatoria se extiende a los docentes de Colombia y México. El curso va dirigido a docentes de enseñanza pre-universitaria de Geografía y áreas relacionadas con el conocimiento geográfico, ambiental y social.

gvSIG Batoví es un Sistema de Información Geográfica destinado a entornos educativos que surge como una adaptación del software libre desarrollado por la Asociación gvSIG.

La capacitación se llevará a cabo del 30 de junio al 29 de julio y tendrá modalidad b-learning (plataforma + taller por videoconferencia).

Con respecto al concurso, se llevará a cabo del 12 setiembre al 30 de noviembre (aproximadamente). Los equipos de trabajo estarán integrados por estudiantes y al menos un docente de referencia, el cual debió participar en alguna de las ediciones del curso. Cada equipo deberá presentar un proyecto de trabajo que identifique y aborde una problemática de interés local, posea una dimensión territorial y se enmarque en alguno de los Objetivos de Desarrollo Sostenible 2030 de la ONU.

Organizan, colaboran y participan:

Colombia: Instituto Geográfico Agustín Codazzi y Gobernación de Cundinamarca. México: Instituto Politécnico Nacional y Comisión Mexicana de Cooperación con la UNESCO (CONALMEX). Uruguay: Plan Ceibal, Dirección Nacional de Topografía (MTOP) e Inspección Nacional de Geografía y Geología (ANEP-DGES).

Enlaces de interés:

Inscripciones

Información completa (PDF)

by Alvaro at June 15, 2022 08:17 AM

In our previous post we briefly explained how we improved our processing pipeline for the new Sentinel-2 cloudless 2021 release. In this post we will go deeper into the details of what changed and where we are technologically. Parts of these developments are results from the LOOSE project funded by ...

June 15, 2022 12:00 AM

June 14, 2022

Freshwater is essential for life and development. For this reason, it is of vital importance to carry out a correct planning, management and analysis of this natural resource. The national and regional administrations of water of each country carry out this work (hydrological plans, characterization studies, monitoring of the water balance…). And to achieve this, the study, delimitation and preservation of river basins is essential.

The EU Water Framework Directive defines a hydrographic basin as “the area of land from which all surface run-off flows through a sequence of streams, rivers and, possibly, lakes into the sea at a single river mouth, estuary or delta”. The river basin as a resource management unit is considered indivisible. Each basin in turn is divided into sub-basins, these being defined as the area of land from which all surface run-off flows through a series of streams, rivers and,possibly, lakes to a particular point in a water course (normally a lake or a river confluence).

Work context

At iCarto we have been working since 2012, first with the SIXHIARA project and later through the Blue Deal program, with the National Administration of Water Resources Management (DNGRH) and with the Regional Water Administrations (ARAs IP) of Mozambique. Among the actions, the creation of SIRH: Water Users and Licenses, the Water Resources Information System that is currently used by all the ARAs for the management of their licenses and water users, stands out.

This system used information from river and basin layers, created by the UDC’s Water and Environmental Engineering Group (GEAMA), but did not have a sub-basin layer. This type of information also did not exist in any of the national or regional administrations, being essential for proper planning and territorial management of water, especially in the country’s large basins. For all these reasons, the need to delimit the sub-basins of Mozambique was evident and the Dutch water authorities, through the Blue Deal program, decided to finance iCarto to carry out this activity.

The delimitation of sub-basins can be carried out through a hydrological analysis with a GIS application on a DEM (Digital Elevation Model). In our case we have used QGIS, a free open source GIS software, and the NASADEM global coverage model, since Mozambique does not have its own high-resolution model.

If you want to know more about the Digital Elevation Models and the process for the delimitation of sub-basins with QGIS in the following articles of this series you will find more information:

  1. Delimitation of hydrographic sub-basins using QGIS and a DEM (this post)
  2. Digital elevation models for hydrological studies
  3. Technical criteria for the delimitation of sub-basins
  4. Sub-basin delimitation process using QGIS

La entrada Delimitation of hydrographic sub-basins using QGIS and a DEM se publicó primero en iCarto.

by iCarto at June 14, 2022 10:14 AM

June 13, 2022

El agua dulce es fundamental para la vida y el desarrollo. Por ello es de vital importancia realizar una correcta planificación, gestión y análisis de este recurso natural. Las administraciones hidrológicas nacionales y regionales de cada país realizan este trabajo (planes hidrológicos, estudios de caracterización, seguimiento del balance hídrico…). Y para conseguirlo es imprescindible el estudio, delimitación y preservación de las Cuencas Hidrográficas.

La Directiva Marco del Agua de la UE define cuenca hidrográfica como la superficie de terreno cuya escorrentía superficial fluye en su totalidad a través de una serie de corrientes, ríos y eventualmente lagos hacia el mar por una única desembocadura, estuario o delta. La cuenca hidrográfica como unidad de gestión del recurso se considera indivisible. Cada cuenca a su vez se divide en subcuencas, definiéndose éstas como la superficie de terreno cuya escorrentía superficial fluye en su totalidad a través de una serie de corrientes, ríos y, eventualmente, lagos hacia un determinado punto de un curso de agua (generalmente un lago o una confluencia de ríos).

Contexto del trabajo

En iCarto llevamos trabajando desde 2012, primero con el proyecto SIXHIARA y después mediante el programa Blue Deal, con la Administración Nacional de Gestión del Recurso Hídrico (DNGRH) y con las Administraciones Regionales de Agua (ARAs IP) de Mozambique. Destaca entre las actuaciones la creación de SIRH: Usuarios de agua, el Sistema de Información del Recurso Hídrico que actualmente usan todas las ARAs para la gestión de sus licencias y usuarios de agua.

Este sistema usaba la información de las capas de ríos y cuencas, creadas por el grupo de investigación de Ingeniería del Agua y del Medio Ambiente (GEAMA) de la UDC, pero no contaba con una capa de subcuencas. Este tipo de información tampoco existía en ninguna de las administraciones nacionales ni regionales, siendo fundamental para realizar una correcta planificación y gestión territorial del agua, especialmente en las grandes cuencas del país. Por todo ello resultaba evidente la necesidad de delimitar las subcuencas de Mozambique y las autoridades holandesas del agua, a través del programa Blue Deal, decidieron financiar a iCarto para realizar esta actividad.

La delimitación de subcuencas se puede realizar mediante un análisis hidrológico con una aplicación SIG sobre un MDE (Modelo Digital de Elevaciones). En nuestro caso hemos utilizado QGIS y el modelo de cobertura mundial NASADEM, dado que Mozambique no dispone de un modelo de alta resolución propio.

Si quieres saber más sobre los Modelos Digitales de Elevaciones y el proceso para la delimitación de subcuencas con QGIS en los siguientes artículos de esta serie encontrarás más información:

  1. Delimitación de subcuencas hidrográficas con QGIS y MDE (este artículo)
  2. Modelos digitales de elevaciones para estudios hidrológicos
  3. Criterios técnicos para la delimitación de subcuencas
  4. Proceso de delimitación de subcuencas con QGIS

La entrada Delimitación de subcuencas hidrográficas usando QGIS y un MDE se publicó primero en iCarto.

by iCarto at June 13, 2022 02:55 PM

The question of “who uses PostGIS” or “how big is PostGIS” or “how real is PostGIS” is one that we have been wrestling with literally since the first public release back in 2001.

There is no doubt that institutional acceptance is the currency of … more institutional acceptance.

Oroboros

So naturally, we would love to have a page of logos of our major users, but unfortunately those users do not self-identify.

As an open source project PostGIS has a very tenuous grasp at best on who the institutional users are, and things have actually gotten worse over time.

Originally, we were a source-only project and the source was hosted on one web server we controlled, so we could literally read the logs and see institutional users. At the time mailing lists were the only source of project communication, so we could look at the list participants, and get a feel from that.

All that’s gone now. Most users get their PostGIS pre-installed by their cloud provider, or pre-built from a package repository.

So what do we know?

IGN

In the early days, I collected use cases from users I identified on the mailing list. My favourite was our first major institutional adopter, the Institut Géographique National, the national mapping agency of France.

IGN

In 2005, they decided to move from a desktop GIS paradigm for their nation-wide basemap (of 150M features), to a database-centric architecture. They ran a bake-off of Oracle, DB2 and PostgreSQL (I wonder who got PostgreSQL into the list) and determined that all the options were similar in performance and functionality for their uses. So they chose the open source one. To my knowledge IGN is to this day a major user of PostgreSQL / PostGIS.

GlobeXplorer

Though long-gone as a brand, it’s possible the image management system that was built by GlobeXplorer in the early 2000’s is still spinning away in the bowels of Maxar.

MAXAR

GlobeXplorer was both one of the first major throughput use cases we learned about, and also the first one where we knew we’d displaced a proprietary incumbant. GlobeXplorer was one of the earliest companies explicitly serving satellite imagery to the web and via web APIs. They used a spatial database to manage their catalogue of images and prepared product. Initially it was built around DB2, but DB2 was a poor scaling choice. PostGIS was both physically faster and (more importantly) massively cheaper as scale went up.

RedFin

RedFin was a rarity, a use case found in the wild that we didn’t have to track down ourselves.

RedFin

They described in some detail their path from MySQL to PostgreSQL, including the advantages of having PostGIS.

Using PostGIS, we could create an index on centroid_col, price, and num_bedrooms. These indexes turned many of our “killer” queries into pussycats.

Google

Google is not that big on promoting any technology they haven’t built in house, but we have heard individual Google developers confirm that they use core open source geospatial libraries in their work, and that PostGIS is included in the mix.

Google

The biggest validation Google ever gave PostGIS was in a press release that recognized that the set of “users of spatial SQL” was basically the same as the set of “PostGIS users”.

Our new functions and data types follow the SQL/MM Spatial standard and will be familiar to PostGIS users and anyone already doing geospatial analysis in SQL. This makes workload migrations to BigQuery easier. We also support WKT and GeoJSON, so getting data in and out to your other GIS tools will be easy.

They didn’t address their new release to “Esri users” or “Oracle users” or “MySQL users”, they addressed it to the relevant population: PostGIS users.

More!

Getting permission to post logos is hard. Really hard. I’ve watched marketing staff slave over it. I’ve slaved over it myself.

Major automaker? Check. Major agricultural company? Check. Major defence contractor? Check, check, check. National government? Check. State, local, regional? Check, check, check. Financial services? Check. Management consulting? Check.

Yes, PostGIS is real.

At some point, for a project with a $0 price point, you just stop. If a user can’t be bothered to do the due diligence on the software themselves, to reap all the advantages we offer, for free, I’m not going to buy them a steak dinner, or spoon feed them references.

That said! If you work for a major government or corporate institution and you are allowed to publicize your use of PostGIS, I would love to write up a short description of your use, for the web site and our presentation materials.

Email me!

June 13, 2022 08:00 AM

June 10, 2022

En breve estaremos dando un curso de la UNAM (gratuito) sobre geomática e injusticias.

CONTENIDOS:

Módulo Conceptual
a) Inequidades espaciales: conceptos básicos desde la geografía.
b) Justicia territorial: conceptos básicos desde las ciencias legales.
c) El marco conceptual y metodológico JUST-SIDE.
d) Soberanía tecnológica y software libre.

Módulo Práctico: Herramientas de código abierto para el análisis espacial y la generación de mapas:

a) Plataforma gvSIG.
b) El uso de R para el análisis espacial.
c) Edición de mapas y en línea  

Registro abierto (hasta 16 de junio): https://docs.google.com/forms/d/e/1FAIpQLSfRE3Y8fuaw7zWbQW2GNHjpVzlXJza1e9XpYE_FGV3F7D31Jg/viewform

by Alvaro at June 10, 2022 09:40 AM

June 09, 2022