Welcome to Planet OSGeo

August 18, 2022

Decía Oscar Wilde que un mapamundi que no incluya los territorios de la utopía no vale la pena mirarlo, pues cuando la Humanidad mira a lo lejos tierras mejores, siempre zarpa hacia ellas. El progreso, decía el escritor, es la realización de las utopías.

Este año son las 18as Jornadas Internacionales de gvSIG, un proyecto que en sus inicios fue tachado de utópico, de irrealizable, y que dieciocho años después está más activo que nunca. Tras dos años en los que el evento tuvo que realizarse en modalidad virtual, regresa a la presencialidad, en València, la ciudad que se ha convertido por derecho propio en uno de los centros de referencia de la geomática, la ciencia y tecnología aplicada a la gestión territorial.

Volvemos de la mejor manera posible, uniendo esfuerzos con la Red GeoLIBERO, impulsada por CYTED Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo. Una red que aúna a algunas de las principales entidades y personas del ámbito de la investigación en el área de la geomática libre. Coordinadores de los distintos grupos de investigación de GeoLIBERO, de toda Iberoamérica, presentarán sus trabajos en el marco de las Jornadas.

Impulsando la economía del conocimiento, lema del evento, destaca uno de los ejes principales del proyecto gvSIG. Los últimos tiempos están haciéndonos más conscientes de la necesidad de ser independientes en sectores críticos como la energía, la sanidad, la defensa y, por supuesto, la tecnología. Soberanía tecnológica, uno de los lemas de gvSIG, y que debe ir intrínsecamente relacionado con la soberanía económica.

Es necesario, hoy más que nunca, impulsar los proyectos que apuestan por nuevos modelos de negocio, basados en la colaboración, la solidaridad y el conocimiento compartido. Es el momento de romper definitivamente con la dependencia tecnológica, con modelos que no generan economía y suponen gasto. El momento de mantener y reforzar tecnologías que se asientan sobre los conceptos de cooperación y sostenibilidad. Es el momento de gvSIG y la geomática libre.

by Alvaro at August 18, 2022 08:31 AM

August 16, 2022

August 14, 2022

Week twenty was supposed to be a rest week. I took it pretty easy and did just one long high-altitude run on Friday. From the Bear Lake Road Park and Ride lot in Rocky Mountain National Park, I ran past Bierstadt Lake, up to Flattop Mountain (12,324 ft), Ptarmigan Pass, and back. 16 miles and 4,000 ft of elevation gain in all. I saw moose grazing in Bierstadt Lake, elk at Ptarmigan Pass, and pika all over the FLattop Mountain trail. This run dominated my numbers for the week.

  • 7 hours, 3 minutes

  • 26.1 miles

  • 5,020 ft D+

https://live.staticflickr.com/65535/52285769639_c86ca04d94_b.jpg

Moose in Bierstadt Lake

https://live.staticflickr.com/65535/52285502106_991695d13a_b.jpg

Tonahutu Trail at Ptarmigan Pass

https://live.staticflickr.com/65535/52285502136_1d6e89d0b0_b.jpg

Odessa Lake, Lake Helene, Two Rivers Lake (left to right)

I want to get ~110 miles and 15,000 ft of climbing in over the next two weeks. I hope weather and my health continue to stay good.

by Sean Gillies at August 14, 2022 06:33 PM

Cartographers use all kind of tricks to make their maps look deceptively simple. Yet, anyone who has ever tried to reproduce a cartographer’s design using only automatic GIS styling and labeling knows that the devil is in the details.

This post was motivated by Mika Hall’s retro map style.

There are a lot of things going on in this design but I want to draw your attention to the labels – and particularly their background:

Detail of Mike’s map (c) Mike Hall. You can see that the rail lines stop right before they would touch the A in Valencia (or any other letters in the surrounding labels).

This kind of effect cannot be achieved by good old label buffers because no matter which color we choose for the buffer, there will always be cases when the chosen color is not ideal, for example, when some labels are on land and some over water:

Ordinary label buffers are not always ideal.

Label masks to the rescue!

Selective label masks enable more advanced designs.

Here’s how it’s done:

Selective masking has actually been around since QGIS 3.12. There are two things we need to take care of when setting up label masks:

1. First we need to enable masks in the label settings for all labels we want to mask (for example the city labels). The mask tab is conveniently located right next to the label buffer tab:

2. Then we can go to the layers we want to apply the masks to (for example the railroads layer). Here we can configure which symbol layers should be affected by which mask:

Note: The order of steps is important here since the “Mask sources” list will be empty as long as we don’t have any label masks enabled and there is currently no help text explaining this fact.

I’m also using label masks to keep the inside of the large city markers (the ones with a star inside a circle) clear of visual clutter. In short, I’m putting a circle-shaped character, such as ◍, over the city location:

In the text tab, we can specify our one-character label and – later on – set the label opacity to zero.
To ensure that the label stays in place, pick the center placement in “Offset from Point” mode.

Once we are happy with the size and placement of this label, we can then reduce the label’s opacity to 0, enable masks, and configure the railroads layer to use this mask.

As a general rule of thumb, it makes sense to apply the masks to dark background features such as the railways, rivers, and lake outlines in our map design:

Resulting map with label masks applied to multiple labels including city and marine area labels masking out railway lines and ferry connections as well as rivers and lake outlines.

If you have never used label masks before, I strongly encourage you to give them a try next time you work on a map for public consumption because they provide this little extra touch that is often missing from GIS maps.

Happy QGISing! Make maps not war.

by underdark at August 14, 2022 04:50 PM

August 13, 2022

I spent most of week nineteen on the hot and humid coast of North Carolina. I did three hour-long barefoot runs on the beach, some bodyweight strength training, and three hour-long surfing lessons. Back at home on the weekend, I did one long run at Horsetooth Open Space. Two trips to the summit from the parking lot. That one run accounted for 3,900 ft of my week's elevation gain. There aren't any hills on Hatteras Island. Here are the numbers for the week.

  • 7 hours, 16 minutes

  • 29.5 miles

  • 3,973 ft D+

https://live.staticflickr.com/65535/52282318222_b03485c7da_b.jpg

by Sean Gillies at August 13, 2022 06:41 PM

August 12, 2022

The latest v0.11 release is now available from conda-forge.

This release contains some really cool new algorithms:

  • New minimum and Hausdorff distance measures #37
  • New functions to add a timedelta column and get the trajectory sampling interval #233 

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

The new distance measures are covered in tutorial #11:

Computing distances between trajectories, as illustrated in tutorial #11

Computing distances between a trajectory and other geometry objects, as illustrated in tutorial #11

But don’t miss the great features covered by the other notebooks, such as outlier cleaning and smoothing:

Trajectory cleaning and smoothing, as illustrated in tutorial #10

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.

by underdark at August 12, 2022 02:00 PM

August 08, 2022

When we talk about geolocation we refer to the ability of a device to obtain the geographical position in which it is located. To do this, you can use the WiFi access point to which you are connected, the mobile phone tower to which you are linked (in the case of phones or other mobile devices that have a SIM card) or through GPS technology. when you have it. Even these different options can be combined with each other to improve the precision that each of them offers us separately.

In mobile devices, this ability to obtain geolocation greatly opens up the range of applications that can be developed, some examples may be the creation of Apps to inventory information, including the geographical position of each element, store routes or get information about what we have nearby, etc. The possibilities are almost endless.

Recently, at iCarto, we had the need to develop a mobile application with which to collect information on the route followed to collect wild mushrooms. We are committed to developing this application using web technologies adapted to create mobile applications, specifically using the framework Ionic with ReactJS. In this project we had some particular need related to precision and the possibility of collecting route points even if the application was in the background and the device was being used for something else. That made us look at different geolocation plugins available for Ionic. In this article we want to share this analysis, as well as some of the conclusions and learnings along the way.

When working with technologies to develop a mobile application, by default we can use the W3C Geolocation API that browsers implement. However, the use of a specific geolocation plugin is recommended, which we can activate if we use Cordova to package the mobile application, or in our particular case Capacitor, the tool provided by Ionic. It is a good practice in general, since in this way we ensure that we will have the functionality available regardless of the version of the browser that the mobile device has. Let’s remember that when using this type of technology, in the end our application, on the mobile device, is running inside a web browser. But it is also that in our particular case it was mandatory, because we needed specific functionalities, which the geolocation API does not offer, such as support to continue obtaining the location when the application is in the background.

Cordova vs. Capacitor

Before going into the analysis of geolocation plugins, it is good to stop and review the two main alternatives for packaging mobile applications when we work with web technologies for their development.

Apache Cordova was one of the first mobile application development environments using web technologies, CSS, HTML, and JavaScript, instead of using platform-specific APIs. This has its advantages and disadvantages, which we are not going to talk about now, since it would take several posts. The Ionic project started out using Cordova as an application packaging tool as well, but more recently it came out with its own, Capacitor. On the project website they tell us the reason for this change and what advantages Capacitor brings.

One of the advantages of Capacitor is that it is fully compatible with Cordova, and therefore we can use plugins that are available for the latter even if they do not exist in Capacitor, or even if there is an equivalent version we can use the one from * Cordova* if we are interested for any reason. This, for example, allowed us to expand the range of plugins to review.

Precisely because Cordova has been available for much longer, the number of plugins that we can find, both official and developed by the community, is much greater today than in the case of Capacitor, which despite also having quite a few plugins developed by their community, in addition to official, due to their youth they are still less.

Geolocation plugins

Both Cordova and Capacitor have a default geolocation plugin, based on the browser API. They are easy-to-use plugins, which are available without having to add anything and that basically allow us to obtain the coordinates of the position in which the mobile device is located. Its API gives us two functions:

  • getCurrentPosition: Gives us the coordinates when we call it.
  • watchPosition: Allows us to connect to a watcher that returns the coordinates of the position every time there is a change, that is, every time we move.

In none of the cases can coordinates be obtained when the application is in background, one of the requirements that our application did have, for example, since when collecting a long route, it was necessary to be able to use the phone at the same time to other tasks, without stopping recording the route.

Luckily, thanks to the community we have more options available. In our case, we tried two other plugins that allow geolocation to work in background.

  • BackgroundGeolocation by Capacitor-community: It is the latest plugin with support for working with geolocation in background, specific to Capacitor and with support for TypeScript.
  • Cordova Plugin Background Geolocation: The classic plugin used in Cordova environments when background geolocation is needed. The last release is 3 years ago, although it still works fine on recent versions of Android.

Within the tests we did with these two plugins, in both we achieved sufficient precision for the needs of the project, in most cases of a few meters. The plugin options allow us to play with various parameters to optimize the needs of precision of the coordinates that are obtained with the battery life of the device, for example.

Although the plugin for Capacitor is more current, the reality, at least in our case, is that we had problems with more modern mobiles, from Android 10, however it seemed to work without problem in older versions. The Cordova plugin however behaved well from the start on both older and more modern mobiles. For this and other small details we ended up deciding on the latter for our project.

Some problems detected along the way

Once the plugin was selected and despite having done a fairly exhaustive analysis and various tests, during development we still had to solve some problems. The first of them has to do with the abstraction layers that some manufacturers introduce on Android to customize their devices. It is a fairly well-known topic and is well documented on the Internet. The problems are usually related to battery management, looking for battery savings, what some of these customizations do and what caused our background service to stop. On the Don’t kill my app! website, most of these cases are documented and how to configure the device to prevent battery saving management from canceling this type of process.

Another problem that we had that was not so easy to identify and solve has to do with Android version 10. What happened is that once the application was put in background, points continued to be collected, but after a certain time they stopped collecting, approximately 30 seconds. Upon retrieving the application and having the main focus on the system, then the collection of points continued as if nothing had happened. It was clear that this problem was not the same as the previous one, since the application did not die due to battery management. The real issue is related to the new permission management introduced in Android 10 that allows you to choose whether a permission is granted always or only when the app is in use. This second option is the one that causes problems, since only when the application is in use actually means when the application is visible on the screen, in the foreground, and not when it is running in the background.

Our solution was to explicitly add the andorid:foregroundServiceType=”location” option to the AndroidManifest.xml. In this way we indicate that our app is a location service and always needs access to GPS. The complete line in the AndroidManifest.xml would look like this:

< service android:enabled="true" android:exported="false" android:foregroundServiceType="location" android:name="com.marianhello.bgloc.service.LocationServiceImpl" />

You also need to explicitly grant ACCESS_BACKGROUND_LOCATION permission as detailed in the documentation.

Final considerations

When choosing a plugin, as well as any development software or library in general, the functionalities it offers us are as important as its evolution and maintenance. In our case, we bet on a plugin that has not been active in its repository for some time, this implies that it is not maintained and future improvements are not expected. On the other hand, the one that fit in these criteria did not offer us the functionality we needed. Perhaps it is an extreme case that we found and that is why we made that decision. In any case, our recommendation is to always take into account the activity of the repository of a solution, seeing that it is and will continue to be maintained over time, that there is a community around it and that it is a safe bet. When this is not possible, you have to consider other things to make the choice and even if it were possible, try to create your own solution.

La entrada Geolocation with GPS in Ionic se publicó primero en iCarto.

by iCarto at August 08, 2022 10:45 PM

August 03, 2022

In the interest of getting back into the habit of releasing things again and to line up authoring expectations/experience for another upcoming MapGuide Open Source 4.0 preview release, here's a long overdue release of MapGuide Maestro. Here's a summary of what's changed since the last release (Wow! It really has been 4 years since the last one?)

MapGuide Open Source 4.0 authoring support

This release of Maestro takes advantage of features/capabilities introduced in the upcoming MapGuide Open Source 4.0. For all these demonstrated features, we assume you have the current Preview 3 release of MGOS 4.0 installed or newer.

A new Layer Definition resource template based on the v4.0.0 schema is now available.



What features/capabilities does this offer? A new toggle option to determine if QUERYMAPFEATURES requests that hit this layer should include bounding box information or not. When bounding box data is not included, client-side viewer tools like zooming to selection will not work due to the lack of this information.




The WMS metadata UI now has support for exposing or hiding the geometry field data from WMS GetFeatureInfo responses.



The basic label style editor finally has the missing support for editing advanced placement settings



MapGuide Open Source 4.0 introduced bulk coordinate transformation capabilities in the mapagent and Maestro will now take advantage of this feature for places in the UI that require transforming coordinates, such as setting the map extents for example



MapGuide Open Source 4.0 now also removes the restriction that you cannot CREATERUNTIMEMAP or MgMap.Create() a Map Definition that links to a XYZ tileset, so the Map Definition editor will no longer throw this warning and block you from linking to a XYZ tile set definition if you are connected to a MGOS 4.0 instance.


Notable UI Changes

Your published WFS and WMS layers are now visible as top-level nodes in the Site Explorer! This allows for quick visual verification that you have correctly applied the appropriate WFS/WMS publishing metadata to your Feature Source or Layer Definition.


These new nodes aren't just for show. For the published WMS layers, there are context menu actions to follow back to the source Layer Definition or for the more exciting option, the ability to preview this WMS layer through the new OpenLayers-driven content representation for WMS layers



The local map viewer component (used for local map previews) now has a panel to show selection attributes


MgCooker is no more. Replaced with MgTileSeeder

The venerable MgCooker tool for pre-seeding tilesets in MapGuide has been removed. MgTileSeeder is now the full replacement for MgCooker and is capable of more things than MgCooker (like being a generic XYZ tileset seeder). All existing Maestro UI integrations with the MgCooker tool have also been removed as a result.

Maestro API package is now SourceLink-enabled

If you use the Maestro API to build your own MapGuide client applications, the Maestro API nuget package is now built with SourceLink support meaning that when you go to a definition of any class/type of the Maestro API, you will now see the full source code of that class/type instead of the class/type outline from inferred .net metadata.

Similarly, when debugging you can now step into the source code for any method in the Maestro API!


To take advantage of SourceLink requires some small Visual Studio settings changes, which are covered in detail here.

Maestro is now a self-contained .net 6.0 windows application

Aside from being able to take advantage of the new capabilities and performance improvements of .net 6.0, the other upside of this move is that this means that you no longer have to download/install the .net Framework first before installing Maestro. Being a self-contained application means that the support files needed to run a .net 6.0 application are now included with the installer itself.


by Jackie Ng (noreply@blogger.com) at August 03, 2022 05:10 PM

August 02, 2022

August 01, 2022

GeoServer 2.21.1 release is now available with downloads (bin, war, windows), along with docs and extensions.

This is a stable release of the GeoServer 1.21.x series, made in conjunction with GeoTools 27.1 and GeoWebCache 1.21.1.

Thanks to Jody Garnett (GeoCat) for making this release.

Server Status

The server status page has been cleaned up with a few quality of life improvements:

  • Units supplied for numbers, such as “7 threads” or “30,000 ms”
  • Number of items held in the resource cache is shown, so there is visual feed back when using Clear button.
  • Documentation has been updated to cover all the status field descriptions and document the available actions

For more information see Server Status page.

Server status

JVM Console

A new JVM Console tab has been added to the server status page allowing a summary of memory use to be reviewed and downloaded, and a summary of active threads to be reviewed and downloaded.

For more information see JVM Console.

JVM Console

Workspace headers for proxy url

A checkbox Use headers for Proxy URL has been added to the workspace page.

This setting an individual workspace use headers for proxy URL (even when the default in global settings has been disabled).

Improvements and Fixes

Improvement:

GEOS-10580 Server status page improvements for status, modules and docs

GEOS-10521 Allow GetFeatureInfo over raster layers to identify both original raster and transformed vectors

GEOS-10514 Better capture catalog configuration issues: layergroup with a misconfigured layer

GEOS-10501 GetMap: support auth headers forwarding to remote SLD urls

GEOS-10495 Request Logger Memory Buffer Limits

GEOS-10489 Add options to LDAP Role Service to configure prefixes and enforce capitalization

GEOS-10464 Improve logging and check for NPEs and other issues in Importer Module

Bug:

GEOS-10584 Enabling logging of request body results in stream closed errors in tomcat environment

GEOS-10570 Deleting a style in a Hazelcast cluster renames the styles directory

GEOS-10553 Importer replace fails with schema mismatch

GEOS-10548 GeoFence layer group handling is inconsistent

GEOS-10546 Invalid time expressions used in WCS 2.0 subset return a code 200 with generic exception

GEOS-10545 Layer Group cache not initialized

GEOS-10539 DescribeLayer typeName is no longer workspace qualified

GEOS-10535 WFS Update request throw NPE on bad namespace

GEOS-10534 a badly formed delete transaction will get a NPE instead of an informative error message

GEOS-10533 Review startup logging INFO and WARN updates

GEOS-10526 Parallel REST API calls failures

GEOS-10522 REST API Failure in @ExceptionHandler No input String specified

GEOS-10518 Partial RELINQUISH_LOG4J_CONTROL regression with WildFly

GEOS-10516 WMS GetCapabilities dimension representations ignores the end attribute

GEOS-10496 Using the REST API to purge NetCDF granules causes a seemingly infinite loop

GEOS-10487 Custom logging configuration not respecting log location setting

GEOS-10468 (virtually) Impossible to turn off “Enable All Statistics” in > Server status > System Status

Tasks:

GEOS-10588 Build structure gs-sec-oauth2-core is duplicated in the reactor

GEOS-10585 Upgrade to Jetty from 9.4.44 to 9.4.48

GEOS-10579 Bump oshi-core from 6.2.0 to 6.2.1

GEOS-10562 Bump oshi-core from 5.8.6 to 6.2.0

GEOS-10551 Refactor commons-httpclient usage in the WPS module

GEOS-10532 FreemarkerTemplateManager API changes for easier subclassing

GEOS-10529 Use Awaitility to replace waits for condition in tests

GEOS-10525 Centralize and simplify management of common test dependencies

About GeoServer 2.21

Additional information on GeoServer 2.21 series:

Release notes: ( 2.21.1 | 2.21.0 | 2.21-RC )

by Jody Garnett at August 01, 2022 12:00 AM

July 26, 2022

July 25, 2022

Prezados leitores,

A Geocursos irá realizar um workshop que visa apresentar como criar seu banco de dados, importar seus shapefiles e ao final publicá-los em um servidor de mapas.

O evento será 100% online, gratuito e acontecerá nos dias 29, 30, 31 de agosto e 01 de setembro as 20h (horário de Brasília).

As inscrições estão abertas e devem ser realizadas em https://workshop.geocursos.com.br

📍 Veja a programação do Workshop:

✅ Aula 01 (29/AGOSTO): Saiba como criar seu primeiro banco de dados.

✅ Aula 02 (30/AGOSTO): Saiba como importar seus shapefiles para o banco de dados.

✅ Aula 03 (31/AGOSTO): Saiba como publicar seus dados geoespaciais com o GeoServer.

✅ Aula Bônus (01/SETEMBRO): A partir de uma modelagem vamos criar a estrutura de tabelas de um WebGIS.

Ficou com alguma dúvida?
Basta entrar em contato pelo e-mail: workshops@geocursos.com.br

Temos certeza de que será um evento de muito aprendizado!
Nos vemos no Workshop!

by Fernando Quadro at July 25, 2022 01:20 PM

July 24, 2022

After 3 weeks of little training, I'm back at it in week 17. 5 trail runs with plenty of hills.

  • 8 hours, 35 minutes

  • 38 miles

  • 7,493 ft D+

I spent 2:30 riding 30 miles on my bike on top of that. About half riding to and from Pineridge Open Space, but also did a longer ride on Saturday instead of running. I'm going to try to run 55 miles next week and 65 in week 19. That's down quite a bit from my peak volume in the past two years.

by Sean Gillies at July 24, 2022 08:23 PM

An important concept in spatial data modelling is that of a coverage.  A coverage models a two-dimensional region in which every point has a value out of a range (which may be defined over one or a set of attributes).  Coverages can be represented in both of the main physical spatial data models: raster and vector.  In the raster data model a coverage is represented by a grid of cells with varying values.  In the vector data model a coverage is a set of non-overlapping polygons (which usually, but not always, cover a contiguous area).  

This post is about the vector data coverage model, which is termed (naturally) a polygonal coverage. These are used to model regions which are occupied by discrete sub-regions with varying sets of attribute values.  The sub-regions are modelled by simple polygons.  The coverage may contain gaps between polygons, and may cover multiple disjoint areas.  The essential characteristics are:

  • polygon interiors do not overlap
  • the common boundary of adjacent polygons has the same set of vertices in both polygons.

There are many types of data which are naturally modelled by polygonal coverages.  Classic examples include:

  • Man-made boundaries
    • parcel fabrics
    • political jurisdictions
  • Natural boundaries
    • vegetation cover
    • land use
A polygonal coverage of regions of France

Topological and Discrete Polygonal Coverages


There are two ways to represent polygonal coverages: as a topological data structure, or as a set of discrete polygons.  
A coverage topology consists of linked faces, edges and nodes. The edges between two nodes form the shared boundary between two faces.  The coverage polygons can be reconstituted from the edges delimiting each face.  
The discrete polygon representation is simpler, and aligns better with the OGC Simple Features model.  It is simply a collection of polygons which satisfy the coverage validity criteria given above.
Most common spatial data formats support only a discrete polygon model, and many coverage datasets are provided in this form.  However, the lack of inherent topology means that datasets must be carefully constructed to ensure they have valid coverage topology.  In fact, many available datasets contain coverage invalidities.  A current focus of JTS development is to provide algorithms to detect this situation and provide the locations where the polygons fail to form a valid coverage.

Polygonal Coverage Operations

Operations which can be performed on polygonal coverages include:

  • Validation - check that a set of discrete polygons forms a valid coverage
  • Gap Detection - check if a polygonal coverage contains narrow gaps (using a given distance tolerance)
  • Cleaning - fix errors such as gaps, overlaps and slivers in a polygonal dataset to ensure that it forms a clean, valid coverage
  • Simplification - simplify (generalize) polygon boundary linework, ensuring coverage topology is preserved
  • Precision Reduction - reduce precision of polygon coordinates, ensuring coverage topology is preserved
  • Union - merge all or portions of the coverage polygons into a single polygon (or multipolygon, if the input contains disjoint regions)
  • Overlay - compute the intersection of two coverages, producing a coverage of resultant polygons 

Implementing polygonal coverage operations is a current focus for development in the JTS Topology Suite.  Since most operations require a valid coverage as input, the first goal is to provide Coverage Validation.  Cleaning and Simplification are priority targets as well.  Coverage Union is already available, as is Overlay (in a slightly sub-optimal way).  In addition,  a Topology data structure will be provided to support the edge-node representation.  (Yes, the Topology Suite will provide topology at last!).  Stay tuned for further blog posts as functionality is rolled out.

As usual, the coverage algorithms developed in JTS will be ported to GEOS, and will thus be available to downstream projects like PostGIS.

by Dr JTS (noreply@blogger.com) at July 24, 2022 04:14 PM

This week I became a Fort Collins municipal internet utility customer. 1 Gbps up and down for $60 a month.

https://live.staticflickr.com/65535/52233090219_e3263282a9_c.jpg

Buh-bye coaxial cable

I called the cable company on Tuesday to cancel and was told that they were going to refund me $25 in the process of squaring up my account. Sweet! Then today I got an email announcing my next month's bill. The cable company is terrible at their business. I wonder how many times I will have to cancel my account before it sticks?

This is a great upgrade. The house my family was renting in France got fiber in 2017. I'm just saying.

by Sean Gillies at July 24, 2022 03:07 AM

July 23, 2022

I had some health troubles during weeks 14-16 and got very little training done while feeling generally crappy and worried. On June 16 (end of week 13) I went for a 20 mile run in the hills and struggled on the climbs. I was unusually out of breath and after I finished I was a bit dizzy. The next day I was mildly feverish and I continued to have an elevated temperature and noticeable lethargy during my run on Tuesday, the 28th. I took a COVID test and it was negative. On the 29th I went out for an interval workout and quit after my warmup. I felt dizzy, achy, without energy, and was seeing a heart rate on my Garmin watch that conflicted with what I was feeling: anomalously low at times. I took another COVID test, again negative. Friday, July 1, I had a virtual visit with my doctor, who recommended a PCR test for flu and COVID and some blood work. The PCR test was negative and the lab report said I was normal on all counts. About this time I became aware of twitching in my chest, which I noticed most when I was laying down before falling asleep, or in the middle of the night. At first I chalked this up to anxiety, but after several days of no relief and some very confusing heart rate measurements on my run on July 5 (after which I joked "Getting confusing HR signals from my watch. Either it or I am about to die.") I got a live, in-person visit with my doctor and a ECG, which revealed premature atrial and ventricular complexes (PACs and PVCs). My heart really was malfunctioning.

I got fitted with a wearable ECG to collect more data and had a generally crappy week of heart twitches, poor sleep, anxiety, and no running while waiting to get a echo scan of my heart and a consultation with a cardiologist. In week 16 (starting July 11), I began to feel a little better. I found that I could hike and run at a super easy pace and not feel terrible, so I began to treat it like an ordinary recovery week (every 4th week of my training blocks are recovery weeks). On July 14 I drove to the UC Health hospital in Greeley, which has some extra capacity, and got an echo scan (sonogram) of my heart. The initial assessment said that I had an enlarged right ventricle. That didn't sound good. My heart palpitations continued to subside, but I still had five stressful days of waiting before my cardiology appointment on Tuesday of this week (week 17). The cardiologist disagreed with the initial assessment of my echo scan and didn't recommend any other scans. I don't have an enlarged ventricle. Other than the PACs and PVCs, which continue to diminish, my heart is in good shape. I had a treadmill stress test on Friday and passed. We measured only a few PVCs during the test.

This week I started running a little harder and have been feeling fine. It seems like I only had a temporary episode of premature contractions that were likely triggered by an unknown virus. My watch's measurement of my heart rate is back to normal, too. Neither it, nor I am going to die soon.

Here are the numbers for weeks 14-16.

Week 14:

  • 3 hours, 11 minutes

  • 15.1 miles

  • 3,428 ft D+

Week 15:

  • 1 hour, 57 minutes

  • 9.6 miles

  • 1,056 ft D+

Week 16:

  • 3 hours, 45 minutes

  • 18.1 miles

  • 1,631 ft D+

I had been aiming for 120 miles of running and 20,000 feet of climbing in weeks 14 and 15 and got nowhere near that. I missed two big weeks of training, but I'm trying not to sweat about that. I've had enough worrying in the past three weeks, I don't need to add worry about training to my problems. I'm still on track to run the Superior 50 in 7 weeks.

by Sean Gillies at July 23, 2022 09:36 PM

July 21, 2022

July 20, 2022

July 19, 2022

July 13, 2022

July 09, 2022

The BEV (Austrian Bundesamt für Eich- und Vermessungswesen) has recently published the Austrian cadastre as open data:

The URLs for vector tiles and styles can be found on https://kataster.bev.gv.at under Guide – External

The vector tile URL is:

https://kataster.bev.gv.at/tiles/{kataster | symbole}/{z}/{x}/{y}.pbf

There are 4 different style variations:

https://kataster.bev.gv.at/styles/{kataster | symbole}/style_{vermv | ortho | basic | gis}.json

When configuring the vector tiles in QGIS, we specify the desired tile and style URLs, for example:

For example, this is the “gis” style:

And this is the “basic” style:

The second vector tile source I want to mention is basemap.at. It has been around for a while, however, early versions suffered from a couple of issues that have now been resolved.

The basemap.at project provides extensive documentation on how to use the dataset in QGIS and other GIS, including manuals and sample projects:

Here’s the basic configuration: make sure to set the max zoom level to 16, otherwise, the map will not be rendered when you zoom in too far.

The level of detail is pretty impressive, even if it cannot quite keep up with the basemap raster tiles:

Vector tile details at Resselpark, Vienna
Raster basemap details at Resselpark, Vienna

by underdark at July 09, 2022 06:58 PM

July 08, 2022

July 07, 2022

July 06, 2022

This article is part of our series on Delimitation of hydrographic sub-basins using QGIS and a DEM:

  1. Delimitation of hydrographic sub-basins using QGIS and a DEM
  2. Digital elevation models for hydrological studies
  3. Technical criteria for the delimitation of sub-basins
  4. Sub-basin delimitation process using QGIS (This post)

In this article we are going to explain the process to delimit the river sub-basins of a certain region using QGIS and a DEM. In this case the region is the entire Republic of Mozambique.

The starting data we need is a mosaic of Digital Elevation Models (DEMs) of the entire area and the watershed layer. As we commented in the previous articles of the series, we will use the NASADEM HGT mosaics (here we explain how to download them) and the basin layer elaborated by GEAMA.

Create single DEM (Mosaic)

The first step is the union of the DEM mosaics to create a single DEM for all of Mozambique.

QGIS provides us with different options for joining raster files. In this case we are going to use the tool “Merge”. To do this, select “Raster” from the menu, display the “Miscellaneous” options and click “Merge”.

In the pop-up window we add all the input layers, that is, all the DEMs downloaded from the web of the area in which we are going to work. We give a path and a name for the output file, check the option “Open the output file after running the algorithm” and execute.

DEM reprojection

The merged DEM is in the WGS 84 Geographic Coordinate System (EPSG:4326). We reproject the DEM to a Projected Coordinate System. In our case to WGS 84 / UTM zone 37S (EPSG:32737). To reproject the DEM from the menu we select “Raster”, then “Projections” and “Warp (reproject)”.

In the input layer we select the combined DEM and indicate both the source CRS and the target CRS. We also indicate a path and a name for the output file and run the algorithm.

DEM clipping

To reduce the processing computation of the algorithms that we will use in the following steps, we will now clip the DEM per each river basin. Another reason for making these cuts is the difference in sizes and shapes of the basins that we are going to analyze. Better results will be obtained by analyzing each basin separately in order to adjust the parameters of the algorithms to each particular case instead of using the same parameters for all of Mozambique.

We create the masks for each basin from the basins layer. In the following image you can see the DEM in the background, in white the administrative limit of Mozambique and in blue the mask of the Lurio river basin.

Next, we clip the DEM with each created mask. To clip the DEM, in the main menu we select “Raster”, then “Extraction” and “Clip Raster by Mask Layer”.

In the pop-up window we select the reprojected DEM on the input layer and the basin mask on the mask layer. Also, so that the crop fits exactly to the edge of the basin and so that no black area is visible, we will specify a value for “no data”, for example -9999. Finally, we mark the following two options:

  • Match the extent of the clipped raster to the extent of the mask layer.
  • Keep resolution of input raster.

We specify the path and name for the output file and run the algorithm.

Perform this process for all basins. Batch processing can be used for this instead of doing it one by one.

Filling depressions

Although the latest versions of the DEMs are already processed with different corrections and gap filling, they may still contain artifacts such as depressions. Artifacts must be removed before using a DEM in hydrologic analysis. There are several algorithms for filling gaps (GDAL, GRASS, SAGA, …). The incorporation of the GRASS hydrological toolset in QGIS is very useful. We will therefore use the GRASS algorithm both for this step and for others that we will see later.

To fill depressions we wil use the processing toolboox, if it is not visible on the right side of the QGIS window, it can be enabled from the main menu, by select “Processing” and “Toolbox”.

In the toolbox find or type the following algorithm: r.fill.dir. In the dialog window select as input (Elevation) the clipped DEM and uncheck Flow direction and Problem areas. We are only interested in obtaining the DEM without depressions. Indicate the path and name for the output file and execute.

This geoprocess can take a long time to finish depending on the size of the raster and the power of the computer. Perform this geoprocess for all basins. Batch processing can be used for this instead of doing it one by one.

Drainage direction, accumulated flow, stream segments and sub-basins

Now that we have a corrected DEM, the next step is to calculate the drainage directions, accumulated flow, stream segments and sub-basins. To do this we look for the following algorithm in the toolbox: r.watershed

In the dialog window we select the corrected raster of the basin and establish the most appropriate “Minimum size of exterior watershed basin” for each case.

Because of the wide difference between the size of some basins and others, it is necessary to play with the “Minimum size of exterior watershed basin” to obtain the optimal results for each of them.

An average size can be obtained by reviewing the characteristics of the raster. The pixel size of the raster is 30x30m, so the “Minimum size of exterior watershed basin” must be at least 333,333.33 pixels, approximating 300,000 pixels.

But, in many cases, it is decided to establish 100,000 or less to solve some problems detected in some flat areas or mouths. Also check the following two options:

  • Enable Single Flow Direction (D8) flow
  • Allow only horizontal and vertical flow of water

Also check the boxes that can be seen in the following image, in addition to indicating the paths and names of the output files and execute the algorithm.

We thus obtain for each basin:

  • Accumulated flow
  • Drainage direction
  • Sub-basins
  • Stream segments

The only layer we are going to work on is the sub-basin layer, the rest are calculated to serve as support for small manual adjustments that we will make later.

Transform to vector layer

Once we have the sub-basins layer, the next step is to transform this raster layer into a vector layer. To do this we are going to look for the following geoprocess in the toolbox: r.to.vect. In the window, select the sub-basins layer as the input raster layer.

Also select these options and run:

  • Feature type: area
  • Smooth corners of area features
  • v.out.ogr output type: area

Review and manual adjustments

Once we have the vector layer of sub-basins we have to review the result, join the necessary polygons to have only the sub-basins that meet the agreed criteria mentioned in the previous article and make small manual corrections where necessary.

The union of the polygons is also necessary because the algorithm creates sub-basins of tributaries that are not direct to the main river, that is, tributaries of a tributary of the main river. In this way we solve the problem.

These algorithms work well in sloping areas, but in flat areas they can give undesirable results, and it is necessary to review them. For example, in coastal areas with very flat basins, a more detailed review and adjustments to sub-basin boundaries were required based on other information such as drainage directions, rivers, etc.

Smoothing and simplification

When vectorizing a raster, the edges of the polygons are generated following the shape of the pixels, for this reason it is necessary to smooth them.

This leads to another problem, which is that smoothing creates many vertices that make the layer size increase quite a bit. To avoid this, after smoothing you can also do a simplification.

Choosing the appropriate parameters, smoothing and simplification are carried out without practically losing definition in the delimitation of the sub-basins. In the image below you can see the edges of the sub-basins before and after (red) smoothing and simplifying.

Sub-basins merge

Since we have created a sub-basin layer for each basin, we perform a vector layer union to create a single sub-basin layer for the entire country: mergevectorlayers.

In the dialog window we select all the sub-basin layers for input layers, we specify the Coordinate Reference System, the path and the name of the output file and we execute.

Geometric and topological checks and corrections

Once we have the desired sub-basin layer, overlaps, duplicates, gaps, invalid geometries, etc. must be checked and corrected.

This can be done with plugins (Manage and install plugins)

Modify attribute table as needed

In addition to the geometry of the sub-basins, we must also add the desired attributes and order the attribute table as desired. Some examples of useful attributes in a sub-basin layer are the name of the river, the area, length of the main river or the basin to which it belongs.

Final Results

Following these steps we obtain our layer of sub-basins.

With this article we close our series on the delimitation of sub-basins. Did you find them interesting? Do you use other techniques? Need help? Get in touch with us or write to us on Twitter.

La entrada Sub-basin delimitation process using QGIS se publicó primero en iCarto.

by iCarto at July 06, 2022 09:30 PM

July 05, 2022