Decía Oscar Wilde que un mapamundi que no incluya los territorios de la utopía no vale la pena mirarlo, pues cuando la Humanidad mira a lo lejos tierras mejores, siempre zarpa hacia ellas. El progreso, decía el escritor, es la realización de las utopías.
Este año son las 18as Jornadas Internacionales de gvSIG, un proyecto que en sus inicios fue tachado de utópico, de irrealizable, y que dieciocho años después está más activo que nunca. Tras dos años en los que el evento tuvo que realizarse en modalidad virtual, regresa a la presencialidad, en València, la ciudad que se ha convertido por derecho propio en uno de los centros de referencia de la geomática, la ciencia y tecnología aplicada a la gestión territorial.
Volvemos de la mejor manera posible, uniendo esfuerzos con la Red GeoLIBERO, impulsada por CYTED Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo. Una red que aúna a algunas de las principales entidades y personas del ámbito de la investigación en el área de la geomática libre. Coordinadores de los distintos grupos de investigación de GeoLIBERO, de toda Iberoamérica, presentarán sus trabajos en el marco de las Jornadas.
Impulsando la economía del conocimiento, lema del evento, destaca uno de los ejes principales del proyecto gvSIG. Los últimos tiempos están haciéndonos más conscientes de la necesidad de ser independientes en sectores críticos como la energía, la sanidad, la defensa y, por supuesto, la tecnología. Soberanía tecnológica, uno de los lemas de gvSIG, y que debe ir intrínsecamente relacionado con la soberanía económica.
Es necesario, hoy más que nunca, impulsar los proyectos que apuestan por nuevos modelos de negocio, basados en la colaboración, la solidaridad y el conocimiento compartido. Es el momento de romper definitivamente con la dependencia tecnológica, con modelos que no generan economía y suponen gasto. El momento de mantener y reforzar tecnologías que se asientan sobre los conceptos de cooperación y sostenibilidad. Es el momento de gvSIG y la geomática libre.
This content is restricted to members only. Please login or register for access. Username or Password should be not be empty Username or Email Address Password Remember Me Lost Password
I'm solo parenting next week while Ruth is traveling. Between work, running,
and the first week of school for the kids, I'm going to be busy. Apologies in
advance for delayed replies to emails and open source project issues.
Week twenty was supposed to be a rest week. I took it pretty easy and did just one
long high-altitude run on Friday. From the Bear Lake Road Park and Ride lot in
Rocky Mountain National Park, I ran past Bierstadt Lake,
up to Flattop Mountain (12,324 ft), Ptarmigan Pass, and back. 16 miles and 4,000 ft
of elevation gain in all. I saw moose grazing in Bierstadt Lake, elk at
Ptarmigan Pass, and pika all over the FLattop Mountain trail. This run
dominated my numbers for the week.
7 hours, 3 minutes
26.1 miles
5,020 ft D+
Moose in Bierstadt Lake
Tonahutu Trail at Ptarmigan Pass
Odessa Lake, Lake Helene, Two Rivers Lake (left to right)
I want to get ~110 miles and 15,000 ft of climbing in over the next two
weeks. I hope weather and my health continue to stay good.
Cartographers use all kind of tricks to make their maps look deceptively simple. Yet, anyone who has ever tried to reproduce a cartographer’s design using only automatic GIS styling and labeling knows that the devil is in the details.
This post was motivated by Mika Hall’s retro map style.
New print designs on the way: these are some snippets from a project I began last year to map the provinces of Spain in a retro style, which I've decided to revisit this summer. Prints will be available later in the year! pic.twitter.com/gUJirBFv0x
There are a lot of things going on in this design but I want to draw your attention to the labels – and particularly their background:
Detail of Mike’s map (c) Mike Hall. You can see that the rail lines stop right before they would touch the A in Valencia (or any other letters in the surrounding labels).
This kind of effect cannot be achieved by good old label buffers because no matter which color we choose for the buffer, there will always be cases when the chosen color is not ideal, for example, when some labels are on land and some over water:
Ordinary label buffers are not always ideal.
Label masks to the rescue!
Selective label masks enable more advanced designs.
Here’s how it’s done:
Selective masking has actually been around since QGIS 3.12. There are two things we need to take care of when setting up label masks:
1. First we need to enable masks in the label settings for all labels we want to mask (for example the city labels). The mask tab is conveniently located right next to the label buffer tab:
2. Then we can go to the layers we want to apply the masks to (for example the railroads layer). Here we can configure which symbol layers should be affected by which mask:
Note: The order of steps is important here since the “Mask sources” list will be empty as long as we don’t have any label masks enabled and there is currently no help text explaining this fact.
I’m also using label masks to keep the inside of the large city markers (the ones with a star inside a circle) clear of visual clutter. In short, I’m putting a circle-shaped character, such as ◍, over the city location:
In the text tab, we can specify our one-character label and – later on – set the label opacity to zero.To ensure that the label stays in place, pick the center placement in “Offset from Point” mode.
Once we are happy with the size and placement of this label, we can then reduce the label’s opacity to 0, enable masks, and configure the railroads layer to use this mask.
As a general rule of thumb, it makes sense to apply the masks to dark background features such as the railways, rivers, and lake outlines in our map design:
Resulting map with label masks applied to multiple labels including city and marine area labels masking out railway lines and ferry connections as well as rivers and lake outlines.
If you have never used label masks before, I strongly encourage you to give them a try next time you work on a map for public consumption because they provide this little extra touch that is often missing from GIS maps.
I spent most of week nineteen on the hot and humid coast of North Carolina.
I did three hour-long barefoot runs on the beach, some bodyweight strength
training, and three hour-long surfing lessons. Back at home on the weekend,
I did one long run at Horsetooth Open Space. Two trips to the summit from the
parking lot. That one run accounted for 3,900 ft of my week's elevation gain.
There aren't any hills on Hatteras Island. Here are the numbers for the week.
New functions to add a timedelta column and get the trajectory sampling interval #233
As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:
The new distance measures are covered in tutorial #11:
Computing distances between trajectories, as illustrated in tutorial #11Computing distances between a trajectory and other geometry objects, as illustrated in tutorial #11
But don’t miss the great features covered by the other notebooks, such as outlier cleaning and smoothing:
Trajectory cleaning and smoothing, as illustrated in tutorial #10
If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.
Did you know it’s now been over 15 years since Google Street View was launched? It’s been a truly revolutionary technology, allowing users to immerse themselves in places a...
When we talk about geolocation we refer to the ability of a device to obtain the geographical position in which it is located. To do this, you can use the WiFi access point to which you are connected, the mobile phone tower to which you are linked (in the case of phones or other mobile devices that have a SIM card) or through GPS technology. when you have it. Even these different options can be combined with each other to improve the precision that each of them offers us separately.
In mobile devices, this ability to obtain geolocation greatly opens up the range of applications that can be developed, some examples may be the creation of Apps to inventory information, including the geographical position of each element, store routes or get information about what we have nearby, etc. The possibilities are almost endless.
Recently, at iCarto, we had the need to develop a mobile application with which to collect information on the route followed to collect wild mushrooms. We are committed to developing this application using web technologies adapted to create mobile applications, specifically using the framework Ionic with ReactJS. In this project we had some particular need related to precision and the possibility of collecting route points even if the application was in the background and the device was being used for something else. That made us look at different geolocation plugins available for Ionic. In this article we want to share this analysis, as well as some of the conclusions and learnings along the way.
When working with technologies to develop a mobile application, by default we can use the W3C Geolocation API that browsers implement. However, the use of a specific geolocation plugin is recommended, which we can activate if we use Cordova to package the mobile application, or in our particular case Capacitor, the tool provided by Ionic. It is a good practice in general, since in this way we ensure that we will have the functionality available regardless of the version of the browser that the mobile device has. Let’s remember that when using this type of technology, in the end our application, on the mobile device, is running inside a web browser. But it is also that in our particular case it was mandatory, because we needed specific functionalities, which the geolocation API does not offer, such as support to continue obtaining the location when the application is in the background.
Cordova vs. Capacitor
Before going into the analysis of geolocation plugins, it is good to stop and review the two main alternatives for packaging mobile applications when we work with web technologies for their development.
Apache Cordova was one of the first mobile application development environments using web technologies, CSS, HTML, and JavaScript, instead of using platform-specific APIs. This has its advantages and disadvantages, which we are not going to talk about now, since it would take several posts. The Ionic project started out using Cordova as an application packaging tool as well, but more recently it came out with its own, Capacitor. On the project website they tell us the reason for this change and what advantages Capacitor brings.
One of the advantages of Capacitor is that it is fully compatible with Cordova, and therefore we can use plugins that are available for the latter even if they do not exist in Capacitor, or even if there is an equivalent version we can use the one from * Cordova* if we are interested for any reason. This, for example, allowed us to expand the range of plugins to review.
Precisely because Cordova has been available for much longer, the number of plugins that we can find, both official and developed by the community, is much greater today than in the case of Capacitor, which despite also having quite a few plugins developed by their community, in addition to official, due to their youth they are still less.
Geolocation plugins
Both Cordova and Capacitor have a default geolocation plugin, based on the browser API. They are easy-to-use plugins, which are available without having to add anything and that basically allow us to obtain the coordinates of the position in which the mobile device is located. Its API gives us two functions:
getCurrentPosition: Gives us the coordinates when we call it.
watchPosition: Allows us to connect to a watcher that returns the coordinates of the position every time there is a change, that is, every time we move.
In none of the cases can coordinates be obtained when the application is in background, one of the requirements that our application did have, for example, since when collecting a long route, it was necessary to be able to use the phone at the same time to other tasks, without stopping recording the route.
Luckily, thanks to the community we have more options available. In our case, we tried two other plugins that allow geolocation to work in background.
BackgroundGeolocation by Capacitor-community: It is the latest plugin with support for working with geolocation in background, specific to Capacitor and with support for TypeScript.
Cordova Plugin Background Geolocation: The classic plugin used in Cordova environments when background geolocation is needed. The last release is 3 years ago, although it still works fine on recent versions of Android.
Within the tests we did with these two plugins, in both we achieved sufficient precision for the needs of the project, in most cases of a few meters. The plugin options allow us to play with various parameters to optimize the needs of precision of the coordinates that are obtained with the battery life of the device, for example.
Although the plugin for Capacitor is more current, the reality, at least in our case, is that we had problems with more modern mobiles, from Android 10, however it seemed to work without problem in older versions. The Cordova plugin however behaved well from the start on both older and more modern mobiles. For this and other small details we ended up deciding on the latter for our project.
Some problems detected along the way
Once the plugin was selected and despite having done a fairly exhaustive analysis and various tests, during development we still had to solve some problems. The first of them has to do with the abstraction layers that some manufacturers introduce on Android to customize their devices. It is a fairly well-known topic and is well documented on the Internet. The problems are usually related to battery management, looking for battery savings, what some of these customizations do and what caused our background service to stop. On the Don’t kill my app! website, most of these cases are documented and how to configure the device to prevent battery saving management from canceling this type of process.
Another problem that we had that was not so easy to identify and solve has to do with Android version 10. What happened is that once the application was put in background, points continued to be collected, but after a certain time they stopped collecting, approximately 30 seconds. Upon retrieving the application and having the main focus on the system, then the collection of points continued as if nothing had happened. It was clear that this problem was not the same as the previous one, since the application did not die due to battery management. The real issue is related to the new permission management introduced in Android 10 that allows you to choose whether a permission is granted always or only when the app is in use. This second option is the one that causes problems, since only when the application is in use actually means when the application is visible on the screen, in the foreground, and not when it is running in the background.
Our solution was to explicitly add the andorid:foregroundServiceType=”location” option to the AndroidManifest.xml. In this way we indicate that our app is a location service and always needs access to GPS. The complete line in the AndroidManifest.xml would look like this:
< service android:enabled="true" android:exported="false" android:foregroundServiceType="location" android:name="com.marianhello.bgloc.service.LocationServiceImpl" />
You also need to explicitly grant ACCESS_BACKGROUND_LOCATION permission as detailed in the documentation.
Final considerations
When choosing a plugin, as well as any development software or library in general, the functionalities it offers us are as important as its evolution and maintenance. In our case, we bet on a plugin that has not been active in its repository for some time, this implies that it is not maintained and future improvements are not expected. On the other hand, the one that fit in these criteria did not offer us the functionality we needed. Perhaps it is an extreme case that we found and that is why we made that decision. In any case, our recommendation is to always take into account the activity of the repository of a solution, seeing that it is and will continue to be maintained over time, that there is a community around it and that it is a safe bet. When this is not possible, you have to consider other things to make the choice and even if it were possible, try to create your own solution.
The Great Resignation, The Big Quit or The Big Reshuffle? The employment landscape has been given many names since early 2021, but they all have one thing in common. They a...
The GeoTools team is pleased to share the availability GeoTools 27.1 :
geotools-27.1-bin.zip
geotools-27.1-doc.zip
geotools-27.1-userguide.zip
geotools-27.1-project.zip
Improvements and fixes in this release
Bug:
GEOT-7182
TransformFeatureSource can lose paging information while transforming query
GEOT-7170
In the interest of getting back into the habit of releasing things again and to line up authoring expectations/experience for another upcoming MapGuide Open Source 4.0 preview release, here's a long overdue release of MapGuide Maestro. Here's a summary of what's changed since the last release (Wow! It really has been 4 years since the last one?)
MapGuide Open Source 4.0 authoring support
This release of Maestro takes advantage of features/capabilities introduced in the upcoming MapGuide Open Source 4.0. For all these demonstrated features, we assume you have the current Preview 3 release of MGOS 4.0 installed or newer.
A new Layer Definition resource template based on the v4.0.0 schema is now available.
What features/capabilities does this offer? A new toggle option to determine if QUERYMAPFEATURES requests that hit this layer should include bounding box information or not. When bounding box data is not included, client-side viewer tools like zooming to selection will not work due to the lack of this information.
The basic label style editor finally has the missing support for editing advanced placement settings
MapGuide Open Source 4.0 introduced bulk coordinate transformation capabilities in the mapagent and Maestro will now take advantage of this feature for places in the UI that require transforming coordinates, such as setting the map extents for example
MapGuide Open Source 4.0 now also removes the restriction that you cannot CREATERUNTIMEMAP or MgMap.Create() a Map Definition that links to a XYZ tileset, so the Map Definition editor will no longer throw this warning and block you from linking to a XYZ tile set definition if you are connected to a MGOS 4.0 instance.
Notable UI Changes
Your published WFS and WMS layers are now visible as top-level nodes in the Site Explorer! This allows for quick visual verification that you have correctly applied the appropriate WFS/WMS publishing metadata to your Feature Source or Layer Definition.
These new nodes aren't just for show. For the published WMS layers, there are context menu actions to follow back to the source Layer Definition or for the more exciting option, the ability to preview this WMS layer through the new OpenLayers-driven content representation for WMS layers
The local map viewer component (used for local map previews) now has a panel to show selection attributes
MgCooker is no more. Replaced with MgTileSeeder
The venerable MgCooker tool for pre-seeding tilesets in MapGuide has been removed. MgTileSeeder is now the full replacement for MgCooker and is capable of more things than MgCooker (like being a generic XYZ tileset seeder). All existing Maestro UI integrations with the MgCooker tool have also been removed as a result.
Maestro API package is now SourceLink-enabled
If you use the Maestro API to build your own MapGuide client applications, the Maestro API nuget package is now built with SourceLink support meaning that when you go to a definition of any class/type of the Maestro API, you will now see the full source code of that class/type instead of the class/type outline from inferred .net metadata.
Similarly, when debugging you can now step into the source code for any method in the Maestro API!
To take advantage of SourceLink requires some small Visual Studio settings changes, which are covered in detail here.
Maestro is now a self-contained .net 6.0 windows application
Aside from being able to take advantage of the new capabilities and performance improvements of .net 6.0, the other upside of this move is that this means that you no longer have to download/install the .net Framework first before installing Maestro. Being a self-contained application means that the support files needed to run a .net 6.0 application are now included with the installer itself.
On the 26th July it was announced that the United Kingdom would be hosting 2023’s Eurovision Song Contest. No fewer than 13 cities are expected to bid for the honor of host...
I continue to feel fine and finished a hard hill interval workout and some
tempo running in week 18. Saturday was a travel day but I did get in a long run
out here on Hatteras Island island yesterday. The training volume for the week
is a little lower than I planned, but that's okay. I'm happy to be at the
seaside with with my family.
A new JVM Console tab has been added to the server status page allowing a summary of memory use to be reviewed and downloaded, and a summary of active threads to be reviewed and downloaded.
The Circular Economy is a cornerstone of sustainability. It’s defined as a model of production and consumption where the lifecycle of materials and products is extended. Th...
We talked about Mergin Maps in the MapScaping podcast: QGIS Offline And In The Field
Peter Petrik was a guest in the episode of QGIS Offline And In The Field. He talked with Daniel O’Donohue about collection of spatial data in the field.
Mergin Maps is a field data collection app based on QGIS. It makes field work easy with its simple interface and cloud-based sync. Available on Android, iOS and Windows.
A Geocursos irá realizar um workshop que visa apresentar como criar seu banco de dados, importar seus shapefiles e ao final publicá-los em um servidor de mapas.
O evento será 100% online, gratuito e acontecerá nos dias 29, 30, 31 de agosto e 01 de setembro as 20h (horário de Brasília).
After 3 weeks of little training, I'm back at it in week 17. 5 trail runs with
plenty of hills.
8 hours, 35 minutes
38 miles
7,493 ft D+
I spent 2:30 riding 30 miles on my bike on top of that. About half riding to and from
Pineridge Open Space, but also did a longer ride on Saturday instead of
running. I'm going to try to run 55 miles next week and 65 in week 19. That's
down quite a bit from my peak volume in the past two years.
An important concept in spatial data modelling is that of a coverage. A coverage models a two-dimensional region in which every point has a value out of a range (which may be defined over one or a set of attributes). Coverages can be represented in both of the main physical spatial data models: raster and vector. In the raster data model a coverage is represented by a grid of cells with varying values. In the vector data model a coverage is a set of non-overlapping polygons (which usually, but not always, cover a contiguous area).
This post is about the vector data coverage model, which is termed (naturally) a polygonal coverage. These are used to model regions which are occupied by discrete sub-regions with varying sets of attribute values. The sub-regions are modelled by simple polygons. The coverage may contain gaps between polygons, and may cover multiple disjoint areas. The essential characteristics are:
polygon interiors do not overlap
the common boundary of adjacent polygons has the same set of vertices in both polygons.
There are many types of data which are naturally modelled by polygonal coverages. Classic examples include:
Man-made boundaries
parcel fabrics
political jurisdictions
Natural boundaries
vegetation cover
land use
A polygonal coverage of regions of France
Topological and Discrete Polygonal Coverages
There are two ways to represent polygonal coverages: as a topological data structure, or as a set of discrete polygons.
A coverage topology consists of linked faces, edges and nodes. The edges between two nodes form the shared boundary between two faces. The coverage polygons can be reconstituted from the edges delimiting each face.
The discrete polygon representation is simpler, and aligns better with the OGC Simple Features model. It is simply a collection of polygons which satisfy the coverage validity criteria given above.
Most common spatial data formats support only a discrete polygon model, and many coverage datasets are provided in this form. However, the lack of inherent topology means that datasets must be carefully constructed to ensure they have valid coverage topology. In fact, many available datasets contain coverage invalidities. A current focus of JTS development is to provide algorithms to detect this situation and provide the locations where the polygons fail to form a valid coverage.
Polygonal Coverage Operations
Operations which can be performed on polygonal coverages include:
Validation - check that a set of discrete polygons forms a valid coverage
Gap Detection - check if a polygonal coverage contains narrow gaps (using a given distance tolerance)
Cleaning - fix errors such as gaps, overlaps and slivers in a polygonal dataset to ensure that it forms a clean, valid coverage
Precision Reduction - reduce precision of polygon coordinates, ensuring coverage topology is preserved
Union - merge all or portions of the coverage polygons into a single polygon (or multipolygon, if the input contains disjoint regions)
Overlay - compute the intersection of two coverages, producing a coverage of resultant polygons
Implementing polygonal coverage operations is a current focus for development in the JTS Topology Suite. Since most operations require a valid coverage as input, the first goal is to provide Coverage Validation. Cleaning and Simplification are priority targets as well. Coverage Union is already available, as is Overlay (in a slightly sub-optimal way). In addition, a Topology data structure will be provided to support the edge-node representation. (Yes, the Topology Suite will provide topology at last!). Stay tuned for further blog posts as functionality is rolled out.
As usual, the coverage algorithms developed in JTS will be ported to GEOS, and will thus be available to downstream projects like PostGIS.
I called the cable company on Tuesday to cancel and was told that they were going to
refund me $25 in the process of squaring up my account. Sweet! Then today I got an
email announcing my next month's bill. The cable company is terrible at their
business. I wonder how many times I will have to cancel my account before it
sticks?
This is a great upgrade. The house my family was renting in France got fiber in
2017. I'm just saying.
I had some health troubles during weeks 14-16 and got very little training done
while feeling generally crappy and worried. On June 16 (end of week 13) I went
for a 20 mile run in the hills and struggled on the climbs. I was unusually out
of breath and after I finished I was a bit dizzy. The next day I was mildly
feverish and I continued to have an elevated temperature and noticeable
lethargy during my run on Tuesday, the 28th. I took a COVID test and it was
negative. On the 29th I went out for an interval workout and quit after my
warmup. I felt dizzy, achy, without energy, and was seeing a heart rate on my
Garmin watch that conflicted with what I was feeling: anomalously low at times.
I took another COVID test, again negative. Friday, July 1, I had a virtual
visit with my doctor, who recommended a PCR test for flu and COVID and some
blood work. The PCR test was negative and the lab report said I was normal on
all counts. About this time I became aware of twitching in my chest, which
I noticed most when I was laying down before falling asleep, or in the middle
of the night. At first I chalked this up to anxiety, but after several days of
no relief and some very confusing heart rate measurements on my run on July
5 (after which I joked "Getting confusing HR signals from my watch. Either it
or I am about to die.") I got a live, in-person visit with my doctor and a ECG,
which revealed premature atrial and ventricular complexes (PACs and PVCs). My
heart really was malfunctioning.
I got fitted with a wearable ECG to collect more data and had a generally
crappy week of heart twitches, poor sleep, anxiety, and no running while
waiting to get a echo scan of my heart and a consultation with a cardiologist.
In week 16 (starting July 11), I began to feel a little better. I found that
I could hike and run at a super easy pace and not feel terrible, so I began to
treat it like an ordinary recovery week (every 4th week of my training blocks
are recovery weeks). On July 14 I drove to the UC Health hospital in Greeley,
which has some extra capacity, and got an echo scan (sonogram) of my heart. The
initial assessment said that I had an enlarged right ventricle. That didn't
sound good. My heart palpitations continued to subside, but I still had five
stressful days of waiting before my cardiology appointment on Tuesday of this
week (week 17). The cardiologist disagreed with the initial assessment of my
echo scan and didn't recommend any other scans. I don't have an enlarged
ventricle. Other than the PACs and PVCs, which continue to diminish, my heart
is in good shape. I had a treadmill stress test on Friday and passed. We
measured only a few PVCs during the test.
This week I started running a little harder and have been feeling fine. It
seems like I only had a temporary episode of premature contractions that were
likely triggered by an unknown virus. My watch's measurement of my heart rate
is back to normal, too. Neither it, nor I am going to die soon.
Here are the numbers for weeks 14-16.
Week 14:
3 hours, 11 minutes
15.1 miles
3,428 ft D+
Week 15:
1 hour, 57 minutes
9.6 miles
1,056 ft D+
Week 16:
3 hours, 45 minutes
18.1 miles
1,631 ft D+
I had been aiming for 120 miles of running and 20,000 feet of climbing in weeks
14 and 15 and got nowhere near that. I missed two big weeks of training, but
I'm trying not to sweat about that. I've had enough worrying in the past three
weeks, I don't need to add worry about training to my problems. I'm still on
track to run the Superior 50 in
7 weeks.
PostgreSQL is one of the most popular open source relational database systems available with + 30 years in development, 1,500,000 lines of submitted code and more than 650 ...
One of the defining elements of the COVID epidemic has been how quickly the situation seems to change. One moment things feel positive with the number of infections slowing...
Introduction
If we asked you to picture a UK Retail Centre, what would come to mind? A bustling city centre boulevard or a quaint village high street? A sprawling out-of-to...
This blog post is the third part of a multi-post series. Introduction The first and second parts of this blog post series introduced the Area Monitoring System (AMS) and the EO-WIDGET project. Furthermore, it covered user interface components, APIs (Application Programming Interface) and backend ser ...
When configuring the vector tiles in QGIS, we specify the desired tile and style URLs, for example:
For example, this is the “gis” style:
And this is the “basic” style:
The second vector tile source I want to mention is basemap.at. It has been around for a while, however, early versions suffered from a couple of issues that have now been resolved.
The basemap.at project provides extensive documentation on how to use the dataset in QGIS and other GIS, including manuals and sample projects:
Here’s the basic configuration: make sure to set the max zoom level to 16, otherwise, the map will not be rendered when you zoom in too far.
The level of detail is pretty impressive, even if it cannot quite keep up with the basemap raster tiles:
Vector tile details at Resselpark, ViennaRaster basemap details at Resselpark, Vienna
Electric vehicles are becoming one of the true success stories in sustainable transport, both environmentally and commercially. Their popularity has grown exponentially, wi...
As we approach the midpoint of the year, it is time for us to share with you some of the exciting new enhancements we have developed for the most recent release of the CART...
Sub-basin delimitation process using QGIS (This post)
In this article we are going to explain the process to delimit the river sub-basins of a certain region using QGIS and a DEM. In this case the region is the entire Republic of Mozambique.
The starting data we need is a mosaic of Digital Elevation Models (DEMs) of the entire area and the watershed layer. As we commented in the previous articles of the series, we will use the NASADEM HGT mosaics (here we explain how to download them) and the basin layer elaborated by GEAMA.
Create single DEM (Mosaic)
The first step is the union of the DEM mosaics to create a single DEM for all of Mozambique.
QGIS provides us with different options for joining raster files. In this case we are going to use the tool “Merge”. To do this, select “Raster” from the menu, display the “Miscellaneous” options and click “Merge”.
In the pop-up window we add all the input layers, that is, all the DEMs downloaded from the web of the area in which we are going to work. We give a path and a name for the output file, check the option “Open the output file after running the algorithm” and execute.
DEM reprojection
The merged DEM is in the WGS 84 Geographic Coordinate System (EPSG:4326). We reproject the DEM to a Projected Coordinate System. In our case to WGS 84 / UTM zone 37S (EPSG:32737). To reproject the DEM from the menu we select “Raster”, then “Projections” and “Warp (reproject)”.
In the input layer we select the combined DEM and indicate both the source CRS and the target CRS. We also indicate a path and a name for the output file and run the algorithm.
DEM clipping
To reduce the processing computation of the algorithms that we will use in the following steps, we will now clip the DEM per each river basin. Another reason for making these cuts is the difference in sizes and shapes of the basins that we are going to analyze. Better results will be obtained by analyzing each basin separately in order to adjust the parameters of the algorithms to each particular case instead of using the same parameters for all of Mozambique.
We create the masks for each basin from the basins layer. In the following image you can see the DEM in the background, in white the administrative limit of Mozambique and in blue the mask of the Lurio river basin.
Next, we clip the DEM with each created mask. To clip the DEM, in the main menu we select “Raster”, then “Extraction” and “Clip Raster by Mask Layer”.
In the pop-up window we select the reprojected DEM on the input layer and the basin mask on the mask layer. Also, so that the crop fits exactly to the edge of the basin and so that no black area is visible, we will specify a value for “no data”, for example -9999. Finally, we mark the following two options:
Match the extent of the clipped raster to the extent of the mask layer.
Keep resolution of input raster.
We specify the path and name for the output file and run the algorithm.
Perform this process for all basins. Batch processing can be used for this instead of doing it one by one.
Filling depressions
Although the latest versions of the DEMs are already processed with different corrections and gap filling, they may still contain artifacts such as depressions. Artifacts must be removed before using a DEM in hydrologic analysis. There are several algorithms for filling gaps (GDAL, GRASS, SAGA, …). The incorporation of the GRASS hydrological toolset in QGIS is very useful. We will therefore use the GRASS algorithm both for this step and for others that we will see later.
To fill depressions we wil use the processing toolboox, if it is not visible on the right side of the QGIS window, it can be enabled from the main menu, by select “Processing” and “Toolbox”.
In the toolbox find or type the following algorithm: r.fill.dir. In the dialog window select as input (Elevation) the clipped DEM and uncheck Flow direction and Problem areas. We are only interested in obtaining the DEM without depressions. Indicate the path and name for the output file and execute.
This geoprocess can take a long time to finish depending on the size of the raster and the power of the computer. Perform this geoprocess for all basins. Batch processing can be used for this instead of doing it one by one.
Drainage direction, accumulated flow, stream segments and sub-basins
Now that we have a corrected DEM, the next step is to calculate the drainage directions, accumulated flow, stream segments and sub-basins. To do this we look for the following algorithm in the toolbox: r.watershed
In the dialog window we select the corrected raster of the basin and establish the most appropriate “Minimum size of exterior watershed basin” for each case.
Because of the wide difference between the size of some basins and others, it is necessary to play with the “Minimum size of exterior watershed basin” to obtain the optimal results for each of them.
An average size can be obtained by reviewing the characteristics of the raster. The pixel size of the raster is 30x30m, so the “Minimum size of exterior watershed basin” must be at least 333,333.33 pixels, approximating 300,000 pixels.
But, in many cases, it is decided to establish 100,000 or less to solve some problems detected in some flat areas or mouths. Also check the following two options:
Enable Single Flow Direction (D8) flow
Allow only horizontal and vertical flow of water
Also check the boxes that can be seen in the following image, in addition to indicating the paths and names of the output files and execute the algorithm.
We thus obtain for each basin:
Accumulated flow
Drainage direction
Sub-basins
Stream segments
The only layer we are going to work on is the sub-basin layer, the rest are calculated to serve as support for small manual adjustments that we will make later.
Transform to vector layer
Once we have the sub-basins layer, the next step is to transform this raster layer into a vector layer. To do this we are going to look for the following geoprocess in the toolbox: r.to.vect. In the window, select the sub-basins layer as the input raster layer.
Also select these options and run:
Feature type: area
Smooth corners of area features
v.out.ogr output type: area
Review and manual adjustments
Once we have the vector layer of sub-basins we have to review the result, join the necessary polygons to have only the sub-basins that meet the agreed criteria mentioned in the previous article and make small manual corrections where necessary.
The union of the polygons is also necessary because the algorithm creates sub-basins of tributaries that are not direct to the main river, that is, tributaries of a tributary of the main river. In this way we solve the problem.
These algorithms work well in sloping areas, but in flat areas they can give undesirable results, and it is necessary to review them. For example, in coastal areas with very flat basins, a more detailed review and adjustments to sub-basin boundaries were required based on other information such as drainage directions, rivers, etc.
Smoothing and simplification
When vectorizing a raster, the edges of the polygons are generated following the shape of the pixels, for this reason it is necessary to smooth them.
This leads to another problem, which is that smoothing creates many vertices that make the layer size increase quite a bit. To avoid this, after smoothing you can also do a simplification.
Choosing the appropriate parameters, smoothing and simplification are carried out without practically losing definition in the delimitation of the sub-basins. In the image below you can see the edges of the sub-basins before and after (red) smoothing and simplifying.
Sub-basins merge
Since we have created a sub-basin layer for each basin, we perform a vector layer union to create a single sub-basin layer for the entire country: mergevectorlayers.
In the dialog window we select all the sub-basin layers for input layers, we specify the Coordinate Reference System, the path and the name of the output file and we execute.
Geometric and topological checks and corrections
Once we have the desired sub-basin layer, overlaps, duplicates, gaps, invalid geometries, etc. must be checked and corrected.
This can be done with plugins (Manage and install plugins)
In addition to the geometry of the sub-basins, we must also add the desired attributes and order the attribute table as desired. Some examples of useful attributes in a sub-basin layer are the name of the river, the area, length of the main river or the basin to which it belongs.
Final Results
Following these steps we obtain our layer of sub-basins.
With this article we close our series on the delimitation of sub-basins. Did you find them interesting? Do you use other techniques? Need help? Get in touch with us or write to us on Twitter.