Welcome to Planet OSGeo

May 11, 2021

People often ask whether the Mergin service and the Input app can deal with multiple team members doing edits at the same time.

They happily find out that the answer is yes - it is one of the core features that greatly simplifies life of our users. In this post we will shed some light on how things work behind the scenes.

Mergin promo

Getting started

Let’s think of a simple project directory that needs to be synchronised between multiple desktop or mobile users, containing just two files:

  • a QGIS project file my-project.qgz that sets up map layers, styling, …
  • a GeoPackage file my-data.gpkg containing all GIS data

Our sample GIS data will contain a tree survey table, containing location, species and age of various trees:

tree survey table

When users edit data in my-data.gpkg, the traditional cloud storage solutions (such as Dropbox, Google Drive, Microsoft OneDrive and others) simply copy the modified files there. They do not understand the file content though - so if two people modify the same file, they have no way of knowing how to merge changes together. In the worse case, when two versions of the same file are uploaded, they keep just the version which was synchronised last. Or slightly better, they resort to creation of conflicting copies which need to be manually merged later. As one can imagine, merging and consolidating modifications from multiple GeoPackages back to one copy is a slow, tedious and error-prone job.

the Mergin service has been designed to understand spatial data, especially GeoPackages that are becoming the most popular format to store vector & attribute data. This is thanks to the open source geodiff library that we have developed while working on Mergin.

Synchronising data using “diffs”

The first trick is that synchronisation of GeoPackage files between Mergin server and clients (Input app, QGIS or other apps) only transfers actual changes in tables (“diffs” in technical jargon).

Our Mergin project with the tree survey has been prepared and downloaded by users. Jack did a field survey and he added or updated some rows in the survey table (changes highlighted in yellow and green):

Jack table

After pressing sync button, his changes are detected and uploaded to Mergin, encoded as a list of changes to the survey table:

Jack diff

Another user, Jill, also downloaded the tree survey project to her mobile device prior to Jack’s changes. When Jill synchronises the project to get the latest version, the changes as uploaded by Jack are downloaded and applied to her local copy of the project, getting the same data as seen by Jack.

At this point, the advantage of uploading/download only changes in tables may not seem obvious besides saving some network bandwidth… Read on to learn how this is used to support multi-user editing.

Merging changes from multiple users

So far we have expected that Jill does not have any pending changes to sync, so that was easy. Now let’s assume that Jill has also done some changes on her device:

Jill table

Here comes the more tricky part - how do we merge changes from Jack and Jill back to a single table:

Merging Jack and Jill table

In Mergin, cases that require merging changes from multiple users are handled by the “rebase” operation, a concept we have borrowed from version control systems for source code.

Let’s assume that Jack has synchronised his changes first. Later, when Jill synchronises her changes, a couple of things happen on her device before uploading the changes: Jill’s changes will be temporarily undone, Jack’s changes get applied, and finally Jill’s changes are re-applied after being rebased on top of Jack’s changes.

What does it mean to rebase someone’s changes? There are a couple of possible edit conflicts that could happen between rows of a database table with matching IDs (insert/insert, update/delete, delete/delete, update/update). These edit conflicts need to be resolved to

In our example, both Jack and Jill have added a row with ID = 4. This is not allowed, and therefore Jill’s new row ID will get changed to ID = 5 (any unused ID would do). As a result, here’s how the merged table will look at the end - combining changes of both users:

Final table

If both Jack and Jill modified the same row (the update/update edit conflict), we can only accept one edit automatically. The conflicting edit of the other user is written to a special conflict file and uploaded to Mergin, so no data gets lost, and the conflict can be later inspected by the project admin. Fortunately, this kind of conflict does not happen often if the survey work is well planned to avoid users simultaneously modifying the same features within the GeoPackage data.

What if conflict files appear

There are some cases when automatic merging is not supported. In those cases, Mergin is unable to find out details about changes in the data file(s) and has to resort to creation of a conflicting copy which gets uploaded to Mergin project along the original data file(s). In particular the problems may appear when:

  • Other format than GeoPackage is used for data storage (e.g. shapefiles)
  • Database or table structure is changed (e.g. adding new columns or new tables)

In the future, these limitation may be removed, but at this point it is good to keep them in mind.

If you plan to change structure of the survey tables and the project is already being used on multiple devices, it may be a good idea to create a new Mergin project with the modified database structure and instruct users to switch to the new project. Otherwise conflict files may start popping up as long as some users have older version of the project, adding more manual work to collate data.

You may also like...

Input, a field data collection app based on QGIS. Input makes field work easy with its simple interface and cloud-based sync. Available on Android and iOS. Screenshots of the Input App for Field Data Collection Get it on Google Play Get it on Apple store

May 11, 2021 05:00 AM

May 10, 2021

We are happy to announce that OTB version 7.3.0 has been released! Ready to use binary packages are available on the package page of the website: OTB-7.3.0-Darwin64.run (Mac OS) OTB-7.3.0-Linux64.run (Linux) OTB-7.3.0-Win32.zip (Windows 32 bits) OTB-7.3.0-Win64.zip (Windows 64 bits) It is also possible to checkout the branch with git: The documentation for OTB 7.3.0 can […]

by Cédric Traizet at May 10, 2021 03:08 PM

May 06, 2021

Yesterday I talked about all-things-GDAL (or at least all the things that fit in 30 minutes) with MapScaping Podcast’s Daniel O’Donohue.

In the same way that Linux is the under-appreciated substrate of modern computing, GDAL is the under-appreciated substrate of modern geospatial data management. If the compute is running in the cloud, it’s probably running on Linux; if the geospatial data are flowing through the cloud, they’re probably flowing through GDAL.

GDAL

At the same time as it has risen to being the number one spatial data processing tool in the world (by volume anyways), GDAL has maintained an economic support model from the last century. One maintainer (currently Even Rouault), earning a living with new feature development, and doing all the work of code quality, integration, testing, documentation, and promotion as a loss leader. This model burns out maintainers, and it doesn’t ask the organizations that gain the most value from GDAL (the ones pushing terrabytes of pixels through the cloud) to contribute commensurate with the value they receive.

With the new GDAL sponsor model, the organizations who receive the most value are stepping up to do their share. If your organization uses GDAL, and especially if it uses it in volume, consider joining the other sponsors in making sure GDAL remains high quality and cutting edge by sponsoring.

Thanks Daniel, for having me on!

May 06, 2021 08:00 AM

May 05, 2021

No os perdáis las próximas jornadas organizadas por la red de investigación GeoLIBERO: “Uso de las Tecnologías Libres de Información Geográficas en Educación Básica – experiencias iberoamericanas“, días 12 y 13 de mayo de 2021.
En ellas se van a contar distintas experiencias del maravilloso proyecto, denominado gvSIG Batoví, con el que han llevado los SIG en software libre a las aulas. De verdad, vale mucho la pena ver lo que han conseguido. Para estas jornadas se ha conseguido reunir a distintos equipos que durante estos años han participado en el proyecto gvSIG Batoví (más de uno dejó la educación básica atrás).
El programa, pudiendo inscribiros a las charlas a las que queráis asistir, lo podéis consultar aquí:
http://www.gvsig.com/es/eventos/tec-libres-ig-iberoamericanas-2021/programa

by Alvaro at May 05, 2021 08:24 AM

May 04, 2021

So, this happened:

Tweet about Indexes

Basically a GeoDjango user posted some workarounds to some poor performance in spatial queries, and the original query was truly awful and the workaround not a lot better, so I snarked, and the GeoDjango maintainer reacted in kind.

Sometimes a guy just wants to be a prick on the internet, you know? But still, I did raise the red flag of snarkiness, so it it seems right and proper to pay the fine.

I come to this not knowing entirely the contract GeoDjango has with its users in terms of “hiding the scary stuff”, but I will assume for these purposes that:

  • Data can be in any coordinate system.
  • Data can use geometry or geography column types.
  • The system has freedom to create indexes as necessary.

So the system has to cover up a lot of variability in inputs to hide the scary stuff.

We’ll assume a table name of the_table a geometry column name of geom and a geography column name of geog.

Searching Geography

This is the easiest, since geography queries conform to the kind of patterns new users expect: the coordinates are in lon/lat but the distances are provided/returned in metres.

Hopefully the column has been spatially indexed? You can check in the system tables.

SELECT * 
FROM pg_indexes 
WHERE tablename = 'the_table';

Yes, there are more exact ways to query the system tables for this information, I give the simple example for space reasons.

If it has not been indexed, you can make a geography index like this:

CREATE INDEX the_table_geog_x 
  ON the_table
  USING GIST (geog);

And then a “buffered” query, that finds all objects within a radius of an input geometry (any geometry, though only a point is shown here) looks like this.

SELECT *
FROM the_table
WHERE ST_DWithin(
    the_table.geog,
    ST_SetSRID(ST_MakePoint(%lon, %lat), 4326),
    %radius
    );

Note that there is no “buffering” going on here! A radius search is logically equivalent and does not pay the cost of building up buffers, which is an expensive operation.

Also note that, logically, ST_DWithin(A, B, R) is the same as ST_Distance(A, B) < R, but in execution the former can leverage a spatial index (if there is one) while the latter cannot.

Indexable Functions

Since I mention that ST_DWithin() is indexable, I should list all the functions that can make use of a spatial index:

And for a bonus there are also a few operators that access spatial indexes.

  • geom_a && geom_b returns true if the bounding box of geom_a intersects the bounding box of geom_b in 2D space.
  • geom_a &&& geom_b returns true if the bounding box of geom_a intersects the bounding box of geom_b in ND space (an ND index is required for this to be index assisted),

Searching Planar Geometry

If the data are planar, then spatial searching should be relatively easy, even if the input geometry is not in the same coordinate system.

First, is your data planar? Here’s a quick-and-dirty query that returns true for geographic data and false for planar.

SELECT srs.srtext ~ '^GEOGCS' 
FROM spatial_ref_sys srs
JOIN geometry_columns gc
ON srs.srid = gc.srid
WHERE gc.f_table_name = 'the_table'

Again, do you have a spatial index on your geometry column? Check!

CREATE INDEX the_table_geom_x 
  ON the_table
  USING GIST (geom);

Now, assuming query coordinates in the same planar projection as the data, a fast query will look like this:

SELECT *
FROM the_table
WHERE ST_DWithin(
    geom,
    ST_SetSRID(ST_MakePoint(%x, %y), %srid)
    %radius
    );

But what if users are feeding in queries in geographic coordinates? No problem, just convert them to planar before querying:

SELECT *
FROM the_table
WHERE ST_DWithin(
    geom,
    ST_Transform(
        ST_SetSRID(ST_MakePoint(%lon, %lat), 4326), 
        %srid)
    %radius
    );

Note that here we are transforming the geography query to planar, not transforming the planar column to geographic!

Why? There’s only one query object, and there’s potentially 1000s of rows of table data. And also, the spatial index has been built on the planar data: it cannot assist the query unless the query is in the same projection.

Searching Lon/Lat Geometry

This is the hard one. It is quite common for users to load geographic data into the “geometry” column type. So the database understands them as planar (that’s what the geometry column type is for!) while their units (longitude and latitude) are in fact angular.

There are benefits to staying in the geometry column type:

  • There are far more functions native to geometry, so you can avoid a lot of casting.
  • If you are mostly working with planar queries you can get better performance from 2D functions.

However, there’s a huge downside: questions that expect metric answers or metric parameters can’t be answered. ST_Distance() between two geometry objects with lon/lat coordinates will return an answer in… “degrees”? Not really an answer that makes any sense, as cartesian math on anglar coordinates returns garbage.

So, how to get around this conundrum? First, the system has to recognize the conundrum!

  • Is the column type “geometry”?
  • Is the SRID a long/lat coordinate system? (See test query above.)

Both yes? Ok, this is what we do.

First, create a functional index on the geography version of the geometry data. (Since you’re here, make a standard geometry index too, for other purposes.)

CREATE INDEX the_table_geom_geog_x
ON the_table
USING GIST (geography(geom));

CREATE INDEX the_table_geom
ON the_table
USING GIST (geom);

Now we have an index that understands geographic coordinates!

All we need now is a way to query the table that uses that index efficiently. The key with functional indexes is to ensure the function you used in the index shows up in your query.

SELECT * 
FROM the_table
WHERE ST_DWithin(
    geography(geom),
    geography(ST_SetSRID(ST_MakePoint(%lon, %lat), 4326))
    %radius
    );

What’s going on here? By using the “geography” version of ST_DWithin() (where both spatial arguments are of type “geography”) I get a search in geography space, and because I have created a functional index on the geography version of the “geom” column, I get it fully indexed.

Random Notes

  • The user blog post asserts incorrectly that their best performing query is much faster because it is “using the spatial index”.
SELECT 
    "core_searchcriteria"."id", 
    "core_searchcriteria"."geo_location"::bytea,
     "core_searchcriteria"."distance",
     ST_DistanceSphere(
        "core_searchcriteria"."geo_location", 
        ST_GeomFromEWKB('\001\001\000\000 \346\020\000\000\000\000\000\000\000@U@\000\000\000\000\000\000@@'::bytea)) AS "ds" 
     FROM "core_searchcriteria" 
        WHERE ST_DistanceSphere(
            "core_searchcriteria"."geo_location", 
            ST_GeomFromEWKB('\001\001\000\000 \346\020\000\000\000\000\000\000\000@U@\000\000\000\000\000\000@@'::bytea)
        ) <= "core_searchcriteria"."distance";
  • However, the WHERE clause is just not using any of the spatially indexable functions. Any observed speed-up is just because it’s less brutally ineffecient than the other queries.

  • Why was the original brutally inefficient?

SELECT 
    "core_searchcriteria"."id", 
    "core_searchcriteria"."geo_location"::bytea, 
    "core_searchcriteria"."distance", 
    ST_Buffer(
        CAST("core_searchcriteria"."geo_location" AS geography(POINT,4326)), "core_searchcriteria"."distance"
        )::bytea AS "buff" FROM "core_searchcriteria" 
    WHERE ST_Intersects(
        ST_Buffer(
            CAST("core_searchcriteria"."geo_location" AS geography(POINT,4326)), "core_searchcriteria"."distance"), 
        ST_GeomFromEWKB('\001\001\000\000 \346\020\000\000\000\000\000\000\000@U@\000\000\000\000\000\000@@'::bytea)
    )
  • The WHERE clause converts the entire contents of the data column to geography and then buffers every single object in the system.
  • It then compares all those buffered objects to the query object (what, no index? no).
  • Since the column objects have all been buffered… any spatial index that might have been built on the objects is unusable. The index is on the originals, not on the buffered objects.

May 04, 2021 08:00 AM

TLDR: JTS can now fix invalid geometry!

The JTS Topology Suite implements the Geometry model defined in the OGC Simple Features specification.  An important part of the specification is the definition of what constitutes valid geometry.  These are defined by rules about the structural and geometric characteristics of geometry objects.  Some validity rules apply to all geometry; e.g. vertices must be defined by coordinates with finite numeric values (so that NaN and Inf ordinates are not valid).  In addition, each geometric subtype (Point, LineString, LinearRing, Polygon, and Multi-geometry collections) has its own specific rules for validity.

The rules for Polygons and MultiPolygons are by far the most restrictive.  They include the following constraints:

  1. Polygons rings must not self-intersect
  2. Rings may touch at only a finite number of points, and must not cross
  3. A Polygon interior must be connected (i.e. holes must not split a polygon into two parts)
  4. MultiPolygon elements may touch at only a finite number of points, and must not overlap

These rules guarantee that:

  • a given area is represented unambiguously by a polygonal geometry
  • algorithms operating on polygonal geometry can make assumptions which provide simpler implementation and more efficient processing

Valid polygonal geometry is well-behaved

Given the highly-constrained definition of polygonal validity, it is not uncommon that real-world datasets contain polygons which do not satisfy all the rules, and hence are invalid. This occurs for various reasons:

  • Data is captured using tools which do not check validity, or which use a looser or different definition than the OGC standard
  • Data is imported from systems with different polygonal models
  • Data is erroneous or inaccurate 

Because of this, JTS does not enforce validity on geometry creation, apart from a few simple structural constraints (such as rings having identical first and last points).  This allows invalid geometry to be represented as JTS geometry objects, and processed using JTS code.  Some kinds of spatial algorithms can execute correctly on invalid geometry (e.g. determining the convex hull).  But most algorithms require valid input in order to ensure correct results (e.g. the spatial predicates) or to avoid throwing exceptions (e.g. overlay operations).  So the main reason for representing invalid geometry is to allow validity to be tested, to take appropriate action on failure.

Often users would like "appropriate action" to be Just Make It Work.  This requires converting invalid geometry to be valid.  Many spatial systems provide a way to do this: 

But this has a been a conspicuous gap in the JTS API.  While it is possible to test for validity, there has never been a way to fix an invalid geometry.  To be fair, JTS has always had an unofficial way to make polygonal geometry valid.  This is the well-known trick of computing geometry.buffer(0), which creates a valid output which often is a good match to the input. This has worked as a stop-gap for years (in spite of an issue which caused some problems, now fixed - see the post Fixing Buffer for fixing Polygons). However, using buffer(0) on self-intersecting "figure-8" polygons produces a "lossy" result.  Specifically, it retains only the largest lobes of the input linework.  This is undesirable for some uses (although it is advantageous in other situations, such as trimming off small self-intersections after polygon simplification).

Buffer(0) of Figure-8 is lossy 

So, it's about time that JTS stepped up to provide a supported, guaranteed way of fixing invalid geometry. This should handle all geometry, although polygonal geometry repair is the most critical requirement.

This raises the question of what exactly the semantics of repairing polygons should be.  While validity is well-specified, there are no limits to the complexity of invalid polygons, and a variety of possible approaches to fixing them.  The most significant decision is how to determine the interior and exterior of a polygonal geometry with self-intersections or overlaps. (This is the classic "bow-tie" or "figure-8" - although self-intersecting polygons can be far more complex.) The question comes down to whether the geometry linework or structure is used to determine interior areas.  

If linework is used to create validity, to node the constituent linework to form a topologically-valid coverage.  This coverage is then scanned with an alternating even-odd strategy to assign areas as interior or exterior.  This may result in adjacent interior or exterior areas, in which case these are merged.

Alternatively, the structure of the polygonal geometry can be taken as determinative. The shell and hole rings are assumed to accurately specify the nature of the area they enclose (interior or exterior).  Likewise, the (potentially overlapping or adjacent) elements of a MultiPolygon are assumed to enclose interior area.  The repair operation processes each ring and polygon separately.  Holes are subtracted from shells.  Finally, if required the repaired polygons are unioned to form the valid result.  

PostGIS MakeValid and the ESRI Java Geometry API makeSimple both use the linework approach.  However, for some relatively simple invalid geometries this produces results which seem overly complex.  
Complex output from ST_MakeValid

For this reason I have implemented the structure-based approach in JTS.  It provides results that are closer to the existing buffer(0) technique (and conveniently allows using the existing buffer code).  This made it a relatively simple matter to implement the repair algorithm as the GeometryFixer class.

Here's some examples of how GeometryFixer works.  First, the example above, showing the (arguably) simpler result that arises from using the structure information:
Figure-8s are handled as desired (keeping all area):

Self-overlapping shells have all interior area preserved:
Of course, the GeometryFixer also handles simple fixes for all geometry types, such as removing invalid coordinates.

One further design decision is how to handle geometries which are invalid due to collapse (e.g. a line with a single point, or a ring which has only two unique vertices).  GeometryFixer provides an option to either remove collapses, or to return them as equivalent lower dimensional geometry. 

To see the full range of effects of GeometryFixer, the JTS TestBuilder can be used to view and run the GeometryFixer on the set of 75 test cases for invalid polygonal geometry in the file TestInvalidA.xml

It's been a long time coming, but finally JTS can function as a full-service repair shop for geometry, no matter how mangled it might be.




  


by Dr JTS (noreply@blogger.com) at May 04, 2021 05:27 AM

April 29, 2021

Building off the work from my previous post on vector tiles, I wanted to develop a second process for use in Openlayers. There are many tutorials out there demonstrating the use of vector tiles with predefined styles and tutorials explaining how to build a fully vector tile map online. My aim instead is to demonstrate the processes to build:

  1. A vector tile set up overlaying a raster tile cache
  2. How to integrate a custom style.json
  3. How to do all of this in a custom projection using Openlayers

As always, there is more than one method to do any of this. What is shown below can definitely be improved upon. Please take this and make it better.

Github Respository is here.

Website example is here.

This example uses Openlayers6. The map is served in the NZTM projection (EPSG:2193). No effort has been made in ordering the labels. The labels display as the Openlayers decluttering orders them. Included in the JS code is a basic pop-up window demonstrating how to get information from the vector tile.

The project is built as a static set up. The vector tile cache is built directly into the website. THIS IS NOT OPTIMAL, but does demonstrate the principle. Ideally, you would have a location like AWS S3, to serve your tile cache from.

In order to use a custom projection, you will need to build an XYZ tile cache. MBTiles do not handle projections other than Web Mercator (EPSG:3857).

Basic steps

  1. Download or reproject the shapefile in NZTM
  2. Upload shapefile to PostgreSQL database with PostGIS extensions
  3. Tile PostgreSQL table into NZTM (EPSG:2193) XYZ tile cache using TRex
  4. Construct Openlayers6 JS for tile consuption

Sample Data

https://data.linz.govt.nz/layer/50280-nz-geographic-names-topo-150k/

Note:

The Geographic Names layer is clipped and filtered for this example. I clipped only to the Wellington Region and filtered the data only to use:

desc_code = BAY, METR, LOC, POP, TOWN, SBRB

Upload to PostgreSQL

shp2pgsql -s 2193 /data/wellyRegion_townBay.shp public.wellyRegion_townBay_nztm | psql -h localhost -d <yourDatabaseHere> -U <youUserNameHere>

TRex Tiling

TRex will create an XYZ tile cache in the projection of your choosing. You will need to know the resolutions and bounding box of your projection in order to make this work. I was fortunate to have this information at hand thanks to a great tutorial from LINZ.

TRex uses a config file for tiling. The config used in this example is here

The command used to run TREX:

t_rex generate --progress true --maxzoom=14 --minzoom=0 --extent=174.627603,-41.613839,176.259896,-40.737190  --config /configpsql_points.toml

TRex will generate gzip pfb’s. If you prefer to unzip them:

find . -type f | xargs -n1 -P 1 -t -I % gzip -d -r -S .pbf %
find . -type f | xargs -n1 -P 1 -t -I % % %.pbf

Openlayers JS

The Openlayers for this is version 6.  <script> tags needed are:

<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/openlayers/openlayers.github.io@master/en/v6.5.0/css/ol.css" type="text/css">

<script src="https://cdn.jsdelivr.net/gh/openlayers/openlayers.github.io@master/en/v6.5.0/build/ol.js"></script>

<script src="//cdnjs.cloudflare.com/ajax/libs/proj4js/2.3.15/proj4.js"></script>

<script src="https://unpkg.com/ol-mapbox-style@6.3.2/dist/olms.js" type="text/javascript"></script>

For the full JS example

NZTM Construct in Openlayers

Building the projection for Openlayers

// set NZTM projection extent so OL can determine zoom level 0 extents.
// Define NZTM projection
proj4.defs("EPSG:2193","+proj=tmerc +lat_0=0 +lon_0=173 +k=0.9996 +x_0=1600000 +y_0=10000000 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs");

// Register projection with OpenLayers
ol.proj.proj4.register(proj4);

// Create new OpenLayers projection
var proj2193 = new ol.proj.Projection({
	code: 'EPSG:2193',
	units: 'm',
	extent: [827933.23, 3729820.29, 3195373.59, 7039943.58]
});

// NZTM tile matrix origin, resolution and matrixId definitions.
var origin = [-1000000, 10000000];
var resolutions = [ 8960, 4480, 2240, 1120, 560, 280, 140, 70, 28, 14, 7, 2.8, 1.4, 0.7, 0.28, 0.14, 0.07 ];
var matrixIds = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16];

Applying to Raster and Vector Tiles

Raster Tiles

Another great tutorial from LINZ regarding the set up of an XYZ for raster tiles.

// Tile Services Map
var urlTemplate =
  "https://tiles.maps.linz.io/nz_colour_basemap/NZTM/{z}/{x}/{y}.png";

// Set raster layer
var layer = new ol.layer.Tile({
  source: new ol.source.XYZ({
    url: urlTemplate,
    projection: proj2193,
    attributions: ['<a href="http://data.linz.govt.nz">Data from LINZ. CC BY 4.0</a>'],
    tileGrid: new ol.tilegrid.TileGrid({
      origin: origin,
      resolutions: resolutions,
      matrixIds: matrixIds,
      extent: [827933.23, 3729820.29, 3195373.59, 7039943.58]
    })
  })
});
Vector Tiles

Set up the vector layer to use custom projection:

// Set vector layer
var placeSource = new ol.source.VectorTile({
  cacheSize: 0,
  overlaps: true,
  tilePixelRatio: 1, // oversampling when > 1
  tileGrid: new ol.tilegrid.TileGrid({ 
    origin: [-1000000, 10000000],
    maxZoom: 16,
    tileSize: 4096,
    extent: [827933.23, 3729820.29, 3195373.59, 7039943.58],
    resolutions: resolutions,
  }),
  extent: [827933.23, 3729820.29, 3195373.59, 7039943.58],
  format: new ol.format.MVT(),
  projection: ol.proj.get('EPSG:2193'),
  url: 'https://xycarto.github.io/static.vector.tiles.openlayers.nztm/tiles/wellyRegion_townBay_nztm/{z}/{x}/{y}.pbf'
});

var vectorMap = new ol.layer.VectorTile({
  declutter: true,
  source: placeSource,
  renderMode: 'vector',
  zIndex: 10
  })

Styling

For the style file example

  1. The method in this example is loading the vector tile and overaying it on a raster tile cache. In order to accomplish this, a vector tile cache must be loaded first to the map, THEN the rules from the style JOSN are applied using:
fetch('./styleText.json').then(function(response) {
  response.json().then(function(glStyle) {
    olms.applyStyle(vectorMap, glStyle, 'wellyRegion_townBay_wgs');
  });
});
  1. The above uses olms.applyStyle. To access this function you will need to add the scipt tag to your HTML:
<script src="https://unpkg.com/ol-mapbox-style@6.3.2/dist/olms.js" type="text/javascript"></script>

Notes

  1. Not fully complete. Working example only
  2. Runs slow in Safari
  3. This is in no way the only way to do this. My hope is someone takes this and make it better.
  4. Still needs label work for use on mobile devices.

by xycarto at April 29, 2021 07:59 PM

April 28, 2021

This image has an empty alt attribute; its file name is Malena-jodygarnett_2021-Apr-28-1024x683.jpg

This post is reposted from Mappery

Yesterday the open source and open data geo communities were devastated by the tragic news of the passing of our friend and colleague Malena Libman. There was an outpouring of grief and celebration of her zest for life, humour, warmth and contribution to open geo.

Jody Garnett posted this picture of Malena’s tattoo which is truly a Map in the Wild today.

I met Malena in 2018 in Dar es Salaam for FOSS4G, the first time you met her you knew that she was something different, a force of nature. She wanted some advice about chairing a global FOSS4G event because she had a plan to host FOSS4G in Buenos Aires in 2021, by the end of the conference she had convinced me to advise her. 

I’m struggling to find the words to express how tragic it is that Malena will not be with us to celebrate FOSS4G 2021 but I know that her co-chair and close friend Maria Aria de Reyna and the rest of the organising team will make this a special event to celebrate her memory.

by Steven at April 28, 2021 03:32 PM

Nesse vídeo iremos aprender como usar a função r.surf.area para realizar o cálculo de área de uma superfície tridimensional. Ou seja calcularemos a área de uma região levando em consideração a sua topografia.

Notas sobre o calculo do r.surf.area

Como podemos observar na documentação do algorítimo, link abaixo, o cálculo depende muito da resolução dos dados (pense nisso como um problema fractal de linha costeira; quanto mais resolução, mais detalhes, mais área, etc).

O r.surf.area funciona melhor quando a superfície que está sendo avaliada se estende até as bordas da região atual e a resolução da célula é pequena. Superfícies que são especialmente longas e finas e têm limites altamente irregulares tendem a ter áreas de superfície subestimadas.

E lembre-se definir uma resolução de célula alta (área pequena) reduzirá muito esse impacto, mas causará tempos de processamento mais longos.

Links úteis

Documentação da função: https://grass.osgeo.org/grass74/manua…

Artigo sobre a estimativa de erro na delimitação de aŕea 3D utilizando o r.surf.area: https://joseguerreroa.wordpress.com/2012/04/09/estimar-el-error-en-la-determinacion-del-area-3d-de-una-montana-o-elevacion-con-el-modulo-r-surf-area-qgis-grass/


O que vocês acharam deste tutorial
QGIS - Cálculo de área tridimensional com o r.surf.area ?

Aguardo seus comentários, dicas e sugestões.

💵 Faça uma doação para o meu canal via Picpay: https://app.picpay.com/user/narceliodesa
⭐ Nos siga em todas as plataformas: https://linktr.ee/narceliodesa
📬 Contato comercial: narceliosapereira@gmail.com

The post QGIS – Cálculo de área tridimensional com o r.surf.area appeared first on Narcélio de Sá.

by Narcélio de Sá at April 28, 2021 11:00 AM

April 23, 2021

Participants of a GRASS GIS workshop contribute new tutorials The workshop Geospatial data analysis with GRASS GIS organized by Gulich Institute in Argentina, concluded last week with participants from different Latin American countries defending their work. The workshop went through different topics within GRASS ecosystem, but mostly covered remote sensing, Object Based Image Analysis (OBIA) and time series analysis, making use of GRASS GIS extensions to obtain and process Landsat, Sentinel and MODIS data.

April 23, 2021 09:12 PM

The GeoTools team is pleased to share the availability of GeoTools 24.3 :geotools-24.3-bin.zip    geotools-24.3-doc.zipgeotools-24.3-userguide.zipgeotools-24.3-project.zipThis release is published to the osgeo maven repository, and is made in conjunction with  GeoServer 2.18.3. This is a maintenance release and is a recommended upgrade for all users of the GeoTools library.We would like to thank

by Ian Turton (noreply@blogger.com) at April 23, 2021 01:13 PM

We are pleased to announce the release of GeoServer 2.18.3 with downloads ( war | bin ), documentation and extensions.

This release is made in conjunction with GeoTools 24.3 and GeoWebCache 1.18.3. This is a maintenance release recommended for production systems.

Thanks to everyone who contributed, and Ian Turton (Astun) and Jody Garnet (GeoCat) for making this release.

Improvements and Fixes

This release includes a number of enhancements and fixes in core and extensions:

  • Add LayerGroup support to Geofence,
  • ColorMap labels in raster layer GetFeatureInfo response
  • New web-service-auth community module
  • Add GeoServer environment parametrization support to Base Proxy configuration
  • Include layer name(s) in WMS ServiceException for GetMap

For more information check the 2.18.3 release notes.

About GeoServer 2.18

Additional information on GeoServer 2.18 series:

by Ian Turton at April 23, 2021 12:00 AM

April 22, 2021

Probably you have received the information, one way or another, because the issue is causing a stir (at an international level). An essay of which Álvaro Anguix is a co-author has been published and which discovers and demonstrates, with an avalanche of reasons and data, that Munera is the name of the place, to which Cervantes refers, in which the possible most famous phrase in literature: “Somewhere in la Mancha, in a place whose name I do not care to remember”.

The essay, published in Spanish with the title “Un lugar de la Mancha. La patria de don Quijote al descubierto” (“Somewhere in La Mancha, Don Quixote’s homeland brought to light”), answers the two questions that haven’t been resolved for more than 400 years satisfactorily: Which is the famous place where the nobleman of La Mancha lived? and Why didn’t Cervantes want to remember its name?

Those of you who can read the 166-page essay will see that there are all kinds of reasons in it, from the most literary to the geographical ones. And, precisely, that in this blog is what matters to us, the geographical analysis has been fundamental to corroborate that everything said by Cervantes fitted, with surprising precision, once Munera was located as the place. Distances and routes, in the different means of transport that appear in Don Quixote, geographical characteristics, evidence of the existence of certain elements due to their appearance in cartographies closest to Cervantes time, road crossings, etc. The tool used to analyze all of this has been gvSIG, an open source Geographic Information System that everybody who follow this blog know. From the location, points have been placed, so filled in the novel, such as hostelries, and  routes that can be mapped have been traced. gvSIG, used in all the world for all types of uses related to the management and mapping of geographic information, in all types of institutions, has also served to resolve the greatest literary enigma for centuries.

Cervantes launched a challenge and and it has been solved by means of gvSIG.

by Mario at April 22, 2021 02:14 PM

Il est probable que l’information vous soit déjà parvenue, d’une manière ou d’une autre, car le sujet fait beaucoup de bruit (déjà au niveau international). Un essai dont Alvaro Anguix est le co-auteur a été publié et qui découvre et démontre, avec une avalanche de raisons et de données, que Munera est le nom du lieu, auquel Cervantes se référait, dans ce qui est peut-être la phrase la plus célèbre de littérature «Dans un endroit de La Mancha dont je ne veux pas me souvenir du nom».

L’essai, publié en espagnol avec le titre «Un lieu de la Mancha. La patrie de Don Quichotte découverte », répond aux deux questions qui, depuis plus de 400 ans, n’avaient pas été résolues de manière satisfaisante : Quel est le fameux lieu où vivait l’hidalgo Manchego ? et pourquoi Cervantes ne voulait-il pas se souvenir de son nom ?

Ceux d’entre vous qui ont pu lire l’essai de 166 pages verront qu’il contient toutes sortes de raisons, des plus littéraires aux plus géographiques. Et, justement, que dans ce blog, c’est ce qui compte pour nous, l’analyse géographique a été fondamentale pour corroborer que tout ce que dit Cervantes correspondait, avec une précision surprenante, une fois que Munera a été localisée comme le fameux lieu.

Distances et itinéraires, dans les différents moyens de transport qui apparaissent dans Don Quichotte, caractéristiques géographiques, preuves de l’existence de certains éléments du fait de leur apparition dans les cartographies les plus proches de l’époque de Cervantes, les traversées routières, etc. Pour analyser tout cela, l’outil utilisé était gvSIG, un Système d’Information Géographique en logiciel libre que vous tous qui fréquentez ce blog connaissez très bien.

A partir du lieu, des points ont été localisés, si pertinents dans le roman, comme les ventes et le traçage des itinéraires qui peuvent être cartographiés. gvSIG, utilisé partout dans le monde pour toutes sortes d’utilisations liées à la gestion et au traitement de l’information géographique, dans toutes sortes d’institutions, a également permis de résoudre la plus grande énigme littéraire de tous les temps.

Cervantes a lancé un défi et, grâce à gvSIG, il a été résolu.

by mateoboudon at April 22, 2021 08:54 AM

It happens that when you collect data and then checking it in the office on your desktop using QGIS, the points are misplaced and shifted. This blog post explains the possible root cause of this issue? (Read time 5 min)

This article follows the help document about the projections and coordinate reference system handling in QGIS.

The accuracy of the captured points is affected by two main factors

  • GPS receiver and accuracy
  • Selection of coordinate reference systems used in the project

Lets take a closer look at both problems in context of field surveys.

GPS accuracy

GPS accuracy depends on the quality of your GPS receiver and the number of visible GPS satellites your receiver can use at the moment of capturing the point.

When you load the map in Input app, the bottom bar contains the GPS marker with a coloured dot. The dot can have the following three colours:

  • green: The actual accuracy is below the threshold
  • orange: The actual accuracy is above the threshold
  • red: The GPS is unavailable

GPS accuracy low

For different use cases, acceptable accuracy is different. So the threshold for the colour scheme could be adjusted in the Input app settings. Always adjust the settings based on your project needs and check the GPS accuracy when capturing data.

Input app GPS settings

The GPS receiver itself can be either improved by usage of device with a better internal hardware or usage of powerful external GPS receivers If you want to use the external GPS receiver, read the following help article for setup.

Projection problems

First of all, there could be multiple problems with setup of coordinate reference systems and transformations. To solve it, you need to load the project in QGIS, using Mergin Plugin and check for various situations as described in this help article.

Input app checks for missing datum shift files or other transformation problems and notify you once at the project load by following message.

shift of the point

To be able to use the datum grid shift in your mobile app:

  • Copy the proj folder from your QGIS working directory (see help page for different platforms)
  • Paste the folder to your survey project downloaded fromt he Mergin server
  • Sync the changes

For our example with the British national grid on Windows, the grid file is called OSTN15_NTv2_OSGBtoETRS.gsb and if the Mergin project is in C:\users\qgis\myproject then it should be copied to C:\users\qgis\myproject\proj directory and synchronised by the Mergin service to the Input app.

Notes:

  • This is one-off process. Input app will transfer the grid shift datum to its own working directory on your mobile device.
  • There might be more than one datum grid shift in your QGIS working directory. Use the appropriate one for your survey project.

You can find a short summary of this article on the Input app help pages

April 22, 2021 12:00 AM

April 21, 2021

Es probable que ya os haya llegado la información, de una u otra forma, porque el tema está causando bastante revuelo (ya a nivel internacional). Se ha publicado un ensayo del que soy coautor y que descubre y demuestra, con una avalancha de razones y datos, que Munera es el nombre del lugar, al que se refería Cervantes, en la que quizá sea la frase más famosa de la literatura «En un lugar de la Mancha de cuyo nombre no quiero acordarme».

El ensayo, publicado en castellano con el título «Un lugar de la Mancha. La patria de don Quijote al descubierto» , contesta las dos preguntas que durante más de 400 años no habían sido resueltas satisfactoriamente: ¿Cuál es el famoso lugar donde vivía el hidalgo manchego? y ¿Por qué Cervantes no deseaba recordar su nombre?

Los que podáis leer el ensayo, de 166 páginas, veréis que en él se encuentran todo tipo de razones, desde las más literarias a las geográficas. Y, precisamente, que en este blog es lo que nos importa, el análisis geográfico ha sido fundamental para corroborar que todo lo dicho por Cervantes encajaba, con una precisión sorprendente, una vez ubicado Munera como el lugar. Distancias y recorridos, en los distintos medios de transporte que aparecen en El Quijote, características geográficas, evidencias de la existencia de determinados elementos por su aparición en cartografías lo más cercanas a la época de Cervantes, cruces de caminos, etc. Para analizar todo ello, la herramienta utilizada ha sido gvSIG, un Sistema de Información Geográfica en software libre que todos los que frecuentáis este blog conocéis de sobra. A partir del lugar se han situado puntos, tan relevantes en la novela, como las ventas y trazado las rutas que se pueden cartografiar. gvSIG, utilizado en todo el mundo para todo tipo de usos relacionados con la gestión y tratamiento de información geográfica, en todo tipo de instituciones, ha servido también para resolver el mayor enigma literario que han visto siglos.

Cervantes lanzó un reto y, gvSIG mediante, ha sido resuelto.

by Alvaro at April 21, 2021 05:12 PM

April 20, 2021

In this post, we will explore the new feature in the Input app which allows you to record and display photo’s geotag information (e.g. direction, longitude and latitude).

Photos’ metadata

When users take photos in the field, they often would like to record extra information which can help convey a better understanding and awareness of the surroundings. Recent mobile devices allow users to record photo direction, location, time stamp, camera settings, etc when taking photos. This information is often optional or restricted by default (due to privacy) within the mobile settings.

For surveying and data collection, this information can add extra dimension and context. For example, if you are surveying a bat nesting, it is useful to know the directions of the photos you have taken. This will help identifying the site easily in the subsequent site visit.

Geotag information is also useful metadata to have attached to your photos. There are other GIS or non-GIS applications which can read and interoperate the information.

In the recent version of the Input app, we have added a feature which allows you record and display the geotag information. Combined with QGIS styling, you can create a very informative symbology to display the information while you are capturing it in the field.

Before you start

To be able to capture geotag information, you need to enable your Camera app to use location. For that:

  • in Android, from Settings > Apps and notifications > (see all) Camera and under Permission ensure you allow Location

  • in iOS, from Settings > Privacy > Location Services > Camera and the Precise Location is enabled.

Quick start

If you want to record photos with directions as shown below, you can follow the following steps:

  • Log into Mergin (or sign up if you have not yet registered)
  • Brows to photo survey project
  • On the top right-click on Clone to make a copy of your project in your account. You can choose a different project name:

  • Download and install the Input app

  • Select My projects and log in using your Mergin credentials

  • Download the newly created project from the earlier step

  • Start recording points and add photos, you should see photo direction

More details

The project configuration is done in QGIS. The form has been set up to allow you take photos using Attachment widget. To learn more about different form edit tools in Input app and how to set the up in QGIS, see the Input app help pages.

There are multiple fields to extract geotag information from the photo, as soon as you add the photo either using the camera or the gallery:

  • Direction: read_exif_img_direction(@project_home + '/' + "photo")
  • Latitude: read_exif_latitude(@project_home + '/' + "photo")
  • Longitude: read_exif_longitude(@project_home + '/' + "photo")

As noted, all the above functions take the path to the photo (@project_home + '/' + "photo") and return different metadata related to the image.

In addition to the form, the layer styling has been modified to resemble the direction of the camera and field of view.

To set the camera icon and direction of the camera:

  • For the point styling, select SVG Marker
  • From the list of SVGs, select the camera icon. (ensure to embed the SVG to the project, otherwise it will appear as a question mark in the Input app.)
  • For the Rotation, select direction field

To add field of view:

  • Add another symbol layer to your point
  • For Symbol layer type select Geometry Generator
  • For Geometry type select Polygon / MultiPolygon
  • For the expression, type: wedge_buffer(center:=$geometry,azimuth:= "direction",width:=45,outer_radius:=0.0008)
  • For Symbol layer type, select Shapeburst Fill

Feedback and suggestions

Input, a field data collection app based on QGIS. Input makes field work easy with its simple interface and cloud-based sync. Available on Android and iOS.

Screenshots of the Input App for Field Data Collection

Get it on Google PlayGet it on Apple store

If you would like to add a new feature or suggestions to improve the app, do not hesitate to contact us on info@lutraconsulting.co.uk

April 20, 2021 09:00 AM

Imagens de sensoriamento remoto podem ser utilizadas em diferentes estudos ambientais. É possível utilizar elas para analisar presença de algas e sedimentos em cursos d’água, delimitar o uso do solo e avaliar inundações.

As possibilidades de uso são inúmeras.

Aliado à isso, temos alguns satélites que estão funcionando ou funcionaram por um bom período de tempo, tornando possível a realização de séries históricas de imagens.

Confira nosso curso de Geoprocessamento para Estudos Ambientais usando QGIS.

Neste tutorial, vamos aprender a utilizar o Google Earth Engine (GEE) dentro QGIS para avaliar a evolução temporal do município de Siderópolis e região de 1990 até os dias de hoje, por meio das imagens Landsat 5 e 8.

Como instalar o GEE no QGIS?

Para utilizar o GEE no QGIS, é necessário instalar o Google Earth Engine Plugin, mas não basta somente ir em Complementos e marcar o plugin, alguns passos antes disso são necessários.

É necessário ter o Python instalado no seu computador e estar registrado no Google Earth Engine.

O procedimento com maiores detalhes pode ser encontrado no site do Jorge Santos (Instrutor GIS – QGIS 3.14: Instalação dos Plugins Google Earth Engine e Sentinel-2), sugerimos que você utilize ele para instalar o GEE plugin.

Máscara de Nuvens do Landsat

Agora que já temos o GEE plugin instalado, vamos começar a escrever nosso código em python. Você pode escrever diretamente no editor de scripts do QGIS ou trabalhar no bloco de notas e depois jogar para dentro do QGIS.

Antes de mais nada, vamos importar as bibliotecas do GEE e do plugin e caso você não tenha autenticado ou inicializado o GEE, descomente as linhas comentadas abaixo.

import ee
from ee_plugin import Map

# ee.Authenticate()
# ee.Initialize()

Agora, iremos criar uma função para remover as nuvens das imagens Landsat.

A remoção das nuvens é realizada usando a banda ‘pixel_qa’, a qual apresenta valores em bits classificados conforme o que foi capturado pelo sensor. Por exemplo, o bit 3 e 5 representam a presença de sombra das nuvens e nuvens no Landsat 8.

Lembre-se que tais valores se diferenciam para cada sensor.

def maskL8SR(imagem):
    '''Filtro para remover nuvens e sombras
    das imagens LANDSAT 8 com buffer de 100m'''
    
    cloudShadowBitmask = int(2**3)
    cloudBitmask = int(2**5)
    qa = imagem.select('pixel_qa')
    mask = qa.bitwiseAnd(cloudShadowBitmask).eq(0).And(\
           qa.bitwiseAnd(cloudBitmask).eq(0))\
        .focal_min(radius = 100, units = 'meters')
    
    return imagem.updateMask(mask)

def maskL5SR(imagem):
    '''Filtro para remover nuvens e sombras
    das imagens LANDSAT 5 com buffer de 100m'''
    
    qa = imagem.select('pixel_qa')
    cloud = qa.bitwiseAnd(int(2**5)).And(qa.bitwiseAnd(int(2**7)))\
        .Or(qa.bitwiseAnd(2**3)).focal_max(radius = 100, units = 'meters')
        
    mask2 = imagem.mask().reduce(ee.Reducer.min())
    
    return imagem.updateMask(cloud.Not()).updateMask(mask2)

Em python, a definição de funções é iniciada com ‘def’, seguida do nome da função e a inserção dos argumentos de entrada entre parênteses, finalizando com dois pontos. Tudo que for para ser executado pela função deverá estar com um espaçamento (i.e. identação).

Como vamos utilizar essas funções numa coleção de imagens do GEE por meio da função map (que aplica a função desejada em todas as imagens da coleção), devemos manter apenas a própria imagem como entrada.

Além disso, note que usamos o método .select() para selecionar a banda ‘pixel_qa’ e pedimos para que os pixels que contém os bits de nuvens e sombras sejam diferenciados.

Note também que a função do Landsat 8 marca os pixels bons (i.e. sem nuvens), deixando seus pixels com valor igual à 1, enquanto as nuvens são 0, sendo possível aplicar o buffer por meio do focal_min (janela que passa na imagem selecionando os pixels de menor valor).

Na função do Landsat 5 é diferente, ela marca os pixels ruins (i.e. com nuvens), fazendo com eles sejam identificados com o valor igual à 1, enquanto onde não há nuvem, o pixel fica 0. Dessa forma, aplicamos o focal_max para expandir a máscara por meio de um buffer. Inclusive, é utilizado a função Not() para inverter esses valores depois.

Por fim, é chamada a função updateMask() para tirar os pixels marcados com 0.

Definindo Parâmetros de Entrada

Agora vamos definir parâmetros como o polígono que abrange nossa área de estudo e a coleção de imagens para os anos em estudo (neste caso, 1990 e 2020).

Embora no QGIS o polígono não seja usado, é interessante mantermos o código caso você utilize ele fora do QGIS.

coordenadas = [[-49.45, -28.58], [-49.40, -28.58], [-49.40, -28.60], [-49.45, -28.60], [-49.45, -28.58]]
ee_pol = ee.Geometry.Polygon(coordenadas)

img_1990 = ee.ImageCollection('LANDSAT/LT05/C01/T1_SR')\
	.filterDate('1989-01-01', '1991-12-31')\
	.filterBounds(ee_pol)\
	.map(maskL5SR)\
	.mean()
	
visParams5 = {'bands': ['B3', 'B2', 'B1'], 'min': 0, 'max': 3000, 'gamma': 1.4}

Map.addLayer(img_1990, visParams5, 'IMG_1990')

img_2020 = ee.ImageCollection('LANDSAT/LC08/C01/T1_SR')\
	.filterDate('2020-01-01', '2020-12-31')\
	.filterBounds(ee_pol)\
	.map(maskL8SR)\
	.median()
	
visParams8 = {'bands': ['B4', 'B3', 'B2'], 'min': 0, 'max': 3000, 'gamma': 1.4}

Map.addLayer(img_2020, visParams8, 'IMG_2020')

Nas primeiras linhas do código, definimos as coordenadas dos vértices do nosso polígono que será usado para filtrar as imagens disponíveis nos anos de 1990 e 2020.

Note que para o ano de 1990, aumentamos a janela para fazer a imagem, isso porque se usarmos somente 1990, para esta localidade, a imagem fica ‘embaçada’, ainda com a presença de pixels contaminados por nuvens.

Em seguida, salvar as imagens por meio da função ImageCollection(), aplicar os filtros de data e de localização. Depois passamos a função para retirar as nuvens e aplicamos a função mediana para que todas as imagens sejam unidas, prevalecendo os valores medianos delas.

Adicionando imagens do GEE no QGIS

Para abrirmos as imagens no QGIS, precisamos usar a função Map.addLayer().

O primeiro argumento desta função é a imagem propriamente dita, seguida do dicionário indicando quais bandas devem ser usadas e os valores máximos e mínimos. Por fim, temos o nome da camada.

Agora que você já tem as imagens, basta construir um layout e gerar seu mapa. A figura abaixo apresenta o resultado da nossa postagem.

Análise histórica da região de Siderópolis em Santa CatarinaAnálise histórica da região de Siderópolis em Santa Catarina.

É interessante verificar nessa imagem a evolução da área urbana ao sul, assim como a recuperação de áreas degradadas pela mineração de carvão.

Você pode conferir o código completo no GitHub.

Agora você pode replicar o código para adicionar outros anos ao seu mapa. Você também pode aplicar índices espectrais como NDVI para analisar o desenvolvimento da vegetação e ou ainda modificar os valores mínimos e máximos para melhorar o contraste das imagens.

Confira nosso curso de Geoprocessamento para Estudos Ambientais usando QGIS.

E chegamos ao fim da nossa postagem, caso tenha ficado com alguma dúvida, utilize os comentários que responderemos assim que possível.

The post Como criar mapas de evolução temporal no QGIS usando Google Earth Engine? first appeared on Blog 2 Engenheiros.

by Fernando BS at April 20, 2021 06:00 AM

April 19, 2021

El Ayuntamiento de La Garriga (Barcelona, España), a través de Asecrim, ha elaborado su Plan Local de Seguridad utilizando gvSIG Desktop. En él han elaborado mapas de calor de la delincuencia en el municipio, y han realizado un análisis temporal de los incidentes conocidos, con el fin de destinar los efectivos policiales en los momentos y lugares necesarios maximizando la eficiencia los recursos públicos.

El documento completo, en catalán, es el siguiente:

Click to access Pla_Local_Seguretat_La_Garriga_2021-2024.pdf

Gracias a las herramientas relacionadas con el análisis del delito de gvSIG Desktop, los ayuntamientos pueden realizar la compleja tarea de elaboración de los planes de seguridad de forma gráfica.

by Mario at April 19, 2021 12:14 PM

April 18, 2021

April 16, 2021

Mt. Ruapehu, Ngauruhoe, and Tongariro

The following is about how to build an elevation file to work with procedural-gl.js. I just really wanted to build my own elevation dataset and thought it would be helpful to share how I did it.

Procedural-gl.js has changed the game in attainable 3D online mapping. When I came across this project, I was very excited and wanted to get to mapping ASAP.  There is so much potential in just a few lines of JS code; just add a base map, an elevation file and boom, you have a smooth functioning 3D map, mobile ready. I cannot thank Felix Palmer enough for putting this out there. You can check out the source code and some really great maps he developed here.

I’m not one to leave well enough alone. I wanted to develop a method to use New Zealand specific elevations with my goal being to eventually incorporate New Zealand’s LiDAR elevations. However, before I get to building elevation models for LiDAR, I wanted to test building an elevation model with the LINZ 8m DEM. This dataset covers the entire nation and was light weight enough to test on the ol’ home Linux box.

Method

As mentioned above, to run procedural-gl.js in its most basic form, you’ll need a tile cache for your base map and a tile cache for your elevation file.  The base map I have covered, but I was unfamiliar for what was needed for the elevation tile cache. Fortunately, I came across SyncPoints tutorial on how to do this.

I am not going to rewrite their entire process.  They did a really good job explaining the what, why, and how.  Instead, I will layout what I did for the NZ specific data. Much of this process can be further refined and this is really just the proof of concept.

Two quick notes before I get started:

  1. For now, this process is only done in Web Mercator.  I always want to push the limits and do my online work in NZTM, but for now, I am going to work with what I have.
  2. I am not going to talk about building a raster tile cache for the base map.  I was fortunate to have a web mercator tile cache for NZ available.

We want to create an elevation file, in PNG rgb format, rendered into an XYZ tile cache directory, with 512x512px tiles.

The basic steps are this:

  1. Remove all errant noData pixels from a geotiff.  
  2. Convert your elevation geotiff to rgb (single band to three band), in Byte format.
  3. Render your rgb geotiff into xyz raster tiles. Use a directory format.
  4. Consume through procedual-gl.js

List of Tools

Data

The data I used for this was the full LINZ 8m DEM elevation. Make sure your data is in Web Mercator projection (EPSG:3857).  See some earlier blogs about how to retroject across all the tiles using BASH and gdalwarp.

LINZ 8m DEM

Process

1. With the VRT, we can gather all the elevation tiles under one file name.  

gdalbuildvrt elevation.vrt *.tif

2. Remove nodata

Using the VRT, we can preform the noData conversion across all the elevation tiles as if it were a mosaic.  Better yet, the output from this command will produce a mosaic out the other side. GDAL is magic.

gdalwarp  -t_srs EPSG:3857 -dstnodata None  -co TILED=YES  -co COMPRESS=DEFLATE  -co BIGTIFF=YES  elevation.vrt elevation_noData_mosaic.tif

3. RGB-ify the elevation mosaic.

rio rgbify -b -10000 -i 0.1 /elevation_noData_mosaic.tif elevation_noData_mosaic_rgb.tif

4. Create your XYZ tile cache in EPSG:3857

GDAL2Tiles is a quick way to render xyz tile caches if you have single layer and are working in Web Mercator.  Be sure to use 3.1 or greater to get the functionality of xyz caching and tile size control.  You need these both.  In the end, I used the GDAL Docker developed by perrygeo, with modification, --with-python, and got access to GDAL 3.2.  It was a lifesaver, however, I did need to modify the Dockerfile to add GDALs Python bindings and rebuild the Docker.

gdal2tiles.py --s_srs=EPSG:3857 --zoom=0-16 --xyz --tilesize=512 --processes=7 elevation_noData_mosaic_rgb.tif /data/tile-cache

Localhost Testing

Once I built the tile cache, I built a quick site on my localhost to test. I embedded the tile cache into the site (place the tile cache in the same directory as your web files), then I ran it under a local server.  The advantage of embedding your tile cache for testing is that it allows the local server to serve out the raster tiles as well. There is no need to set up a tile server. The local server does this for you.

I switched it up this time used the npm http-server to avoid some cors issues I was encountering. 

http-server . —cors 8000

Code and Website

You can view the code for the HTML and JS here:

https://github.com/xycarto/3D_Aotearoa/

The running website can be found here:

https://xycarto.github.io/3D_Aotearoa/

It works great on mobile devices, but will happily display on your desktop.

by xycarto at April 16, 2021 07:47 AM

April 15, 2021

April 14, 2021

April 12, 2021

Pegando inspiração no artigo do @NexoJornal , sobre a extensão das divisas dos estados brasileiros, eu resolvi replicar a análise para os municípios do Ceará.

O Governo do Estado do Ceará publicou a Lei Nº16.821, de 09 de janeiro de 2019, onde ficam descritos os limites intermunicipais dos 184 municípios cearenses.

Recentemente em um esforço coletivo a comunidade OpenStreetMap do Ceará finalizou os ajustes dos limites na base do OSM.

Top 5 Maiores limites entre municípios do Ceará.

1º – Limite entre Parambu e Tauá com 124,24 km

2º – Limite entre Pereiro e Jaguaribe com 120,99 km

3º – Limite entre Independência e Tauá com 111,05 km

4º – Limite entre Ibicuitinga e Morada Nova com 97,53 km

5 º – Limite entre Santa Quitéria e Catunda com 88,24 Km

Top 5 Menores limites entre municípios do Ceará.

1º – Limite entre Horizonte e Guaiúba com 803.21m

2º – Limite entre Limoeiro do Norte e São João do Jaguaribe com 849.08m

3º – Limite entre Canindé e Madalena com 937.71m

4º – Limite entre Aquiraz e Fortaleza com 1,71 Km

5º – Limite entre Pedra Branca e Quixeramobim com 1,73 Km

A análise foi realizada no @qgis, onde foi criado um modelo que analisa os dados dos limites municipais e retorna uma camada dos limites entre os municípios com as respectivas extensões.

Também no QGIS foi possível elaborar todos os 467 mapas dos limites utilizando a função Atlas do QGIS. Em breve iremos publicar um vídeo mostrando como chegamos nesse resultado.

Os dados de limites do estado do Ceará pode ser obtidos no portal do IPECE ou diretamente do OSM, segue um link do overpass turbo onde os dados podem ser visualizados e baixados: overpass-turbo.eu/s/160y

Para saber um pouco mais sobre a comunidade OpenstreetMap e como contribuir com o projeto segue um fio no twitter:


The post A extensão dos limites dos municípios do Ceará appeared first on Narcélio de Sá.

by Narcélio de Sá at April 12, 2021 02:07 PM

April 06, 2021

Just few weeks ago, QGIS 3.18 has been released - the first version to include support for point cloud data, thanks to the great support from the QGIS community in our joint crowdfunding campaign with North Road and Hobu.

We have received a lot of feedback from users and we would like to summarise the most common problems people have faced, and add a couple of tips on useful features that are otherwise easy to overlook.

1. I am unable to load LAS/LAZ files

If your QGIS installation does not recognize .las or .laz files (which are the two point cloud formats that we currently support in QGIS), the most likely source of the problem will be that the PDAL library needed for reading LAS/LAZ is missing in your installation. Unfortunately not all installers include it at this point. Particularly on Windows, there are several options how to install QGIS, and only one choice is the right one for PDAL support. At the time of writing (April 2021), you need to download installer from section called “Standalone installers from OSGeo4W testing packages (MSI)”, here’s a screenshot of what to look for:

On macOS the official all-in-one installers include point cloud support. On Linux, PDAL support depends on the particular distribution/packages, but probably most of them include point PDAL library.

2. Point cloud is completely flat in 3D

It may happen that you load a point cloud layer, then open the 3D map view and the point is displayed as a flat surface like on this screenshot:

The reason for this is that the 3D renderer is not enabled for your point cloud layer. To enable 3D renderer, open the Layer Styling panel (there’s F7 shortcut to open it!), then switch to the second tab (“3D View” - with a cube icon) and change the “No Rendering” option to some other option - for example “Classification” in case your point cloud is classified. You should then see your point cloud in 3D.

3. Point cloud is rendered twice - in 3D and in 2D (“flat”)

Commonly when people open 3D view with a point cloud, they may see the point cloud rendered twice, like in the following screenshot:

The reason is that both 3D rendering and 2D rendering of point cloud is enabled, and therefore the layer is also rendered as a 2D map texture on top of terrain (which is flat by default). An easy way how to fix this is to set 2D rendering of the point cloud layer to “Extent Only” in Layer Styling panel (in the first tab):

If the dashed rectangle marking the extent is still bothering you, it is possible to change the line symbol to use white (or transparent) colour.

Hopefully in near future we would address unexpected behaviour and layers with a 3D renderer defined would not be rendered as 2D.

4. I still can’t see my point cloud in 3D view

It could happen that if your point cloud is for a small area, yet the elevation of points is relatively high: when you first open 3D view or when you click “Zoom Full” button, the view may get zoomed too close and the actual point cloud data may be behind the camera. Try zooming out a bit to see if it helps. (This is a bug in QGIS - at this point “zoom full” ignores extra entities and only takes into account terrain.)

5. Enable “Eye Dome Lighting” in 3D view

For a much better 3D perception of your point cloud, try clicking the “Options” button (with a wrench icon) in the toolbar of 3D view and enable “Show Eye Dome Lighting” in the pop-up menu. This will apply extra post-processing that adds slight shading based on the positions of nearby points, and adds silhouettes when there is a sudden change in depth:

As soon as you zoom into your point cloud more, to the point when individual points can be seen, the eye dome lighting effect will start to disappear. You can try experimenting with the point size (in Layer Panel, 3D View tab) - increasing the point size will help.

- A view with point size 2.0 pixels

- The same view with point size 6.0 pixels

6. Try the new camera navigation mode in 3D view

In QGIS 3.18 we have added a new “Walk Mode” camera navigation mode that is much better suited for inspection of point clouds (compared to the default “Terrain Based” camera navigation mode). Open 3D map view configuration dialogue, pick “Camera & Skybox” tab and set it here:

Control                                          Action                                         
Mouse move Rotate camera
Mouse wheel Change movement speed
W / Up Move forward
S / Down Move backward
A / Left Move left
D / right Move right
Q / Page up Move up
E / Page dn Move down

7. Use elevation scaling and offset to your advantage

Sometimes it is useful to modify offset and/or scale of the elevation of points (Z values). For example, if the point elevations do not match your other 3D data, or maybe you have source data where X,Y coordinates are in meters and Z coordinate is in feet!

Another case when this can be useful, is when your point cloud data are further away from the terrain and the default “Terrain Based” navigation mode does not work nicely - it expects that data are near the terrain, and the camera rotates around a point terrain, which may feel strange when browsing point clouds. A workaround is to apply offset to the point cloud layer to move the points near the terrain. For example, this is a point cloud which is roughly at 200 meters elevation (the grey plane is the terrain):

When an offset of -200 is applied to the point cloud in Layer Styling panel, data show up much closer to the terrain and camera navigation feels more natural:

8. Try circular points in 2D maps

By default QGIS draws points as squares as this is the fastest option. But for higher quality output you may want to change point symbol style to circle in Layer Styling panel, which makes things look a little less jagged:

– using squares – using circles

At this point we always use square points in 3D views - in the future we will likely offer circular points in 3D views as well.

9. Give us your feedback

We would love to hear from you about your experience with point clouds in QGIS so far, what features you are missing or what problems you have encountered - feel free to drop us a mail at info@lutraconsulting.co.uk. If you think you have found a bug, best to file an issue for it in the QGIS GitHub repository.

With the QGIS 3.18 release, we know we are only at the beginning of a long journey to provide great point cloud support to the users. The decreasing cost of laser scanning (LIDAR) hardware and increasing availability of photogrammetric methods means that we will be be seeing point cloud data in GIS more and more often. There is still a lot of functionality missing for efficient work with raw point clouds, for easy processing of data and we are still using only a fraction of what the PDAL point cloud library offers. We are dedicated to provide first class support for point clouds in QGIS - however this cannot be done without funding. Therefore, if you would like to help us to push point clouds in QGIS to the next level, please do not hesitate to contact us at info@lutraconsulting.co.uk.

You may also like...

Input, a field data collection app based on QGIS. Input makes field work easy with its simple interface and cloud-based sync. Available on Android and iOS. Screenshots of the Input App for Field Data Collection Get it on Google Play Get it on Apple store

April 06, 2021 06:00 AM

April 01, 2021

WTF (what2figures) é um novo sistema de endereçamento global revolucionário, que pode localizar sua posição na Terra com apenas dois números. 

No passado, se você quisesse compartilhar sua localização com outra pessoa, teria que se lembrar de três palavras inteiras. Obviamente, ter que lembrar três palavras desconexas para compartilhar sua localização é desnecessariamente complicado, menos para o o Goku que usa carne, leite, pão até para sincronizar um Kamehameha. Agora você só precisa se lembrar de dois números simples!

Sim meu jovem, apenas dois números.

E isso não é tudo … WTF é reconhecido por todas as principais empresas de mapeamento em todo o mundo. Insira seu endereço WTF de dois números no Google Maps, OpenStreetMap, Bing Maps ou Apple Maps e seu endereço será reconhecido instantaneamente e mostrado no mapa.  

Encontrar seu endereço no what2figures não poderia ser mais fácil. Basta clicar no mapa WTF interativo e você receberá um endereço de dois números exclusivo. Este endereço de dois dígitos aponta sua localização na Terra com precisão milimétrica. Agora você pode compartilhar seu endereço WTF com quem quiser. Clique nos links do Google Maps ou OSM e seu endereço WTF será mostrado no Google Maps ou OpenStreetMap. Se você deseja compartilhar sua localização atual com o resto do mundo, você também pode clicar no link ‘Tweet’ para postar seu endereço WTF de dois números em seu fluxo do Twitter. 

Use WTF e nunca mais se perca.

Fonte: Maps Mania

The post What2figures um sistema de endereçamento global revolucionário appeared first on Narcélio de Sá.

by Narcélio de Sá at April 01, 2021 12:52 PM

Sometimes you just have to work with binary in your PostgreSQL database, and when you do the bytea type is what you’ll be using. There’s all kinds of reason to work with bytea:

  • You’re literally storing binary things in columns, like image thumbnails.
  • You’re creating a binary output, like an image, a song, a protobuf, or a LIDAR file.
  • You’re using a binary transit format between two types, so they can interoperate without having to link to each others internal format functions. (This is my favourite trick for creating a library with optional PostGIS integration, like ogr_fdw.)

Today I was doing some debugging on the PostGIS raster code, testing out a new function for interpolating a grid surface from a non-uniform set of points, and I needed to be able to easily see what the raster looked like.

Interpolated raster surface grid

There’s a function to turn a PostGIS raster into a GDAL image format, so I could create image data right in the database, but in order to actually see the image, I needed to save it out as a file. How to do that without writing a custom program? Easy! (haha)

Basic steps:

  • Pipe the query of interest into the database
  • Access the image/music/whatever as a bytea
  • Convert that bytea to a hex string using encode()
  • Ensure psql is not wrapping the return in any extra cruft
  • Pipe the hex return value into xxd
  • Redirect into final output file

Here’s what it would look like if I was storing PNG thumbnails in my database and wanted to see one:

echo "SELECT encode(thumbnail, 'hex') FROM persons WHERE id = 12345" \
  | psql --quiet --tuples-only -d dbname \
  | xxd -r -p \
  > thumbnail.png

Any bytea output can be pushed through this chain, here’s what I was using to debug my ST_GDALGrid() function.

echo "SELECT encode(ST_AsGDALRaster(ST_GDALGrid('MULTIPOINT(10.5 9.5 1000, 11.5 8.5 1000, 10.5 8.5 500, 11.5 9.5 500)'::geometry, ST_AddBand(ST_MakeEmptyRaster(200, 400, 10, 10, 0.01, -0.005, 0, 0), '16BSI'), 'invdist' ), 'GTiff'), 'hex')" \
  | psql --quiet --tuples-only grid \
  | xxd -r -p \
  > testgrid.tif 

April 01, 2021 08:00 AM

March 29, 2021

Compartimos una tesis final de grado que puede ser muy interesante para ampliar conocimientos en el uso de gvSIG como herramienta de análisis hidrológico.

Una introducción a lo que podéis encontrar en esta tesis, redactada por su autora:

«Piura es una ciudad que cada cierto tiempo es fuertemente golpeada por las consecuencias de las intensas precipitaciones que se presentan en nuestra región, especialmente en épocas del Fenómeno El Niño. La presente tesis surge con la intención de conocer las características físicas y el aporte de caudal de una de las subcuencas de gran aporte al caudal del río Piura en épocas del FEN, además de conocer que tan bien puede trabajar un software libre como lo es gvSIG en comparación con un software licenciado cuya limitación de este último se encuentra en el elevado costo de su licencia, un requisito que no todos pueden cumplir.

Por último, el autor desea que el material puesto a disposición sea de utilidad para aquellos que tengan la intención de llevar a cabo proyectos que impliquen un estudio profundo en hidrología».

La tesis está accesible en PDF en este enlace: https://pirhua.udep.edu.pe/handle/11042/4810

by Alvaro at March 29, 2021 09:31 AM

dmorissette shows mobile app

Daniel Morissette, president of Mapgears, shows the mobile app used by over 90 000 snowmobilers and ATVers across North America every year.

After two years of partnership that led to the conquest of the North American snowmobile and ATV trail management market, Mapgears acquires Ondago, a transaction that will push the Canadian firm to new heights by adding a solid mobile development expertise to the team.

A natural match

the new team

Daniel Morissette, Simon Mercier and Julien-Samuel Lacroix from Mapgears’ management team with their new collaborators from Ondago, Benoit Racine and Martin Guay

Mapgears is an expert in web mapping that has developed numerous tools to simplify the monitoring of operations and decision-making process of its clients around the world. The firm currently has customers in a wide variety of sectors, including marine navigation, public works, recreational trails and many others.

In order to meet the evolving needs of its clients and solidify its presence in the market, Mapgears quickly turned to the mobile technologies and sought a partner with solid experience and a proven product in this market.

This is where the collaboration with Igloo Creations, the company behind the Ondago product which is already extremely popular in the Quebec tourism industry, was born. This product consists of a mapping platform and a mobile application available on iPhone, iPad and Android which allows users to easily locate themselves during their expeditions.

This relationship has evolved very naturally over the past two years due in large part to the complementary nature of the technologies and business processes, the similar needs of the customers and the in-depth knowledge of each team in their field.

With a combined total of more than a quarter of a million active users for mobile applications and hundreds of clients in the public and private sectors, the new Mapgears team now stands as a key player in mapping-based solutions in North America.

Improvements and new products to come

In a more concrete manner, existing Mapgears customers will have access to a well-rounded range of solutions and increased portability of their mapping applications that will now be available on mobile. The development of new functionalities will also be faster thanks to the addition of new talents to the team.

Customers of the mapping platform and users of the Ondago mobile application will be reassured to know that the original team will continue to provide the excellent level of service they are used to. They will also benefit from Mapgears’ expertise, which will greatly help in improving existing products and will lead to many new features.

We are hiring!

In order to support the rapid growth that is expected following this transaction, Mapgears is actively looking to recruit experienced mobile application developers who are looking for a challenge!

by Mapgears Team at March 29, 2021 09:00 AM