Welcome to Planet OSGeo

August 19, 2018

The PostGIS development team is pleased to release PostGIS 2.5.0rc1.

This release is a work in progress. Remaining time will be focused on bug fixes and documentation until PostGIS 2.5.0 release. Although this release will work for PostgreSQL 9.4 and above, to take full advantage of what PostGIS 2.5 offers, you should be running PostgreSQL 11beta3+ and GEOS 3.7.0rc1 which were released recently.

Best served with PostgreSQL 11beta3 which was recently released.

Changes since PostGIS 2.5.0beta2 release are as follows:

  • 4146, Fix compilation error against Postgres 12 (Raúl Marín).
  • 4147, 4148, Honor SOURCEDATEEPOCH when present (Christoph Berg).

View all closed tickets for 2.5.0.

After installing the binaries or after running pg_upgrade, make sure to do:


— if you use the other extensions packaged with postgis — make sure to upgrade those as well

ALTER EXTENSION postgis_topology UPDATE;
ALTER EXTENSION postgis_tiger_geocoder UPDATE;

If you use legacy.sql or legacy_minimal.sql, make sure to rerun the version packaged with these releases.


by Regina Obe at August 19, 2018 12:00 AM

August 16, 2018

Using synced data streams and CARTO VL, this map shows active fire perimeters and their growth since the beginning of July.

The initial goal of the map was to animate the growth of fires over time. Through the design and development process, other patterns began to emerge like the speed at which some fires are growing compared to others, and the spread pattern of fires over different geographic areas.

Read on to learn more about how we are leveraging synced data feeds, powerful visualization capabilities, and dynamic UI elements to power this near real-time fire map.

Link to live map


The fire polygons displaying on the map are a derivative of two datasets from GeoMAC Wildland Fire Support: active fire perimeters and all (active and inactive) perimeters for the 2018 fire year. In order to display only active fires, the two fire datasets are intersected where any perimeter tagged as “active” in the first dataset grabs only the polygons within its boundary from the all 2018 perimeter dataset. Once a fire perimeter is declared inactive, it no longer displays on the map.

In the background, smoke patterns produced from the fires are displayed using smoke perimeter data from NOAA’s Fire Products.

Each dataset syncs to the source every hour.

This means any time the data are updated, so is the map.

Cool, right?

Keep reading for more… :)

Animation and Visualization

There are so many visualization capabilities inside of CARTO VL and lately, one of my favorites to experiment with is animation.

In the active fires map, there are two different polygon animations happening.

First, is the visualization of how fires grew over time inside of their current perimeter. The polygons draw in according to the date the perimeter was collected (datecrnt) and each is colored based on its reported acreage (gisacres). The animation begins on July 1, 2018 and plays through until it reaches the polygon with the most recent date in the table. The animation cycles through all of the data in 30 seconds. The polygons have no fade-in effect, but fade-out based on reported acres. This means that larger polygons stay on the map longer than smaller ones and given that most fires spread from a small area to a larger one, this provides a way to gradually fade out where a fire began and bring the larger, consecutive polygons, to the foreground.

filter: animation(linear($datecrnt,time('2018-07-01T00:00:00Z'),globalMAX($datecrnt)),30,fade(0,$gisacres))

Each polygon is colored according to its reported acreage using one of our CARTOColor schemes (OrYel), where yellow is assigned to smaller areas through orange to red for the largest. The opacity of a polygon blends between 0.1 and 0.6 based on its area.

color: opacity(ramp(linear($gisacres,viewportMIN($gisacres),viewportMAX($gisacres)),oryel),blend(0.1,0.6,linear($gisacres)))

Combining these animation and visualization techniques gives the visual impact of a growing and burning fire:

Animation and Visualization

The second animation, of smoke plumes, is a more subtle one and so is the style. Unlike the fires, the smoke is not animating over time (since NOAA serves a new dataset each day). Given that smoke has a natural, flow-like movement, I wanted an effect where on any given day, we could see which fires are generating the most smoke and get a sense for how the plumes are traveling. This is a fun technique to experiment with.

Smoke Animation

Viewport Based Styling

Another favorite CARTO VL feature of mine is the ability to do viewport-based styling versus globally over an entire dataset.

You might have noticed that in the color styling above, we are ramping linearly across the viewportMIN and the viewportMAX.

With viewport-based styling we get a better understanding of fires at both the national and more localized levels.

In the image below, when the map is zoomed out to the western United States, the fires that really pop out are the Mendocino Complex and Carr fires in Northern California. We can see there are many other fires burning in different states, but since those are two of the largest in the current view, and polygons are colored according to their size, the other, comparably smaller fires are colored more along the yellow to orange range.

Viewport zoomed out

If we zoom into the series of fires in southern Oregon, removing both Mendocino and Carr from the view, the symbology dynamically updates to take into account only the fires in the current viewport. In this case we can see that the Klondike and Taylor Creek fires are the largest burning fires in this area.

Viewport zoomed in

Legend and Interactivity

Of course no map is complete without a legend and hover or pop-up components.

Our Head of Design Emilio created an informative yet unobtrusive legend that hierarchically presents the different levels of information. The dynamic time stamp that cycles through the animation and the color scale legend are key to understanding the patterns seen on the map.

And since we can’t fit all of the information in a legend or map symbol, Front-End Developer Jesus created hover-based interactivity that dynamically fetches the name and acreage of the current fire polygon.

Hover interactivity

Take a look!

You can find this map and associated code here !

We hope this map gets you even more excited about CARTO VL and we can’t wait to see the maps that you make!

Happy mapping!

August 16, 2018 10:00 AM

Anyone who has programmed geospatial software has eventually come to a conclusion about data formats: there is only one truly de facto standard for geospatial data, the shape file, and the shape file sucks.

  • It requires at least three files to define one spatial layer (more if you want to specify coordinate reference system, or character encoding, or spatial indexing).
  • It only supports column names of 10 or fewer characters.
  • It lacks a time or timestamp data type.
  • It is limited to 2GB in file size.
  • It only supports homogeneous spatial types for each layer.
  • It only supports text fields of up to 255 characters.

Almost since they invented it, Esri has been trying to come up with replacements.

The “personal geodatabase” used the Microsoft Access MDB format as a storage layer, and stuffed geospatial information into that. Sadly, the format inherited all the limitations of MDB, which were substantial: file size, bloat, occasional corruption, and of course Windows-only platform dependencies.

File Geodatabase

The “file geodatabase” (FGDB) got around the MDB limitations by using a storage engine Esri wrote themselves. Unfortunately, the format was complex enough that they could never fully specify it and release a document on the format (and never seemed to want to). As a result, official support has only been through a proprietary binary API.

The FGDB format is very close to a shape file replacement.

  • It supports multiple layers in one directory.
  • It has no size limitations.
  • It has a rich set of data types.
  • It includes useful metadata about coordinate reference system and character encoding.

Since it shipped with ArcGIS 10 in 2010, the FGDB format has become popular in the Esri ecosystem, and it’s not uncommon to find FGDB files on government open data sites, or to receive them from GIS practitioners when requesting data.

CARTO has supported the shape file format since day 1, but we only recently added support for FGDB. We were able to support FGDB because the GDAL library we use in our import process has an open source read-only FGDB driver. Using the “open FGDB” driver allows us to use a stock build of GDAL without incorporating the proprietary Esri API libraries.

The file geodatabase format is a collection of files inside a directory named with a .gdb extension. In order to transfer that structure around, FGDB files are first zipped up. So, any FGDB data you receive will be a zip file that unzips to a .gdb directory.

FGDB data are loaded to CARTO just like any other format.

  • Use the “New Dataset” option and either browse to your FGDB .zip file or drag’n’drop it in.
  • Or, just drag’n’drop the .zip file directly into the datasets dashboard.

After loading, you will have one new dataset in your account for each layer in the FGDB, named using a datasetname_layername pattern.

For example, the Elections.zip file from Clark County, Nevada, includes 11 layers, as we can see by looking at the ogrinfo output for the file.

INFO: Open of `Election.gdb'
      using driver `OpenFileGDB' successful.
1: senate_p (Multi Polygon)
2: school_p (Multi Polygon)
3: regent_p (Multi Polygon)
4: precinct_p (Multi Polygon)
5: ward_p (Multi Polygon)
6: congress_p (Multi Polygon)
7: pollpnts_x (Point)
8: educat_p (Multi Polygon)
9: township_p (Multi Polygon)
10: commiss_p (Multi Polygon)
11: assembly_p (Multi Polygon)

After upload, the file has been convered to 11 datasets with the standard naming pattern.

Multiple FGDB Layers


If the FGDB format is so much better than shape files, why doesn’t the story end there?

Because FGDB still has a couple major problems:

  • There is no open source way to write to an FGDB file: that requires the proprietary Esri API libraries.
  • The FGDB format is a directory, which makes shipping it around involve annoying extra zip/unzip steps each time.
  • The FGDB format is closed, so there is no way to extend it for special use cases.

A couple years after FGDB was released, the Open Geospatial Consortium (OGC) took on the task of defining a “shape file replacement” format, that learned all the lessons of shape files, personal geodatabases, and file geodatabases.

  • Use open souce SQLite as the storage engine, more reliable and platform independent than MDB, but with the advantage of easy, language independent, read/write access via SQL.
    • The SQLite engine is open source and multi-platform, so no Windows dependency.
    • The SQLite engine stores data in a single file, so no need to zip/unzip all the time.
  • Leverage existing OGC standards like the WKT standard for spatial reference systems, and the WKB standard for binary geometry representation.
  • Document the format and include an extension mechanism so it can evolve over time and so third parties can experiment with new extensions.

The result is the GeoPackage (GPKG) format, which has become widely used in the open source world, and increasingly throughout the geospatial software ecosystem.

Loading GeoPackage into Carto now works exactly the same as FGDB: use the “New Dataset” page, or just drag the file into the dataset dashboard. All the layers will be imported, using the filename_layername pattern.

You can also now use GeoPackage as an export format! Click the export button and select the GPKG format, and you’ll get a single-layer GeoPackage with your table inside, ready for sharing with the world.

Download GPKG Layers

Thanks, GDAL!

All this works because of the wonderful multi-format tools in the GDAL library, which we use as part of our import process. You can exercise the power of GDAL yourself to directly solve your CARTO ETL problems using the ogr2ogr and ogrinfo tools in GDAL, check it out!

August 16, 2018 09:30 AM

August 15, 2018

This is Emit #5, in a series of blog-posts around the Smart Emission Platform, an Open Source software component framework that facilitates the acquisition, processing and (OGC web-API) unlocking of spatiotemporal sensor-data, mainly for Air Quality and other environmental sensor-data like noise.

Summer holidays and a heat-wave strikes The Netherlands. Time for some lighter material mainly told in pictures. As highlighted in Emit #2, I have the honor of doing a project for the European Union Joint Research Centre  (EU JRC), to deploy five AirSensEUR (ASE) boxes within The Netherlands, attaching these to the Smart Emission Platform in cooperation with RIVM (National Institute for Public Health and the Environment). The ASE boxes measure four Air Quality (AQ) indicators: NO2 (Nitrogen Dioxide), NO (Nitrogen Monoxide), O3 (Ozone) and CO (Carbon Monoxide) plus meteo (Temp, Humidity, Air Pressure) and GPS. Read more on ASE in this article.

ASE Architecture

The ASE is an Open Hard/Software platform that can be configured with multiple brands/types of sensors. In the current case all four above mentioned AQ sensors are from AlphaSense. As these are relatively cheap sensors (< $100,-), the challenge is to have these calibrated before final deployment. This calibration is done by placing the ASE boxes first at an RIVM station, gather data for a month or two and then calibrate these sensors from official RIVM reference measurements at the same location. Using both the raw ASE data and the RIVM reference data the calibration “formulae” can be determined, before placing the ASEs at their final deployment locations around The Netherlands and have the Smart Emission Platform assemble/publish the (calibrated) data for the next year or so. More info on AirSensEUR via this Google Search.

Ok, picture time!  Explanatory text is below each picture.

1. ASEs unboxed

Picture 1: Boxes arrived from EU JRC Italy on June 12, 2018. Assembling: upper left shows the (total of 20) AlphaSense sensors like “blisters” (Dutch “doordrukstrips”), the ASE box (with screwdrivers on top) and the protecting metal outer shield on the right.

2. placing AlphaSense sensors in sockets

Picture 2: Very carefully placing the AlphaSense sensors in the ASE Sensor Shield (an Arduino-like board) without touching the top-membrane!

3. All sensors firmly in their sockets

Picture 3: all sensors placed, attach current and next to network and other configuring!

4. Boxes humming and connected via WIFI to the LAN

Picture 4: On default startup (via touch buttons) the ASE will expose a default WIFI access point. This can be used to attach and to login at the “ASE Host Board”, a Raspberry Pi-like board running standard Linux Debian. SSH into each box and further configure e.g. the WIFI settings to become a WIFI client, first having all boxes connect to the local office WLAN.

5. configured for InfluxDB Data Push visualized via Grafana

Picture 5. Each box runs a Data Aggregator and can be configured to push data to a remote InfluxDB database. In our case we have setup a Smart Emission InfluxDB Data Collector where the (raw) data is received. This InfluxDB datastore is visualized using a Grafana Panel shown in the picture. We see the five boxes ASE_NL_01-05 sensing and pushing data!


6. All packed and in trunk of my car

Picture 6. A good start, but next we need to go out and place the boxes at the RIVM station for a period of calibration. So tearing down, packing, all into the trunk of my car. Up to the RIVM station! July 30, 2018, Still 35 degrees C outside.

7. The RIVM sensor station, right near the highway

Picture 7. July 30, 2018, 13:00. Arrived at the RIVM station. Now to figure out how to attach the five boxes. The lower horizontal iron pole seems the best option. Put all soft/hardware knowledge away, now real plumbing is required!

8. Could not have made this without the great help of Jan Vonk (RIVM)

Picture 8. Jan Vonk of RIVM, who also have deployed about 12 ASEs, placing the first boxes on the horizontal pole, so far so good.

9. All five boxes attached!

Picture 9. All five boxes strapped to the pole. Jan Vonk doing the hard work. Next challenge: they need power and WIFI…

10. Connecting to power…

Picture 10. One cannot have enough power sockets.

11. Power supplies covered under plastic box.

Picture 11. Covering all power supply stuff under tightened box shielded from rain.

12. Moment of truth starting up and attaching to local WIFI

Picture 12. July 30, 2018, 17:00. Last challenge: booting up the boxes and have them connecting to the local RIVM station’s WIFI. I had pre-configured WLAN settings in each box, but this is always a moment of truth: will they connect? If they do they will start sampling and push their raw data to the Smart Emission Platform…Then we can start the calibration period. And success.. they connected!

13. All boxes connected and sampling and pushing data.

Picture 13. Now on August 15, 2018, with minor hickups, and with great help from the JRC folks Marco Signorini and Michel Gerboles, we have all five boxes sampling and pushing data for the calibration period. The above plot shows raw NO2 data, to be calibrated.

A next step for the RIVM Program “Together Measuring Air Quality”.

So a good start! The heatwave is over, the next hard work is calibration. Why are we doing this? Well, like with meteorology, RIVM and others are stimulating Air Quality to be measured by basically anyone, from groups of civilians to individuals. For this RIVM has setup the program “Samen meten aan Luchtkwaliteit” (“Together measuring air quality”). Measuring Air Quality is not an easy task. We need to learn by doing, make mistakes, and spread knowledge gained. Both AirSensEUR and Smart Emission are therefore Open. Below some further links:

Smart Emission: GitHub, WebSite, Documentation, and Docker Images.

by Just van den Broecke at August 15, 2018 09:01 PM

August 14, 2018

There are only three certainties in life: death, taxes, and the constant growth in data sizes.

To deal with the latter, we have introduced a new mode to our SQL API: copy mode.

The new /sql/copyfrom and /sql/copyto end points are direct pipes to the underlying PostgreSQL COPY command, allowing very fast bulk table loading and unloading.

With the right use of the HTTP protocol, data can stream directly from a file on your local disk to the CARTO cloud database.

What’s The Difference?

Import API

When you import a file using the dashboard, or the Import API, we first pull a copy up to the cloud, so we make one copy.

Then, we analyze your file a bit. If it’s a text file, we’ll try and figure out what columns might be geometry. Once we’re pretty sure what it is, we’ll run an OGR conversion process to bring it into the CARTO database. So we’ve made another copy (and we get rid of the staging copy).

Once it is in the database, we still aren’t quite done yet! We need to make sure the table has all the columns the rest of the CARTO platform expects, which usually involves making one final copy of the table, and removing the intermediate copy.

Import API Process

That’s a lot of copying and analysis!

On the upside, you can throw almost any old delimited text file at the Import API and it will make a good faith effort to ensure that at the end of the process you’ll have something you can put on a map.

The downside is all the analyzing and staging and copying takes time. So there’s an upper limit to the file size you can import, and the waiting time can be long.

Also, you can only import a full table, there’s no way to append data to an existing table, so for some use cases the Import API is a poor fit.


In order to achieve a “no copy” stream from your file to the CARTO database, we make use of the HTTP chunked transfer encoding to send the body of a POST message in multiple parts. We will also accept non-chunked POST messages, but for streaming large files, using chunked encoding lowers the load on our servers and speeds up the process. Ambitious clients can even use a compressed encoding for more efficient use of bandwidth.

SQL API Copy Process

At our SQL API web service, we accept the HTTP POST payload chunks and stream them directly into the database as a PostgreSQL COPY, using the handy node-pg-query-stream module.

The upside is an upload that can be ten or more times faster than using the Import API, and supports appending to existing tables. You also have full control of the upload process, to tweak to your exact specifications.

The downside is… that you have full control of the upload process. All the work the Import API usually does is now delegated to you:

  • You will have to create your target table manually, using a CREATE TABLE call via the SQL API before running your COPY upload.
  • If your upload file doesn’t have a geometry column, you’ll have to compose one on your side for optimum performance.

    • You can use a post-upload SQL command to, for example, generate a point geometry from a latitude and longitude column, but that will re-write the whole table, which is precisely what we’re trying to avoid.
  • You will have to run CDB_CartodbfyTable() yourself manually to register your uploaded table with the dashboard so you can see it. For maximum speed, you’ll want to ensure your table already contains the required CARTO columns or the “cartodbfy” process will force a table rewrite to fill them in for you.

For Example…

Suppose you had a simple CSV file like this:

SRID=4326;POINT(-126 54),North West,89
SRID=4326;POINT(-96 34),South East,99
SRID=4326;POINT(-6 -25),Souther Easter,124

You would create a table using this DDL:

CREATE TABLE upload_example (
    the_geom geometry,
    name text,
    age integer

Then “cartdbfy” the table so it was visible in the dashboard:

SELECT CDB_CartodbfyTable('upload_example');

And finally, upload the file:

COPY upload_example (the_geom, name, age)

A copy call consists of two parts: an invocation of the COPY SQL command to specify the target table and format of the input file; and, the file payload itself.

For example, this shell script pipes a CSV file through a compressor and then to a streamed curl POST upload, so the data moves directly from the file to CARTO.



cat $filename \
| gzip -1 \
| curl \
  -X POST \
  -H "Content-Encoding: gzip" \
  -H "Transfer-Encoding: chunked" \
  -H "Content-Type: application/octet-stream" \
  --data-binary @- \

Note that the COPY command specifies the format of the incoming file, so the database knows where to route the various columns:

  • The tablename (column1, column2, column3) portion tells the system what database column to route each column of the file to. In this way you can load files that have fewer columns than the target table.
  • FORMAT CSV tells the system that the format is a delimited one (with comma as the default delimiter).
  • HEADER TRUE tells the system that the first line is a header, not data, so it should be ignored. The system will not use the header to route columns from the file to table.

Also note that for upload compression, on a fast network, a light compression (see the -1 flag on the gzip command) works best, because it balances the performance improvement of smaller size with the cost of decompressing the payload at the server, for the fastest overall speed.

Next Steps

  • If you’re interested in using the SQL API COPY infrastructure for your uploads or ETL, start with the SQL API documention for COPY. There are some basic examples for streaming data with Python and using curl for uploads and downloads.
  • You can do lots of basic ETL to and from CARTO using the ogr2ogr utility. The next release will include support for COPY, for a 2x speed-up, but even without it, the utility offers a convenient way to upload data and append to existing tables without going through the Import API.

August 14, 2018 04:50 PM

Todo mapa precisa de retoques.

Seja porque a sua primeira tentativa deixou ele desagradável, seja porque alguém pediu para modificar algum dado. Normalmente, é o segundo.

Em algumas situações, essas alterações são rápidas, basta modificar uma cor aqui, outra lá, trocar as configurações da escala e esta tudo pronto.

Entretanto, nem todas as solicitações são assim fáceis. Algumas precisam modificar todo o nosso banco de dados, ou seja, toda a nossa Tabela de Atributos.

Imagine a seguinte situação, a qual exploraremos neste tutorial, você fez o mapa de uso e ocupação do solo de uma determinada área de estudo e dividiu os usos em:

  • Pastagem;
  • Área Urbana;
  • Reflorestamento;
  • Agricultura;
  • Vegetação Secundária Estágio Inicial;
  • Vegetação Secundária Estágio Médio; e
  • Vegetação Secundária Estágio Avançado.

Até aqui, tudo bem.

Mas vamos supor que o empreendedor quer modificar o mapa e, ao invés de utilizar os termos Pastagem, Área Urbana, Reflorestamento e Agricultura, ele queira utilizar “Áreas Antropizadas”.

Se você tem poucos polígonos no seu shapefile de uso do solo, este processo pode ser rápido, mas e se você tiver mais de 100? 1.000? Vai fazer um por um? Não.

Acompanhe a nossa postagem e descubra como utilizar a calculadora de campo (“Field Calculator”) para resolver este problema no ArcGIS e QGIS.

Preparando um Arquivo para o Tutorial

No ArcGIS, a criação de um shapefile pode ser realizada pelo ArcToolbox. Nele, procure pela ferramenta Create Feature Class, a qual encontra-se dentro de Data Management Tools > Feature Class.

Nela, você irá fornecer dados como localização do shape a ser criado (Feature Class Location), nome do arquivo (Feature Class Name) e tipo da geometria (Geometry Type).

Os outros dados são opcionais.

Com o shapefile criado, desenhe alguns polígonos e insira as classes que apresentamos acima na tabela de atributos. A figura abaixo mostra o shapefile que criamos e sua respectiva tabela de atributos.

Shapefile criado para conduzir o tutorial do Blog 2 Engenheiros.Shapefile criado para conduzir o tutorial do Blog 2 Engenheiros.

No QGIS, a criação de shapefiles é realizada clicando no menu Camada, Criar Camada, e em seguida, Nova Camada Shapefile.

Uma janela será aberta e você que terá que preencher dados como tipo da geometria (ponto, linha ou polígono), sistema de coordenada e campos existentes na tabela de atributos.

Ao clicar em OK, o QGIS irá solicitar onde você quer salvar o novo arquivo. A figura abaixo mostra o resultado.

Shapefile criado para conduzir o tutorial do Blog 2 Engenheiros.Shapefile criado para conduzir o tutorial do Blog 2 Engenheiros.

Substituição no ArcGIS

Agora, com o nosso shapefile em mãos, iremos realizar a modificação solicitada pelo nosso empreendedor.

Lembre-se que modificações realizadas na tabela de atributos sem ligar o modo de edição são permanentes.

Clique com o botão direito sobre o shapefile e selecione Open Attribute Table (Abrir Tabela de Atributos). Em seguida, iremos criar uma nova coluna, de forma a não perder a informação original que iremos alterar.

Em seguida, vamos acrescentar uma nova coluna, iremos chamar ela de uso_solo2 (a criação de novas colunas se dá clicando em Table Options e Add Field).

Uma nova janela será aberta solicitando o nome da nova coluna e os tipos de dados que serão inseridos. No nosso caso, são dados do tipo Texto.

Quando você criar essa nova coluna, clique sobre ela com o botão direito e selecione a opção Field Calculator. Nela iremos utilizar uma função do tipo python para substituir vários processos com apenas algumas linhas de código.

Antes de escrever o código Python, lembre-se de marcar a opção “Python” (no topo da janela) e “Show Codeblock”, o que habilitará o “Pre-Logic Script Code”, onde iremos inserir o código abaixo.

def TrocarB2E( Valor ):
  if Valor == "Pastagem" or Valor == "Área Urbana" or Valor == "Reflorestamento" or Valor == "Agricultura":
    return u"Área Antropizada"
    return Valor

Este código irá receber um determinado Valor, e se ele for igual à Pastagem, ou a Área Urbana, ou à Reflorestamento, ou à Agricultura, irá retornar como resultado o texto Área Antropizada, caso contrário, irá retornar o próprio valor de entrada.

Após preencher este código no campo “Pre-Logic Script Code”, na caixa de texto seguinte, a qual leva o nome da coluna (no nosso caso, é uso_solo2), iremos escrever a nossa função, abrir parênteses e inserir a coluna uso_solo para converter os valores, conforme código abaixo.

TrocarB2E( !uso_solo! )

O resultado, assim como os campos preenchidos pelo Field Calculator, são apresentados na figura a seguir.

Código python e resultados da operação com Field Calculator no ArcGISCódigo python e resultados da operação com Field Calculator no ArcGIS.

Substituição no QGIS

No QGIS, com o nosso shapefile já adicionado, vamos acessar a tabela de atributos dele clicando sobre ele com o botão direito e selecionando Abrir Tabela de Atributos (“Open Attribute Table”).

Na tabela de atributos, há um botão com o ícone de um ábaco, clique sobre ele (ou utilize o atalho Ctrl+I). Esse procedimento irá abrir a Calculadora de Campo do QGIS (“Field Calculator”).

Lembre-se de marcar a caixa Criar Novo Campo (“Create New Field”) para que o processo gere uma nova coluna com os novos valores.

Na janela para inserção do código, utilize o código abaixo, onde cada linha representa uma condição e o resultado esperado dela.

  WHEN "uso_solo" IS 'Pastagem' THEN 'Área Antropizada'
  WHEN "uso_solo" IS 'Área Urbana' THEN 'Área Antropizada'
  WHEN "uso_solo" IS 'Reflorestamento' THEN 'Área Antropizada'
  WHEN "uso_solo" IS 'Agricultura' THEN 'Área Antropizada'
  ELSE "uso_solo"

Após clicar em OK, uma nova coluna será gerada com as novas classes. A figura abaixo apresenta o procedimento e o resultado no QGIS.

Procedimento e resultado do Field Calculator no QGIS.Procedimento e resultado do Field Calculator no QGIS.

Note que no Field Calculator do QGIS, há uma diferenciação do nome das colunas (as quais utilizam aspas duplas) e textos (que utilizam aspas simples).

No código que utilizamos, criamos vários “casos”, sendo que quando (“when”) o valor do campo é igual à um determinado valor, o resultado (“then”) é apresentado logo em seguida.

Lembre-se que esse procedimento também pode ser feito para números inteiros, possibilitando substituir valores de um intervalo por outros.

Caso tenha alguma dúvida, fique a vontade e deixa ela nos comentários que estaremos respondendo assim que possível.

by Fernando BS at August 14, 2018 06:49 AM

August 13, 2018

August 11, 2018

The PostGIS development team is pleased to release PostGIS 2.5.0beta2.

This release is a work in progress. Remaining time will be focused on bug fixes and documentation until PostGIS 2.5.0 release. Although this release will work for PostgreSQL 9.4 and above, to take full advantage of what PostGIS 2.5 offers, you should be running PostgreSQL 11beta3+ and GEOS 3.7.0beta2 which were released recently.

Best served with PostgreSQL 11beta3 which was recently released.

Changes since PostGIS 2.5.0beta1 release are as follows:

  • 4115, Fix a bug that created MVTs with incorrect property values under parallel plans (Raúl Marín).
  • 4120, ST_AsMVTGeom: Clip using tile coordinates (Raúl Marín).
  • 4132, ST_Intersection on Raster now works without throwing TopologyException (Vinícius A.B. Schmidt, Darafei Praliaskouski)
  • 4109, Fix WKT parser accepting and interpreting numbers with multiple dots (Raúl Marín, Paul Ramsey)
  • 4140, Use user-provided CFLAGS in address standardizer and the topology module (Raúl Marín)
  • 4143, Fix backend crash when ST_OffsetCurve fails (Dan Baston)
  • 4145, Speedup MVT column parsing (Raúl Marín)

View all closed tickets for 2.5.0.

After installing the binaries or after running pg_upgrade, make sure to do:


— if you use the other extensions packaged with postgis — make sure to upgrade those as well

ALTER EXTENSION postgis_topology UPDATE;
ALTER EXTENSION postgis_tiger_geocoder UPDATE;

If you use legacy.sql or legacy_minimal.sql, make sure to rerun the version packaged with these releases.


by Regina Obe at August 11, 2018 12:00 AM

August 09, 2018

Leo, y no es la primera vez, en un pliego de prescripciones técnicas que se pide a los proponentes que en el caso de que se le ocurra presentar una solución que se base total o parcialmente en software libre que justifiquen de forma adicional a la propuesta técnica, la estabilidad, robustez y grado de penetración en el mercado de las componentes de software libre. E, igualmente, argumenten que cuentan con el respaldo de una comunidad de usuarios y desarrolladores lo suficientemente amplia para garantizar su evolución y viabilidad a futuro.

Cosa que no me parece mal, todo lo contrario. Pero me pregunto por qué esto sólo se pide si la propuesta es de software libre.

Sería bueno que en aquellos casos que a los proponentes se les pasara por la cabeza proponer software privativo se les exigiera lo mismo. No sólo robustez, estabilidad y mercado, sino también disponer de una comunidad de usuarios y desarrolladores que fuera lo suficientemente amplia para garantizar la evolución y futuro de la tecnología. Y claro, esta comunidad de desarrolladores, debería poder tener acceso al código fuente para cumplir con esto último. Que si no, pasa lo que pasa, que por ejemplo un día tenemos el ArcIMS y al siguiente despertamos con el ArcGIS y nos dejan colgados (o lo que es lo mismo, ¡pasé usted por caja!), que la empresa cambia sus políticas comerciales o, casos hay unos cuantos, que directamente la empresa desaparece o abandona una determinada tecnología.

A lo mejor, es por eso, por lo que no lo piden. Porque si se exige lo mismo al privativo que al libre, no tendrían opción. Lo que ya es cuestionable es la legalidad, por no hablar de la ética, de este tipo de condiciones unilaterales.

by Alvaro at August 09, 2018 09:38 AM

Prezado leitor,

Você tem interesse em aprender a trabalhar com banco de dados espacial, e possui conhecimentos em algum banco de dados? Então esta é a sua oportunidade!

A GEOCURSOS acaba de lançar a Turma 4 do Curso DBA PostGIS. Este curso online oferece uma visão completa que vai desde uma revisão sobre o PostgreSQL até tópicos avançados do PostGIS, apresentando como trabalhar em sua totalidade com esta poderosa extensão espacial do banco PostgreSQL.

Este curso é formado pelos nossos cursos PostGIS Básico (16 horas online) + PostGIS Avançado (20 horas online) e acontecerá entre os dias 22 de setembro e 08 de dezembro (aos sábados).

Se você fosse comprar os cursos separadamente sairia pelo valor de R$ 900,00. Porém o curso esta com uma super promoção, e está saindo por apenas R$ 599,00.

Para maiores informações e para ver a ementa completa do curso, acesse:


by Fernando Quadro at August 09, 2018 05:37 AM

August 07, 2018

This blog makes today 12 years old. Yes, we haven’t posted that much lately, indeed we are still around, busy with our projects, families, and lives. We keep sharing some content in our twitter and LinkedIn social networks and will maintain this place for sure. 12 years is already a lot, but our passion for geospatial content and sharing it is not less than when we started this space.

Leave us a comment if you remember a content we created that you liked, or whatever comes to your head!

We love you!!

Photo by Audrey Fretz on Unsplash

by Jorge at August 07, 2018 09:22 AM

August 04, 2018

Curso de Postgrado en la Universidad de Rı́o Cuarto del 22 al 26 de Octubre de 2018

Gran parte de las investigaciones en Ecologı́a y ambiente en la actualidad requiere de conocimientos técnicos en el procesamiento avanzado de grandes conjuntos de datos espacio-temporales. Existe de este modo una urgente necesidad de formar a los potenciales usuarios en lo que respecta a formas eficientes de manejar y analizar los grandes volúmenes de datos que las agencias espaciales y diversas otras instituciones ponen a dispocisión del público a diario. Este curso abordará el procesamiento y análisis de datos espacio-temporales con uno de los Sistemas de Información Geográfica (SIG) de código abierto más populares: GRASS GIS 7.


Durante este curso los participantes obtendrán una visión general de las capacidades de GRASS GIS y experiencia práctica en procesamiento de datos ráster, vectoriales y series de tiempo de productos satelitales para análisis ecológicos y ambientales. Además, se presentarán ejemplos de análisis de datos espaciales utilizando la interfaz GRASS – R a través del paquete rgrass7.

El curso, a cargo de la Dra. Verónica Andreo, se dictará entre el 22 y 26 de Octubre de 2018 en el aula de postgrado de la Facultad de Ciencias Exactas, Fı́sico-Quı́micas y Naturales de la Universidad Nacional de Rı́o Cuarto (Córdoba, Argentina).

Informes e inscripciones: cecilia.provensal@gmail.comcprovensal@exa.unrc.edu.ar

Los cupos son limitados, no se lo pierdan!


by veroandreo at August 04, 2018 02:31 PM

August 03, 2018

The Horton Machine, previously known as Jgrasstools, is a set of tools available from the latest gvSIG Desktop version. Thanks to The Horton Machine we have a lot of functionalities in gvSIG, especially useful to work in areas such as hydrology (but not only!).

Such as at ‘About Hydrology’ blog, we want to help to spread the presentation made at FOSS4G-Europe in Guimãraes by HydroloGIS company and the documentation used in the workshop, which will help you to learn how to handle this powerful set of tools.

You can find the presentation here.

And about the workshop material, you have:

by Mario at August 03, 2018 10:18 AM

The Horton Machine, anteriormente conocido como Jgrasstools, es un conjunto de herramientas disponibles desde la última versión de gvSIG Desktop. Gracias a The Horton Machine tenemos en gvSIG mucha funcionalidad y muy diversa, especialmente útil para trabajar en áreas como la hidrología (pero no sólo!!).

Tal y como ha hecho el blog ‘About Hydrology’, queremos ayudar a divulgar la presentación realizada en el FOSS4G-Europe en Guimãraes por la empresa HydroloGIS y la documentación utilizada en el taller impartido, que os ayudará a aprender a manejar este potente conjunto de herramientas.

La presentación la podéis encontrar aquí.

Y en cuanto al material del taller, tenéis:

by Alvaro at August 03, 2018 09:42 AM

Durante las 14as Jornadas Internacionales gvSIG se realizará una sesión AgriGIS de GODAN con presentaciones relacionadas con Agricultura, en la que os invitamos a presentar vuestros proyectos.

Casi 800 millones de personas luchan contra el hambre y la desnutrición en todos los rincones del mundo. Concretamente es una de cada nueve personas, y la mayoría son mujeres y niños. Estamos convencidos de que la solución al Hambre Cero se basa en los datos de agricultura y nutrición existentes, que a menudo no están disponibles. Global Open Data para Agricultura y Nutrición (GODAN) es una iniciativa que busca apoyar los esfuerzos globales para hacer que los datos agrícolas y sobre nutrición relevantes estén disponibles y accesibles, y puedan ser utilizados sin restricciones en todo el mundo.

La iniciativa se centra en la construcción de políticas de alto nivel, así como el apoyo institucional público y privado para datos abiertos. Los datos geoespaciales abiertos y las herramientas geo abiertas son clave para apoyar y lograr la agenda 2030 para la seguridad alimentaria mundial.

Os invitamos a presentar vuestros proyectos sobre AgriGIS, e investigaciones y ejemplos sobre FoodSecurity. Para ello solo debéis enviar vuestros resúmenes a través de la sección de Comunicaciones de la web antes del 11 de septiembre de 2018.


by Mario at August 03, 2018 08:24 AM

During the 14th International gvSIG Conference there will be a GODAN AgriGIS session with presentations in Agriculture theme, where we invite you to present your projects.

Nearly 800 million people struggle with debilitating hunger and malnutrition in every corner of the globe. That’s one in every nine people, with the majority being women and children. We are convinced that the solution to Zero Hunger lies within existing, but often unavailable, agriculture and nutrition data. Global Open Data for Agriculture and Nutrition (GODAN) is an initiative that seeks to support global efforts to make agricultural and nutritionally relevant data available, accessible, and usable for unrestricted use worldwide.

The initiative focuses on building high-level policy as well as public and private institutional support for open data. Open Geospatial data and open geo tools are key in supporting and achieving the 2030 agenda for Global Food Security.

Your contributions and inputs on AgriGIS, FoodSecurity research and examples are welcome. Please send your abstracts through the Communications section of the website by 11th September 2018.

by Mario at August 03, 2018 08:21 AM


State of the Map (SotM) is an annual international conference which brings together stakeholders in the OpenStreetMap ecosystem. The attendees vary from enthusiastic mappers, software developers, academicians, open-data evangelists, NGOs to companies using the geodata in their applications. But what connects all of them is a passion for open maps.

State of the Map 2018

This year, SotM took place between 28.-30. July in Milan, Italy. The event was hosted by the Polytechnic University of Milan in their beautiful campus on Piazza Leonardo da Vinci. Our presentation about OpenMapTiles become a vanguard for one of the most discussed topics of the whole conference: vector tiles.

Vector tiles are the future

The common thread of the whole conference was vector tiles. Started by our presentation and followed by others which showed how vector tiles are used in a real-world deployment like the use case of the Helsinki Regional Transport Authority.

However, the most important discussion was about vector tiles on the main page OpenStreetMap.org. While nothing is set in stone yet, there was a general agreement that the main page should slowly switch to vector tiles.

Vector tiles used by the Helsinki Regional Transport Authority

Vector tiles used by the Helsinki Regional Transport Authority

Data quality

Finding errors, preventing vandalism, reverting malicious edits. The search for an ideal system for keeping the data quality high is still ongoing. Currently, there are many useful tools and we can expect many other automated mechanisms for preventing data degradation, but none of them is fully capable of replacing a human editor (yet?).

A new data model for OpenStreetMap

A topic touched by a few speakers as well as discussed in the OpenHistoryMap debate. The current data model was created some time ago and has some issues like the inability to use two values for one key, complicated way of working with relations and redundancy.

While some speakers suggest just a simplification by using a phrase “Evolution, no revolution” others call for a radically different model similar to the one on Wikidata.

Evolution, no revolution approach of Jochen Topf

"Evolution, no revolution" approach of Jochen Topf

Public transport

Since the beginning of the OpenStreetMap, the public transport infrastructure was mapped on nodes. It was simple but fails to describe complex situations and causes problems for routing. Therefore schema v2 was created, which fixed those issues, but the price was high complexity for mappers. Consequently, the new public transport schema v3 is in the proposal stage right now. 

However, the issue of public transport is more complicated because of informal systems in the global south or former Soviet Union countries. Few talks lifted the topic of mapping in developing countries as well as many discussions were focused on how to map them and even if public transport data belongs to OpenStreetMap.

Humanitarian mapping

Unlike commercial maps, OpenStreetMap is not focused just on the most profitable parts of the world. There are projects like Humanitarian OpenStreetMap Team, Missing Maps and others which maps in the developing world to give a voice to the underrepresented communities. Thanks to the grants from the OpenStreetMap Foundation, many representatives of the local communities were able to make it to the conference and increased the diversity of the conference.

Talking about diversity, the topic of gender representation in the OpenStreetMap was reflected as well with few proposals on how to improve the situation, but the solution is a long-distance run.

Photo ©Francesco Giunta CC-BY-SA 4.0

Photo ©Francesco Giunta CC-BY-SA 4.0

Social event

Probably the most relaxing part of the whole conference. Great atmosphere, delicious food, good music and a lot of friendly people open to chatting about maps and beyond.

Big thank organizers and see you next year!

Since this year’s SotM is over, we would like to say big thank you to all organizers, the OpenStreetMap Foundation and all speakers for making such an outstanding event possible.

In case you missed any interesting talk, you can find most of them on YouTube including ours about generating own vector tiles from OpenStreetMap data and slides on SlideShare.

See you all next year at SotM 2019 in Heidelberg!

by Petr Pridal (info@klokantech.com) at August 03, 2018 07:00 AM

August 02, 2018

This announcement is a double-whammy because they are both somewhat intertwined.

Firstly, we'll start with a new build of Fusion that cleans up several aspects of the viewer framework. These changes in Fusion are slated for inclusion in a future release of MapGuide

New PHP entry point for fusion templates

This new build of Fusion introduces a new PHP entry point for the 5 fusion templates.

Instead of loading say ... the slate template like so:


You can now load it like so:


This entry point supports all the same query string parameters as the original templates but with support for the following additional parameters:
  • template (required) - The name of the template to load 
  • debug (optional) - If set to 1, the entry point will use fusion.js instead of fusionSF-compressed.js, making the whole debug/development process way more simpler.
The entry point also fetches the application definition and writes it out to the underlying template as JSON, avoiding the need for the initial round trip to fetch this document among other things.

Fixing external base layer support 

The way external base layers are currently supported in Fusion is a bit crufty which this build also addresses:
  • Fusion no longer attempts to append script tags to support Google Maps / Bing / OSM / Stamen. The way we do it is unsafe according to modern browsers. That is now the responsibility of the new entry point.
  • For Bing / OSM / Stamen, we no longer load external helper scripts at all. This is because such support is already present in OpenLayers itself and the external scripts are merely convenience wrappers that we can easily implement ourselves in Fusion.
  • Finally, XYZ layers are now properly supported in Fusion. This means you can do things like consuming OSM tiles from your own OSM tile server, or maybe use non-watermarked CycleMap/TransportMap layers, or you can finally be able to consume your own MapGuide XYZ Tile Set Definitions.
Now the reason this announcement is a double-header is because these changes in Fusion really needs tooling assistance to best take advantage of these changes, so here's also a new release of MapGuide Maestro as well. Here's the notable changes.

Fusion XYZ layer editor support

Now that we have proper XYZ tile layer support in Fusion we now have support for consuming external XYZ tile sets.

Because MapGuide XYZ Tile Sets Definitions can now be consumable with Fusion, you can specify a tile set URL for such a Tile Set Definition by clicking the new Add from XYZ Tile Set Definition toolbar button.

If you want to consume OSM tiles from your own OSM tile server you can use this editor and point it to your OSM tile server with the requisite ${x}, ${y} and ${z} placeholders.

RtMapInspector tool improvements 

The other major feature of this new release is improvements to the RtMapInspector.

For a refresher, the RtMapInspector tool was introduced way back in a beta release of Maestro 5.0 and the purpose of this tool was to easily inspect the state of any runtime map if you know its session id and map name so you can easily debug map manipulation and state updates. Did your MapGuide application code correctly added your new layer? You can use this tool to inspect the map state and find out.

Having looked at this tool more recently, I've come to realise that I'm only skimming the surface of what this tool is capable of and with this release, the capabilities have been significantly expanded.

For example, the tool now lets you see the map image for the current map state

But it also occurred to me if we're inspecting a map, we can (and should be able to) also inspect its layers (and their feature sources) too! So when you select any layer in this tool, a new Inspect Layer button is enabled.

Clicking it will bring up an inspection dialog that shows the layer's XML and be able to interact with its feature source using the same components as the local feature source preview.

Other Changes
  • Now requires .net Framework 4.7.1. The windows installer will check for this (and install if required)
  • Maestro API now uses the latest stable releases of NetTopologySuite and GeoAPI
  • The MgTileSeeder tool now targets .net Core 2.1

Download MapGuide Maestro
Download test build of Fusion

by Jackie Ng (noreply@blogger.com) at August 02, 2018 04:44 PM

by Alvaro at August 02, 2018 09:46 AM

Do you want to change the appearance of gvSIG Desktop? Then keep reading …

One of the novelties included in gvSIG 2.4 version was a plugin that allows you to generate your own icon sets and apply them on the gvSIG Desktop interface. It allows to change the gvSIG style as well as to have icons in different sizes. Apart from the ‘classic’ icons theme (16×16 pixels) in gvSIG Desktop 2.4, from the Add-ons Manager we were able to install an icon set made by TreCC, available in 16×16 and 22×22 pixels.

Currently the gvSIG Association team is working on the next 2.4.1 version (already in stabilization phase) and in parallel in 3.0 version, a version that will bring important changes, including some improvements related to usability and aesthetics of the application. Related to this issues we have been reviewing aspects such as the icons distribution, icons used by several tools, etc. and the best way to do it has been to apply a new icon set, which will allow us to review the current status of this section in gvSIG Desktop as a proof of concept. And although the motivation has been to perform this test, the results is a new icon theme ready to be used in the application, that you can already find in the add-ons manager. The name of this plugin isgvSIG Black‘ and the icon resolution is 24×24 pixels.

How to install it:

  • Go to the ‘Add-ons Manager’ and mark the ‘Installation from URL’ option. Then search and install the ‘gvSIG Black 24×24 Icon Theme’ plugin. Once installed, we must restart gvSIG.
  • From ‘Preferences’, ‘General’ section, ‘Game of icons’ subsection, we select ‘gvSIG Black’. We restart gvSIG and we will see that it has already been applied.
    If you want to return to the ‘classic’ icon set you just have to repeat the process, installing the ‘Classic Icon Theme’ plugin.

In addition we share a document that will help those who want to generate their own icon set to give their own appearance to gvSIG Desktop. The document contains the main icons that are used in your favourite GIS, with images of the 3 icons set currently available, and the path in which each icon is saved.

And if you want to see all the gvSIG Desktop icons, you just have to launch the tool that generates the corresponding report from ‘Tools / Development / Show icon theme information’.

by Mario at August 02, 2018 08:57 AM

¿Te apetece cambiar el aspecto de gvSIG Desktop? Entonces sigue leyendo…

Una de las novedades que presentó la versión 2.4 de gvSIG Desktop fue un plugin que permite generar tus propios juegos de iconos y aplicarlos a la interfaz de gvSIG Desktop. Esto permite cambiar tanto el estilo de gvSIG como disponer de iconos a distintos tamaños. Además del tema de iconos ‘clásico’ (16×16 píxeles) en gvSIG Desktop 2.4, mediante el Administrador de Complementos, ya podíamos instalar un juego de iconos realizado por la empresa TreCC y disponible a 16×16 y a 22×22 píxeles.

Actualmente el equipo de la Asociación gvSIG está trabajando en la próxima versión 2.4.1 (ya en fase de estabilización) y en paralelo en la versión 3.0, una versión que traerá importantes cambios y entre ellos algunos relacionados con usabilidad y estética de la aplicación. Relacionado con esto último hemos estado revisando aspectos como la distribución de iconos, iconos utilizados por diversas herramientas, etc. y la mejor forma de hacerlo ha sido aplicando un nuevo juego de iconos, que a modo de prueba de concepto, nos permitiera revisar el estado actual de este apartado de gvSIG Desktop. Y aunque la motivación ha sido realizar ese testeo, el resultado es un nuevo tema de iconos listo para utilizarse en la aplicación y que desde ya podéis encontrar en el administrador de complementos. Su nombre ‘gvSIG Black’ y su resolución 24×24 píxeles.

Pasos para instalarlo:

  • Ir al ‘Administrador de complementos’ y marcando la opción ‘Instalación por URL’ buscar e instalar el plugin ‘gvSIG Black 24×24 Icon Theme’. Una vez instalado, debemos reiniciar gvSIG.
  • Desde ‘Preferencias’, apartado ‘General’, subapartado ‘Juego de iconos’ seleccionamos el de ‘gvSIG Black’. Reiniciamos gvSIG y veremos que ya se ha aplicado.
  • Si queréis volver al ‘clásico’ no tenéis más que repetir el proceso, instalando el plugin ‘Classic Icon Theme’.

Por otro lado compartimos un documento que ayudará a aquellos que quieran generarse su propio juego de iconos para dar un aspecto propio a gvSIG Desktop. El documento contiene los principales iconos que se utilizan en vuestro SIG favorito, con imágenes de los 3 temas de iconos actualmente disponibles, y la ruta en la que se guarda cada icono.

Y si queréis ver todos los iconos de gvSIG Desktop, no tenéis más que lanzar la herramienta que genera el informe correspondiente desde ‘Herramientas/Desarrollo/Mostrar información sobre temas de iconos’.

by Alvaro at August 02, 2018 07:35 AM

One of the things I have been working on with the LocationTech crew is the JTS Topology Suite project.

JTS logo

We have had an exciting couple of months with James Hughes and myself taking a code-sprint to:

The other bit of RnD I have been working has been Explicit support for XYZM coordinates. Previously this has been left as an exercise for the reader, designed for in the API but something for downstream applications to figure out. The motivation to address this in JTS is to support a WKTReader / WKT Writer enhancements Felix Obermaier has been working on, but now that it has started a wide range of projects have expressed interest.

The design trade off is between treating JTS Coordinate as a data object, managed in arrays and created and deleted as required. JTS itself only makes use of coordinate.x and coordinate.y fields (which are public as befitting a data structure).

Here is what adding XYZM support looks like for Coordinate:

Methods have beed added for getX(), getY(), getZ() and getM() - and the methods return NaN for anything they do not support.

The cloud projects at LocationTech, such as GeoMesa, run into a bit of trouble with this approach as creating and deleting Coordinate constantly during processing causes a lot of memory pressure. The alternative is to manage information as a CoordinateSequence, such as Packed CoordinateSequence which is backed by an array of doubles. There is lots of room for JTS to improve in this area and we have watched downstream projects come up with all kinds of fascinating CoordinateSequence implementations we can learn from.

Here is what adding XYZM support looks like for CoordinateSequence:

The key addition here is clarifying the definition of dimension (the number of ordinates in each coordinate) and adding measures (the number of measures included in dimension for each coordinate). For for XY data dimension is 2 and measures is 0. For XYZM data dimension is 4 and measures is 1.

FOSS4G 2018

If you are interested in hearing more about JTS Topology Suite join Rob Emanuele and myself at FOSS4G 2018 later this month for State of JTS 2018

by Jody Garnett (noreply@blogger.com) at August 02, 2018 05:29 AM

August 01, 2018

The registration period for the 14th International gvSIG Conference is now open. The conference will be held from October 24th to 26th at School of Engineering in Geodesy, Cartography and Surveying (Universitat Politècnica de València, Spain).

Registrations are free of cost (limited capacity) and must be done through the application form on the Conference web page.

The conference program and all the information about registration for workshops, and about the conference rooms will be available at the event website soon.

We also remind you that we have extended the deadline for submitting communication proposals. The extended deadline is September 11th. You can find all the information about communication sending in the conference website.

We expect your participation!

by Mario at August 01, 2018 11:12 AM

Ya está abierto el periodo de inscripción de las 14as Jornadas Internacionales gvSIG, que se celebrarán del 24 al 26 de octubre, en la Escuela Técnica Superior de Ingeniería Geodésica, Cartográfica y Topográfica (Universitat Politècnica de València, España).

La inscripción es totalmente gratuita (con aforo limitado) y se ha de realizar a través del formulario existente en la página web de las Jornadas.

En unas semanas se publicará el programa con todas las ponencias y talleres previstos, tanto para usuarios como para desarrolladores, con toda la información sobre cómo inscribirse en ellos.

Por otro lado, os recordamos que se ha ampliado el plazo de envío de resúmenes, siendo la nueva fecha límite el 11 de septiembre. Las normas para el envío de comunicaciones podéis encontrarlas en la web del evento.

¡Esperamos vuestra participación!

by Mario at August 01, 2018 11:10 AM

I was coming across some errors when installing Anaconda, Miniconda specifically.


I think part of the reason is I have quite a few installs of Python due to OSGeo4W.

My error:

Fatal Python error: Py_Initialize: unable to load the file system codec
ModuleNotFoundError: No module named 'encodings'

Current thread 0x00002554 (most recent call first):

The solution:

Update the activate.bat file that is called when launching from the start menu. For me located in:


We need to add in the following to clear out and reset the python environment before launching anaconda:

@SET PYTHONHOME=C:\ProgramData\Anaconda
@PATH C:\ProgramData\Anaconda;C:\ProgramData\Anaconda\Scripts;%PATH%

So, editing the file from:

@REM Test first character and last character of %1 to see if first character is a "
@REM   but the last character isn't.
@REM This was a bug as described in https://github.com/ContinuumIO/menuinst/issues/60
@REM When Anaconda Prompt has the form
@REM   %windir%\system32\cmd.exe "/K" "C:\Users\builder\Miniconda3\Scripts\activate.bat" "C:\Users\builder\Miniconda3"
@REM Rather than the correct
@REM    %windir%\system32\cmd.exe /K ""C:\Users\builder\Miniconda3\Scripts\activate.bat" "C:\Users\builder\Miniconda3""
@REM this solution taken from https://stackoverflow.com/a/31359867
@set "_args1=%1"
@set _args1_first=%_args1:~0,1%
@set _args1_last=%_args1:~-1%
@set _args1_first=%_args1_first:"=+%
@set _args1_last=%_args1_last:"=+%
@set _args1=

@if "%_args1_first%"=="+" if NOT "%_args1_last%"=="+" (
    @CALL "%~dp0..\Library\bin\conda.bat" activate
    @GOTO :End

@CALL "%~dp0..\Library\bin\conda.bat" activate %*

@set _args1_first=
@set _args1_last=


@REM Test first character and last character of %1 to see if first character is a "
@REM   but the last character isn't.
@REM This was a bug as described in https://github.com/ContinuumIO/menuinst/issues/60
@REM When Anaconda Prompt has the form
@REM   %windir%\system32\cmd.exe "/K" "C:\Users\builder\Miniconda3\Scripts\activate.bat" "C:\Users\builder\Miniconda3"
@REM Rather than the correct
@REM    %windir%\system32\cmd.exe /K ""C:\Users\builder\Miniconda3\Scripts\activate.bat" "C:\Users\builder\Miniconda3""
@REM this solution taken from https://stackoverflow.com/a/31359867
@set "_args1=%1"
@set _args1_first=%_args1:~0,1%
@set _args1_last=%_args1:~-1%
@set _args1_first=%_args1_first:"=+%
@set _args1_last=%_args1_last:"=+%
@set _args1=

@SET PYTHONHOME=C:\ProgramData\Anaconda
@PATH C:\ProgramData\Anaconda;C:\ProgramData\Anaconda\Scripts;%PATH%

@if "%_args1_first%"=="+" if NOT "%_args1_last%"=="+" (
    @CALL "%~dp0..\Library\bin\conda.bat" activate
    @GOTO :End

@CALL "%~dp0..\Library\bin\conda.bat" activate %*

@set _args1_first=
@set _args1_last=

Updating the paths as required.

This just clears out the python and windows environmental variables before launching, similar to what OSGeo4W does.

by Heikki Vesanto at August 01, 2018 08:00 AM

July 31, 2018

E chegamos à última parte do nosso tutorial de criação de plugins para o QGIS utilizando Python e o Qt Designer. Na primeira postagem, ensinamos como criar os arquivos base do plugin e como solicitar ao usuário pontos temporários para serem inseridos no mapa.

Na segunda postagem, vimos como salvar o shapefile do ponto inserido, como avaliar em qual região de um determinado zoneamento o ponto esta localizado e como corrigir erros no Python.

Caso você não tenha lido a primeira e segunda parte, você pode ler elas clicando nos links abaixo.

Nesta terceira parte do tutorial, iremos mostrar como criar um mapa no compositor de impressão, de forma que possamos exportar o ponto inserido e o zoneamento avaliado para outros usuários no formato PDF e PNG.

Preparando o Compositor de Impressão com PyQGIS

Antes de começarmos a inserirmos as funções relacionadas ao compositor de impressão, precisamos importar elas. Você deve inserir nas linhas iniciais do arquivo python (.py) as seguintes linhas:

from PyQt4.QtGui import QAction, QIcon, QFileDialog, QPrinter, QPainter, QImage, QColor # Nesta linha, foram adicionados os itens QPrinter, QPainter, QImage e QColor
from qgis.utils import iface # Adicione esta linha para importar o item iface

Agora que já importamos as funções necessárias, vamos incorporar, passo a passo, os itens do nosso mapa.

As próximas linhas de código serão inseridas no final do nosso arquivo python, logo após a verificação se os checkbox da área de estudo estão selecionados ou não.

Primeiro, iremos criar uma composição do tipo QgsComposition a qual irá receber a informação visual do nosso mapa. Confira o código abaixo.

# Criação de uma composição QgsComposition
mapRenderer = iface.mapCanvas().mapRenderer()
c = QgsComposition(mapRenderer)

O próximo passo é definir onde será impresso nosso mapa em uma folha A4 (o QGIS define como padrão de saída folha A4 com resolução de 300 dpi). Note que após criar as variáveis, elas são inseridas dentro do nosso mapa usando a função addItem().

# Adicionando e definindo onde o mapa será inserido na folha
x, y = 0, 0
w, h = c.paperWidth(), c.paperHeight()
composerMap = QgsComposerMap(c, x, y, w, h)

Agora, vamos criar um item que receberá nossos textos e irá colocar eles em uma determinada parte da impressão.

Veja que dentro da função setText() devemos inserir o texto que será apresentado e na função setItemPosition() recebe a posição do nosso texto.

# Adiciona texto (label) ao mapa gerado
composerLabel = QgsComposerLabel(c)
composerLabel.setText("Blog 2 Engenheiros")
composerLabel.setItemPosition(100, 0)

Por fim, vamos adicionar agora a legenda do nosso mapa.

# Adiciona legenda ao mapa
legend = QgsComposerLegend(c)

Já temos os elementos do nosso mapa no formato de código Python. Podemos também incluir outros itens, tais como escala (QgsComposerScaleBar) e polígonos baseados em pontos (QgsComposerPolygon), mas deixaremos eles de fora, neste tutorial.

Itens como figuras, flechas e tabelas ainda não suportados, ou seja, não podem ser inseridos por meio de python no QGIS (Documentation QGIS 2.18).

Com os itens inseridos, precisamos agora exportar o resultado para PDF ou para PNG.

Exportando mapa no formato PDF

Para exportar o mapa criado, utilize o código abaixo, lembrando de substituir (na função setOutputFileName) o caminho onde será gerado o arquivo pdf.

Neste exemplo, utilizei o mesmo caminho onde foi salvo o arquivo shapefile.

# Exportando o resultado para PDF
printer = QPrinter()
printer.setPaperSize(QSizeF(c.paperWidth(), c.paperHeight()), QPrinter.Millimeter)
pdfPainter = QPainter(printer)
paperRectMM = printer.pageRect(QPrinter.Millimeter)
paperRectPixel = printer.pageRect(QPrinter.DevicePixel)
c.render(pdfPainter, paperRectPixel, paperRectMM)

Veja que podemos modificar várias opções, tais como tamanho do papel (setPaperSize), se a impressão é na folha toda (setFullPage) e resolução (setResolution).

Exportando o mapa no formato PNG

Agora, se você quer ter o mapa no formato png, você deverá utilizar o código abaixo, ao invés do código previamente apresentado.

# Exportando o mapa para PNG
dpi = c.printResolution()
dpmm = dpi / 25.4
width = int(dpmm * c.paperWidth())
height = int(dpmm * c.paperHeight())

# Cria uma imagem de saída e inicializa ela
image = QImage(QSize(width, height), QImage.Format_ARGB32)
image.setDotsPerMeterX(dpmm * 1000)
image.setDotsPerMeterY(dpmm * 1000)

# Cria a composição
imagePainter = QPainter(image)
c.renderPage( imagePainter, 0 )

image.save(localSalvo+"_mapa.png", "png")

Caso você tenha seguido os passos corretamente e nenhum erro tenha acontecido, você obterá o seguinte mapa.

Mapa gerado após seguir os procedimentos que apresentamos.Mapa gerado após seguir os procedimentos que apresentamos.

A aparência do nosso mapa não é a melhor de todas, por isso, vamos modificar um pouco os códigos que apresentamos para que ele fique mais apresentável, acrescentando também, em qual região de planejamento e bacia hidrográfica o ponto se encontra.

Melhorando a aparência do nosso mapa

A legenda será o primeiro item que iremos melhorar. Confira o código abaixo (as linhas com comentários são aquelas que foram inseridas).

# Adiciona legenda ao mapa
legend = QgsComposerLegend(c)
legend.setTitle("Legenda: ") # Acrescenta ou modifica o título da legenda
legend.setFrameOutlineWidth(0.5) # Modifica tamanho da linha no entorno da legenda
legend.setFrameEnabled(1) # Acrescenta linha no entorno da legenda

Com a legenda corrigida, vamos modificar o texto que foi inserido.

Além de aparecer “Blog 2 Engenheiros”, vamos fazer ele mostrar o local onde o ponto inserido pelo usuário caiu (conforme bacia hidrográfica e região de planejamento).

# Adiciona texto (label) ao mapa gerado
composerLabel = QgsComposerLabel(c)
composerLabel.setText("Blog 2 Engenheiros: ")
composerLabel.setItemPosition(c.paperWidth() - 100, c.paperHeight() - 24)

# Adiciona texto referente à localização do ponto adicionado
regiaoLabel = QgsComposerLabel(c)
if self.dlg.checkBoxRP.isChecked():
		regiaoLabel.setText(pnt_selection[0] + u" esta localizado em " + rp_selection[0])
		regiaoLabel.setText(u"O ponto fornecido esta fora da área de interesse!!")
	regiaoLabel.setText(u"O item Região de Planejamento não foi selecionado.")
regiaoLabel.setItemPosition(c.paperWidth() - 100, c.paperHeight() - 19)

baciaLabel = QgsComposerLabel(c)
if self.dlg.checkBoxBH.isChecked():
		baciaLabel.setText(pnt_selection[0] + " esta na " + bh_selection[0])
		baciaLabel.setText(u"O ponto fornecido esta fora da área de interesse!!")
	baciaLabel.setText(u"O item Bacias Hidrográficas não foi selecionado.")
baciaLabel.setItemPosition(c.paperWidth() - 100, c.paperHeight() - 14)

Agora vamos adicionar uma grade de coordenadas e uma borda ao nosso mapa, lembrando que devemos modificar, para isso, a posição que o nosso mapa foi inserido.

# Adicionando e definindo onde o mapa será inserido na folha
x, y = 25, 25 # Variáveis que mostram a origem do nosso mapa na folha de impressão
w, h = c.paperWidth() - 50, c.paperHeight() - 50 # Variáveis mostrando o tamanho do nosso mapa
composerMap = QgsComposerMap(c, x, y, w, h)

composerMap.setFrameEnabled(1) # Liga a borda do nosso mapa
composerMap.setGridEnabled(True) # Liga a grade de coordenadas
composerMap.setGridIntervalX(20000) # Determina o intervalo da grade no eixo X
composerMap.setGridIntervalY(20000) # Determina o intervalo da grade no eixo X
composerMap.setShowGridAnnotation(True) # Mostra os valores das coordenadas
#composerMap.setGridAnnotationPrecision(0) # Precisão das coordenadas (Zero = não há casas decimais) (Por alguma razão, essa função não funciona).
composerMap.setGridStyle(QgsComposerMap.Cross) # Estilo da Grade

# Determina a posição e direção das coordenadas no topo do mapa
composerMap.setGridAnnotationPosition(QgsComposerMap.OutsideMapFrame, QgsComposerMap.Top)
composerMap.setGridAnnotationDirection(QgsComposerMap.Horizontal, QgsComposerMap.Top)

# Desabilita coordenadas na parte inferior do mapa
composerMap.setGridAnnotationPosition(QgsComposerMap.Disabled, QgsComposerMap.Bottom)

# Desabilita coordenadas na esquerda do mapa
composerMap.setGridAnnotationPosition(QgsComposerMap.Disabled, QgsComposerMap.Left)

# Determina a posição e direção das coordenadas na direita do mapa
composerMap.setGridAnnotationPosition(QgsComposerMap.OutsideMapFrame, QgsComposerMap.Right)
composerMap.setGridAnnotationDirection(QgsComposerMap.Vertical, QgsComposerMap.Right)

#composerMap.setAnnotationFrameDistance(1) # Distância da coordenada do mapa (Por alguma razão, essa função não funciona).
composerMap.setAnnotationFontColor(QColor(0, 0, 0)) # Cor da coordenada

O resultado das nossas modificação é apresentado na figura abaixo.

Após abrir o shapefile com os municípios do RJ, rodamos nosso plugin e este é o resultado.Após abrir o shapefile com os municípios do RJ, rodamos nosso plugin e este é o resultado.

Lembre-se de, antes de utilizar o plugin, determinar corretamente o sistema de coordenadas, ajustar a legenda e as cores do seu mapa, para que, ao utilizar o plugin, o mapa final sai corretamente.

Você pode baixar o arquivo python clicando em ponto_exato_final.

Confira o vídeo (aqui) que realizamos mostrando como o plugin funciona (aproveite e se inscreva no nosso canal do Youtube) e se você quer conferir as postagens anteriores deste plugin, confira os links abaixo:

Ficou com alguma dúvida? Deixa ela nos comentários que iremos responder assim que possível.

Referências Consultadas:

Documentation QGIS 2.18: https://docs.qgis.org/2.18/en/docs/pyqgis_developer_cookbook/composer.html

GIS StackExchange “Setting an item position (variable) from LowerRight?”: https://gis.stackexchange.com/questions/206150/setting-an-item-position-variable-from-lowerright

GIS StackExchange “How can I add grid lines to a print composition using pyqgis?”: https://gis.stackexchange.com/questions/85724/how-can-i-add-grid-lines-to-a-print-composition-using-pyqgis

by Fernando BS at July 31, 2018 06:07 AM

July 29, 2018

Today marks 35 years of GRASS GIS development – with frequent releases the project keeps pushing the limits in terms of geospatial data processing quality and performance.

GRASS (Geographic Resources Analysis Support System) is a free and open source Geographic Information System (GIS) software suite used for geospatial data management and analysis, image processing, graphics and map production, spatial modeling, and 3D visualization. Since the major GRASS GIS 7 version, it also comes with a feature rich engine for space-time cubes useful for time series processing of Landsat and Copernicus Sentinel satellite data and more. GRASS GIS can be either used as a desktop application or as a backend for other software packages such as QGIS and R. Furthermore, it is frequently used on HPC and cloud infrastructures for massive parallelized data processing.

Brief history
In 1982, under the direction of Bill Goran at the U.S. Army Corps of Engineers Construction Engineering Research Laboratory (CERL), two GIS development efforts were undertaken. First, Lloyd Van Warren, a University of Illinois engineering student, began development on a new computer program that allowed analysis of mapped data.  Second, Jim Westervelt (CERL) developed a GIS package called “LAGRID – the Landscape Architecture Gridcell analysis system” as his master’s thesis. Thirty five years ago, on 29 July 1983, the user manual for this new system titled “GIS Version 1 Reference Manual” was first published by J. Westervelt and M. O’Shea. With the technical guidance of Michael Shapiro (CERL), the software continued its development at the U.S. Army Corps of Engineers Construction Engineering Research Laboratory (USA/CERL) in Champaign, Illinois; and after further expansion version 1.0 was released in 1985 under the name Geographic Resources Analysis Support System (GRASS). The GRASS GIS community was established the same year with the first annual user meeting and the launch of GRASSnet, one of the internet’s early mailing lists. The user community expanded to a larger audience in 1991 with the “Grasshopper” mailing list and the introduction of the World Wide Web. The users’ and programmers’ mailing lists archives for these early years are still available online.
In the mid 1990s the development transferred from USA/CERL to The Open GRASS Consortium (a group who would later generalize to become today’s Open Geospatial Consortium — the OGC). The project coordination eventually shifted to the international development team made up of governmental and academic researchers and university scientists. Reflecting this shift to a project run by the users, for the users, in 1999 GRASS GIS was released under the terms of the GNU General Public License (GPL). A detailed history of GRASS GIS can be found at https://grass.osgeo.org/history/.

Where to next?
The development on GRASS GIS continues with more energy and interest than ever. Parallel to the long-term maintenance of the GRASS 7.4 stable series, effort is well underway on the new upcoming cutting-edge 7.6 release, which will bring many new features, enhancements, and cleanups. As in the past, the GRASS GIS community is open to any contribution, be it in the form of programming, documentation, testing, and financial sponsorship. Please contact us!


The Geographic Resources Analysis Support System (https://grass.osgeo.org/), commonly referred to as GRASS GIS, is an Open Source Geographic Information System providing powerful raster, vector and geospatial processing capabilities in a single integrated software suite. GRASS GIS includes tools for spatial modeling, visualization of raster and vector data, management and analysis of geospatial data, and the processing of satellite and aerial imagery. It also provides the capability to produce sophisticated presentation graphics and hardcopy maps. GRASS GIS has been translated into about twenty languages and supports a huge array of data formats. It can be used either as a stand-alone application or as backend for other software packages such as QGIS and R geostatistics. It is distributed freely under the terms of the GNU General Public License (GPL). GRASS GIS is a founding member of the Open Source Geospatial Foundation (OSGeo).

The GRASS Development Team, July 2018

The post Celebrating 35 years of GRASS GIS! appeared first on GFOSS Blog | GRASS GIS and OSGeo News.

by neteler at July 29, 2018 05:16 PM

July 27, 2018

Msasani Peninsula, Dar es Salaam

I attended my first FOSS4G 5 years ago, I really had no idea what to expect even though I’d agreed to chair the organising committee! I’d been to lots of Geo conferences before and never experienced the passion, friendship and technical depth that a FOSS4G has. Since Nottingham, I’ve been to 2 more global FOSS4Gs, 2 UK FOSS4Gs and this year’s European FOSS4G and I can’t wait to go to Dar in August.

The IT landscape is being transformed at a staggering pace as we shift to consuming services, nowhere is this more evident than in Geo. Open Source is powering these services – there are probably over a billion people accessing Geo services powered by the technologies that you will find at FOSS4G, there are over a million users of an open source desktop GIS. If you are already using OSGeo projects, FOSS4G presents an opportunity to learn from the developers of the software and other users. If you are new to open source, FOSS4G will welcome you and help you to evaluate whether the software projects and supportive community will be the best solution for your future needs.

In 2016 the Bill and Melinda Gates Foundation said:

Lack of capacity can cause countries to be slow to adopt methodologies to produce, maintain, and more importantly, benefit from robust geospatial data and analysis systems. Lack of new tools or approaches to geospatial data can inhibit many opportunities for growth and innovation. Given the rapid recent advances in data and geospatial technology, now is a critical time to invest in ensuring that Sub-Saharan Africa is mapped and has the tools to use geospatial data effectively and sustainably. Building better data systems in the long-term can contribute to more informed policy making or improved decision making to reduced inequity while in the short-term provide rapid information to prevent disasters such as large-scale epidemics.https://www.gatesfoundation.org/How-We-Work/General-Information/Grant-Opportunities/Geospatial-Data-in-Africa

Open Source and Open Data are key to enabling this vision for African Geospatial.

What better time could there be to bring FOSS4G to Africa? This year FOSS4G is in Dar es Salaam (The Bay of Peace), Tanzania, it’s the first time a major geospatial global tech conference has been hosted in East Africa. We will be joined by the Humanitarian OpenStreetMap Team, the Consultative Group for International Agricultural Research, Understanding Risk Tanzania, Sahara Sparks to make this the most exciting FOSS4G program yet!

Thanks to the generous support of our many sponsors we are expecting close to 1,000 people from across Africa and much further afield to join together in Dar to share, learn and make new friends.

I really do believe that this could be the most important FOSS4G ever, if you haven’t booked yet there is still time.

Tutaonane Dar es Salaam

by Steven at July 27, 2018 04:21 PM

Na Facebooku uruchomiliśmy grupę o nazwie Polska Grupa Użytkowników QGIS, która jest odpowiedzią na rosnące zapotrzebowanie na takie miejsce w sieciach społecznościowych. Celem grupy jest promowanie QGIS jako wolnego i otwartego oprogramowania oraz wymiana wiedzy, umiejętności i doświadczeń. Zapraszamy do grupy wszystkich posiadających konto na FB i korzystających z QGIS. Zachęcamy też do aktywnego w niej uczestnictwa.

by robert at July 27, 2018 01:06 PM

July 26, 2018

The Web Mercator grid as used by OSM, Google, etc. is the de-facto standard for serving web maps. But not many people maintain data in a Web Mercator reference system. People usually use either a locally optimized SRS or store their data in WGS84.

For publishing, data has to be transformed to Web Mercator. If the result is good enough for your purpose, you can take this route. You loose the possibility to use base maps served in your local tile grid, but you can use OSM or other base maps.

If you don’t want to retransform your data to Web Mercator, but still use tools like Mapbox GL JS, which are only supporting Mercator, there is a way to do that. I call this method “Fake Mercator”. You handle the coordinates in your local reference system as if they were Web Mercator coordinates. At a first glance, this looks like a terrible hack, but after using this method in some projects, I think there are legitimate uses of this approach. We’re just using a display grid in an other projection, but as long as we stay in our area of interest, why should we care?

Assuming your data is stored in a PostGIS database, there is simple way to convert them to Web Mercator:

SELECT UpdateGeometrySRID('public', 'mytable', 'wkb_geometry', 3857);

What happens with my country doing that?


Everything is shifted to the same coordinates but now in the Web Mercator grid. We still have the original projection, only placed on an other part on the Web Mercator world. From then on you can use all tools specialized on Web Mercator with loosing only one thing: you can’t use other base maps served as tiles. What you can do, is using WMS services or raster images with the same shift applied. And you have to take care when displaying WGS84 coordinates that you calculate them from the original projection of the data.

If you don’t want to change the SRS of your data permanently, you can also apply ST_SetSRID on-the-fly. In case of MVT vector tiles, t-rex does that for you by passing the option no-transform=true. It then handles the correct (non-)projection of request BBOX coordinates and calculated extends.

t_rex serve --datasource ch_epsg_2056.gpkg --no-transform true


You’ll see your data in their original projection on a Web Mercator grid. You’ll only notice that t-rex adds the original coordinates when generating a configuration file:

name = "mytable"
# Real extent: [5.96438, 45.81937, 10.55886, 47.77210]
extent = [22.32683, 9.61388, 25.45680, 11.56229]

If you don’t pass no-transform, t-rex will automatically transform your data to Web Mercator. Alternatively you can define a user grid with your data’s reference system, but then you have to use this grid in your map viewer (e.g. OpenLayers) as well.

To sum this post up, there are many possibilities to serve your data in a tile grid and sometimes it makes sense to serve your data without reprojection in a Web Mercator grid.

Pirmin Kalberer (@implgeo)

July 26, 2018 03:29 PM

This is the second progress report of the GDAL SRS barn effort.

Since the previous report, a number of topics have been addressed:
- extension of the class hierarchy to implement BoundCRS (the generalization of the WKT1 TOWGS84 concept. This concept only exists in WKT 2 and has not been modeled in ISO-19111, so I went on my own modelling), TimeCRS, DerivedGeodeticCRS
- implementation of the exportToPROJ() method for CRS and related objects, and CoordinateOperation
- addition of all documentation needed at class and method level so that Doxygen passes without warnings
- implementation of CoordinateOperation::createOperation() method that can instanciate a transformation between two CRSs. For now, it does not use yet the operations of the EPSG database, but it can already take into account the hints of BoundCRS.
- implementation of a number of Transformations: geocentric translation, position vector transformation, coordinate frame rotation, NTv2,  GravityRelatedHeightToGeographic3D, VERTCON.
- start of mapping all GDAL currently supported projection methods. For now: UTM, Transverse Mercator, Transerve Mercator South Oriented, Two Point Equidistant, Tunisia Mapping Grid, Albers Conic Equal Area, Lambert Conic Conformal 1SP, New Zealand Map Grid. Several tens are obviously still missing.
- addition of a isEquivalentTo() method to compare the various objects.
- and of course, extensive unit testing of all the above work.

The result of this continued work can be followed in this pull request.

As a related effort, I've coordinated with the OGC CRS Working Group to provide my comments on the upcoming ISO:19168 / WKTv2 2018 revision.

by Even Rouault (noreply@blogger.com) at July 26, 2018 02:56 PM

Based on the work by:

Geoff Boeing: Comparing City Street Orientations

Rixx: Street Orientations

The graphs show the percentage of streets that run in a certain orientation. So for a grid based city like Chicago, there will be a heavy bias in north/south and east/west streets. Bearing in mind north and south will be the same (unless there are one-way streets, which only count in the direction they run in).

But for older cities that formed naturally, without modern city planning, the streets should be more varied.


Largest populated places by population. Based on the Ordnance Survey Ireland urban areas. As it is OSI data, Northern Ireland is not included.

Dublin Postcodes:

Some areas are clearly impacted by large motorways running through them.

And for non-Dubliners, a map of the postal district boundaries:

I updated the script by Rixx, so that it would take a ShapeFile as an input with a few caveats (it must be WGS84, it must have an attribute that has the are name and it must be called settl_name).

Check out the script at: GitHub

by Heikki Vesanto at July 26, 2018 08:00 AM

July 25, 2018

We are happy to announce the release of GeoServer 2.13.2. Downloads are available (zipwar, and exe) along with docs and extensions.

This is a stable release recommended for production use. This release is made in conjunction with GeoTools 19.2 and GeoWebCache 1.13.2.

Highlights of this release are featured below, for more information please see the release notes (2.13.2 | 2.13.1 | | 2.13-beta).

Improvements and Fixes

  • style editor map legend always includes legend
  • performance improvement for multi-band coverage time series
  • WMS 1.3.0 performance improvement for north/east axis order
  • Fix support of external graphics over http

Security updates

Please update your production instances of GeoServer to receive the latest security updates and fixes.

If you encounter a security vulnerability in GeoServer, or any other open source software, please take care to report the issue in a responsible fashion.

About GeoServer 2.13 Series

Additional information on the 2.13 series:


by jgarnett at July 25, 2018 01:23 PM

Cartel gvSIG aplicado a Medioambiente_1000.png

El curso de gvSIG aplicado a Medio Ambiente ha alcanzado ya la cifra de 1100 participantes. Muchas gracias a todas las personas que han confiado en gvSIG para aprender o reforzar sus conocimientos de SIG.

Recordamos que este curso es gratuito y completamente en línea. Además podeís optar a la certificiación oficial de gvSIG Usuario.

Para poder obtener dicha certificación se deberán completar todos los ejercicios de cada tema. Los ejercicios validarán los conocimientos adquiridos durante el curso y serán evaluados por un tutor.

La plataforma y la opción de matrícula abierta la podéis encontrar en www.geoalternativa.com/gvsig-training. Es necesario elegir el curso de gvSIG aplicado a Medio Ambiente. Después debéis elegir la opción “Registrarse como usuario” y, por último, matricularos en él.

Aparte de la entrega y aprobación de los ejercicios, la certificación llevará un coste mínimo, necesario para cubrir los gastos relativos a la evaluación y certificación. Este coste será de 30 €. El pago es posible hacerlo a través de Paypal en el siguiente enlace: http://www.gvsig.com/es/curso-gvsig-aplicado-medio-ambiente

La certificación será emitida por la Asociación gvSIG, y estará compuesta por dos certificados:

– Certificado de aprovechamiento del curso, que incluirá toda la información relativa a los contenidos formativos adquiridos.

– Certificado oficial gvSIG Usuario, al haber completado los 90 créditos necesarios para ello, y que da derecho a poder obtener el certificado de gvSIG Usuario, realizando y aprobando los créditos necesarios para su convalidación, a través de los cursos ofrecidos por la Asociación gvSIG.

El tiempo de dedicación del curso se ha estimado en 90 horas.

by Alonso Morilla at July 25, 2018 10:05 AM

Leyendo el post del blog de la IDEE sobre los cambios en la API de Google Maps, he considerado que no está de más darle también difusión por el blog de gvSIG a este claro ejemplo de dependencia tecnológica. Las diferencias entre ser dueños de la tecnología o, al contrario, ser totalmente dependientes de los dueños de la tecnología se explican perfectamente con ejemplos como este.

Nuestras administraciones públicas, las que gestionan nuestro dinero, deberían tenerlo claro. Inversión frente a gasto. Reutilización frente a derroche. Soberanía tecnológica frente a dependencia.

Y aunque la dinámica hacia soluciones de software libre es clara, es todavía demasiado frecuente encontrarnos con tomadores de decisiones que no dan importancia a este tema. Los errores cometidos, además, pueden acabar por hipotecar el futuro tecnológico (y económico!!) de sus organizaciones. No es baladí.

Volviendo al caso del API de Google Maps, otro más que se suma a la lista. Desde el pasado 11 de junio, las condiciones de uso de la API de Google Maps (que recuerda, no decides tú) han cambiado, con un encarecimiento notable de su uso. Tal y como apuntan desde la IDEE el servicio se ha encarecido en un 1400 % y el límite de peticiones gratuitas ha pasado de 25.000 por día a 28.000 por mes (unas 933 al día de media). El riesgo para los desarrolladores es muy grande, se pueden arruinar muy fácilmente si su aplicación tiene un buen pico de peticiones, por ejemplo durante un ataque.

Puede ser que éste sea un buen momento para que los organismos públicos que están utilizando la API de Google Maps consideren otras opciones como gvSIG Online, software libre implantado en todo tipo de organizaciones. Con todos los derechos, con todas las libertades, y con toda la capacidad tecnológica que necesitas.

by Alvaro at July 25, 2018 07:51 AM

July 24, 2018

No dia 23 de agosto de 2018 será realizada 10as Jornada Latino Americana e do Caribe de gvSIG em Santa Maria (Rio Grande do Sul – Brasil) sobre o tema “Plataformas e usuários para resolução de problemas”, um novo encontro para a troca de experiências sobre o uso e desenvolvimento do gvSIG em conjunto com a Semana da Geomática da UFSM.

Já está aberto o período para envio de propostas para apresentação nas Jornadas. Desde hoje as propostas poderão ser enviadas para o E-mail  artigojornadagvsig2018@gmail.com Toda informação sobre as normas para a apresentação pode ser consultada pela seção de Comunicação pela web. O período para recebimento dos resumos se encerra no próximo dia 10 de agosto.

O período de inscrição das Jornadas abrirá no próximo dia 25 de julho. A inscrição será gratuita.

by Alvaro at July 24, 2018 08:58 AM

Na postagem anterior, aprendemos a criar a base de arquivos para criarmos nosso plugin no QGIS. Nesta postagem, daremos continuidade nele e vamos salvar o arquivo shapefile criado (pois no código anterior, ele era temporário).

Além disso, iremos extrair, a partir do ponto criado, em qual parte de um determinado shapefile o ponto se localiza (por exemplo, em qual zoneamento da cidade o ponto se encontra).

Como salvar um shapefile no PyQGIS

Para salvar o shapefile criado, vamos acrescentar um campo no Qt Designer para que o usuário possa identificar o local onde o shape será salvo e em seguida vamos modificar o código Python para que ele armazene corretamente esse dado.

No Qt Design, iremos adicionar:

  • Um item “Label” com um texto mostrando para o usuário que o campo ao lado deve ser utilizado para colocar o endereço onde o shapefile será salvo;
  • Um item “Line Edit”, o qual irá mostrar o endereço escolhido; e
  • Um item “Push Button”, no qual o usuário irá clicar e uma janela que possibilitará navegar no Windows e selecionar o caminho para salvar o shapefile.

Após adicionar estes itens, iremos renomear (no Object Inspector) o Line Edit para “caminho”. O resultado é apresentado na figura abaixo.

Botão para guardar o caminho do shapefile.Botão para guardar o caminho do shapefile.

Agora que já adicionamos a interface gráfica, vamos modificar o código python para que o nosso sistema funcione.

Antes da função “def run(self)”, vamos criar uma função para ser executada quando o usuário clica no botão “…” ao lado do nosso “Line Edit”. O código para ser utilizado é apresentado abaixo:

## Função para obter o caminho onde será salvo o shapefile
def selecionar_saida(self):
  arquivoCaminho = QFileDialog.getSaveFileName(self.dlg, "Salvar o   arquivo em: ", "", "*.shp")

Agora, precisamos indicar por meio de python, que esta função será chamada quando apertarmos o botão “…”.

A função initGui é um bom local para adicionar conexões entre os botões e as ações executadas quando estes são pressionados (Germán Carrillo no GIS StackExchange).

Para isso, vamos até o topo do código python na função initGui e iremos inserir as seguintes linhas de código, onde a primeira linha limpa o campo (caso este já tenha sido preenchido) e o segundo indica que a função “selecionar_saida” deverá ser chamada quando ele for clicado.

## Esse código deverá ser inserido dentro de def initGui(self).

Note que ainda precisamos modificar o código apresentado na postagem anterior para que este receba como variável o endereço onde salvamos o shapefile.

Apresentamos abaixo o código da postagem anterior mais o código adicionado nesta postagem (o qual é iniciado com duas hashtags ##). Lembrando que este código foi inserido dentro da função def run(self) (e dentro de if result:).

# Variáveis que recebem as coordenadas fornecidas pelo usuário (e projeção)
longX = self.dlg.longTextIn.text()
latY = self.dlg.latTextIn.text()
projecao = self.dlg.projEPSG.text()

## Variável com o caminho salvo
localSalvo = self.dlg.caminho.text()

# Cria um shapefile de ponto a partir das coordenadas fornecidas
# Definindo a geometria do shapefile
camada = QgsVectorLayer('Point?crs=epsg:'+projecao, 'point' , 'memory')

# Define o provedor os pontos fornecidos
prov = camada.dataProvider()
prov.addAttributes([QgsField("Nome", QVariant.String)]) ## Fornece atributos ao nosso ponto
ponto = QgsPoint(float(longX),float(latY))

# Adiciona uma nova feição para a geometria
feat = QgsFeature()
feat.setAttributes(["Ponto B2E"]) ## Linha adicionada para fornecer atributo ao ponto

# Atualiza a camada
camada.updateFields() ## Atualiza os campos adicionados

# Adiciona a camada ao QGIS

## Salva a camada na variável localSalvo
QgsVectorFileWriter.writeAsVectorFormat(camada, localSalvo, "utf_8_encode", camada.crs(), "ESRI Shapefile")
pnt_layer = QgsVectorLayer(localSalvo, "Ponto B2E", "ogr")

Desta forma, conseguimos criar um shapefile e salvá-lo no nosso computador. Agora vamos criar uma função para estabelecer uma área de interesse e verificar se nosso ponto esta ou não dentro dela.

Lembre-se de adicionar, no topo do código python, junto com com os outros códigos do tipo from … import …, a seguinte linha “from PyQt4.QtCore import *”.

Intersecção de Ponto e Polígonos no PyQGIS

Agora, iremos criar um campo no nosso plugin onde o usuário irá marcar qual é a área de interesse (ou shapefile) que este deseja avaliar. Em outras palavras, vamos responder a seguinte pergunta: em qual zoneamento/bacia hidrográfica/região o ponto criado está inserido?

Para este tutorial, iremos utilizar dois shapefiles, um deles contendo os limites das bacias hidrográficas do município do Rio de Janeiro e outro com as regiões de planejamento do mesmo município.

Cabe lembrar que para executar a intersecção, os shapefiles envolvidos devem estar no mesmo sistema de coordenadas.

Os dois shapefiles indicados estão em um sistema de coordenadas antigo (SAD69) e no nosso tutorial, reprojetamos os shapefiles para SIRGAS 2000 UTM Zone 23S (ESPG: 31983).

Após realizar o download dos shapefiles, no Qt Designer, iremos adicionar dois Check Box, os quais indicarão para o usuário qual camada será avaliada, conforme a caixa esta marcada ou não.

Um dos checkbox será para o limite das bacias hidrográficas e outro para o limite das regiões de planejamento. O nome de cada um deles no Object Inspector é checkBoxRH e checkBoxRP, respectivamente.

Check Box adicionados no Qt Designer.Check Box adicionado no Qt Designer e seu respectivo nome no Object Inspector.

Agora que já temos nossas caixas para marcar a área de interesse, vamos adicionar o código que irá realizar a intersecção entre o ponto adicionado pelo usuário e a área selecionada.

O código seguinte deve ser inserido abaixo do código que apresentamos anteriormente, e em caso de dúvida, o código esta comentado, de forma a esclarecer as funções utilizadas.

## [....] Continuação do código anterior.
## Salva a camada na variável localSalvo
QgsVectorFileWriter.writeAsVectorFormat(camada, localSalvo, "utf_8_encode", camada.crs(), "ESRI Shapefile")
pnt_layer = QgsVectorLayer(localSalvo, "Ponto B2E", "ogr")
## Variáveis para a interseções
pnt_selection = []
bh_selection = []
rp_selection = []
## Condições para os checkboxs criados (Avaliação da Bacia Hidrográfica e da Região de Planejamento)
## Primeira condição para avaliar se o ponto cai em alguma bacia hidrográfica
if self.dlg.checkBoxBH.isChecked():
  bh_rioPath = "C:/Users/ferna/Desktop/municipiosRJ/bacia_hidroRJ_SIRGAS.shp" # Não esqueça de corrigir esse caminho no seu computador
  bh_rioLayer = QgsVectorLayer(bh_rioPath, "BH RJ", "ogr")
  for w in pnt_layer.getFeatures():
    for s in bh_rioLayer.getFeatures():
      if s.geometry().intersects(w.geometry()):
        ## Número dois foi usado pois o nome da bacia esta na terceira coluna (python começa a contar do zero)
  print pnt_selection[0] + " esta na " + bh_selection[0]
elif not self.dlg.checkBoxBH.isChecked():
  print u"O item Bacias Hidrográficas não foi selecionado."
## Segunda condição para avaliar se o ponto cai em alguma região de planejamento
if self.dlg.checkBoxRP.isChecked():
  rp_rioPath = "C:/Users/ferna/Desktop/municipiosRJ/limite_RP_SIRGAS.shp" # Não esqueça de corrigir esse caminho no seu computador
  rp_rioLayer = QgsVectorLayer(rp_rioPath, "RP RJ", "ogr")
  for w in pnt_layer.getFeatures():
    for s in rp_rioLayer.getFeatures():
      if s.geometry().intersects(w.geometry()):
        ## Número três foi usado pois o nome da região esta na quarta coluna (python começa a contar do zero)
  print pnt_selection[0] + u" esta na região de " + rp_selection[0]
elif not self.dlg.checkBoxRP.isChecked():
  print u"O item Região de Planejamento não foi selecionado."

Note que as rotinas (loops) para a avaliação da intersecção são semelhantes e terminam com break, de forma a realizar o loop apenas uma vez.

Você pode acessar ele clicando em Plugins > Python Controle, ou pelo atalho Ctrl + Alt + P.

O comando print do python irá exibir as mensagens que inserimos nesta linha, sendo que quando executamos o plugin no QGIS, essas mensagens serão exibidas no terminal python dele.

Plugin desenvolvido rodando e mensagens no terminal python.Plugin desenvolvido rodando e mensagens no terminal python.

O código que apresentamos irá funcionar corretamente se o usuário inserir pontos dentro das áreas de interesse, caso um ponto fora seja fornecido, um erro será gerado.

Então, como podemos evitar esse erro e só mostrar uma mensagem avisando o usuário que o ponto não esta dentro dos limites?

Tratando erros dentro do Python

Nesta situação, utilizaremos um bloco de código do tipo “try: …. except: ….”, onde o código que pode apresentar erro é inserido depois de try (tentar) e caso algum erro aconteça, o que o programa deve fazer é colocado depois de except.

Desta forma, no nosso código, onde havia somente “print pnt_selection[0] + ” esta na ” + bh_selection[0]”, substitua pelo código abaixo, sendo que o erro levantado, caso o ponto caia fora da área de interesse é do tipo IndexError.

  print pnt_selection[0] + " esta na " + bh_selection[0]
except IndexError:
  print u"O ponto fornecido esta fora da área de interesse!!"

E chegamos ao fim da segunda parte do nosso tutorial de como criar um plugin no QGIS utilizando Qt Designer e Python. Você pode conferir o código completo deste tutorial clicando aqui >> ponto_exatoB2E (Obs.: Abra o arquivo de texto no NotePad++ para que a indentação fique correta).

Em breve, iremos postar a terceira parte. E caso você tenha alguma dúvida, deixe ela nos comentários que responderemos assim que possível.

Referências consultadas:

Ujaval Gandhi - QGIS Tutorials and Tips: https://www.qgistutorials.com/en/docs/building_a_python_plugin.html

Python: How to List Polygon Intersections in QGIS: https://gifguide2code.com/2017/04/16/python-how-to-code-a-list-of-polygon-intersections-in-qgis/

by Fernando BS at July 24, 2018 06:01 AM