Welcome to Planet OSGeo

October 18, 2019

inspire 2019

Dear All, we are proud to announce that GeoSolutions will be present at the INSPIRE Helsinki 2019 Event which will be held in Helsinki from 22nd to 24th of October 2019.

GeoSolutions will be present with its INSPIRE Expert Nuno Oliveira and we will be happy to talk to you about our open source products, like GeoServerMapstore, GeoNode and GeoNetwork, as well as about our Enterprise Support Services and GeoServer Deployment Warranty offerings.

Our INSPIRE & GeoServer expert Nuno Oliveira is going to hold a workshop on GeoServer for INSPIRE, info below:

as well as another workshop on work performed with BRGM as follows:

If you are interested in learning about how we can help you achieving your goals with our Open Source products and professional services, make sure to visit us and talk to Nuno, he is a nice guy :).

See you in Helsinki!

The GeoSolutions team,

by simone giannecchini at October 18, 2019 04:27 PM

October 17, 2019

Since porting the Maestro API over to target .net standard and making the library truly cross-platform, the SDK story of the Maestro API was left in an somewhat undefined state. If you wanted to use the Maestro API for your own applications, where do you start?

For many milestones since that migration, I have intentionally side-stepped from answering that question until now. For the next milestone of MapGuide Maestro we'll be re-activating the Maestro SDK story in the form of a separate repository that contains:

  • Various code samples using the Maestro API
  • Source code template for the "Local Native" connection provider that wraps the official MapGuide .net API with instructions on how to roll your own build of this provider
So as a result, I've been re-vamping some of the code samples, especially usage in your own web applications. The last time I touched this code, it was all ASP.net Web Forms, not exactly a shining example of how best to use the Maestro API in your own web application in our current year.

Obviously, this means we need code samples that demonstrate usage of the Maestro API in ASP.net core. So in the process of building this modern code sample, I found some interesting challenges. 

Firstly, an ASP.net core application runs on its own webserver (Kestrel or IIS express). Our mapagent and built-in viewers are hosted on Apache or your local IIS. Even though in development they would both be on localhost, the web servers listen on different ports, meaning the same-origin policy is in play, complicating integration.

After some deliberation, I've decided that rather than try to embed the existing AJAX viewer in an iframe (which is what the old webforms sample did), I'll eat my own dogfood and just use the mapguide-react-layout as the viewer for the new asp.net core sample and set it up to talk to the existing mapagent of your MapGuide installation. This approach requires enabling CORS to make the mapagent accessible from a different origin. I figured out how this is actually done, which I'll show you in another blog post.

So with this main challenge addressed, I launched the work-in-progress asp.net core sample with the mapguide-react-layout viewer, and I get this problem.

For some context, this code sample does the following:
  1. Create a new session id
  2. Create a new web layout on-the-fly that sets up the task pane to point to our custom landing page of code samples
  3. Save this web layout into a session repository and then render the MVC view instructing the embedded mapguide-react-layout viewer to load from this weblayout and session id
It turns out that mapguide-react-layout misinterpreted this initial state and tries to load the viewer as though the user manually refreshed the browser (the URL state recovery feature introduced in 0.11) and assumes that a runtime map already exists

This was clearly a bug and since I needed this viewer to behave properly in the context of a new code sample for a re-activated SDK story for the next milestone release of MapGuide Maestro, this has been the longest block of text to segue into ...

... Announcing a new bugfix release of mapguide-react-layout :)

This release also fixes the font icons so they aren't a scrambled mess and fixes some minor warnings around some of the libraries we're using.

by Jackie Ng (noreply@blogger.com) at October 17, 2019 02:08 PM

October 16, 2019

October 15, 2019

Caros leitores,

Quero convidá-los a participarem do Curso Online de GeoServer que estarei ministrando pela GEOCURSOS. O objetivo do curso é que você aprenda a disponibilizar, compartilhar e editar dados geográficos na internet com o GeoServer.

No curso serão abordados tópicos como: configuração de dados, criação de estilo com SLD, padrões OGC, interface administrativa (web), visualização cartográfica com OpenLayers, REST API, Segurança, entre outros.

O curso ocorrerá entre os dias 03 e 12 de dezembro (terças, quartas e quintas) das 20:00 as 22:00 (horário de Brasília).

Aqueles que poderem divulgar para seus contatos, agradeço. Quem quiser saber mais informações sobre o curso, pode obtê-las nos seguintes links:

by Fernando Quadro at October 15, 2019 02:36 PM

Thumb

Global maps based on OpenStreetMap in WGS84 map projection are available on MapTiler Cloud via maps API. This complements the already existing maps in Web Mercator projection, which is de-facto the industry standard.

Why there are so many map projections?

Earth is not flat. Therefore, we need to mathematically solve the issue of projecting a globe into 2D space. It is possible to do in many ways, but all solutions, which are called map projections, have pros and cons.

Over time, Web Mercator become de-facto the industry standard. However, like any other map projections, it also has downsides: the most visible one is the distortion of sizes. Areas near poles are displayed much bigger while equatorial zone appears much smaller than in reality.

Various local/national coordinate systems were created because many maps were historically showing only limited territory. Therefore, they use projection which fits best to a specific territory. Local coordinate systems are still used by governments and required for official maps.

Some of the map projections

Some of the map projections. Image ©Tobias Jung CC BY-SA 4.0

Vector and raster OSM tiles in EPSG:4326

To fulfill specific needs, MapTiler Cloud is now adding base maps derived from OpenStreetMap in alternative coordinate systems. Base maps are available in WGS84.

https://api.maptiler.com/tiles/v3-4326/{z}/{x}/{y}.pbf?key={YOUR-OWN-KEY}

There is an OpenLayers 6 vector viewer code snippet, raster tiles rendered on-demand, WMTS service for use in desktop GIS software and static maps API. 

Maps in Lambert, Rijksdriehoekstelsel and other national coordinate systems

From the local coordinate systems, French Lambert, Dutch Rijksdriehoekstelsel are available and many others will come soon!

MapTiler Cloud is also able to host maps in any coordinate system with EPSG code. Just upload your map in GeoPackage format or create one with the new MapTiler Desktop 10.2.

OpenStreetMap in French Lambert and Dutch Rijksdriehoekstelsel map projections

OpenStreetMap in French Lambert and Dutch Rijksdriehoekstelsel map projections

Free maps API with local coordinate systems

Start using maps in WGS84, Lambert or Rijksdriehoekstelsel for free via MapTiler Cloud.

by Dalibor Janak (info@maptiler.com) at October 15, 2019 11:00 AM

October 14, 2019

The program of the 15th International gvSIG Conference is now available. It includes a a great number of presentations and several free workshops for both users and developers.

Conference will take place from November 6th to 8th at School of Engineering in Geodesy, Cartography and Surveying (Universitat Politècnica de València, Spain), and registrations have to be done from the form available at the event website.

Registration for workshops will be independent, and they will be able to be made from October 15th. All the workshops information will be published at gvSIG Blog soon.

by Mario at October 14, 2019 04:27 PM

El programa de las 15as Jornadas Internacionales gvSIG está ya disponible con una gran variedad de ponencias y varios talleres gratuitos, tanto para usuarios como para desarrolladores.

Las jornadas tendrán lugar del 6 al 8 de noviembre en la Escuela Técnica Superior de Ingeniería Geodésica, Cartográfica y Topográfica (Universitat Politècnica de València, España), y para poder asistir es necesario inscribirse previamente desde el formulario habilitado en la web del evento. Se recomienda no esperar al último momento, ya que las salas cuentan con un aforo limitado.

Para los talleres gratuitos se deberá realizar una inscripción independiente, que podrá realizarse a partir del día 15 de octubre. Toda la información relativa a los mismos la podréis encontrar en el blog de gvSIG.

by Mario at October 14, 2019 04:21 PM

October 13, 2019

October 10, 2019

Blue Rivers Ordered

Over the past few years, I’ve played around with developing ordered rivers networks for different projects. I am not an expert in hydrology, but I can get close for cartographic purposes. I am an expert; however, in asking for help from those who know best and I rely on a lot of very smart people to guide me on my journey.

Recently, I decided to put together a visualization of ordered rivers for New Zealand. I came across a very nice data set offered through the Ministry for the Environment via the Koordinates website and thought I’d like to put it to use.

The rivers vis project made me wonder if I could build this base dataset myself using some of the recently released elevation data sets via the LINZ Data Service. The short answer to my question is “sorta”. Doing it open source is not an issue, but building an accurate ordered river centerline network is another story. This is a task I cannot take on as a solo project right now, but I could do a little experimentation. Below, I’ll offer some of methods and things I learned along the way.

Tools and Data

The method I tested used TauDEM and a 1m DEM raster accessed from the LINZ Data Service. I down sampled the DEM to 2m and 5m resolutions and used small areas for testing. Finding and open source tool was easy. I sorted through a few available methods and finally landed on “Terrain Analysis Using Digital Elevation Models” (TauDEM). There are additional methods through GRASS and SAGA GIS. I chose TauDEM because I never used it before.

Method Tested

To my knowledge, there is no open source tool where a person can put in a DEM and get a networked rivers centerline vector out the other side. It requires a number of steps to achieve your goal.

The basic run down to process the DEM is to:

  1. Fill sinks
  2. Determine flow directions
  3. Determine watersheds
  4. Determine flow accumulation
  5. Stream classification
  6. Export to vector

TauDEM does require a few extra steps to complete the process, but these steps are explained in the documentation of the tool. It was more about keeping all my variables in the right places and using them at the right time. I recommend using the variable names TauDEM provides.

BASH script here

Click the arrow to the left to view the full BASH script below:

#!bin/bash

#Rough sketch for building river centerlines. Rasters have been clipped prior

BASEPATH=/dir/path/to/base

raster_list=$( find $BASEPATH -name "*.tif" )

taudem_outputs=/dir/path/to/outputs

reso=resolution_number

for i in $raster_list
do


	INPUT_RASTER=$i

	file_name=$( basename $i )

	strip_input_extension=$( echo $file_name | sed 's/.tif//' )

	reso_name=$taudem_outputs/${strip_input_extension}_${reso}res

	gdal_translate -tr $reso $reso -of GTiff $i $reso_name.tif

	fel=${reso_name}_fel.tif
	p=${reso_name}_p.tif
	sd8=${reso_name}_sd8.tif
	ad8=${reso_name}_ad8.tif
	ang=${reso_name}_ang.tif
	slp=${reso_name}_slp.tif
	sca=${reso_name}_sca.tif
	sa=${reso_name}_sa.tif
	ssa=${reso_name}_ssa.tif
	src=${reso_name}_src.tif

	ord=${reso_name}_strahlerorder.tif 
	tree=${reso_name}_tree.dat
	coord=${reso_name}_coord.dat
	net=${reso_name}_network.shp
	w=${reso_name}_watershed.tif 

	processed_input_file=$reso_name.tif

	#TauDEM Commands
	mpiexec -n 8 pitremove -z $processed_input_file -fel $fel

	mpiexec -n 8 d8flowdir -fel $fel -p $p -sd8 $sd8 

	mpiexec -n 8 aread8 -p $p -ad8 $ad8 -nc

	mpiexec -n 8 dinfflowdir -fel $fel -ang $ang -slp $slp

	mpiexec -n 8 areadinf -ang $ang -sca $sca -nc

	mpiexec -n 8 slopearea -slp $slp -sca $sca -sa $sa

	mpiexec -n 8 d8flowpathextremeup -p $p -sa $sa -ssa $ssa -nc

	mpiexec -n 8 threshold -ssa $ssa -src $src

	mpiexec -n 8 streamnet -fel $fel -p $p -ad8 $ad8 -src $src -ord $ord -tree $tree -coord $coord -net $net -w $w

done

The script is a rough sketch, but does get results.

Challenges in the Process

One major challenge for this project was the size of the input DEM and my computers available RAM. I work primarily off a laptop. It’s a good machine but no match for a proper server set up with some spacious RAM. My laptop struggled with the large hi-resolution DEMs, so I needed to down-sample the images and choose a smaller test area to get it to work.

Clip the tiff with gdal_translate -projwin and down sample with -tr

gdal_translate -tr xres yres -projwin ulx uly lrx lry input.tif output.tif

The second challenge came up because I used a bounding box to clip my test regions. I recommend not doing this and instead clip your regions using a watershed boundary. Having square shapes for your test regions will give you very inaccurate and unhelpful results. For example, major channels in your DEM will be cut at the edges of your raster. You will not get accurate results.

Clipping a raster using a shapefile, like a watershed boundary, can be achieved using gdalwarp.

gdalwarp –cutline input.shp input.tif output.tif

Results

I ran my process and QCed the results against Aerial Imagery and a hillshade I developed from the DEM. The first run gave me good enough results to know I have a lot of work to do, but I did manage to develop a process I was happy with. The tool did a great job, but the accuracy of the DEM was a little more challenging. It’s a start. I captured a good number of river channels despite my incorrect usage of a square DEM, learned a lot about how DEM resolution affects outputs, and gained knowledge around how to spot troublesome artifacts.

Well Defined ChannelsImg 1: River capture in well defined channel.

From this experiment, there are a few ideas I’d like to explore further:

1. Accuracy of the DEM. The particular DEM I worked with had a number of ‘dams’ in the flows. Notably, bridges, culverts, vegetation artifacts, and other general errors that caused water to flow in interesting directions. When working with a data set like this, I am curious how manage these artifacts.

Road issueImg 1: River diversion at road.

Artifact issueImg 1: River diversion at culvert or bridge.

2. How to go beyond borders. This analysis can be broken down by watershed, but it will be necessary to link the outflows of those watersheds to the next for accurate results.

Edge issueImg 1: Flow not captured at edge.

3. As DEMs are released with better resolution, there is a need for scaled up computing power. The process needs a large amount of RAM. What is the best computational set up for capturing the largest area?

4. Did I do this correctly? I perform this task about once every two years and usually weekends when the surf is flat and the garden is weeded, so I am not an expert. There is a lot more research to be done to determine if I am using the tools to the best of their abilities.

by xycarto at October 10, 2019 09:47 PM

October 09, 2019

October 08, 2019

September 30, 2019

QGIS-versioning is a QGIS and PostGIS plugin dedicated to data versioning and history management. It supports :

  • Keeping full table history with all modifications
  • Transparent access to current data
  • Versioning tables with branches
  • Work offline
  • Work on a data subset
  • Conflict management with a GUI

QGIS versioning conflict management

In a previous blog article we detailed how QGIS versioning can manage data history, branches, and work offline with PostGIS-stored data and QGIS. We recently added foreign key support to QGIS versioning so you can now historize any complex database schema.

This QGIS plugin is available in the official QGIS plugin repository, and you can fork it on GitHub too !

Foreign key support

TL;DR

When a user decides to historize its PostgreSQL database with QGIS-versioning, the plugin alters the existing database schema and adds new fields in order to track down the different versions of a single table row. Every access to these versioned tables are subsequently made through updatable views in order to automatically fill in the new versioning fields.

Up to now, it was not possible to deal with primary keys and foreign keys : the original tables had to be constraints-free.  This limitation has been lifted thanks to this contribution.

To make it simple, the solution is to remove all constraints from the original database and transform them into a set of SQL check triggers installed on the working copy databases (SQLite or PostgreSQL). As verifications are made on the client side, it’s impossible to propagate invalid modifications on your base server when you “commit” updates.

Behind the curtains

When you choose to historize an existing database, a few fields are added to the existing table. Among these fields, versioning_ididentifies  one specific version of a row. For one existing row, there are several versions of this row, each with a different versioning_id but with the same original primary key field. As a consequence, that field cannot satisfy the unique constraint, so it cannot be a key, therefore no foreign key neither.

We therefore have to drop the primary key and foreign key constraints when historizing the table. Before removing them, constraints definitions are stored in a dedicated table so that these constraints can be checked later.

When the user checks out a specific table on a specific branch, QGIS-versioning uses that constraint table to build constraint checking triggers in the working copy. The way constraints are built depends on the checkout type (you can checkout in a SQLite file, in the master PostgreSQL database or in another PostgreSQL database).

What do we check ?

That’s where the fun begins ! The first thing we have to check is key uniqueness or foreign key referencing an existing key on insert or update. Remember that there are no primary key and foreign key anymore, we dropped them when activating historization. We keep the term for better understanding.

You also have to deal with deleting or updating a referenced row and the different ways of propagating the modification : cascade, set default, set null, or simply failure, as explained in PostgreSQL Foreign keys documentation .

Nevermind all that, this problem has been solved for you and everything is done automatically in QGIS-versioning. Before you ask, yes foreign keys spanning on multiple fields are also supported.

What’s new in QGIS ?

You will get a new message you probably already know about, when you try to make an invalid modification committing your changes to the master database

Error when foreign key constraint is violated

Partial checkout

One existing Qgis-versioning feature is partial checkout. It allows a user to select a subset of data to checkout in its working copy. It avoids downloading gigabytes of data you do not care about. You can, for instance, checkout features within a given spatial extent.

So far, so good. But if you have only a part of your data, you cannot ensure that modifying a data field as primary key will keep uniqueness. In this particular case, QGIS-versioning will trigger errors on commit, pointing out the invalid rows you have to modify so the unique constraint remains valid.

Error when committing non unique key after a partial checkout

Tests

There is a lot to check when you intend to replace the existing constraint system with your own constraint system based on triggers. In order to ensure QGIS-Versioning stability and reliability, we put some special effort on building a test set that cover all use cases and possible exceptions.

What’s next

There is now no known limitations on using QGIS-versioning on any of your database. If you think about a missing feature or just want to know more about QGIS and QGIS-versioning, feel free to contact us at infos+data@oslandia.com. And please have a look at our support offering for QGIS.

Many thanks to eHealth Africa who helped us develop these new features. eHealth Africa is a non-governmental organization based in Nigeria. Their mission is to build stronger health systems through the design and implementation of data-driven solutions.

by Julien Cabieces at September 30, 2019 07:52 AM

September 28, 2019

FOSS4GUK (https://uk.osgeo.org/foss4guk2019/) came and went a week or so ago, in Edinburgh, and to my mind it was a game-changer for our UK events. This is not going to be a detailed post about how great it was (yes it was great), and how good the venue was (also great), but a reflection on how it was different. For one, there were 250 attendees, which is a step up from previous events.

September 28, 2019 10:00 AM

September 27, 2019

Prezados leitores,

Ontem, por volta das 22h Tim Schaub anunciou no GitHub do OpenLayers que a tão aguardada (pelo menos por mim) versão 6.0 está disponível oficialmente. Foram mais de 1.780 commits e 540 pull requests desde a versão 5.3.

Dentre as novidades, um recurso importante nesta versão é a capacidade de compor camadas com diferentes tipos de renderizador. Anteriormente, o mapa usava uma única estratégia de renderização, e todas as camadas do seu mapa tinham que implementar essa estratégia.

Agora é possível ter um mapa com camadas que usam diferentes tecnologias de renderização. Isso possibilita, por exemplo, que a camada Canvas (2D) seja composta junto com uma camada baseada em WebGL no mesmo mapa. Também é possível criar camadas com renderizadores personalizados. Portanto, você pode ter um mapa que use outra biblioteca (como d3) para renderizar uma camada e usar o OpenLayers para renderizar as outras camadas.

Além disso, a versão 6.0 inclui várias melhorias na renderização de vector tiles e deve consumir um quantidade menor de memória em geral. A versão também inclui vários recursos experimentais que ainda não fazem parte da API estável, como um novo renderizador baseado em WebGL e a função experimental useGeographic().

Esta versão inclui várias alterações incompatíveis com versões anteriores. Desta forma é importante ler as notas do release para verificar o que mudou a partir da versão 5.3.

Fonte: GitHub do OpenLayers

by Fernando Quadro at September 27, 2019 12:05 PM

Tips about the Semi-Automatic Classification Plugin for QGIS

Processing satellite images requires a large amount of disk space for temporary files (required for processing but deleted afterward). In case your system disk has low space available, you can change the SCP temporary directory to a different location with large disk space available.



For any comment or question, join the Facebook group about the Semi-Automatic Classification Plugin.

by Luca Congedo (noreply@blogger.com) at September 27, 2019 08:00 AM

September 26, 2019

Did you know that MapGuide uses CS-Map, a coordinate system transformation library with support for many thousands of coordinate systems out of the box? No matter how esoteric the map projection your data is in, MapGuide can re-project geospatial data in and out of it thanks to this library.

So it is quite a shame that for the longest time, MapGuide's powerful coordinate system transformation capabilities has zero representation in any part of the mapagent (the HTTP API provided by MapGuide).

To use MgCooordinateSystem and friends (the classes that wrap the CS-Map library) requires building a your own MapGuide application using the MapGuide Web API in one of our supported languages (.net, Java or PHP) and having your client application call into your own MapGuide application. There is nothing out of the box in the mapagent for a client map viewer application to do a basic HTTP request for transforming coordinates or requesting feature data to be transformed to a certain coordinate system.

For MapGuide 4.0, we've exposed the coordinate system transformation capabilities in APIs where it makes sense. Such as our SELECTFEATURES APIs for returning feature data. In my previous showcase post, I've shown how appending VERSION=4.0.0 and CLEAN=1 now gives you an intuitive GeoJSON result for this API.



The coordinates are based on the coordinate system of the data source it comes from. In the above screenshot, this is latitude/longitudes (code: LL84, epsg: 4326).

Suppose you want this data in web mercator (code: WGS84.PseudoMercator, epsg: 3857) so you can easily plonk this GeoJSON onto a slippy map with OpenStreetMap or your own XYZ tiles, which would also be in web mercator. With MapGuide 4.0, you can now also include a TRANSFORMTO=WGS84.PseudoMercator parameter to your HTTP request and that GeoJSON data will be transformed to web mercator.


The other major capability gap is that the mapagent offers no API over the mapagent for transforming a series of coordinates from one coordinate system to another. For MapGuide 4.0, there's now a dedicated API for doing this: The CS.TRANSFORMCOORDINATES operation.

With this operation, simply feed it:

  • A comma-separated list of space-separated coordinate pairs (COORDINATES)
  • The CS-Map code of the source coordinate system the above coordinates are in (SOURCE)
  • The CS-Map code of the target coordinate system to transform to (TARGET)
  • Your desired output format of JSON or XML (FORMAT)
  • If you want JSON, whether you want the pretty JSON version (CLEAN)
Invoking the operation will give you back a collection of transformed coordinates.



A place where this new API is sorely needed is MapGuide Maestro, which previously required kludgy workarounds for transforming bounding boxes in Map Definitions (when adding layers in different coordinate systems, or re-computing extents):
The next milestone of MapGuide Maestro will now take advantage of this new API for transformation when connected to a MapGuide 4.0 or newer server.

by Jackie Ng (noreply@blogger.com) at September 26, 2019 09:28 AM

September 25, 2019

PyWPS-4.2.2 Released

New bugfix-release of PyWPS is finally produced, with following bugfixes

  • Fixed scheduler extension (#480).
  • Fixed ValuesReference implementation (#471, #484).
  • Fixed AllowedValue range (#467, #464).
  • Add metalink support to facilitate outputs with multiple files (#466).
  • Rename async to async_ for Python 3.7 compatibility (#462).
  • Improve queue race conditions (#455).
  • Numerous bug-fixes, additional tests and documentation improvements.

September 25, 2019 12:00 AM

September 24, 2019

La componente geográfica es cada vez más reconocida como un atributo fundamental de la información. La realidad se expresa en el territorio. Herramientas que nos permitan gestionar el territorio nos permitirán gestionar mejor la realidad.

“Un sistema de información geográfica (SIG) es un conjunto de herramientas que integra y relaciona diversos componentes (usuarios, hardware, software, procesos) que permiten la organización, almacenamiento, manipulación, análisis y modelización de grandes cantidades de datos procedentes del mundo real que están vinculados a una referencia espacial, facilitando la incorporación de aspectos sociales-culturales, económicos y ambientales que conducen a la toma de decisiones de una manera más eficaz.

En el sentido más estricto, es cualquier sistema de información capaz de integrar, almacenar, editar, analizar, compartir y mostrar la información geográficamente referenciada. En un sentido más genérico, los SIG son herramientas que permiten a los usuarios crear consultas interactivas, analizar la información espacial, editar datos, mapas y presentar los resultados de todas estas operaciones.” (Wikipedia)

No todos los ingenieros utilizan SIG, pero es una herramienta cada vez más útil para más tipos de proyectos. Realiza alguno de estos cursos online gratuitos y mejora tus competencias. Empieza hoy mismo a formarte con gvSIG. ¡Vamos allá!

by Alvaro at September 24, 2019 09:17 AM

We are pleased to announce the release of GeoServer 2.15.3 with downloads (zip|war), documentation (html) and extensions.

This is a maintenance release recommended for production. This release is made in conjunction with GeoTools 21.3 and GeoWebCache 1.15.3. Thanks to everyone who contributed to this release.

For more information see the GeoServer 2.15.3 release notes.

Improvements and Fixes

This release includes a number of improvements, including:

  • Enhance mongodb schema generation
  • Allow setting Entity Expansion limit on WFS XML Readers
  • Promote authkey to extension (GSIP-174)
  • Make MongoDB App-Schema schema rebuilder endpoints only rebuild schemas present in mappings.
  • Promote status-monitoring module from Community to Extension
  • Upgrade Jetty to 9.4.18.v20190429

A number of fixes are also present:

  • WMS GetFeatureInfo formats text/html, text/plain, text/xml and application/vnd.ogc.gml (GML2) don’t handle time correctly
  • Wrong URL scheme in layer preview (when using HTTPS)
  • GetCapabilities on a single layer fails if a style is duplicated
  • Renaming a layer doesn’t update Data Security rules
  • SLD file renamed with REST PUT request when not needed
  • GeoTIFF sources configured with GeoServer 2.14.x might not work in 2.15.x
  • Style editor extension point not working
  • NullPointerException on WFS ComplexGeoJsonWriter Link check
  • Switching from System Status to Modules tab gives an error.
  • Nodata is not made transparent after channelSelect+contrastEnhancement on multibands dataset
  • WFS GeoJSONBuilder limits max nested level to 20

About GeoServer 2.15 Series

Additional information on the 2.15 series:

Java 11 comparability is the result of a successful code-sprint. Thanks to participating organizations (BoundlessGeoSolutionsGeoCatAstun TechnologyCCRi) and sprint sponsors (Gaia3Datolosgeo:ukAstun Technology).

by Andrea Aime at September 24, 2019 08:01 AM

September 23, 2019

Del 1 al 3 del próximo mes de octubre, Feria Valencia acoge una nueva cita de Ecofira y Efiaqua, dos certámenes de referencia en torno a la gestión sostenible y el medio ambiente que este año ponen el foco en los servicios urbanos y la eficiencia tecnológica. En este sentido, desde ambas ferias han propuesta una completa agenda de conferencias y actividades. Así, la agenda de Ecofira abordará aspectos como la economía circular, el cambio climático o los avances tecnológicos para las ciudades inteligentes. Por su parte, Efiaqua ha programado una ambiciosa agenda en torno a las directivas y gobernanza del agua, la gestión sostenible del agua en el municipio o la digitalización en el sector. Además, junto a ambos eventos, también se celebrará Iberflora, el gran evento profesional del sector verde en Europa y referente entre las ferias de jardinería.

Durante el evento, en la zona showcasing, la Asociación gvSIG estará presente en el stand de AVAESEN, donde se podrá consultar cualquier duda sobre las soluciones gvSIG y ver demostraciones referentes al uso de los Sistemas de Información Geográfica en la gestión municipal y su importancia en las denominadas Ciudades Inteligentes o Smart Cities. Geomática libre al servicio de todos.

by Alvaro at September 23, 2019 02:16 PM

FOSS4G 2019

Dear Reader,

we are putting together in this post all presentations that were given by our staff during this year FOSS4G in Bucharest, Romania.

Here is the complete list. Enjoy!

State of GeoServer 2019 GeoServer feature frenzy GeoServer WFS3: introduction to the RESTful, schemaless, JSON first download service Creating Stunning Maps in GeoServer, with SLD, CSS, YSLD and MBStyles Standing up a OSM clone with GeoServer and CSS  Crunching Data In GeoServer : Mastering Rendering Transformations, WPS Processes And SQL Views. Mapping the world beyond Web Mercator Using the OGC Web Processing Service (WPS) to move business logic to the server State of JAI Introduction to MapStore, mashups made simple One Geonode, many Geonodes State of GeoNode

If you are interested in learning about how we can help you achieving your goals with open source products like GeoServerMapstore, GeoNode and GeoNetwork through our Enterprise Support Services offer, feel free to contact us!

The GeoSolutions team,

by simone giannecchini at September 23, 2019 01:12 PM

They’re an art form packed with thoughts and imagination. Cause and effect yet another major facet to a lot of types of writing. Summarize the ideas within the paragraph. Carrying this out is an exceptional approach to search for knowledge. If s O, subsequent specified steps can empower the reader find the essential concept, hence improving comprehension. After the student understands the concept of complete and particular phrases, they can better locate an overall assertion within the sentence. No matter your objective, just carry on creating. That’s my advice to you individually if you’re a teacher who does like to cease.

You first need to show (show) the object you’re talking about.

Punctuation, about the opposite palm, may surely alter the significance of the phrase. Many additional examples might be cited here that handle the nature of the kid’s liberty. Write a reply to that query. I’m heading to become a history teacher. It may become your instructor, your family or pals, or just yourself. In my private observation for a instructor as well as a parent, children who like to read books are somewhat more educated than kids who r read publications in any way. Make reading or narrative time part of each day. Persuasive writing is an essential ability for learners to learn. Children’s literature might be missed as a crucial text for ecology instruction.

Jimmy’s amazing cover letter founder is just a portion of the fee.

Composing abilities additionally incorporate firm. Children which might ben’t sufferers aren’t allowed within the assessment areas and aren’t permitted to be left without adult supervision within the waiting areas. The paragraph isn’t a overview of what it’s that you’re authoring, rather, it is in support of what it’s that you’re authoring. For a common 5 paragraph composition, you are going to require a minimum of three rationales, or elements to your own response. First, let’s seem at the start of the article. For some thing to become a superior convincing article topic, it must be an controversial issue. Informing assertions are frequently way too obscure due to their own fictional great.

The crew uncovered 50 eggs to rays for times of five minutes to 30minutes.

Later, the exact same will be right of documents theywill have to compose. Sole story to begin essay. With the appropriate content that’s arranged rationally, the audience may certainly be funnelled into arriving at the similar decision that you would be presenting within the last region of the essay. There are lots of excellent issues for this particular sort of composition. Array article from all varieties of hbs. Looking at what you have composed until now, make a listing of apapers service terms it is possible to research to strive to get posts on your own document. It’s the type of paper you’d compose before composing a option papers. For instance, it is possible to select any item and find out how several adjectives the youngster can create in describing it. Generally, Part papers you’ve got.

At the same period the tone of the literature used should be according to the desires of the task.

The most contentious types of work include the military usage of youngsters along with child prostitution. It can take a while for the student to function alone to uncover the major thought. I’m fumbling with the thought of returning to college to get a history degree. This really is a great brooding exercise for learners to assess independently if they’re composing nicely – created sentences. Child labour contains working kids that are below a particular minimum age. Other kids should do tedious and repetitive jobs including assembling boxes or polishing shoes. It is not possible for them to improve also within their professors if they’re not supplied the scope to relish a remainder from the grayscale characters. This isn’t for you to actually review what you’ve read. Ending by means of your dissertation thought.

by FOSS4G 2019 Bucharest (sales@euplatesc.ro) at September 23, 2019 10:52 AM

September 20, 2019

Cologne city shown as colorized 3D point cloud (data source: openNRW Germany)The latest PDAL release (Point Data Abstraction Library, http://www.pdal.io/, version 2.0.1) has now been packaged for Fedora Linux.
I have cleaned up the dependencies (also the annoying former installation bug with PDAL-devel has been resolved).

The installation is as simple as this (the repository is located at Fedora’s COPR):

# enable and install PDAL
sudo dnf copr enable neteler/pdal
sudo dnf install PDAL PDAL-libs PDAL-vdatums

# if you want to compile other software like GRASS GIS with PDAL support, then you also need
sudo dnf install PDAL-devel
# Now, run PDAL:
pdal-config --version
pdal --help

Enjoy!

The post PDAL 2.0.1 packaged for Fedora including vertical datums and grids appeared first on GFOSS Blog | GRASS GIS and OSGeo News.

by neteler at September 20, 2019 08:24 PM

If you've ever worked with the HTTP-based mapagent API, you should be aware that most of the APIs responses in the mapagent generally come back in 2 forms:

  • XML
  • JSON
However, if you ever look at the JSON responses, you'll be accustomed to dealing with ugly monstrosities like this (eg. A JSON version of a Layer Definition):



Or heaven forbid, you want to explore the structure of a Feature Source and so you need the JSON version of a FDO feature schema:



Good luck trying to parse and comprehend that!

Why is the JSON so horrible? The answer is XML is the canonical response format for all non-image/binary mapagent operations (MapGuide Open Source's inception was in 2004 after all, when XML was king and JSON probably didn't exist yet), so the JSON version is a literal translation of whatever its XML form would be. While there is nothing wrong with that, the way this JSON translation was done is the most lowest-common-denominator approach:
  • Treat every XML element as a JSON array.
  • Treat every XML element text body as a JSON string.
As a result, as evidenced by the above screenshots that means that all JSON responses are nothing but a series of JSON arrays (of possible more JSON arrays) of strings.

But there is a reason for such laziness. The JSON translation knows nothing about the content model of whatever XML it's trying to translate, so it really has no choice in the matter. Such content models are defined in the various XML schemas that are shipped with MapGuide. But working with XML schemas in C++ sounds like a nightmare in and of itself, so there is another way to determine the content model: Just manually hard-code the list of all possible XML xpaths that are:
  • Repeating (ie. Should be converted to actual JSON arrays)
  • Not a string
Using the existing XML schemas as the reference, this list only required a one-off painstaking translation of this xpath list. The end result isn't exactly pretty, but it gets the job done. With this hard-coded list now in our JSON converter, it means we now have the means to output JSON that is much easier to comprehend and intuitive! 

For MGOS 4.0, for any mapagent operation that returns JSON, if you specify VERSION=4.0.0 and CLEAN=1 in the request parameters you will now get a JSON format that actually makes sense!



Now for some mapagent operations, finally having a "clean" JSON version still isn't enough because the clean JSON response is still mostly clunky and unusable. Consider the FDO feature schema example. The original JSON version is an unworkable monstrosity, and the "clean" version would've just been lipstick on a pig because at the end of the day, it is still a literal JSON translation of an XML schema! 



So as a result, for this set of APIs, a new simpler XML response format is introduced for schema and class definitions by including SIMPLE=1 to your VERSION=4.0.0 request:



From which its JSON equivalent (I hope you'll agree) is much more easier to comprehend!



There is another class of operations that have received a similar treatment. Operations that return geometric/feature data. Consider the XML form of selecting features (the ugly schema section collapsed for brevity):



This is what the "ugly" JSON version would've looked like (schema parts again collapsed for brevity). 



Even if we prettied this up with content-model-aware conversion, the JSON is still mostly unremarkable. Because, if we're serving JSON feature data out to clients, then there's really only one format we should care about supporting:


GeoJSON is the de-facto format for serving out JSON-based feature data, and is universally supported by any mapping/GIS product/tool/library/viewer worth a damn, so if we have the opportunity for MapGuide to return clean JSON data for geometries/features, forget about the literal conversion from XML approach! Just go ahead and make GeoJSON output a first-class citizen

So for MGOS 4.0, if you run any mapagent operation that returns geometry/feature data and you ask for VERSION=4.0.0 and CLEAN=1 you will now get GeoJSON!



Which (due to GeoJSON's ubiquity) can be plugged directly into your web mapping library of choice for display if you so choose.

If you are building your own server-side MapGuide applications with the MapGuide Web API, there's a new MgGeoJsonWriter class is available for you to convert MgFeatureReaders to GeoJSON data.

With MapGuide Open Source 4.0, JSON support is no longer an afterthought, it's actually usable!

by Jackie Ng (noreply@blogger.com) at September 20, 2019 06:11 PM

Tips about the Semi-Automatic Classification Plugin for QGIS

Activating the Classification preview pointer you can display a preview with a left click on the image. Also, with a right click you can display the Algorithm raster that represents the minimum spectral distance (calculated by the algorithm) of a pixel signature to any training spectral signature. Dark pixels are more distant to any spectral signature and possibly a new ROI should be collected for these pixels.




For any comment or question, join the Facebook group about the Semi-Automatic Classification Plugin.

by Luca Congedo (noreply@blogger.com) at September 20, 2019 08:00 AM

September 19, 2019

by Fernando Quadro at September 19, 2019 10:30 AM

September 18, 2019

We are happy to announce the release of GeoServer 2.16.0. Downloads are available (zip and war) along with docs and extensions.

This is a GeoServer release candidate made in conjunction with GeoTools 22.0.

Faster map rendering of complex styles

If you have very complex styles, with lots of rules and complex filtering conditions you’ll be pleased to hear that GeoServer 2.16.x can locate the right symbolizer much quicker than previous versions. This is useful, for example, in the GeoServer home page demo map, rendered from OSM data using a OSM Bright clone built with the CSS module.

The GeoSolutions offices in Massarosa (Viareggio), Italy, in the geoserver.org demo map

Dynamic densification on reprojection

GeoServer has always reprojected data “point by point”, this typically caused long lines represented by just two points to be turn into straight lines, instead of curves, as they were supposed to.

In GeoServer there is a new “advanced projection handling” option in WMS enabling on the fly densification of data, the rendering engine computes how much deformation the projection applies in the area being rendered, and densifies the long lines before reprojection, resulting in eye pleasing curves in output. See a “before and after” comparison here:

Reprojection, original point by point versus densified mode in 2.16.x

EPSG database updated to v 9.6

Thanks to the sponsorship of GeoScience Australia the EPSG database has been updated to version 9.6, including roughly a thousand more codes than the previous version available in GeoServer. The code has also been updated to ensure the NTv2 grid shift files between GDA94 and GDA2020 work properly.

Complex GeoJSON output changes

GeoServer WFS can already output GeoJSON out of complex features data sources (app-schema). However, the output can be less than pleasing at times, the following improvements have been made:

  • The property/element alternation typical of GML is preserved, causing deeply nested and ugly to look structures. Not everyone loves to write a “container.x.x” access to reach the x value, with 2.16.x the output skips one of the containers and exposes a direct “container.x” structure
  • XML attributes are now turned into plain JSON properties, and prefixed with a “@”
  • Feature and data types are not lost anymore in translations, preserved by a “@feaureType” and “@dataType” attributes
  • Full nested features are encoded as GeoJSON again, keeping their identifiers

Thanks to the sponsorship of the French geological survey – BRGM and the French environmental information systems research center – INSIDE, the 2.16.0 output now looks as follows:

{
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "id": "0001000001",
      "geometry": {
        "type": "Point",
        "coordinates": [51.0684, 1.4298]
      },
      "properties": {
        "@featureType": "Borehole",
        "identifier": {
          "value": "BSS000AAAA",
          "@codeSpace": "http://www.ietf.org/rfc/rfc2616"
        },
        "bholeHeadworks": [
          {
            "type": "Feature",
            "geometry": {
              "type": "Point",
              "coordinates": [51.0684, 1.4298]
            },
            "properties": {
              "@featureType": "BoreCollar",
              "collarElevation": {
                "value": -32,
                "@srsName": "http://www.opengis.net/def/crs/EPSG/0/5720",
                "@srsDimension": "1",
                "@uomLabels": "m"
              }
            }
          }
        ],

Status Monitoring module promoted to Core

The Status Monitoring module has been promoted to core, and is now included in the GeoServer status page by default!

This module adds a new tab to the Server Status page, with system statistics so that you can monitor the system which GeoServer is running on from the Web GUI.

../../_images/gui.png

Authentication key module graduated to extension

The “Authkey” module has been graduated to extension, allowing security unaware applications to access GeoServer. Reminder, in order to keep the system secure the keys should be managed as temporary session tokens by an external application (e.g. MapStore can do this).

PostGIS data store improvements

The PostGIS data store sees a few improvements, including:

  • TWKB encoding for geometries for all WMS/WMTS requests, reducing the amount of data travelling from the database to GeoServer
  • The JDBC driver used to transfer all data as ASCII, the code was modified to allow full binary transfer when prepared statements are enabled (driver limitation, binary can only be enabled in that case)
  • SSL encryption control, the driver defaults to have it on with a significant overhead, if the communication is in a trusted network the encryption can be disabled with benefit to performance
  • Improved encoding of “or-ed” filters, which now use the “in” operator where possible, increasing the likeliness that an eventual index o nthat column will be used
  • Native KNN nearest search when using the “nearest” filter function

OGC/GDAL stores updated to GDAL 2.x

The OGR datastore as well as the GDAL image readers have been updated and now work against GDAL 2.x official binaries, without requiring custom builds any longer.

The OGR datastore can open any vector data source and, in particular, it can use the native FileGBD library when using Windows. It’s also interesting to note that it can open Spatialite files, quite important now that the direct Spatialite store is gone.

Azure GWC blobstore

Tiles can now be stored in Azure blob containers, increasing GWC compatibility with cloud environments, after the already existing S3 support.

A warning though, Azure does not provide, unlike S3, a mass blob delete API, so on truncate GWC will have to go and remove tiles making a DELETE request for each (using parallel requests of course).

SLDService community module graduated to extension

The SLDService community module allowed to generated classified maps of vector data based on criterias such as equal interval, quantiles and unique values.

The same module has now graduated to extension, providing also data filtering based on standard deviation, equal area classification, and offering all the same services on raster data as well (with automatic sub-sampling when the source image is too large).

For example, creating a five classes quantile classification based on states persons over a custom color ramp can be achieved using the following:

curl -v -u admin:geoserver -XGET
  http://localhost:8080/geoserver/rest/sldservice/states/classify.xml?attribute=PERSONS&method=quantile&intervals=5&ramp=CUSTOM&startColor=0xf7fcb9&endColor=0x31a354&fullSLD=true

New Community Modules

  • WMTS styling module, which adds the ability to get/put a style on a per layer basis using restful resources exposed as ResourceURL
  • OGC API module, including implementations of the new OGC Web APIs for Features, Tiles and Styles (more to come in the upcoming months). Mind, these are cool but also prototypes based on specifications still in draft form, we have warned you, the API will likely have a few rounds of changes still before it stabilizes.
  • MapML community module. See this video for step by step installation instructions.

Other assorted improvements

There are many improvements to look at in the 2.16.0 release notes, as well as in the 2.16-RC release notes. Cherry picking a few, we hav:

  • Integrated GWC fails to seed layers if any data security is configured
  • Default Datastore Parameters panel does not allow https:// protocol values
  • Parameter Extractor plugin cannot mangle URL correctly if Monitor plugin is installed
  • Permit extensibility of Common Formats from Layer Preview page
  • Update name to id in OGC API Collection
  • Add support for configuring ACL in gwc-s3 community module
  • Enhance mongodb schema generation

About GeoServer 2.16

GeoServer 2.16 has been first released in September 2019.

by Andrea Aime at September 18, 2019 04:47 PM