Working with movement data analysis, I’ve banged my head against performance issues every once in a while. For example, PostgreSQL – and therefore PostGIS – run queries in a single thread of execution. This is now changing, with more and more functionality being parallelized. PostgreSQL version 9.6 (released on 2016-09-29) included important steps towards parallelization, including parallel execution of sequential scans, joins and aggregates. Still, there is no parallel processing in PostGIS so far (but it is under development as described by Paul Ramsey in his posts “Parallel PostGIS II” and “PostGIS Scaling” from late 2017).
At the FOSS4G2016 in Bonn, I had the pleasure to chat with Shoaib Burq who ran the “An intro to Apache PySpark for Big Data GeoAnalysis” workshop. Back home, I downloaded the workshop material and gave it a try but since I wanted a scalable system for storing, analyzing, and visualizing spatial data, it didn’t really seem to fit the bill.
Around one year ago, my search grew more serious since we needed a solution that would support our research group’s new projects where we expected to work with billions of location records (timestamped points and associated attributes). I was happy to find that the fine folks at LocationTech have some very promising open source projects focusing on big spatial data, most notably GeoMesa and GeoWave. Both tools take care of storing and querying big spatio-temporal datasets and integrate into GeoServer for publication and visualization. (A good – if already slightly outdated – comparison of the two has been published by Azavea.)
My understanding at the time was that GeoMesa had a stronger vector data focus while GeoWave was more focused on raster data. This lead me to try out GeoMesa. I published my first steps in “Getting started with GeoMesa using Geodocker” but things only really started to take off once I joined the developer chats and was pointed towards CCRI’s cloud-local “a collection of bash scripts to set up a single-node cloud on your desktop, laptop, or NUC”. This enabled me to skip most of the setup pains and go straight to testing GeoMesa’s functionality.
The learning curve is rather significant: numerous big data stack components (including HDFS, Accumulo, and GeoMesa), a most likely new language (Scala), as well as the Spark computing system require some getting used to. One thing that softened the blow is the fact that writing queries in SparkSQL + GeoMesa is pretty close to writing PostGIS queries. It’s also rather impressive to browse hundreds of millions of points by connecting QGIS TimeManager to a GeoServer WMS-T with GeoMesa backend.
Spatial big data stack with GeoMesa
One of the first big datasets I’ve tested are taxi floating car data (FCD). At one million records per day, the three years in the following example amount to a total of around one billion timestamped points. A query for travel times between arbitrary start and destination locations took a couple of seconds:
Travel time statistics with GeoMesa (left) compared to Google Maps predictions (right)
Besides travel time predictions, I’m also looking into the potential for predicting future movement. After all, it seems not unreasonable to assume that an object would move in a similar fashion as other similar objects did in the past.
Early results of a proof of concept for GeoMesa based movement prediction
Big spatial data – both vector and raster – are an exciting challenge bringing new tools and approaches to our ever expanding spatial toolset. Development of components in open source big data stacks is rapid – not unlike the development speed of QGIS. This can make it challenging to keep up but it also holds promises for continuous improvements and quick turn-around times.
If you are using GeoMesa to work with spatio-temporal data, I’d love to hear about your experiences.
This post is to inform you about an error preventing the Landsat previews to be displayed in the Download tab. After the search, clicking any preview will cause the error "Unable to connect".
The error is caused by changes of the Landsat Collection 1 preview url in the site earthexplorer.usgs.gov . The search queries to Earthdata earthdata.nasa.gov still return the old urls, causing this issue. I hope that they will fix this soon, but at the moment the Landsat previews aren't available in SCP, therefore the only solution to download Landsat images is to disable the option "only if preview in Layers" . This way you should be able to download all the images listed in the table.
You can use OGR to move data into and out of CARTO. And you can use QGIS to view and edit layers supported by OGR. So it would stand to reason that you should be able to use QGIS to view and edit CARTO data directly: but how?
Here’s one quick and dirty way to connect QGIS to your CARTO layers.
First, you need to make sure QGIS can access your layers using a CARTO master API key. The OGR driver reads system tables, so it requires the master key to operate.
Open the QGIS Preferences menu.
Navigate to the System panel.
Scroll to the Environment area.
Add a new environment variable, CARTO_API_KEY.
Put your API key in the “Value” field.
Press the OK button.
Shut down QGIS and re-open it to bring the new environment variable into effect.
Now we need to record the connection information OGR will need to access CARTO, and put that information into a “VRT” file.
A VRT file defines connection information and layer names so that an OGR client (like QGIS) can easily connect to a source without reading a lot of metadata. Here’s an example minimal VRT file with two layers defined:
This file reads CARTO tables nyc_subway_stations and nyc_streets and exposes them to QGIS using the names “subway_stations” and “streets”.
The <LayerSRS> should always be EPSG:4326, as that is the system CARTO always uses.
The <SrcDataSource> is of the form “Carto:username”, where “Carto” tells OGR what driver to use and the “username” is your CARTO user name. For multi-user accounts, “username” must be the user name and not the organization name.
The <GeometryType> is optional, but ensures that OGR knows whether the input layer is a point, line or polygon.
You can test your VRT file using the ogrinfo utility. You should be able to run ogrinfo and get a listing of layers back, for example:
# ogrinfo carto.vrt
INFO: Open of `carto.vrt'
using driver `OGR_VRT' successful.
Once you have a working VRT file, you can add the file as a layer in QGIS!
Adding the Layer
After you’ve set up your API key and authored your VRT file, go to Layer > Add Layer > Add Vector Layer… in QGIS, and select your VRT as the source vector dataset, using a “File” source type.
If your VRT includes multiple layer definitions, you’ll be asked to select which layers (or all of them) that you want to add. Then you should be able to see the data draw on your QGIS map!
Working with the Layers
Once you have loaded the layers, they work just like any other QGIS layer:
You can style them any way you like.
You can include them in printed output.
You can reproject the map and see them in other projections.
You can edit them (yes, really!)
You can include them in QGIS analyses.
Since you are editing the live data in CARTO, it’s possible to apply edits in QGIS and see your published CARTO maps update in real time!
Con el objetivo de mejorar gvSIG Online, os pedimos que dediquéis unos pocos minutos a completar una pequeña encuesta.Todas las respuestas serán tratadas de forma confidencial y no serán utilizadas para ningún propósito distinto a la investigación que estamos realizando sobre gvSIG Online.La encuesta consta de 20 preguntas y te dedicará unos 5 minutos el completarla.
We have prepared an exercise that you have to complete and send it to us to be evaluated in order to get the certificate. The cost of the certification has been reduced considerably with the objective to allow anyone to get it.
All the information about the exercise, the mail address to send it, and the payment is available at the PDF file at this link.
OpenMapTiles has always encouraged the development of map services by giving an option to build a self-hosted map with open-source tools and by providing free non-commercial hosting. For those who support us with their paid plan, we introduce another premium service: a new Street map style.
Map with a clear roads’ hierarchy
Each map style should highlight only the information relevant to its primary objective. When we were designing new map style, we focused on transportation. Therefore the name Streets.
Streets style is a base map highlighting different transportation methods. Therefore we include roads, railway network, ferry routes, and airports. Moreover, it also shows all relevant POIs like bus stops, petrol stations or subway stations.
Street style is a fresh alternative to Google Maps default style.
To highlight the roads’ hierarchy, each type of road is represented by different color. While the most saturated colors are representing major roads like highways, more pale colors are used in combination with thin lines for minor roads. By looking at the map, you can at the first look easily judge the overall network and make a quick decision.
Different types of transportation in Hong Kong and bilingual labels
Only relevant information for quick decision
When looking at a map, you want to capture all necessary information as fast as possible. On the other hand, placing too many details decreases the readability and it takes you longer to get oriented on a map. Therefore we carefully picked only relevant information for each zoom level like POIs, labels, minor roads, and others. As you zoom in, more details appear they disappear as you zoom out to give you clear overlook. Since this is our first version, there could still be some missing or redundant items, especially POIs. Any comments on this topic are very welcomed.
A good map should also help the person who is using it recognize the features of the real world on the map. On lower zoom levels, we are displaying the land use information which can be seen from a global satellite image. On the higher zoom levels, we show transparent 3D buildings to help a person with orienting in an urban jungle. The buildings can be turned off.
The map is fully customizable as our MapTiler TileHosting supports creating own styles or derivatives of the existing ones with built-in WYSIWYG design tool.
As each of our map, the Streets style also includes a possibility to change the language. Currently, there are more than 50 languages supported.
Land use information which can be seen from a satellite image from a global satellite image.
All the map styles including Streets are visible as a browsable map on the MapTiler TileHosting main page or in the administration.
Hemos preparado un ejercicio que debéis resolver y enviarnos para su evaluación y poder así obtener dicho certificado. El coste de la certificación lo hemos reducido al máximo, con el objetivo de que cualquier interesado en obtenerlo pueda acceder a él.
In order to simplify the installation of the latest PDAL release (Point Data Abstraction Library, https://pdal.io/, version 1.7.0 1.7.2) on Fedora, I have created an updated set of RPM packages, again including the vertical datums and grids available from OSGeo (i.e., .gtx files from here).
The installation is as simple as this (the repository is located at Fedora’s COPR):
I presented my “PostGIS for Managers” talk for the last time (at least in this form) today at FOSS4G North America. The reason it’s for the last time is that the central conceit it’s built around, that a core decision is between a proprietary and an open source database, isn’t really operative anymore. The real decisions are now being driven by other considerations, like cloud platforms, and the services available there. So, it’s not really PostgreSQL versus Oracle anymore.
I also presented my “SQL Festival” talk, for the first time! New material is always a little challenging: will it work, is it the right level for the audience? It seemed to be well received, a mix of generic SQL tidbits, and some specific interesting queries you can do with PostGIS.
This release brings a ton of fixes and compatibility enhancements for newer Android versions.
And then it brings cloud profiles. And I think this one is huge! :-)
In the last months a new company entered the geopaparazzi community and they have been doing nice contributions. The guys from Geoanalytic were working with us on the implementation of a more structured version of the cloud projects synchronization: Cloud Profiles!
We have been working on this already at the Bonn code sprint together with Cesar from the company Scolab. And now we are finally at a first release that supports this concept.
Cloud Profiles are a great way of easing Geopaparazzi’s data handling tasks. When a web server is configured to serve Cloud Profiles, Geopaparazzi can automatically download Projects, Basemaps, Spatialite Overlays, forms for Notes, and other files. When a user activates a downloaded Profile, Basemaps are made available, Overlays are attached to the Map View and layers are set to display.
The Geoanalytic guys have also been so nice to write a reference geopaparazzi profile server that can be used as a starting point. You can read more about it and download it from here.
The real power behind cloud profiles is that you can use your own server for this, but you could also make it much simpler by using Generic Cloud Server. You can set up your own Cloud Profile server by putting your files on a generic cloud file server like Dropbox or Google Drive, and editing a Cloud Profiles list like the one above and putting it on the cloud file server as well.
Technically it has been necessary to create a notification icon for geopaparazzi, so you will now have that visible when geopaparazzi is active.
The positive side is that:
you will always know when geopaparazzi is active
you can always see information about your position directly in the notification area
Other features and fixes
Values in settings. The settings screen now shows the actual values:
Buttons size. Small buttons are hard to see and press while on the trail. While it can be difficult to show a lot of information on a small screen, where there is room, you can now change the button and text size. This applies to the notes view for now:
Dynamic hints. Dashboard button hints are dynamic where possible. You can now see from there how many notes and logs you have in store. Just long-tap on the buttons:
Notes settings. The notes settings view is now accessible from the notes list (it was hidden in the gps data list menu). Access it from the palette icon:
PDF export. The PDF export now allows to export a subset of notes instead of everything contained in the project. The user can select the notes he/she likes to export and those will be converted in the pdf version.
Linked resources. It is now possible to view not only images stored in a Spatialite database when they are related to (geospatial) features but also for example PDF.
Note that while a user can take pictures in the field and link them to a feature, in the case of PDF, the resource has to be linked before, i.e. it can be just viewed only from geopaparazzi.
Remove all. In the basemaps view it is now possible to remove all maps in one tap. This is really helpfull for those that are used to load maps through the load-folder option and then need a lot of time to remove them to have things more readable.
Mapurl service. One sad note is that the Tanto Mapurl service, which was used to download automagically configured mapurls based on WMS services, is no longer maintained and has therefore been removed also from geopaparazzi.
images taken within Geopaparazzi are not geotagged.
At CARTO we challenge ourselves to use our platform as our users do. For us, this serves several purposes:
Because we care a lot about our users experience, this way we understand better the pains and gains of using our platform.
We still keep learning a lot: from SQL to React, going through WebGL to projections, spatial algorithms and mapping in general.
Since use cases at CARTO go from very simple visualizations to complex geospatial solutions, one of the challenges I set for myself was to solve the graph coloring problem with CARTO.
Graph coloring is a technique to assign colors to the vertices of a graph such that no two adjacent vertices share the same color.
But what does graph coloring have to do with maps?
Map coloring is an application of graph coloring so each two adjacent polygons (countries, provinces, etc.) are assigned different colors.
Map coloring helps to better understand maps and solve other kind of problems like mobile radio frequency assignment or other scheduling problems.
Having said that, graph coloring is a very interesting topic covered by several theorems and algorithms, like the 4-color theorem which basically states that any map can be colored using 4 colors.
Coloring the world map in 4 colors
Let’s put ourselves in the boots of a CARTO user that wants to draw the world map in 4 colors. We don’t have much programming skills, but we know a bit of SQL and spatial concepts.
For this case we’ll work with the world_borders layer that can be imported into any CARTO account from our Data Library.
Modelling a graph in PostGIS
The graph coloring problem needs two mathematical artifacts to be solved: a model and an algorithm.
So we have to model a PostGIS table into a graph. This may sound like complex stuff but it can be actually solved in less than 10 lines of SQL by creating an adjacency list:
With this query we obtain the world map adjacency list having for each country, the list of adjacent countries and its valence (the number of adjacent countries).
Learning point: Note the use of PostgreSQL Window functions to aggregate and count the adjacent countries in a single column. Window functions are a really handy resource to have in your SQL tool box.
We can generalize this query by wrapping it as a PostgreSQL function so that it can be executed for any of our datasets and store the results in a table:
Learning point: By wrapping a query into a function we are able to re-use it and even provide of a geospatial framework to the users in our CARTO organization.
Note as well the use of the EXECUTE and format functions to avoid missusing of the function or SQL injection issues.
The Welsh-Powell algorithm: a greedy coloring approach
Let’s start by implementing the most simpler algorithm for graph coloring, the Welsh-Powell one. This algorithm is as follows:
Find the adjacency list and valence for each vertex (in this case for each country)
List the vertices in order of descending valence
Color the first vertex in the list with color 1
Go down the list and color every vertex not connected to the colored vertices above the same color. Then cross out all colored vertices in the list.
Repeat on the uncolored vertices with a new color, always working in descending order of valence until all the vertices have been colored.
Let’s see how we can implement the Welsh-Powell algorithm as a PostGIS function.
A deep view on the Welsh-Powell algorithm implementation
In this case we have two different sections in our function. First we DECLARE temporary variables needed to store results and second we have the actual algorithm implementation between a BEGIN and END clause.
Since we need to model our dataset as an adjacency list we start by calling our adjacency_list function:
Then we need to know the number of rows in the dataset and start an iterative algorithm:
Let’s color the first vertex in the list with color 1
Go down the list and color every vertex not connected to the colored vertices above the same color
Finally, repeat on the uncolored vertices with a new color, always working in descending order of valence until all the vertices have been colored.
Now we have two functions that can be stored in our CARTO account by running them into the SQL console in our BUILDER dashboard, but how do we run this map coloring algorithm?
Best option here is using our batch SQL API, this allows us to run any SQL that could take several seconds or minutes safely. In this case we just have to do a SELECT to our greedy function passing the table_name and our user_name. We can do this directly from a terminal:
The resulting table world_borders_adjacency_list contains a color assigned for each cartodb_id, now we just can join this table with the original world_borders table and apply a category thematic to visualize the result (plus a bit of CartoCSS magic):
In this case we have colored every adjacent country with a different color, in a total of 5 colors, but can we do it better?
The Kempe’s graph coloring algorithm
In 1879, Alfred B. Kempe tried to prove the 4-color theorem and while years later it was demonstrated that it didn’t solved the problem for all cases, the algorithm he designed still can be used to color the world map using just 4 colors.
The Kempe’s graph color algorithm is as follows:
Convert the map to a graph (in this case an adjacency list)
Choose a vertex (polygon) with less than five neighbors and remove it from the graph. This may cause some vertices that previously had five or more neighbors to now have less than five.
Choose another vertex from the updated graph with less than five neighbors and remove it.
Continue until you’ve removed all the vertices from the graph.
Add the nodes back the graph in reverse order from which you removed them.
Color the added node with a color that is not used by any of its current neighbors.
Continue until you’ve colored in the entire graph.
This is a little bit more complex algorithm that still can be solved using a pure PostgreSQL function:
Again we can create the function from the SQL console, execute it for the world_borders dataset using the batch SQL API and then map it with BUILDER. Let’s see the result:
In this case we have colored the world map in 4 colors, challenge accomplished!
Learning point: We have not only learned how to solve the graph coloring problem with CARTO but we have ended up creating the basis for a geospatial framework by creating PostgreSQL functions into our CARTO account.
More map coloring
So, let’s finish by applying these map coloring functions that now are part of our own geospatial framework inside CARTO to some other of our datasets.
The 4 color theorem applied to the US states dataset
A greedy approach to map color the US counties dataset
Another greedy example with the Spain municipalities
Note that the map coloring algorithm implementations presented in this blog post are totally naive and don’t pretend to be exact or to be used under a production environment, they just pretend to showcase a user workflow to solve a geospatial problem with CARTO.
For reference, all these functions are available here. Feel free to add any comment or improve them.
If you like the kind of stuff we are involved in you may want to join us :)
IOSACal is an open source program for calibration of radiocarbon dates.
A few days ago I released version 0.4, that can be installed from PyPI or from source. The documentation and website is at http://c14.iosa.it/ as usual. You will need to have Python 3 already installed.
The main highlight of this release are the new classes for summed probability distributions (SPD) and paleodemography, contributed by Mario Gutiérrez-Roig as part of his work for the PALEODEM project at IPHES.
A bug affecting calibrated date ranges extending to the present was corrected.
On the technical side the most notable changes are the following:
requires NumPy 1.14, SciPy 1.1 and Matplotlib 2.2
removed dependencies on obsolete functions
improved the command line interface
You can cite IOSACal in your work with the DOI https://doi.org/10.5281/zenodo.630455. This helps the author and contributors to get some recognition for creating and maintaining this software free for everyone.
As you knew a raster image could be represented using a ‘colour table’. These colour tables are generated from the colour ramps. In gvSIG Online we have a series of colour ramps but for customized cases that may be interesting to create a new one. There is no problem, gvSIG Online allows us to do it.
We show you how to do it through a video-tutorial. In this video we follow these steps:
We have a geoportal with a raster image where we can apply a concrete symbology (colour table). Let’s imagine that any of the existing colour ramps is what we want to apply.
We access to the dashboard and we see that in our gvSIG Online there are several symbol libraries that have a series of colour ramps.
We add a new symbol library that has two new colour ramps. For each ramp it indicates the colours that configure it.
Finally, we apply a colour table to our raster image, using the new available colour ramps.
As you can see, gvSIG Online is growing in functionalities day by day and it is becoming a reference when implementing Spatial Data Infrastructures.
Finalizamos la parte teórico-practica del curso gratuito gvSIG aplicado a Medio Ambiente aprendiendo a generar mapas imprimibles.
En este tema trabajaremos sobre el “layout”, es decir, el mapa impreso como resultado de todos los análisis y geoprocesos que hemos realizado durante el curso y que realizaremos en nuestro trabajo cotidiano.
Si bien es cierto que la impresión de mapas es cada vez menor desde la aparición de las plataformas de webmapping y las plataformas para la gestión de Infraestructuras de Datos Espaciales (como gvSIG Online) no está de más aprender a manejar esta herramienta para utlizarla en informes técnicos, publicaciones, etc.
Aprenderemos a realizar una plantilla de mapas, a establecer los elementos necesarios para que el mapa sea comprensible y a insertar las vistas de nuestros proyectos para imprimir nuestro trabajo.
Podéis acceder al nuevo tema en el siguiente enlace:
Here's a new milestone release of MapGuide Maestro. Here's what's new in this release.
One of the design goals of mapguide-react-layout viewer was to be highly compatible with existing Web and Flexible layout documents ensuring for the same authoring experience as the existing AJAX and Fusion viewer offerings.
However, the authoring experience in Maestro knows nothing about mapguide-react-layout, so any authoring experience still assumes the use of AJAX or Fusion viewers.
With this release, we have a new preference for specifying the base URL of a mapguide-react-layout installation.
Once this is set, the Web and Flexible Layout editors light up with additional viewer URLs allowing you to load the Web/Flexible Layout with a mapguide-react-layout template of your choice.
Since the UI for this has changed from a read-only text box to a combo box, once cannot easily select the URL to copy/paste. To workaround this, a convenience "Copy to Clipboard" button is included to easily copy the current viewer URL for pasting elsewhere.
The other authoring experience change is that a new Flexible Layout will no longer include Fusion widgets that are not supported by mapguide-react-layout. These widgets are really esoteric ones, so most authors won't probably notice any differences.
The new MgTileSeeder tool has been improved with the following changes:
A new --failed-requests parameter for specifying a log file to log failed requests to
A new --max-parallelism parameter for controlling the max degreee of parallelism when sending tile requests
New xyz_replay and mapguide_replay commands for re-requesting failed requests from a log file previously logged via the new --failed-requests parameter
Improved OGR Feature Source support
The OGR provider has been the primary recipient of my continued developer attention since the release of MapGuide Open Source 3.1.1 as there are many things in the provider with room for improvement. Given this, it was time to also give the OGR provider equivalent treatment in Maestro, so with this release, feature sources using the OGR provider now has its own specialized editor.
Most of the UI here should be self-explanatory, but the Other Properties section deserves some explanation. This is a future-proof data grid for editing connection properties that may be introduced in future builds/releases of the OGR provider. Another case where the OGR provider gets better treatment is the support in the SHP feature source editor to convert it across to use the OGR provider.
This feature was added to improve the SHP story for MapGuide on 64-bit Linux where the current dedicated SHP provider has unresolved 64-bit portability problems making it unusable. However, the OGR provider has no such issues making it a viable alternative FDO provider for SHP files on 64-bit Linux. This conversion feature helps facilitate this transition in an easy manner. However, there's still some glaring problems with the converted OGR feature source, such as the FDO provider using a hard-coded schema name of "OGRSchema". Nothing some python scripting elbow grease can't fix afterwards. Or you can watch this space :)
Fix: Rename resource with "update references" checked will disregard overwrite flag
Fix: NullReferenceException when ticking a new geometry type and adding a rule to the grid for the first time in Layer Definition editor
Fix: MgInvalidRepositoryTypeException when validating layers. Previous workaround was to disable validation on save. With this release you can safely re-enable this if you so choose.
Fix: Editing default path in line usage context in Symbol Definition editor does nothing
Fix: Cannot browse symbol definiton parameters in any field of the Path editor dialog
In Movement data in GIS #2: visualization I mentioned that it should be possible to label trajectory segments without having to break the original trajectory feature. While it’s not a straightforward process, it is indeed possible to create timestamp labels at desired intervals:
The main point here is that we cannot use regular labels because there would be only one label for the whole trajectory feature. Instead, we are using a marker line with a font marker:
By default, font markers only display one character from a given font but by using expressions we can make it display longer text, including datetime strings:
If you want to have a label at every node of the trajectory, the expression looks like this:
You probably remember those parts of the expression that extract the m value from previous posts. Note that – compared to 2016 – it is now necessary to add the segments_to_lines() function.
The m value (which stores time as seconds since Unix epoch) is then converted to datetime and finally formatted to only show time. Of course you can edit the datetime format string to also include the date.
If we only want a label every 30 seconds, we can add a case statement around that:
segments_to_lines( $geometry ),
)) % 30 = 0
segments_to_lines( $geometry ),
This works well if the trajectory sampling interval is fairly regular. This is not always the case and that means that the above case statement wouldn’t find many nodes with a timestamp that ends in :30 or :00. In such a case, we could resort to labeling nodes based on their order in the linestring:
@geometry_part_num % 30 = 0
Thanks a lot to @JuergenEFischer for providing a solution for converting seconds since Unix epoch to datetime without a custom function!
Open-source stack, usable on multiple SDKs
Whenever you work on a map, you need to define how each geographical feature will look like. This definition is described using a styling language and it says, for example, that rivers should be rendered starting zoom level 10, with blue color and it should be 8px bold line.
The styling languages differ in the syntax used for defining the final look and in the implementation by various software tools (mobile SDKs, raster servers, web APIs).
These are the most popular styling languages today:
GL JSON, which is the main styling language of OpenMapTiles, is based on JSON file format defined originally by Mapbox and adopted by ESRI and others - and it supports Mapbox mobile SDKs, ArcGIS Pro and OpenMapTiles Server or TileServer GL raster server.
Tangram YAML uses the markup style popular in Python programming language and can be used with Tangram SDK and Tangram Paparazzi.
CartoCSS is similar to CSS used to style websites and is implemented with tools using Mapnik map renderer (the primary OpenStreetMap.org toolkit for rendering raster map tiles) and is also usable with the Carto mobile SDK. There is no native vector tile viewer for web implementing CartoCSS directly.
OGC SLD standard based on XML. This language is hard to write directly by humans, so it is typically created by converting from other styling languages or using a visual editor - such as desktop GIS tools like QGIS.
All of these styling languages are compatible with the OpenMapTiles.
OpenLayers supports vector as well as raster tiles and is most advanced in handling coordinate systems and map projections.
Mapbox GL JS is another web mapping library. Supports both raster tiles and vector format used by OpenMapTiles.
Tangram is another rendering client for web applications which supports OpenMapTiles vector tile schema.
Multiple mobile map SDKs
For bringing your map into a mobile device you need an SDK, which is a kit allowing the development of applications for the mobile operating system. With OpenMapTiles, you have a choice of several SDK for different platforms.
GL JSON is closely tied to Mapbox mobile SDKs, which support both Android and iOS and in addition the Unity game engine.
Tangram YAML is connected to Tangram SDK, which allows you to create mobile apps for iOS and Android.
CartoCSS can be best used in Carto mobile SDK, which has in-built support for Android, iOS and Windows Phone platforms.
Raster tiles on the server side
The freedom of choice is also kept on the server side with a variety of servers rendering raster as well as vector tiles.
With the release of three new map styles, CARTO shows the commitment to the OpenMapTiles project. The styles are available for web and mobile, both raster and vector. The styles are based on OpenMapTiles data schema, use the project’s vector tiles and are implemented in three different styling languages while keeping the same look and feel across the different software tools.
Voyager - colored map with clear road hierarchy
Positron - light gray map for further displaying of data
Dark Matter - dark gray map for displaying data of bigger size
The code is fully open-source and can be found on GitHub.
You can see our implementation of these styles with several others beautiful styles such as Streets and Topo in MapTiler Cloud hosting.
While we are waiting for this year’s grant proposals to come in, it is time to look back at last year’s winning proposals and their results. These are the reports on the work that has been done within the individual projects:
QGIS 3D – Martin Dobias
Results are included in the QGIS 3.0 release. As proposed in the grant, a new 3D map view has been added together with GUI for easy configuration of 3D rendering. The 3D view displays terrain (either from a DEM raster layer or a simple flat area) with 2D map rendered on top of the terrain. In addition to that, vector layers can be rendered as true 3D entities: points may be visualized as simple geometric shapes or as 3D models (loaded from a file), polygons and linestrings are tessellated into 3D geometries. 2D polygons can be turned into 3D objects using extrusion, possibly with data-defined height – an easy way how to display buildings, for example. Data with 3D coordinates have the Z values in geometries respected. Although the 3D view is still in its early stages, it is already usable for many use cases. Hopefully this functionality will help to attract even more users to QGIS!
Transaction group allows now to play more easily with stored procedure calls. It is now possible to use ‘QgsTransaction.ExecuteSQL’, dirty the edit buffer to let user be able to save changes, and give a name to that action so that the UNDO/ REDO actions are more explicit. See the Pull requests for more details:
We’ve unified all the various opacity, rotation and scale controls to use the same terminology and numeric scales. We’ve also updated ALL methods for setting opacity, rotation and scale within the PyQGIS API to use consistent naming and arguments, making the API more predictable and easy to use. Lastly, we’ve also added a new reusable opacity widget (QgsOpacityWidget) to the GUI library so that future code can (and 3rd party scripts and plugins) can follow the new UI conventions for opacity handling.
Extend unit test coverage for geometry classes – Nyall Dawson
We’ve extended the unit testing coverage for all the underlying geometry primitive classes (points, lines, polygons, curves, collections, etc) so that all these classes have as close to 100% unit test coverage as possible. In the process, we identified and fixed dozens of bugs in the geometry library, and naturally added additional unit tests to avoid regressions in future releases. As a result QGIS’ core geometry engine is much more stable. Furthermore, we utilised the additional test coverage to allow us to safely refactor some of the slower geometry operations, meaning that many geometry heavy operations will perform much faster in QGIS 3.0.
Processing algorithm documentation – Matteo Ghetta & Alexander Bruy
The new Help system is landed and already available: when opening a Processing algorithm and clicking on the Help button, the guide of the algorithm will be showed in the default browser.
Many of the QGIS Processing algorithm guides have been enhanced with pictures and new or enhanced descriptions. A consistency number of Pull Requests have been already merged and many others are in review. Just a few descriptions need to be still enhanced.
Currently all the QGIS algorithms have been described and all the PR in the doc repository have been merged (kudos to Harrissou for all the reviews!).
Right now the Help button of each Processing dialog will open the related page of the algorithm, BUT:
if the name of the algorithm is made by only ONE word (e.g. clip, intersection…), the help button will open the browser to also the correct section (that is, the user will see directly the description of the related algorithm)
if the name of the algorithm has >1 words (e.g. split polygon with lines, lines to polygon, ecc.) the Help button will open the correct page (so the algorithm GROUP) but is not able to go to the correct algorithm anchor. This is because sphinx converts “split with lines” in “split-with-lines” while QGIS system will always cast the words “split-with-lines” in “splitwithlines”. Not a big deal, but IMHO a pity.
We are really too close to the solution.
So Processing Help system right now consists of:
QGIS algs -> documented
GDAL algs -> documented
GRASS -> documented (own docs)
Orfeo -> documented (own docs)
SAGA -> nothing documented
Thanks to QGIS Grants to provide this chance to give a big improvement to the Processing framework even if not in a coding way!
Last but not least, we had another project that was not part of the grant programme but was also funded by QGIS.ORG in 2017:
Python API documentation – Denis Rouzaud
QGIS Python API Documentation is created using Sphinx and this work is available on Github. The repo is a fork of QGIS’ one and has been merged in the meantime. The docs are available at qgis.org/pyqgis. It uses a new theme (sphinx_rtd_theme aka ReadTheDocs theme). Some improvements were brought in (not exhaustive):
QGIS theming with colors and icon
Summary of methods and attributes for classes
Module index (not available before)
Correct display of overloaded methods
Full Python signature in Docstring
In former SIP versions, it was not possible to use the auto generated signature if a Docstring already existed. This means any documented method could not have a signature created. Unfortunately for this project, the vast majority of methods in QGIS API are documented!
The source code of SIP was modified and theses changes got merged upstream. See rev 1788 to 1793 in SIP changelog. It will be released in upcoming 4.19.7 version. QGIS source code was modified accordingly to prepend auto generated Python signatures to existing Docstrings. Using a CMake configuration file for each module (core.sip.in, gui.sip.in, etc.) was required to avoid syntax errors when using former version of SIP (since bumping minimum version is not realistic).
Many things were fixed in sipify script :
Creation of links to classes, methods
Handling/fixing of Doxygen annotations \see, \note, \param
Handling of code snippets: c++ vs Python. Only Python are shown.
Thank you to everyone who participated and made this round of grants a great success and thank you to all our sponsor and donors who make this initiative possible!
The gvSIG Desktop plugin to create forms for field data gathering with gvSIG Mobile in an easy way is now available.
Through this plugin we can create custom forms for censuses, surveys, inventories, inspections …, with the different type of fields that we want (drop-down, multi-selection, date, true-false …).
The main advantage of working with these forms is that we can create different sections, which cover different topics, and each of them with their customized forms.
We can also send the file containing these forms to the different teams that do the field work so that everyone works with the same types of data.
In order to use this new extension it is necessary to install it previously in gvSIG Desktop, and if we want to import the field data in gvSIG Desktop then we must also install the extension that allows us to do it.
In the following video we can see how to install both extensions, how to create the forms, how to take the field data with them and how to load this data in gvSIG Desktop later for analyze them:
More information about how to work with gvSIG Mobile at this post.
Ya está disponible la extensión para gvSIG Desktop que permite crear formularios para toma de datos en campo en gvSIG Mobile de una forma sencilla.
Mediante esta extensión podemos crear formularios personalizados para censos, encuestas, inventarios, inspecciones…, con los campos que deseemos de diferentes tipos (desplegables, multiselección, fecha, verdadero-falso…).
La principal ventaja de trabajar con estos formularios es que podemos crear diferentes secciones, que abarquen temáticas diferentes, y cada una de ellas con sus formularios personalizados.
Además podemos enviar el fichero que contiene dichos formularios a los distintos equipos que hagan el trabajo de campo para que todos trabajen con los mismos tipos de datos.
Para poder utilizar esta nueva extensión es necesario instalarla previamente en gvSIG Desktop, y si deseamos importar los datos de campo en gvSIG Desktop después también deberemos instalar la extensión que nos permite hacerlo.
En el siguiente vídeo podéis ver cómo instalar ambas extensiones, cómo crear los formularios, cómo tomar los datos en campo con ellos y como cargar dichos datos en gvSIG Desktop después para su posterior análisis:
Más información sobre cómo trabajar con gvSIG Mobile en este post.
gvSIG Crime is the solution offered by the gvSIG Association to organize, analyze and maintain information related to security and crime. A platform adapted to the needs of each organization and territory, that includes several components, an important web part that establishes the necessary computer infrastructure to organize, share and access spatial information, with all types of spatio-temporal analysis tools, support for field tasks, …, and components that are more oriented to advanced analysis such as the desktop Geographic Information System (GIS).
It is precisely in this last part where this crime analysis course is focused, which is intended as an introduction to the use of gvSIG Desktop, an open source GIS software, as fundamental software to optimize the analysis of criminal information.
The course does not intend to make an exhaustive tour of all the gvSIG Desktop tools, and it’s focused on carrying out through practical exercises an invitation to explore the potential of its use in criminology.
The objectives of the course are:
Gain a better understanding of geospatial technology applied to crime mapping.
Perform queries based on attribute and location information to get accurate information.
Use crime databases to produce customized datasets and density maps, or for hot-spot analysis and other geoprocesses.
Learning basic programming concepts (scripting) to develop new analysis tools.
Gain a better understanding of Geostatistics applications for crime analysis.
The course starts with basic modules, in which the student can access the spatial information and the complexity will increase, showing the most advanced possibilities of using gvSIG Desktop in the last modules, through the development of scripts in Python and R.
Here we present the links to the different modules of the course:
gvSIG Crime es la solución que ofrece la Asociación gvSIG para organizar, analizar y mantener la información relacionada con la seguridad y la delincuencia. Una plataforma adaptada a las necesidades de cada organización y territorio, y que incluye diversos componentes, una importante parte web que establece la infraestructura informática necesaria para organizar, compartir y acceder a la información espacial, con todo tipo de herramientas de análisis espacio-temporal, soporte para tareas de campo, …, y componentes más orientados al análisis avanzado como el Sistema de Información Geográfica (SIG) de escritorio.
Es precisamente en esta última parte en la que se centra este curso de análisis del delito que pretende servir de introducción al uso de gvSIG Desktop, un SIG en software libre y gratuito, como software fundamental para optimizar el análisis de información delictual.
El curso no pretende hacer un recorrido exhaustivo por todas las herramientas de gvSIG Desktop y se centra en realizar mediante ejercicios prácticos una invitación a explorar el potencial de su uso en criminología.
Los objetivos del curso son:
Obtener una mejor comprensión de la tecnología geoespacial aplicada a la mapeo del crimen
Realizar consultas basadas los atributos de la información y su localización para obtener información precisa
Utilizar bases de datos de delitos para generar conjuntos de datos personalizados, mapas de densidad, análisis de puntos calientes (hot-spots) y ejecutar geoprocesos
Aprender conceptos básicos de programación (scripting) para poder desarrollar nuevas herramientas de análisis.
Introducirse en la aplicación de la Geoestadística para el análisis del crimen.
El curso comienza con módulos básicos, en los que el alumno aprenderá a visualizar información espacial e irá aumentando en complejidad, llegando a mostrar en los últimos módulos las posibilidades más avanzadas de utilización de gvSIG Desktop, mediante el desarrollo de scripts en Python y R.
A continuación os presentamos los enlaces a los distintos módulos que componen el curso:
Curso de SIG aplicado a gestión municipal (contiene un buen número de vídeo-tutoriales que recorren gran parte de las herramientas disponibles en gvSIG Desktop…por lo que podéis consultar aquel que os interese de forma especifica).
Lista de usuarios. Donde podéis lanzar consultas sobre el uso de gvSIG Desktop a la Comunidad gvSIG.
Grupo de Facebook. Hemos creado un nuevo grupo de Facebook orientado a los alumnos del curso y personas interesadas en el uso de los SIG en criminología.
Qual simbologia eu devo usar em um mapa de uso da terra? Nós temos a solução definitiva.
Para resolver essa questão, e facilitar a vida de pesquisadores de todo o Brasil o IBGE, lançou a terceira edição do Manual Técnico de Uso da Terra . O manual técnico oferece uma perspectiva sintonizada com as questões contemporâneas. Esta nova edição situa os estudos de uso da terra no contexto evolutivo do pensamento geográfico, contempla uma reflexão sobre os conceitos mais atuais que envolvem o tema, em especial sobre aqueles que tratam da sua posição no contexto da globalização da economia, dos problemas ambientais e da questão da equidade, e apresenta o Sistema de Classificação de Uso da Terra para mapeamentos em nível exploratório
O manual nos apresenta uma tabela com as cores, nos sistemas PANTONE, CMYK e RGB, que deve ser utilizadas no mapeamento da cobertura e do uso da terra de todo o Brasil, levando em consideração o Sistema de Classificação de Uso da Terra – SCUT . Para facilitar o uso dessa simbologia para mapeamento eu tomei a liberdade de criar um arquivo .xml com os dados dos estilo de cores retirados da tabela das Classes da cobertura e do uso da terra Níveis I e II.
Esse arquivo, que pode ser facilmente importado para o QGIS, carrega as informações de todas as classes suas respectivas simbologias agregando os símbolos ao biblioteca do software.
Faça o download das simbologias para mapas de classes de cobertura e do uso da terra.
A instalação de uma nova simbologia no QGIS é bem simples, basta seguir os seguintes passos:
Feito o download do arquivo, abra o QGIS e abra a aba Configurações > Gerenciador de Estilos > procure o botão Compartilhar > Importar
Navegue até a pasta onde se encontra o arquivo “sistema_de_classificacao_de_uso_ da_terra_ibge.xml” que você baixou.
Feito isso você verá todas as novas simbologias que serão adicionadas a biblioteca do QGIS. Clique na opção selecionar tudo e depois em importar.
Pronto agora você tem todas as classes da cobertura e do uso da terra e suas respectivas simbologias no seu QGIS, agora é só começar a mapear. Aproveite e instale também as simbologias para mapeamento pedológico.
As many of you know, in the gvSIG Association we have an open source software solution for the analysis and management of information related to crime and citizen security, gvSIG Crime. One of the technologies used in gvSIG Crime is gvSIG Online, that allows to generate geoportals, as simple or complex as we need, among many other things.
Precisely with gvSIG Online we have created a simple map viewer that allows us to visualize the 50 most dangerous cities in the world based on the homicide rate in the last year (2017).
Without a doubt, it is striking that 47 cities of these 50 cities are located in America, and 42 in Latin America.
The information presented is drawn from the last report of the Citizen Council for Public Safety and Criminal Justice (CCSPJP – Consejo Ciudadano para la Seguridad Pública y la Justicia Penal), a Mexican civil organization that generates the list of the 50 most dangerous cities in the world every year. For this report, which annually is an international reference, the CCSPJP uses a simple methodology, comparing the number of homicides per 100,000 inhabitants. They include only those cities that exceed 300,000 inhabitants and compute only intentional homicides or deaths due to aggression. The areas where there is a war conflict are excluded.
The data is extracted by combining several sources: journalistic information, official lists of governments and local authorities, reports from international organizations and NGOs.
Through this geoportal you can visualize the cities, request information about each of them on the number and annual homicide rate, consult the attribute table and search.
Here you have the video about the map and how the geoportal works:
Finally we take the opportunity to announce that we will publish a free course about gvSIG Desktop applied to criminology. Pay attention if you are interested in this issue…
Como todos sabréis una imagen raster la podemos representar utilizando una ‘Tabla de color’. Esas tablas de color se generan a partir de rampas de color. En gvSIG Online disponemos de .una serie de rampas de color pero para ciertos casos nos puede interesar crear una nueva. No hay ningún problema, gvSIG Online nos permite hacerlo.
Os mostramos mediante un vídeo-tutorial cómo hacerlo. En el vídeo se siguen los siguientes pasos:
Tenemos un geoportal con un ráster al que queremos aplicar una determinada simbología (Tabla de color). Imaginemos que ninguna rampa de color de las pre-existentes cumple con lo que queremos aplicar.
Entramos en la parte de administración y vemos que en nuestro gvSIG Online hay distintas bibliotecas de símbolos que contienen una serie de rampas de color.
Añadimos una nueva biblioteca de símbolos que contenga dos nuevas rampas de color. Para cada rampa indicamos los colores que la configuran.
Por último aplicamos una Tabla de color a nuestra capa ráster, utilizando las nuevas rampas de color disponibles.
Como veis, gvSIG Online crece día a día en funcionalidad y se está convirtiendo en un referente a la hora de implantar Infraestructuras de Datos Espaciales.
Ya tenéis disponible el Tema 6 del curso gratuito gvSIG aplicado a Medio Ambiente. Continuamos trabajando con información raster y subimos un peldaño más en la complejidad y las aplicaciones de gvSIG en el trabajo ambiental.
Aprenderemos a crear un Modelo Digital de Elevación a partir de curvas de nivel y realizaremos varios geoprocesos muy importantes y valiosos a partir de este MDE, como pueden ser análisis de visibilidad, análisis hidrológico, pendientes, orientación, etc.
Además, continuaremos aprendiendo a manejar imágenes de satélite creando Indices Normalizados de la Vegetación, como el NDVI, muy útil en nuestro trabajo.
Como muchos de vosotros sabréis en la Asociación gvSIG disponemos de una solución en software libre para el análisis y la gestión de información relacionada con el delito y la seguridad ciudadana, gvSIG Crime. Una de las tecnologías que se utiliza es gvSIG Online, que entre otras muchas cosas permite generar geoportales, tan sencillos o complejos como necesitemos.
Precisamente con gvSIG Online hemos creado un sencillo visor de mapas que permite visualizar las 50 ciudades más peligrosas del mundo en función de la tasa de homicidios del último año (2017).
Sin duda llama la atención que de las 50 ciudades 47 estén ubicadas en América y 42 en América Latina.
La información presentada está extraída del último informe del Consejo Ciudadano para la Seguridad Pública y Justicia Penal (CCSPJP), una organización civil mexicana que cada año elabora un listado con las 50 urbes más violentas del mundo. Para este informe, que anualmente es un referente internacional , el CCSPJP utiliza una metodología sencilla, comparando el número de homicidios por cada 100.000 habitantes. Incluyen únicamente aquellas ciudades que superan los 300.000 habitantes y computan solo los homicidios intencionales o muertes por agresión. Se excluyen las zonas donde hay un conflicto bélico.
Los datos se extraen combinando varias fuentes: informaciones periodísticas, listas oficiales de los gobiernos y autoridades locales, informes de organismos internacionales y ONG.
Mediante este geoportal podréis visualizar las ciudades, solicitar información de cada una de ellas sobre número y tasa anual de homicidios, consultar la tabla de atributos y hacer búsquedas.
Vídeo sobre el mapa y el funcionamiento del geoportal:
Por último aprovechamos para anunciar que en breve vamos a publicar un curso gratuito de gvSIG Desktop aplicado a criminología. Estad atentos si os interesa el tema…
Ya está disponible la grabación del Webinar “Suite gvSIG: Herramientas al servicio del mundo” organizado por UNIGIS, para todos aquellos que no pudieron verla en directo. Para acceder al vídeo basta con hacer un breve registro.
Durante el seminario web se presentaron tanto las soluciones en software libre de la Suite gvSIG como una introducción al potencial de la geomática en el mercado actual. Además, al final del webinar se fueron contestando todas las dudas y opiniones vertidas por los asistentes.
Anyone who has had the chance to install GeoNode knows that setting it up means having to deal with a whole stack of software components. PostgreSQL/PostGIS, Geoserver, Celery, RabbitMQ, ElasticSearch are the key elements for a complete GeoNode setup that allows to exploit the complete set of functionalities offered by the platform.
GeoNode provides a setup procedure (based on Paver) that simplifies the process for testing and development purposes, but a production-ready deployment requires more steps to configure and integrate the individual components to offer a reliable and robust setup.
Docker, Rancher and GeoNode
The past years have seen the rise of DevOps technologies that help a lot in managing deployment operations, comprising CI/CD pipelines, and in configuring services to provide scalability, security and availability. Linux containers, and Docker in particular, has won the trust of developers, DevOps and sysadmins, providing widely adopted approaches for these tasks. A fast growing ecosystem of tools and platforms has been built on top of it, from Docker Compose to Rancher to offer streamlined operations and services for the building of cloud infrastructures and distributed services.
GeoNode did not miss the DevOps train, a set of Docker images has been defined along a Compose configuration to make it possible to deploy the whole GeoNode stack as a group of interlinked containers. The design of the GeoNode "dockerization" is still undergoing thanks to a joint effort from various subjects within the community, where GeoSolutions is actively participating to offer its contribution and experience.
To meet the needs of our current scenarios we have created a custom Docker and Rancher setup, specifically targeted at setting up multiple deployments of GeoNode as easy and quick as possible. The goal was having a one or two click away vanilla GeoNode setup, leveraging single or multiple hosts infrastructures while providing at the same time the ability to customize its look and feel without touching any line of code.
[caption id="attachment_3995" align="aligncenter" width="960"] GeoNode instance with Rancher and Docker[/caption]
The result was the "GeoNode Generic" project template for Docker Compose template v.2 and Rancher 1.6 It can be loaded directly from Rancher UI defining a custom catalog pointing to the GitHub project's repository.
This is a Django project partly based on the GeoNode project template, with a dedicated Docker and Rancher template files from which new instances can be deployed and run with a minimal effort. Define the host environment, answer a few (mostly optional) questions et voilà, here you have a fresh new GeoNode instance, with backup services and monitoring already enabled and running.
The stack comprises five containers running the PostgreSQL/PostGIS DB, Geoserver, the GeoNode Django app, the Nginx HTTP Server exposing the public endpoints of GeoNode, and one container dedicated to persistent data storage (uploaded data, GeoNode's static files, etc.).
Cron schedules for backup and monitoring tasks (metrics collection) are run from the GeoNode Django app container, where uWSGI cron facilities are used for this purpose.
In this example only one container instance for each image is running. The number of GeoNode and Nginx instances can be increased and put behind Rancher's Load Balancing services to provide a basic HA. In the future Kubernetes orchestration, instead of Cattle, will be considered to take advantage of its powerful tools for rolling releases management and advanced HA and FailOver services.
Until a few days ago applying some branding to GeoNode could only be obtained with customizing its CSS and Django templates files. This still remains true if deep customization is needed but now you have the opportunity to define basic frontend styling thanks to the brand new (no pun intended) "themes" feature, developed by GeoSolutions and already available for the development version of GeoNode.
The "themes" configuration allows to define multiple themes offering:
custom header background image
custom header title and message
custom color for header and background
contacts block in footer
copyright message in footer
A list of "partners" can also be defined with their own logos. If partners are defined a custom block in included before the footer, listing the logos which can link to the partner websites.
[caption id="attachment_3997" align="aligncenter" width="1024"] Theme admin excerpt[/caption]
Combining GeoNode's deployment through Rancher templates and themes customization we obtained a fast workflow for our customers, permitting to create new instances for their infrastructures at the speed of a click.
As you know, QGIS 3 has recently been published. This version introduced big changes in the code structure that, in addition to the new functionalities already exposed, makes our code base more modern and easier to expand and improve on in the future.
As a normal by-product of such a huge overhaul, these changes also triggered a series of new issues, that you, our users are helping to discover and document. Our objective is to eliminate the most important of these issues in time for what will be our next Long Term Release (LTR) – version 3.4. This release is scheduled for October 2018. The resources available from QGIS.ORG funds are limited, and we have already invested in QGIS 3.0 far more than we have done for any previous version.
Now is a great time for users, and particularly for power users, larger institutions and enterprises, to invest in QGIS bugfixing. You have a number of different options: donating your developers’ time or hiring a developer directly to resolve the bugs that annoy you most, sponsoring our foundation, or donating to QGIS.ORG.
Our targets are:
20k€ within 2018-05-18 (for 3.2)
40k€ within 2018-09-14 (for 3.4)
If you would like to help, feel free to contact us (preferably through the qgis-users or qgis-developers mailing list, or directly to firstname.lastname@example.org) for further details!
An ETL (Extract, Transform and Load) plugin has been developed in gvSIG Online, that allows to define a transformation process to import data like CSV files and Excel spreadsheet as layers available in a Spatial Data Infrastructure. In this way, gvSIG Online has tools that allow organizations that have it implemented to reformat and clean source data and load them in the spatial database of the SDI.
When a concrete information is generated in a usual way in formats such as those indicated (csv, xlsx), this functionality greatlymakes data loading easy, without requiring users to use other applications to perform the transformation.
We are going to show a video with a demonstration about how it works. The steps for that are:
We have to upload a Microsoft Excel spreadsheet (xlsx) to gvSIG Online.
We create an empty layer with a series of fields.
We define the transformation to be applied. That is, it indicates how to fill in that empty layer with the data contained in the spreadsheet.
We check that the transformation was successful in a geoportal.
En gvSIG Online se ha desarrollado un plugin ETL (Extract, Transform and Load) que permite definir un proceso de transformación para pasar datos como archivos CSV y hojas de cálculo Excel a capas disponibles en la Infraestructura de Datos Espaciales. De este modo gvSIG Online cuenta con herramientas que permiten a la organización que lo tiene implantado el reformatear y limpiar los datos de origen y cargarlos en la base de datos espacial de la IDE.
Cuando cierta información se genera de forma habitual en formatos como los indicados (csv, xlsx) esta funcionalidad facilita enormemente la carga de datos, sin requerir por parte de los usuarios el uso de otras aplicaciones para realizar la transformación.
Vamos a ver en un vídeo una demostración de cómo funciona. Los pasos que se realizan en esta demostración son:
Subimos a gvSIG Online una hoja de cálculo de Microsoft Excel (xlsx).
Creamos una capa vacía con una serie de campos.
Definimos la transformación a aplicar. Es decir, indicamos como rellenar esa capa vacía con los datos que contiene la hoja de cálculo.
Comprobamos en un geoportal que la transformación se ha realizado correctamente.
The new release of open-source project OpenMapTiles 3.8, which offers world maps based on OpenStreetMap, brings evolutionary steps rather than revolution. However, a significant number of small-scale changes in the transport layer, water, and other features improves the map look significantly.
Navigating through multi-level stack interchange could be challenging. Having a map representing all the layers is very helpful at that moment. Therefore, we imported the so-called z-index into OpenMapTiles 3.8 and made a map style which transparently navigates you visually through all roads even in the complicated crossings called spaghetti junction. The downside of this solution is the size which rises about four times compared to the same style without multilevel crossing. It increases computing requirements for saving into a database, downloading, parsing and rendering and therefore it is not switched on in the base styles. What stays turned on by default is another new attribute: bridge polygon.
Turn-by-turn navigation is easier with 83 000 imported one-way roads and flexibility to use different types of transportation has been extended by adding tube entrances and other stations.
The intersection which was first nicknamed as Spaghetti Junction: Gravelly Hill Interchange in Birmingham, England.
The water layer was enriched by layered features as well. With the new OpenMapTiles release, water bridges and functioning aqueducts are presented in the data as well as in the styles. The same is true for underground watercourses, where streams flowing in culverts or caves are now rendered as a dashed line, which is a common representation of this kind of features.
In the previous versions, there were missing labels for some bigger water features, namely seas. Since version 3.8, there should be a label for the Caribbean Sea, Mediterranean Sea, Caspian Sea, Sea of Japan, Celtic Sea and the Chukchi Sea. Also, bigger bays like Bay of Biscay or Gulf of California are now displayed.
There is a culvert under the famous Nieuwmarkt square in Amsterdam, the Netherlands.
One of the significant features which made it to OpenMapTiles 3.8 is sand and beach in the landcover layer. It is also possible to distinguish religions in the places of worship. Some people were confused by borders around leased territories and therefore it is now switched off by default in the styles. This is the case of the Guantanamo Bay Naval Base or Baikonur Cosmodrome, where the second one is leased to Russian government until 2050, but it is located in the heart of neighboring Kazakhstan. The leased area has an oval shape measuring 85 and 90 kilometers in diameter.
From Wikipedia: The spaceport is currently leased by the Kazakh Government to Russia until 2050, and is managed jointly by the Roscosmos State Corporation and the Russian Aerospace Forces.
Afgelopen donderdag (26 april) was de eerste NLExtract Hacking Day, ruim zes jaar nadat de huidige Github repo is aangemaakt en bijna zeven jaar nadat BAGExtract+ door Matthijs van der Deijl online is gezet. De opkomst was zeer goed te noemen, met 14 man en één dame, waaronder de voltallige bezetting van zowel Webmapper als Geogap. Een gevarieerd gezelschap, maar met de nadruk op ontwikkelaars. We werden gastvrij ontvangen door Webmapper bij de Social Impact Factory in Utrecht. De lunch en borrel werden gefinancierd uit het bedrag dat NLExtract won bij de OGT-Award in 2016.
Verbeterpunten en wensen
Aangezien we voor de eerste keer met zo’n grote groep mensen bijeen waren voor NLExtract, leek het ons een goed idee om te inventariseren wat de belangrijkste wensen en verbeterpunten voor NLExtract te zijn. Dit bleken te zijn, in willekeurige volgorde:
User interface: met name door niet-ontwikkelaars gewenst;
Documentatie: altijd een ondergeschoven kindje;
Unit tests / continuous integration: nodig om stabiele en reproduceerbare deliverables te maken;
Geopackage output: simpeler in gebruik dan PostGIS-dumps;
Vector tiles: basisregistraties gelijk hier naar toe omzetten;
Community building: hoe houd je en grotere groep mensen betrokken bij NLExtract;
Docker: gemakkelijker in gebruik dan zelf alle dependencies te installeren.
Het vector tiles team
Door diverse mensen werden verschillende onderwerpen opgepakt, zoals Docker, vector tiles, verbeteren van de website, kwaliteitsanalyse v.w.b. schrijfwijzen namen, etc. Tevens hebben enkele mensen zich verdiept in het gebruik van NLExtract. Het is soms lastig aan de praat te krijgen, met name op Windows. Dit komt voor een groot deel door de verschillende dependencies (PostgreSQL, PostGIS, GDAL, Python, enkele Python-libraries) waarvan de installatieprocedure wijzigt en die deels weer afhankelijk zijn van elkaar.
Aan het eind van de dag werd het volgende gemeld t.a.v. voortgang:
Kwaliteitsanalyse: er zijn enkele scripts gemaakt voor de analyse van BGT-namen en BAG-adressen.
Gebruik NLExtract: degenen die zich wilden verdiepen in het gebruik van NLExtract zijn wijzer geworden.
Geopackage: een deel van de BGT van Schiermonnikoog e.o. werd in Geopackage geladen. Nog niet alle informatie bleek geladen (nummeraanduidingen bij panden, kruinlijnen).
Docker: de NLExtract repo is onder handen genomen als voorbereiding voor het verdockeren. Spullen die niet met basisregistraties te maken hebben, zijn verplaatst.
Website: de website is geactualiseerd en beter georganiseerd.
Vector tiles: er zijn verschillende tools (Tegola, Tileserver GL, T-Rex) nader geanalyseerd en gebruikt op hun geschiktheid om de data van de basisregistraties naar vector tiles over te zetten. Dit resulteerde in een demo met kadastrale informatie.
Hard werken aan NLExtract
We kijken terug op een zeer geslaagde dag waar met veel energie en passie aan NLExtract werd gewerkt. We zijn van plan om in de toekomst nog meer van dergelijke dagen te organiseren.
Ya está disponible la grabación del webinar “Aplicaciones de SIG Móvil con Geopaparazzi/gvSIG Mobile” organizado por la Asociación gvSIG y GeoAlternativa dentro del marco de la iniciativa Geo for All de OSGeo, grupo Iberoamérica.
Si no has podido seguirlo en directo ahora puedes ver esta presentación, en la que se ha mostrado en qué consiste la herramienta Geopaparazzi/gvSIG Mobile, y cómo utilizarla. Una de las novedades mostradas es la funcionalidad que permite generar formularios personalizados para la toma de datos en campo, una de las formas de integración de gvSIG Mobile con gvSIG Desktop.
Desde hace unos días está disponible el programa de las Jornadas de SIG Libre de Girona. Los asistentes al evento tendrán la oportunidad de conocer todas las novedades de la Suite gvSIG gracias a las tres ponencias que tendrán lugar el día 7 de junio:
We are going to show you more gvSIG Online tools, the platform for Spatial Data Infrastructures that doesn’t stop growing. In this case we are going to talk about the tool for multiple legends, a really useful functionality when we want to show a same layer with different legends in a Geoportal. Surely more than once you have found that problem… that gvSIG Online solves easily.
The steps to be followed for this are shown in the video:
From the dashboard, several legends are created for the same layer: one for a single symbol, another for unique values and another for grouping points.
In a geoportal created with gvSIG Online, from the table of contents we can see how we can select the different legends available in the layer and it is automatically applied.
Os seguimos mostrando herramientas de gvSIG Online, la plataforma para Infraestructuras de Datos Espaciales que no para de crecer. En este caso vamos a hablar de la herramienta de leyendas múltiples, una funcionalidad realmente útil cuando queremos mostrar una misma capa con diferentes leyendas dentro de un Geoportal. Seguro que más de una vez os habéis encontrado con ese problema…que gvSIG Online resuelve fácilmente.
Los pasos seguidos para ello se muestran en el vídeo:
Desde el área de administración se crean varias leyendas para una misma capa: una de símbolo único, otra de valores únicos y otra de agrupación de puntos.
En un geoportal creado con gvSIG Online, desde la tabla de contenidos vemos como podemos seleccionar las distintas leyendas disponibles en la capa y automáticamente es aplicada.
Improved bounding box reporting in WMS GetCapabilities allowing more entries to be supported when Output bounding box for every support CRS is selected. Bounding boxes are now returned for layer groups as well.
NetCDF/GRIB has been improved with a new setting to copy over global attributes when generating NetCDF output
WFS 2.0 fix for interaction with startIndex and the total features count
CQL filters can now be used with the WMS vector tile output format
GeoPackage WPS output corrected to generate y coordinates from bottom left
User interface for editing workspace details checks for conflicts with name or namespace URI.
GML 3.2 output can now limit the number of decimals used for coordinate output
REST management of styles now supports defining a style using POST for CSS, YSLD and MapBox styles (previously this only worked for SLD)
WPS output error handling does a better job reporting when Async output format parameters are incorrect.
WPS improvements have also been made for the cleanup of temporary folders and output “raw data encoder” (which is often used for image generation).
Demo request page does a better job of of reporting authorization failures, and correctly sending credentials for testing service security.
GetLegendGraphic fixes to correct line thickness and ensure polygons and points are not cut off.
For developers building from source our community modules remain a great place to collaborate on new functionality and improvements. This month Nuno Oliveira has added a new community module for the GeoTools MongoDB datastore.
Please update your production instances of GeoServer to receive the latest security updates and fixes.
If you encounter a security vulnerability in GeoServer, or any other open source software, please take care to report the issue in a responsible fashion.