Welcome to Planet OSGeo

March 26, 2017


PGConfUS 2017 Getting Stuff done in PostGIS

A reminder, PGConfUS 2017 conference is just days away, and we'll be giving a training March 28th 2017, Jersey City, NJ at 1 PM. If you are coming, keep an eye on this page PGConf 2017 US Getting Stuff done with PostGIS materials.

If you haven't signed up already, there are still spots, make sure to buy your tickets at http://pgconf.us/conferences/2017#registration.

Continue reading "PGConfUS 2017 Getting Stuff done in PostGIS"

by Regina Obe (nospam@example.com) at March 26, 2017 04:42 AM

OSGeo News

The 2nd gvSIG Festival is underway!

by jsanz at March 26, 2017 12:15 AM

OSGeo News

Registration for STDM Code Sprint 2017 opened

by jsanz at March 26, 2017 12:09 AM

March 24, 2017


Esri isn’t evil

Yup you read it here, an open source advocate said that the leading GI proprietary vendor is not evil. Let me go one step further and say that I think Esri are pretty damn good. Now that is not to say that there are not aspects of their business model and practices that I wouldn’t criticise but more of that in another post.

Come to the Dark Side.

So what has prompted this outburst from an open source advocate? A mail from Suchith Anand to the OSGeo Discuss mailing list entitled “Is it possible for properitery GIS vendor to market thier properitery product as Open ?” (sic).

Hi all,

I have a query. If a properitery GIS vendor starts marketing thier properitery products as Open platform and software then what rights do the organisations and customers have who are mislead buying the  properitery software thinking it is open have ?  The definision of Proprietary software is very clearly defined, so  how can it be possible for any properitery GIS vendor to market their  software knowingly as open platform if it is properitery?

This also greatly affects the business and revenues of true open source software companies .  Who is responsible for any misleading marketing that results in losses to both customers who are mislead to buy the properitery software thinking it is open  and also to other companies who do true open source business who lose out on the business opportunities? Is it right business ethics to do this?

Best wishes,

This has prompted several thoughts which I want to share in a wider format than the OSGeo mailing list (I will post this blog back to the list) partly because the list tends to be a bubble of OSGeo geeks and activists and I want to share these thoughts with less committed users or those who have not yet chosen open source.

I am going to try to deconstruct the propositions of the initial mail (others, particularly Jody Garnett have also expressed contrary views in the mail thread).

a properitery GIS vendor” I am pretty sure that this unnamed vendor is Esri inc and that the mail was prompted by Esri’s Open Vision which would understandably cause a few ruffled feathers and raised eyebrows amongst open source advocates.

The argument seems to be that proprietary software cannot be open, in my opinion that is patently incorrect. We use the word ‘open’ with regard to software and services in numerous different ways e.g. Open Source, Open Standards, Open Data, Open Access and some people even confuse or conflate open with free. The open source community does not have a copyright or any other ownership of the ‘open’ and cannot claim to be the arbiter of who can use the adjective ‘open’ or in what context. As a community we cannot even agree amongst ourselves on the differences between ‘open’ ‘free’ and ‘libre’ (watch this very good presentation on the Free, Open & Libre by María Arias de Reyna).

Furthermore the use of ‘open’ in Open Standards is specifically intended to encourage all (proprietary or open source) to support standards that enable data sharing, as the OGC’s welcome says:

The OGC (Open Geospatial Consortium) is an international not for profit organization committed to making quality open standards for the global geospatial community. These standards are made through a consensus process and are freely available for anyone to use to improve sharing of the world’s geospatial data.

Our members come from government, commercial organizations, NGOs, academic and research organizations.

Open Standards facilitate interoperability and limit ‘vendor lock-in’ which would hardly work without major proprietary vendors supporting the OGC and its standards. ‘Open’ is not the exclusive terminology of the open source community. Whilst we may not like it proprietary software can be described as ‘open’ even if the source code is not freely available.

And let’s be honest about this, apart from a very small number of users who really wants access to source code or has the ability to fix or enhance the code? For many users of open source software the cost of ownership is a major factor and in that case the ‘free’ bit is probably more important than the ‘open’ bit (although the cost of ownership is much more than the license cost).

If a properitery GIS vendor starts marketing thier properitery products as Open platform and software then what rights do the organisations and customers have who are mislead buying the  properitery software thinking it is open have?” Most businesses marketing software (regardless of whether it is open source or proprietary) consider that their potential clients have made a mistake if they eventually succumb to the sales patter of a competitor, we all think we have the ‘best’ offer (specification, quality, value). I doubt that many organisations that purchase Esri software consider themselves to have been mislead by their marketing materials about openness.

Sometimes the open source community (and I include myself in this mea culpa) can position themselves on the side of the angels in a battle between good and evil. That is bullshit! We deliver solutions to customers’ needs that hopefully enable them to harness the power of geography to manage assets, make better decisions and inform stakeholders, the choice between proprietary and open source software is made by informed buyers on the basis of what is best for their organisation in terms of functionality, service, lifetime cost and a whole range of other criteria. Sometimes proprietary solutions are the best answer and sometimes (increasingly so in my opinion) open source represents the most scalable and cost effective solution. The geo market is shifting rapidly from enterprise software implementations to services in which the underlying technology is less visible and less important to the buyer.

Rather than trying to wrestle over the usage of ‘open’ by Esri, open source businesses could learn a fair bit from the effectiveness of Esri’s marketing in building such a successful business that dominates our market. Along the way, whilst making profits (not a dirty word) Esri and in particular Jack Dangermond have done an enormous amount to promote the use of GI, provided employment for thousands of talented people and created the GIS category which has enabled many other business to prosper. They also give the OSGeo a nice big target to aim at!



by steven at March 24, 2017 04:42 PM

gvSIG Team

The 2nd gvSIG Festival is underway!

After the successful first gvSIG Festival in the last year, the virtual gvSIG conference, a second edition will be held this year, in May 16th and 17th. The novelty of this year is that users can send their proposals about projects done with gvSIG, that will convert this initiative in a more global and open event.

This event is free of charge and completely online, through the webinar service of the gvSIG Association, with the advantage to count with speakers from different countries and where users and developers from any part of the world can hold them.

If you have done any project with gvSIG and you want to present it at the gvSIG Festival, you can send a summmary to the following e-mail address: conference-contact@gvsig.com, explaining the project. The summary will have no more than 300 words, it has to be written in Spanish or English, and you have to indicate the title and the language of the presentation.

Once the program is configured it will be published at the event website, and registrations will be opened.

We expect your proposals!

Filed under: community, english, events, training Tagged: gvSIG Festival

by Mario at March 24, 2017 09:48 AM

gvSIG Team

¡El 2º gvSIG Festival ya está en marcha!

Tras el éxito del año pasado del primer gvSIG Festival, las jornadas virtuales sobre gvSIG, se va a celebrar este año una segunda edición, los próximos días 16 y 17 de mayo, en las que como novedad de este año los usuarios podrán enviar sus propuestas de proyectos sobre la aplicación, lo que convierte esta iniciativa en un evento todavía más global y abierto.

Este evento es gratuito y completamente online, a través del servicio de webinar en la Asociación gvSIG, con la ventaja de poder contar con ponentes de diferentes países, y donde las ponencias pueden ser seguidas por usuarios de cualquier parte del mundo.

Si has realizado algún proyecto con gvSIG y quieres presentarlo en el gvSIG Festival, puedes enviar un resumen explicando en qué consiste a la dirección de correo conference-contact@gvsig.com. El resumen deberá ser en español o inglés, de un máximo de 300 palabras, y en él se deberá indicar el título de la ponencia y el idioma en el que se daría.

Una vez configurado el programa se publicará en la web del evento y se abrirán las inscripciones para cada presentación.

¡Esperamos vuestras propuestas!

Filed under: community, events, spanish, training Tagged: gvSIG Festival

by Mario at March 24, 2017 09:44 AM

March 23, 2017

Tom Kralidis

pygeometa: new release, hello YAML

Metadata should be automagic and low barrier. pygeometa is a handy little metadata generator tool which is flexible, extensible, and composable.  Command line, or via the API, users can generate config files, or pass plain old Python dicts, ConfigParser objects, etc. We’ve just released 0.2.0 which supports WMO Core Metadata Profile output, as well as […]

by tomkralidis at March 23, 2017 08:13 PM

gvSIG Team

Aprende a trabajar con Modelos Digitales del Terreno y geoprocesamiento ráster con este vídeo-tutorial

Complementando el post en el que os enseñamos los secretos del geoprocesamiento vectorial, hoy veremos como podemos aplicar algunos de los algoritmos disponibles en gvSIG Desktop a las capas ráster en general, y a los Modelos Digitales del Terreno en particular.

De los más de 350 geoprocesos disponibles en gvSIG, un buen número de ellos son aplicables a capas ráster, permitiéndonos hacer cálculos de todo tipo y que son especialmente útiles para algunas disciplinas científicas como la hidrología.

Mediante una serie de ejercicios prácticos, que podréis replicar en vuestra casa, este vídeo-tutorial os mostrará en pocos minutos como realizar geoprocesamiento ráster. Seguid leyendo….

Filed under: gvSIG Desktop, spanish Tagged: geoprocesamiento, hidrología, MDT, Modelo Digital del Terreno, raster

by Alvaro at March 23, 2017 09:19 AM

March 22, 2017

Jackie Ng

React-ing to the need for a modern MapGuide viewer (Part 15): Play with it on docker

Today, I found a very interesting website from the tech grapevine:


What is this site? It is an interactive docker playground. If you've ever used sites like JSFiddle to try out snippets of JS/HTML/CSS, this is basically the docker equivalent to try out Docker environments.

With PWD, I now have a dead simple way for anyone who wants to try out this viewer to spin up a demo MapGuide instance on PWD for themselves to check out the viewer.

Once you've proved to the site that you're are indeed a human and not a robot, you will enter the PWD console. From here, click + ADD NEW INSTANCE to start a new shell.

Then run the following commands to build the demo docker image and spin up the container

git clone https://github.com/jumpinjackie/mapguide-react-layout
cd mapguide-react-layout

After a few minutes, you should see a port number appear beside the IP address

This is a link to the default Apache httpd page that is confirmation that the demo container is serving out web content to the outside world.

Now simply append /mapguide/index.php to that URL to access the demo landing page for this viewer. Pick any template on the list to load the viewer using that template.

You now have a live demo MapGuide Server with mapguide-react-layout (and the Sheboygan dataset) preloaded for you to play with to your heart's content for the next 4 hours, after which PWD will terminate your session and all the docker images/containers/etc that you created with it.

This was just one use case that I thought up in 5 minutes after discovering this awesome site! I'm sure there's plenty of other creative uses for such a site like this.

Many thanks to brucepc for his MGOS 3.1 docker image from which the demo image is based from.

by Jackie Ng (noreply@blogger.com) at March 22, 2017 04:22 PM

Jackie Ng

gRPC is very interesting

MapGuide in its current form is a whole bucket of assorted libraries and technologies:
  • We use FDO for spatial data access
  • We use ACE (Adaptive Communication Environment) for:
    • Basic multi-threading primitives like mutexes, threads, etc
    • TCP/IP communication between the Web Tier and the Server Tier
    • Implementing a custom RPC layer on top of TCP/IP sockets. All of the service layer methods you use in the MapGuide API? They're all basically RPC calls sent over TCP/IP for the MapGuide Server to invoke its server-side eqvivalent. Most of the other classes that you pass into these service methods are essentially messages that are serialized/deserialized through the TCP/IP sockets. When you think about it, the MapGuide Web API is merely an RPC client for the MapGuide Server, which itself is an RPC server that does the actual work
  • We use Berkeley DBXML for the storage of all our XML-based resources
    • We have an object-oriented subset of these resource types (Feature Sources, Layer Definitions, Map Definitions, Symbol Definitions) in the MdfModel library with XML serialization/parsing code in the MdfParser library
    • Our Rendering and Stylization Engine work off of these MdfModel classes to render the maps that you see on your viewer
  • We use xerces for XML reading/writing XML in and out of DBXML
  • We use a custom modified (and somewhat ancient) version of SWIG to generate wrappers for our RPC client so that you can talk to the MapGuide Server in:
    • .net
    • Java
    • PHP
So why do I mention all of this?

I mention this, because I've recently been checking out gRPC, a cross-platform, cross-language RPC framework from Google.

And from what I've seen so far, gRPC could easily replace and simplify most of the technology stack we're currently using for MapGuide:
  • ACE? gRPC is the RPC framework! The only reason we'd keep ACE around would be for multi-threading facilities, but the C++ standard library at this point would be adequate enough to replace that as well
  • DBXML/MdfModel/xerces? gRPC is driven by Google Protocol Buffers.
    • Protobuf messages are strongly typed classes that serialize/deserialize into compact binary streams and is more efficient and faster than slinging around XML. Ever bemoan the fact you have to currently work with XML to manipulate maps/layers/etc? In .net you are reprieved if you use the Maestro API (where we provide strongly-typed classes for all the resource XML types), but for the other languages you have to figure out how to use the XML APIs/services provided by Java/PHP to work with the XML blobs that the MapGuide API gives and expects. With protobuf, you have none of these problems.
    • Protobuf messages can evolve in a backward-compatible manner
    • Because protobuf messages are already strongly-typed classes, it makes MdfModel/MdfParser redundant if you get the Rendering/Stylization engine to work against protobuf messages for maps/layers/symbols/styles/etc
    • If we ever wanted to add support for Mapbox Vector Tiles (which seems to be the de-facto vector tile format), well the spec is protobuf-based so ...
    • Protobuf would mean we no longer deal in XML, so we don't need Xerces for reading/writing XML and DBXML as the storage database (and all its cryptic error messages that can bubble up from the Resource Service APIs) can be replaced with something simpler. We may not even need a database at this point. Dumping protobuf messages to a structured file system could probably be a simpler solution
  • SWIG? gRPC and protobuf can already generate service stubs and protobuf message classes in the languages we currently target:
    • .net
    • Java
    • PHP
    • And if we wanted, we can also instantly generate a gRPC-based MapGuide API for:
      • node.js
      • Ruby
      • Python
      • C++
      • Android Java
      • Objective-C
      • Go
    • The best thing about this? All of this generated code is portable in their respective platforms and doesn't involve native code interop through "flattened" interfaces of C code wrapping the original C++ code, which is what SWIG ultimately does for any language we want to generate wrapper bindings out of. If it does involve native code interop, it's a concern that is taken care of by the respective gRPC/protobuf implementation for that language.
  • Combine a gRPC-based MapGuide Server with grpc-gateway and we'd have an instant REST API to easily build a client-side map viewer out of
  • gRPC works at a scale that is way beyond what we can currently achieve with MapGuide currently. After all, this is what Google uses themselves for building their various services
If what I said above doesn't make much sense, consider a practical example.

Say we had our Feature Service (which as a user of the MapGuide API, you should be familiar with) as a gRPC service Definition

// Message definitions for the request/response types below are omitted for brevity but basically every request and
// response type mentioned below will have eqvivalent protobuf message classes automatically generated along with
// the service

// Provides an abstraction layer for the storage and retrieval of feature data in a technology-independent way.
// The API lets you determine what storage technologies are available and what capabilities they have. Access
// to the storage technology is modeled as a connection. For example, you can connect to a file and do simple
// insertions or connect to a relational database and do transaction-based operations.
service FeatureService {
// Creates or updates a feature schema within the specified feature source.
// For this method to actually delete any schema elements, the matching elements
// in the input schema must be marked for deletion
rpc ApplySchema (ApplySchemaRequest) returns (BasicResponse);
rpc BeginTransaction (BeginTransactionRequest) returns (BeginTransactionResponse);
// Creates a feature source in the repository identified by the specified resource
// identifier, using the given feature source parameters.
rpc CreateFeatureSource (CreateFeatureSourceRequest) returns (BasicResponse);
rpc DeleteFeatures (DeleteFeaturesRequest) returns (DeleteFeaturesResponse);
// Gets the definitions of one or more schemas contained in the feature source for particular classes.
// If the specified schema name or a class name does not exist, this method will throw an exception.
rpc DescribeSchema (DescribeSchemaRequest) returns (DescribeSchemaResponse);
// This method enumerates all the providers and if they are FDO enabled for the specified provider and partial connection string.
rpc EnumerateDataStores (EnumerateDataStoresRequest) returns (EnumerateDataStoresResponse);
// Executes SQL statements NOT including SELECT statements.
rpc ExecuteSqlNonQuery (ExecuteSqlNonQueryRequest) returns (ExecuteSqlNonQueryResponse);
// Executes the SQL SELECT statement on the specified feature source.
rpc ExecuteSqlQuery (ExecuteSqlQueryRequest) returns (stream DataRecord);
// Gets the capabilities of an FDO Provider
rpc GetCapabilities (GetCapabilitiesRequest) returns (GetCapabilitiesResponse);
// Gets the class definition for the specified class
rpc GetClassDefinition (GetClassDefinitionRequest) returns (GetClassDefinitionResponse);
// Gets a list of the names of all classes available within a specified schema
rpc GetClasses (GetClassesRequest) returns (GetClassesResponse);
// Gets a set of connection values that are used to make connections to an FDO provider that permits multiple connections.
rpc GetConnectionPropertyValues (GetConnectionPropertyValuesRequest) returns (GetConnectionPropertyValuesResponse);
// Gets a list of the available FDO providers together with other information such as the names of the connection properties for each provider
rpc GetFeatureProviders (GetFeatureProvidersRequest) returns (GetFeatureProvidersResponse);
// Gets the locked features.
rpc GetLockedFeatures (GetLockedFeaturesRequest) returns (stream FeatureRecord);
// Gets all available long transactions for the provider
rpc GetLongTransactions (GetLongTransactionsRequest) returns (GetLongTransactionsResponse);
// This method returns all of the logical to physical schema mappings for the specified provider and partial connection string
rpc GetSchemaMapping (GetSchemaMappingRequest) returns (GetSchemaMappingResponse);
// Gets a list of the names of all of the schemas available in the feature source
rpc GetSchemas (GetSchemasRequest) returns (GetSchemasResponse);
// Gets all of the spatial contexts available in the feature source
rpc GetSpatialContexts (GetSpatialContextsRequest) returns (GetSpatialContextsResponse);
// Inserts a new feature into the specified feature class of the specified Feature Source
rpc InsertFeatures (InsertFeaturesRequest) returns (stream FeatureRecord);
// Selects groups of features from a feature source and applies filters to each of the groups according to the criteria set in the aggregate query option supplied
rpc SelectAggregate (SelectAggregateRequest) returns (stream DataRecord);
// Selects features from a feature source according to the criteria set in the query options provided
rpc SelectFeatures (SelectFeaturesRequest) returns (stream FeatureRecord);
// Set the active long transaction name for a feature source
rpc SetLongTransaction (SetLongTransactionRequest) returns (BasicResponse);
// Connects to the Feature Provider specified in the connection string
rpc TestConnection (TestConnectionRequest) returns (TestConnectionResponse);
// Executes commands contained in the given command set
rpc UpdateFeatures (UpdateFeaturesRequest) returns (UpdateFeaturesResponse);
// Updates all features that match the given filter with the specified property values
rpc UpdateMatchingFeatures (UpdateMatchingFeaturesRequest) returns (UpdateMatchingFeaturesResponse);

Running this service definition through the protoc compiler with grpc plugin gives us:
  • Auto-generated (and strongly-typed) protobuf classes for all the messages. ie: The request and response types for this service
  • An auto-generated FeatureService gRPC client ready to use in the language of our choice
  • An auto-generated gRPC server stub for FeatureService in the language of our choice ready for us to "fill in the blanks". For practical purposes, we'd generate this part in C++ and fill in the blanks by mapping the various service operations to their respective FDO APIs and its return values to our gRPC responses.
And at this point, we'd just need a simple C++ console program that bootstraps gRPC/FDO, registers our gRPC service implementation, start the gRPC server on a particular port and we'd have a functional Feature Service implementation in gRPC. Our auto-generated Feature Service client can connect to this host and port to immediately start talking to it.

The only real work is the "filling in the blanks" on the server part. Everything else is taken care of for us.

Extrapolate this to the rest of our services (Resource, Rendering, etc) and we basically have a gRPC-based MapGuide Server.

Also filling in the blanks is a conceptually simple exercise as well:
  • Feature Service - Pass down the APIs in FDO.
  • Rendering Service - Setup up FDO queries based on map/layers visible and pass query results to the Rendering/Stylization engine.
  • Resource Service - Read/write protobuf resources to some kind of persistent storage. It doesn't have to be something complex like DBXML, it can be as simple as a file system (that's what mg-desktop does for its resource service implementation btw)
  • Tile Service - It's just like the rendering service, but you're asking the Rendering/Stylization engine to render tile-sized content.
  • KML Service - Just like rendering service, but you're asking the Rendering/Stylization engine to render KML documents instead of images.
  • Drawing Service - Do we still care about DWF support? Well if we have to support this, it's just passing down to the APIs in DWF Toolkit.
  • Mapping Service - It's a mish-mash of tapping into the Rendering/Stylization engine and/or the DWF Toolkit.
  • Profiling Service - Just tap into whatever tracing/instrumentation APIs provided by gRPC.
Now because gRPC is cross-language, nothing says we have to use C++ for all the service implementations, it's just that most of the libraries and APIs we'd be mapping into are already in C++, so in practical terms we'd stick to the same language as well.

Front this with grpc-gateway, and we basically have our RESTful mapagent to build a map viewer against.

There's still a few unknowns:
  • How do we model file uploads/downloads?
  • Can server-side service implementations call other services?
Google's vast set of gPRC definitions for their various web services can answer the first part. The other part, will need more playing around.

The thought of a gRPC-based MapGuide Server is very exciting!

by Jackie Ng (noreply@blogger.com) at March 22, 2017 04:19 PM

March 21, 2017

gvSIG Team

gvSIG nominado en 3 categorías de los “Open Awards”. Abierta votación popular.

El proyecto gvSIG vuelve a ver reconocida su propuesta de desarrollo de soluciones de geomática libre con la nominación a 3 categorías de los premios denominados “Open Awards”.

Tal y como se indica en la web de los premios, estos tienen como objetivo reconocer públicamente a empresas, administraciones, personalidades y comunidades que crean, apoyan y fomentan grandes soluciones con tecnologías Open Source y Software Libre.

Los Open Awards reconocen y premian los proyectos e iniciativas de código abierto que más han destacado durante el último año, impulsan la comunicación y la notoriedad pública de las empresas, proyectos y administraciones participantes en los premios y valoran el trabajo realizado por todos ellos.

Las 3 categorías en las que participa gvSIG reflejan en cierto modo la mirada amplia del proyecto, ya que se opta al “Mejor proveedor de servicios/soluciones” donde la Asociación gvSIG está demostrando que pueden ponerse en marcha nuevos modelos económicos de producción de software desde perspectivas colaborativas; a la “Plataforma/proyecto más innovador” una categoría que se centra en la parte técnico-científica del proyecto y donde gvSIG Suite se constituye como la plataforma integral para solventar las necesidades “geo” de cualquier organización, con aplicaciones de escritorio, móviles y web siguiendo la filosofía de las Infraestructuras de Datos Espaciales; por último la categoría donde más ilusión nos hace participar y que en realidad es un reconocimiento a todos vosotros, los que día a día, desde vuestro rincón apoyáis y ayudáis a gvSIG, la nominación a “Mejor comunidad tecnológica”.

En estos premios hay una primera fase de votaciones públicas de la que saldrá un “top cinco”. A partir de ahí un jurado determinará los ganadores de cada categoría.

¿Qué pasos hay que seguir para votad?

  1. Votad en las categorías que consideres (desde hoy hasta el 30 de abril).
  2. Confirmad voto por email.

Como aliciente adicional, por votar se tiene acceso exclusivo al eBook “OpenExpo Tendencias Open Source y Software Libre 2017”.

Los enlaces directos a las categorías donde está gvSIG nominado son:

Desde ya os agradecemos a todos vosotros vuestro apoyo y tened seguro que nosotros también os votaremos como “Mejor comunidad tecnológica”. 🙂

Filed under: premios, press office, software libre, spanish Tagged: Open Awards, OpenExpo

by Alvaro at March 21, 2017 03:45 PM


GeoServer Code Sprint needs you


Dear Reader,

everything is ready in GeoSolutions for next week's GeoServer code sprint which will take place in our offices during the week of March 27th.

[caption id="attachment_3380" align="aligncenter" width="531"]Sprint 2017 Sprint 2017[/caption]

The main focus will be on refactoring GeoServer's REST API towards a more modern approach (se this page for some insights). A number of GeoServer developers from various organizations will gather from all over world for this work and your support in funding this initiative would help us out with the expenses, therefore, we are asking help to all our readers. Sponsorship opportunities are available for you to contribute on the OSGeo wiki.

Happy sprinting to everybody! The GeoSolutions team,

by simone giannecchini at March 21, 2017 12:22 PM

gvSIG Team

Aprende los secretos del geoprocesamiento vectorial en gvSIG con este vídeo-tutorial

En gvSIG Desktop disponemos de más de 350 geoprocesos, esto sin contar con plugins como el recientemente anunciado de Jgrass. Un buen porcentaje de esos geoprocesos se aplican sobre capas vectoriales, desde los más comunes -como el área de influencia, cortar, unir,…- a otros más específicos y menos conocidos.
Hoy os presentamos un vídeo-tutorial en el que en pocos minutos aprenderéis el funcionamiento de los geoprocesos de gvSIG, mediante una serie de ejercicios prácticos que nos permitan comprender la sencillez con la que podemos utilizar los algoritmos disponibles en la aplicación.
En la parte final del vídeo-tutorial podréis aprender a manejar el modelador de geoprocesos; una herramienta muy útil y no muy conocida por los usuarios de gvSIG.
Seguid leyendo…

Filed under: gvSIG Desktop, spanish Tagged: geoprocesamiento, geoprocesos, modelador

by Alvaro at March 21, 2017 09:01 AM

March 20, 2017

OSGeo News

GeoCat Diamond OSGeo Sponsorship

by jive at March 20, 2017 03:37 PM


New release of MapStore 2 with theming support


Dear Reader,

we are pleased to announce a new release of MapStore 2, our flagship Open Source webgis product, which we have called 2017.02.00. The full list of changes for this release can be found here, but let us now concentrate on the latest most interesting additions.

Advanced Theming The main Feature of this release is the possibility to have different themes, as shown in the Gallery below. [gallery type="slideshow" ids="3319,3320,3324,3325,3326,3321,3327,3328,3329,3330"]

MapStore 2 was conceived with the goal to be highly customizable, therefore we have worked hard on the look and feel from the beginning to create a product that would easily adapt to predefined graphical guidelines as well as a framework which could be easy to integrate with 3rd party applications.

With this release the goal has been achieved. We have refactored the original theme simplifying greatly the steps to create new themes and switch between them. You can try to switch it live from the home page, there is a specific combo with some predefined styles.

On the technical side, we have refactored MapStore 2 theme support using less, hence now creating your own theme to match your company's visual design guidelines is very simple. We are developing an example that allows you to customize your theme directly from the web page, you can see it here below or test it here.

[gallery type="slideshow" ids="3333,3334"] Balloon Tutorial The balloon tutorial is now ready also with html support. You can try it live by clicking on "Tutorial" in the map's burger menu, as an instance in this map. [gallery type="slideshow" ids="3336,3337,3338,3339,3340"]   Notes for Developers

This release has a number of changes that are crucial to know for the developers since they break compatibility with older versions. Here, you can find the details of what we updated and how to migrate your application.

We strongly believe that these changes will speed up the development and improve the quality and the readability of the code (particularly redux-observable). If you will find yourself struggling with these changes, reach out for us on the developer mailing list and we will help you out.

We also looked around for a tool to produce developers' docs that would satisfy our needs in the longer term.  We found in docma a great tool as it allows us to provide both generic guides as well as to document our components, plugins and JavaScript API inline using jsDoc + Markdown. You can find the current version of the developer documentation here. Twitter Account

MapStore 2 has now its own twitter account which it is using to let us know how it feels as well as to share useful information and insights.

What we are working on for the next release

The main focus for the next release is the implementation of a JavaScript API to allow you to include MapStore 2 in your application or web-site and interact with it in more advanced ways than a simple IFRAME. We are also going to focus on the following items:

  • Improve developer's documentation
  • Improve the management of Maps, in order to allow users to manage them also from the map itself
  • Better interaction with WFS

In the longer term, we have a number of features and functionalities in our plans like editing, advanced templating, styling, OAUTH 2.0, and more…

So, Stay tuned and happy webmapping!

If you are interested in learning about how we can help you achieving your goals with open source products like GeoServerMapstore, GeoNode and GeoNetwork through our Enterprise Support Services and GeoServer Deployment Warranty offerings, feel free to contact us!

The GeoSolutions team,

by Lorenzo Natali at March 20, 2017 11:03 AM

March 15, 2017

gvSIG Team

Disponible vídeo-tutorial para aprender Geoestadística con gvSIG

El paquete estadístico R es uno de los más flexibles, potentes y profesionales que existen actualmente para realizar tareas estadísticas de todo tipo, desde las más elementales, hasta las más avanzadas. Y, lo más importante, es software libre.

Desde sus últimas versiones, gvSIG Desktop ha incluido plugins para integrar R, abriendo así la posibilidad a realizar todo tipo de análisis geoestadísticos.

Geólogos, biólogos, ecólogos, agrónomos, ingenieros, meteorólogos, sociólogos…por nombrar sólo unos pocos profesionales, requieren del estudio de información estadístico de información georreferenciada.

Desde la Asociación gvSIG os presentamos un vídeo-tutorial que os permitirá introduciros en el funcionamiento de la dupla gvSIG-R.

Si hemos despertado tú interés, sigue leyendo…

Filed under: gvSIG Desktop, spanish Tagged: Geoestadística, r

by Alvaro at March 15, 2017 09:06 AM

March 14, 2017

GeoServer Team

GeoServer 2.11-RC1 Released

We are happy to announce the release of GeoServer 2.11-RC1. Downloads are available (zipwardmg and exe) along with docs and extensions.

This is a release candidate of GeoServer not intended for production use. This release is made in conjunction with GeoTools 16-RC1 and GeoWebCache 1.11-RC1.

Thanks to everyone who provided feedback, bug reports and fixes. Here are some of the changes included in 2.11-RC1:

  • Incompatibilities with GeoFence and Control-flow have been resolved
  • Empty WFS Transactions (which perform no changes) no indicating everything has changed
  • Improvements to WFS GetFeature support for 3D BBOX requests
  • We have one known regression with the windows service installer (memory setting is incorrect)
  • Please additional details see the release notes (2.11-RC12.11-beta )

Release Candidate Testing

The 2.11 release is expected in March, this release candidate is a “dry run” where we confirm new functionality is working as expected and double check the packaging and release process.

Please note that GeoServer 2.9 has reached its end-0f-life. If your organization has not yet upgraded please give us hand by evaluating 2.11-RC1 and providing feedback and your experiences for the development team. This is a win/win situation where your participation can both assist the GeoServer team and reduce your risk when upgrading.

Corrected default AnchorPoint for LabelPlacement

An issue with SLD 1.0 rendering has been fixed – when a LabelPlacement did not include a AnchorPoint we were using the wrong default!

  • BEFORE: default anchor point was X=0.5 and Y=0.5 – which is at the middle height and middle length of the label text.
  • AFTER: default anchor point was X=0.0 and Y=0.5 – which is at the middle height of the lefthand
    side of the label text.

This is a long standing issue that was only just noticed in February. If you need to “restore” the incorrect behaviour please startup with -Dorg.geotools.renderer.style.legacyAnchorPoint=true system property.

Startup Performance

With extensive improvements to startup performance and OGC requests for large installations we are looking forward to feedback from your experience testing.

About GeoServer 2.11

GeoServer 2.11 is scheduled for March 2017 release. This puts GeoServer back on our six month “time boxed” release schedule.

  • OAuth2 for GeoServer (GeoSolutions)
  • YSLD has graduated and is now available for download as a supported extension
  • Vector tiles has graduate and is now available for download as an extension
  • The rendering engine continues to improve with underlying labels now available as a vendor option
  • A new “opaque container” layer group mode can be used to publish a basemap while completely restricting access to the individual layers.
  • Layer group security restrictions are now available
  • Latest in performance optimizations in GeoServer (GeoSolutions)
  • Improved lookup of EPSG codes allows GeoServer to automatically match EPSG codes making shapefiles easier to import into a database (or publish individually).

by jgarnett at March 14, 2017 07:23 AM

March 11, 2017

Even Rouault

Dealing with huge vector GeoPackage databases in GDAL/OGR

Recently, I've fixed a bug in the OGR OpenFileGDB driver, the driver made from the reverse engineering the ESRI FileGeoDatabase format, so as to be able to read tables whose section that enumerates and describes fields is located beyond the first 4 GB of the file. This table from the 2016 TIGER database is indeed featuring all linear edges of the USA and is 15 GB large (feature and spatial indexes included), with 85 million features.

Some time before, I had to deal with a smaller database - 1.7 GB as GeoPackage - with 5.4 million polygons (bounding box) from the cadastre of an Italian province. One issue I noticed is that when you want to get the summary of the layer, with ogrinfo -al -so the.gpkg, it was very slow. The reason is that this summary includes the feature count, and there's no way to get this metadata quickly, apart from running the "SELECT COUNT(*) FROM the_table" request, which causes a full scan of the table. For small databases, this runs fast, but when going into the gigabyte realm, this can take several dozains of seconds. But getting the spatial extent of the layer, which is one of the other information displayed by the summary mode of ogrinfo, is fast since the gpkg_contents "system" table of a GeoPackage database includes the bounding box of the table. So my idea was to extend the definition of the gpkg_contents table to add a new column, ogr_feature_count, to store the feature count. I went to implement that, and it worked fine. The synchronization of the value of ogr_feature_count after edits can be done with 2 SQLite triggers, on row insertion and deletion, and that  works with implementations that are not aware of the existence of this new column. Like older OGR versions. Unfortunately it appears that at least one other implementation completely rejected such databases. There is some inconsistency in the GeoPackage specification if additional columns are accepted or not in system tables. From the /base/core/contents/data/table_def test case, "Column order, check constraint and trigger definitions, and other column definitions in the returned sql are irrelevant.", it would seem that additional columns should still be considered as a valid GeoPackage. Anyway, that's only the theory and we don't want to break interoperability for just a nice-to-have feature... So I went to change the design a bit and created a new table, gpkg_ogr_contents, with a table_name and feature_count columns. I'm aware that I should not borrow the gpkg_ prefix, but I felt it was safer to do so since other implementations will probably ignore any unknown gpkg_ prefixed table. And the addition of the ogr_ prefix makes collisions with future extension of the GeoPackage specification unlikely. The content of this table is also maintained in synchronization with the data table thanks to two triggers, and this makes the other software that rejected my first attempt happy. Problem solved.

Let's come back to our 13 GB FileGeoDatabase. My first attempt to convert is to GeoPackage with ogr2ogr resulted in converting the features in about half an hour, but once this 100% stage reached, the finalization, which includes building the spatial index took ages. So long, that after a whole night it wasn't yet finished and seriously making the computer non responsive due to massive I/O activity. In the GeoPackage driver, the spatial index is indeed created after feature insertion, so that the feature table and spatial index tables are well separated in the file, and from previous experiment with the Spatialite driver, it proved to be the right strategy. Populating the SQLite R-Tree is done with a simple statement: INSERT INTO my_rtree SELECT fid, ST_MinX(geom), ST_MaxX(geom), ST_MinY(geom), ST_MaxY(geom) FROM the_table. Analyzing what happens in the SQLite code is not easy when you are not familiar with that code base, but my intuition is that there was constant back and forth between the geometry data area and the RTree area in the file, making the SQLite page cache inefficient. So I decided to experiment with a more progressive approach. Let's iterate over the feature table and collect the fid, minx, max, miny, maxy by chunks of 100 000 rows, and the insert those 100 000 bounding boxes into the R-Tree, and loop again unil we have completely read the feature table. With such a strategy, the spatial index can now be built in 4h30. The resulting GeoPackage file weights 31.6 GB, so twice at large than the FileGeoDatabase. One of the reasons for the difference must be due to geometries in FileGeoDatabase being compressed (quantization for coordinate precision, delta encoding and use of variable integer) whereas GeoPackage uses a uncompressed SQLite BLOB based on OGC WKB.
My first attempt at opening it in QGIS resulted in the UI to be frozen, probably for hours. The reason is that QGIS always issues a spatial filter, even when requesting on a area of interest that is at least as large as the extent of the layer, where there is no performance gain to expect from using it. So the first optimization was in the OGR GeoPackage to detect that situation and to not translate the OGR spatial filter as SQLite R-Tree filter. QGIS could now open the database and progressively displays the features. Unfortunately when zooming in, the UI became frozen again. When applying a spatial filter, the GeoPackage driver created a SQL request like the following one:

SELECT * FROM the_table WHERE fid IN 
       (SELECT id FROM the_rtree WHERE 
        xmin <= bbox_xmax AND xmax >= bbox_xmin AND
        ymin <= bboy_ymay AND ymay >= bboy_ymin)

It turns out that the sub-select (the one that fetches the feature ID from the spatial index) is apparently entirely run before the outer select (the one that returns geometry and attributes) starts being evaluated. This way of expressing the spatial filter came from the Spatialite driver (since GeoPackage and Spatialite use the exact same mechanisms for spatial indexing), itself based on examples from an old Spatialite tutorial. For not too big databases, this runs well. After some experiment, it turns out that doing a JOIN between the feature table and the RTree virtual table makes it possible to have a non blocking request:

SELECT * FROM the_table t JOIN the_rtree r ON t.fid = r.id
WHERE r.xmin <= bbox_xmax AND r.xmax >= bbox_xmin AND
      r.ymin <= bboy_ymax AND r.ymax >= bboy_ymin

Now QGIS is completely responsive, although I find that even on high zoom levels, the performance is somehow disappointing, ie features appear rather slowly. There seems to be some threshold effect on the size of the database, since the performance is rather good on the Italian province cadastral use case.

Another experiment showed that increasing the SQLite page size from 1024 bytes (the default in SQLite 3.11 or earlier) to 4096 bytes (the default since SQLite 3.12) decreases the database size to 28.8 GB. This new page size of 4096 bytes is now used by default by the OGR SQLite and GPKG drivers (unless OGR_SQLITE_PRAGMA=page_size=xxxx is specified as a configuration option).

I also discovered that increasing the SQLite page cache from its 2 MB default to 2 GB (with --config OGR_SQLITE_CACHE 2000) significantly improved the time to build the spatial index, decreasing the total conversion time from 4h30 to 2h10. 2GB is just a value selected at random. It might be too large or perhaps a larger value would help further.

All improvements mentionned above (faster spatial index creation, better use of spatial index and change of default page size) are now in GDAL trunk, and will be available in the upcoming 2.2.0 release.

by Even Rouault (noreply@blogger.com) at March 11, 2017 10:25 PM

March 09, 2017

gvSIG Team

3rd Catedra gvSIG Contest

The aim of the Cátedra gvSIG is to create a meeting point for users interested in free space technologies. In order to foment an environment of shared knowledge and participating in the dissemination of free geomatics, the chair organizes this international contest to encourage all gvSIG users and free Geographic Information Systems users to share and give visibility to their work.

Students and graduates in high school, professional training and university, as well as university professors and researchers from all countries can participate in this contest. To enter to the competition you must meet the following requirements: Works must be done with free Geographic Information Systems and the subject of the work may address any area of knowledge. Works may have been made in 2016 or later, the papers may be presented collectively and individually and jobs may be sent in Spanish, Valencian or English.

In the event the work is based on a new development done through free and open source GIS geospatial technologies, these papers must be subjected to GNU / GPL v3 license. Among the selected works a prize of 500 euros will be awarded for each of the following categories:

  • Work produced by students of highschool or professional training.

  • Final University’s Project (Bachelor, Degree or Master).

  • Doctoral thesis or research paper.

Submissions should be sent to gvsig@umh.es and press@gvsig.com no later than November 1, 2017. Selected documents will be published in the repository of the Cátedra gvSIG UMH. The jury will evaluate the methodology, clarity and innovative nature of the work, assessing as well the relevance and applicability of the research.

Winners will be announced in the next International gvSIG Conference.

Filed under: english, events, press office Tagged: awards, contest, open source

by Alvaro at March 09, 2017 06:34 PM

GeoServer Team

REST API Code Sprint Prep

In our previous blog post we highlighted the GeoServer Code Sprint 2017 taking place at the of this month. We are all looking forward to GeoSolutions hosting us in beautiful Tuscany and have lots of work to do.

One of the secrets (and this comes as no surprise) to having a successful code sprint is being prepared. With this year’s REST API migration from restlet to spring model-view-controller we want to have all technical decisions made, and examples for the developers to work from, prior to any boots hitting the ground in Italy.

But before we get into the details …

Code Sprint Sponsors

We would like to thank our sprint sponsors – we are honoured that so many organizations have stepped up world wide to fund this activity.

Gaia3D is a professional software company in the field of geospatial information and Earth science technology. We would like to thank Gaia3D for their gold sponsorship.


Insurance Australia Group (IAD) is our second gold sponsor. This is a great example of open source being used, and supported, by an engineering team. Thanks to Hugh Saalmans and the Location Engineering team at IAD for your support.


Boundless is once again sponsoring the GeoServer team. Boundless provides a commercially supported open source GIS platform for desktop, server, mobile and cloud. Thanks to Quinn Scripter and the Boundless suite team for their gold sponsorship.



How 2 Map is pleased to support this year’s event with a bronze sponsorship.


I am overjoyed FOSSGIS (German local OSGeo chapter) is supporting us with a bronze sponsorship. This sponsorship means a lot to us as the local chapter program focuses on users and developers; taking the time to support our project directly is a kind gesture.



Sponsorship Still Needed

While we have a couple of verbal commitments to sponsor – we are still $1500 USD off the pace. If your organization has some capacity to financially support this activity we would dearly love your support.

This is an official OSGeo activity; any excess money is returned to the foundation to help the next open source sprint.  OSGeo sponsorship is cumulative. Check their website for details on how your helping out the geoserver team can be further recognized.

For sponsorship details visit the wiki page (or contact Jody Garnett for assistance).

Update: Since this post was published we are happy to announce new sponsor(s).

Thanks to Caroline Chanlon and the team at Atol Conseils et Développements for bronze sponsorship.


Update: Thanks to David Ghedini (acugis.com) and others donating smaller amounts via the OSGeo paypal button.

Getting Ready for REST

In this week’s GeoServer meeting we had a chance to sit down and plan out the steps needed to get ready.

The majority of prep will go into performing the restlet to spring mvc migration for a sample REST API end point to produce a “code example” for developers to follow. We have selected the rest/styles endpoint as one of the easier examples:

  1. Preflight check: Before we start we want to have a good baseline of the current REST API responses. We would like to double check that each endpoint has a JUnit test case that checks the response against a reference file. Most of our tests just count the number of elements, or drill into the content to look for a specific value. The goal is to use these reference files as a quick “regression test” when performing the migration.
  2. Migrate rest/styles from StyleResource (restlet) to StyleController (spring): This should be a bit of fun, part of why spring model-view-controller was selected. Our goal is to have one Controller per end-point, and configure the controller using annotations directly in the Java file. This ends up being quite readable with variable names being taken directly out of the URL path. It is also easier to follow since you do not have to keep switching between XML and Java files to figure out what is going on.  It is important that the example is “picture perfect” as it will be used as a template by the developers over the course of the sprint, and will be an example of the level of quality we expect during the activity.
  3. Create StyleInfo bindings (using XStream for xml and json generation): The above method returns a StyleInfo data structure, our current restlet solutions publishes each “resource” using the XStream library. We think we can adapt our XStream work for use in spring model-view-controller by configuring a binding for StyleInfo and implementing in using XStream.  This approach is the key reason we are confident in this migration being a success; existing clients that depend on exactly the same output from GeoServer – should get exactly the same output.
  4. StyleController path management: There is some work to configure each controller, while we have the option of doing some custom logic inside each controller we would like to keep this to a minimum.  This step is the small bit of applicationContext.xml configuration work we need to do for each controller, we expect it to be less work then reslet given the use of annotations.
  5. Reference Documentation Generation: We are looking into a tool called swagger for documentation generation. Our current reference documentation only lists each end-point (and does not provide information on the request and response expected – leaving users to read the examples or try out the api in an ad-hoc fashion). See screen snap below, our initial experience is positive, but the amount of work required is intimidating.
  6. Updated examples for cURL and Python: We would like to rewrite our examples in a more orderly fashion to make sure both XML and JSON sample requests and responses are provided. Ideally we will inline the “reference files” from the JUnit regression test in step 1 to ensure that the documentation is both accurate and up to date.

You can see a pretty even split in our priorities between performing the migration, and updating the documentation. We believe both of these goals need to be met for success.

Next stop Tuscany

Although this blog post focuses on the sponsorship/planning/logistics side of setting up a code sprint there is one group without whom this event could not happen – our sprint participants and in-kind sponsors (providing a venue & staff).

Thanks to GeoSolutions for hosting us, and to Astun, Boundless, GeoSolutions for the hands-on participation that makes this sprint possible

For more information:

by jgarnett at March 09, 2017 03:34 PM

gvSIG Team

Geoprocesamiento desde Scripting en gvSIG. Vídeo-tutorial disponible.

Tras la publicación del webinar “Aprende a programar en gvSIG en media hora” os presentamos un complemento perfecto “Geoprocesamiento desde Scripting en gvSIG”.

Por Geoprocesamiento entendemos a las operaciones de tratamiento o manipulación de datos espaciales realizadas en un Sistema de Información Geográfica. gvSIG, con más de 350 geoprocesos, tiene un potencial enorme como software de geoprocesamiento. Potencial que puede ampliarse gracias al scripting.

En este nuevo webinar podréis aprender a acceder desde scripting a los distintos geoprocesos de gvSIG (y de otras librerías) por medio de una librería denominada gvPy, ejecutar geoprocesos con una simple línea de código, convertir modelos de geoprocesos en scripts,…y todo ello lo veréis -tras una breve introducción teórica- mediante ejercicios prácticos.

Si hemos despertado vuestro interés, seguid leyendo…

Filed under: gvSIG Desktop, spanish Tagged: geoprocesamiento, geoprocesos, gvPy, python, scripting, webinar

by Alvaro at March 09, 2017 10:01 AM

March 08, 2017


FOSSGIS 2017 in Passau

In zwei Wochen beginnt die alljährliche deutschsprachige FOSSGIS Konferenz zum Theme Open Source GIS und OpenStreetMap in Passau. Vom 22.-25. März 2017 wird die FOSSGIS-Konferenz mit Untersützung der Universität Passau in der Dreiflüssestadt Passau stattfinden. Die FOSSGIS-Konferenz ist die im D-A-CH-Raum führende Konferenz für Freie und Open Source Software für Geoinformationssysteme sowie für die Themen OpenStreetMap und Open Data. An vier Tagen werden in Vorträgen für Einsteiger und Experten, Hands-On Workshops und Anwendertreffen Einblicke in neuste Entwicklungen und Anwendungsmöglichkeiten von Softwareprojekten gegeben.

Photo: Tobias Hobmeier (CC-BY-SA)

Sourcepole ist wieder mit einem Stand vertreten und lädt zu interessanten Workshops und Vorträgen ein:

  • Mittwoch 17:00 - Workshop Entwicklung von QGIS Plugins
  • Donnerstag 14:30 (HS 9) - QGIS Server Projektstatus
  • Donnerstag 14:30 (HS 11) - Von WMS zu WMTS zu Vektor-Tiles
  • Donnerstag 15:00 (HS 9) - QGIS Web Client 2

Das gesamte Programm ist auch als praktische Android App erhältlich und die Online-Anmeldung ist noch bis am 19.3. offen.

Wir freuen uns auf eine interessante Konferenz!

March 08, 2017 08:19 AM

gvSIG Team

Learning GIS with Game of Thrones (X): Legends

Today we are going to learn about how to change the symbology of a layer, reviewing different types of legends that are available in gvSIG Desktop.

The symbology is one of the most important properties of a layer. gvSIG includes a great variety of options to represent layers with symbols, graphs and colours. Excepting unique symbol option, the rest of legends are assigned to each element depending on their attribute values and the properties of the type of legend selected.

By default, when a layer is added to a View, it’s represented with a unique symbol with random colour, that means, all the elements of the layer are represented with the same symbol. To modify the symbology of a layer we have to access to its “Properties” window and select “Symbology” tab. We are going to open our “Game of Thrones” project and we will start to explore this part of gvSIG Desktop.

If we want to change a symbol, the easiest way is double-clicking on it at the ToC (Table of Contents with the list of layers). A new window will be opened to select the new symbol. For example we are going to double-click on the symbol of the “Rivers” layer.

At the new window we can change the colour and the width of the line, press on any of the symbol libraries installed (“gvSIG Basic” by default, although we can install a lot of libraries from the Add-ons Manager). In this case we are going to change width to 3 and select a dark blue colour. We press “Accept” to apply changes. 075_got

Now we are going to see the type of legends that are available and we will select a legend by the different types of locations, attribute that we have used in the previous posts. There are a lot of possibilities for symbology, and you can see this additional documentation.

Firstly we will have to open the “Properties” window of the layer. Activating the layer we will find this option at the “Layer/Properties” menu, or directly using the secondary button of the mouse on it. 

Now we access to the “Symbology” tab and a window is shown with the symbology applied. At the left side we can find all the types of symbols than we can use. Warning: Depending on the type of layer (point, line or polygon) we can find different legends available.

In this case we are going to select a legend about “Categories/Unique values”. This type of legend is used to assign a symbol to each unique value specified at the attribute table of the layer. Each element is drawn depending on the value of an attribute that identifies the category. In our case we will select “Type” for classification field; we press “Add all” and it will show the legend created by default:

The Labels (at the right side) can be modified. You can change the texts here.

Now, double-clicking on every symbol a new window will be opened where we can modify them or select new symbols from our symbol libraries with “Select symbol” option. Once they are selected we press “Apply” and we will see the results in our View. 

The best way to learn different type of legends is testing… We also recommend you to install and check the different symbol libraries that are available in gvSIG (hundreds of symbols of all types!!)

See you in the next post…

Filed under: english, gvSIG Desktop, training Tagged: Game of Thrones, legends, symbology

by Mario at March 08, 2017 06:20 AM

gvSIG Team

Aprendiendo SIG con Juego de Tronos (XV y final): Instalación de complementos

Dedicaremos este último post al “Administrador de complementos”, una herramienta que todo usuario de gvSIG Desktop debería conocer.

El administrador de complementos es una funcionalidad que permite personalizar gvSIG, instalando nuevas extensiones, ya sean funcionales o de otro tipo (bibliotecas de símbolos). Se ejecuta desde el menú “Herramientas/Administrador de complementos”, aunque también se puede acceder a él durante el proceso de instalación.

Gracias al “Administrador de complementos” podéis acceder, además de a plugins no instalados por defecto, a todas las nuevas herramientas que se vayan publicando.

En la ventana que aparece lo primero que debéis seleccionar es la fuente de instalación de los complementos:

Los complementos pueden tener 3 orígenes:

  • El propio binario de instalación. El archivo de instalación que nos hemos descargado contiene un gran número de complementos o plugins, algunos de los cuales no se instalan por defecto, pero están disponibles para su instalación. Esto permite poder personalizar gvSIG sin disponer de conexión a internet.

  • Instalación a partir de archivo. Podemos tener un archivo con un conjunto de extensiones listas para instalarse en gvSIG.

  • A partir de URL. Mediante una conexión a Internet podemos acceder a todos los complementos disponibles en el servidor de gvSIG e instalar aquellos que necesitemos.  Es la opción recomendada si se quieren consultar todos los plugins disponibles.

Una vez seleccionada la fuente de instalación, pulsáis el botón de “Siguiente”, lo que nos mostrará el listado de complementos disponibles.

La interfaz del administrador de complementos se divide en 4 partes:

  1. Listado de complementos disponibles. Se indica el nombre del complemento, la versión y el tipo. Las casillas de verificación permiten diferenciar entre complementos ya instalados (color verde) y disponibles (color blanco). Puede ser interesante que revises el significado de cada uno de los iconos.

  2. Área de información referente al complemento seleccionado en “1”.

  3. Área que muestra las “Categorías” y “Tipos” en que se clasifican los complementos. Pulsando en los botones de “Categorías” y “Tipos” se actualiza la información de esta columna. Al seleccionar una categoría o tipo del listado se ejecuta un filtro que mostrará en “1” solo los complementos relacionados con esa categoría o tipo.

  4. Filtro rápido. Permite realizar un filtro a partir de una cadena de texto que introduzca el usuario.

En nuestro caso vamos a instalar una nueva biblioteca de símbolos. Para ello pulsaremos en la categoría “Symbols”, lo que nos filtrará entre los plugins que son “bibliotecas de símbolos”:

A continuación marcamos la biblioteca “G-Maps”:

Pulsamos el botón “Siguiente” y, una vez acabada la instalación, el botón “Terminar”. Un mensaje nos indicará que es necesario reiniciar (en el caso de instalar plugins funcionales es así, pero no es necesario cuando instalamos bibliotecas de símbolos).

Si ahora vamos a cambiar la simbología de alguna de nuestras capas, por ejemplo “Locations”, veremos que ya tenemos los nuevos símbolos disponibles:

Podéis echar un vistazo a las bibliotecas de símbolos disponibles en la documentación.

Y con este último post acabamos este atípico curso de introducción a los SIG. Esperamos que hayáis aprendido y, además, os haya resultado tan divertido como a nosotros hacerlo.

A partir de aquí ya estáis preparados para profundizar en la aplicación e ir descubriendo toda su potencia. Un último consejo…utilizad las lista de usuarios para consultar cualquier duda o comunicarnos cualquier problema con el que os encontréis:


Y recordad…gvSIG is coming!


Filed under: gvSIG Desktop, spanish Tagged: administrador de complementos, bibliotecas de símbolos, extensiones, Juego de tronos, plugins

by Alvaro at March 08, 2017 05:58 AM

March 07, 2017

Jackie Ng

React-ing to the need for a modern MapGuide viewer (Part 14): The customization story so far.

I've been getting an increasing amount of questions lately about "How do you do X?" with mapguide-react-layout. So the purpose of this post is to lay out the customization story so far, so you have a good idea of whether the thing you want to do with this viewer is possible or not.

Before I start, it's best to divide this customization story into two main categories:

  1. Customizations that reside "inside" the viewer
  2. Customizations that reside "outside" the viewer
What is the distinction? Read on.

Customizations "inside" the viewer

I define customizations "inside" the viewer as customizations:
  • That require no modifications to the entry point HTML file that initializes and starts up the viewer. To use our other viewer offerings as an analogy, your customizations work with the AJAX/Fusion viewers as-is without embedding the viewer or modifying any of the template HTML.
  • That are represented as commands that reside in either a toolbar or menu/sub-menu and registered/referenced in your Web Layout or Application Definition
  • Whose main UI reside in the Task Pane or a floating or popup window and uses client-side APIs provided by the viewer for interacting with the map.
These customizations are enabled in our existing viewer offerings through:
  • InvokeURL commands/widgets
  • InvokeScript commands/widgets
  • Client-side viewer APIs that InvokeURL and InvokeScript commands can use 
  • Custom widgets

    From the perspective of mapguide-react-layout, here is what's supported

    InvokeURL commands

    InvokeURL commands are fully supported and do what you expect from our existing viewer offerings:
    • Load a URL (that normally renders some custom UI for displaying data or interacting with the map) into the Task Pane or a floating/popup window.
    • It is selection state aware if you choose to set the flag in the command definition.
    • It will include whatever parameters you have specified in the command definition into the URL that is invoked.
    If most/all of your customizations are delivered through InvokeURL commands, then mapguide-react-layout already has you covered.

    InvokeScript commands

    InvokeScript commands are not supported and I have no real plans to bring such support across. I have an alternate replacement in place, which will require you to roll your own viewer. 

    Client-side viewer APIs

    If you use AJAX viewer APIs in your Task Pane content for interacting with the map, they are supported here as well. Most of the viewer APIs are mostly implemented, short of a few esoteric APIs.

    If your client-side code is primarily interacting with APIs provided by Fusion, you're out of luck at the moment as none of the Fusion client-side APIs have been ported across. I have no plans to port these APIs across 1:1, though I do intend to bring across some kind of pub/sub event system so your client-side code has the ability to respond to events like selection changed, etc.

    Custom Widgets

    In Fusion, if InvokeURL/InvokeScript widgets are insufficient for your customization needs, this is where you would create a custom widget. Like the replacement for InvokeScript commands I intend to enable a similar system once again through custom builds of the mapguide-react-layout viewer.

    My personal barometer for how well mapguide-react-layout supports "inside" customizations is the MapGuide PHP Developer's Guide samples. 

    If you load the Web Layout for this sample in the mapguide-react-layout viewer, you will see all of the examples (and the viewer APIs they demonstrate) all work as before. If your customizations are similar in nature to what is demonstrated in the MapGuide PHP Developer's Guide samples, then things should be smooth sailing.

    Customizations "outside" the viewer

    I define customizations "outside" the viewer as primarily being one of 2 things:
    • Embedding the viewer in a frame/iframe or a DOM element that is not full width/height and providing sufficient APIs so that code in the embedding content document can interact with the viewer or for code in the embedding content document to be able to listen on certain viewer events.
    • Being able to init the viewer with all the required configuration (ie. You do not intend to pass a Web Layout or Application Definition to init this viewer)
    On this front, mapguide-react-layout doesn't offer much beyond a well-defined entry point to init and mount the viewer component.

    Watch this space for how I hope to tackle this problem.

    Rolling your own viewer

    The majority of the work done since the last release is to enable the scenario of being able to roll your own viewer. By being able to roll your own viewer, you will have full control over viewer customization for things the default viewer bundle does not support, such as:
    • Creating your own layout templates
    • Creating your own script commands
    • Creating your own components
    If you do decide to go down this path, there will be some things that you should become familiar with:
    • You are familiar with the node.js ecosystem. In particular, you know how to use npm/yarn
    • You are familiar with webpack
    • Finally, you are familiar with TypeScript and have some experience with React and Redux
    Basically, if you go down this road you should have a basic idea of how frontend web development is done in the current year of 2017, because it is no longer manually editing HTML files, script tags and sprinkles of jQuery.

    Because what I intend to do allow for this scenario is to publish the viewer as an npm module. To roll your own viewer, you would npm/yarn install the mapguide-react-layout module, write your custom layouts/commands/components in TypeScript, and then set up a webpack configuration to pull it all together into your own custom viewer bundle.

    I hope to have an example project available (probably in a different GitHub repository) when this is ready that demonstrates how to do this.

    In Closing

    When you ask the question of "How can I do X?" in mapguide-react-layout, you should reframe the question in terms of whether the thing you are trying to do is "inside" or "outside" the viewer. If it is "inside" the viewer and you were able to do this in the past with the AJAX/Fusion viewers through the extension points and APIs offered, chances are very high that similar equivalent functionality has already been ported across.

    If you are trying to do this "outside" the viewer. you'll have to wait for me to add whatever APIs and extension points are required.

    Failing that, you will have the ability to consume the viewer as an npm module and roll your own viewer with your specific customizations.

    Failing that? 

    You could always fork the GitHub repo and make whatever modifications you need. But you should not have to go that far.

    by Jackie Ng (noreply@blogger.com) at March 07, 2017 02:40 PM

    gvSIG Team

    Aprende a programar en gvSIG en media hora

    Es frecuente que desde la Asociación gvSIG impartamos talleres de scripting en las diversas jornadas gvSIG que se realizan alrededor del mundo. Y es interesante asistir a estos talleres de “observador” porque permite ver como entran alumnos sin ningún conocimiento de programación en gvSIG Desktop y salen con la base necesaria para poder comenzar a desarrollar sus propios scripts en la aplicación.

    Ese es uno de los objetivos principales del scripting, dar a todo tipo de usuarios -no necesariamente programadores- un mecanismo para que de forma muy sencilla puedan desarrollar aplicaciones o herramientas sobre gvSIG Desktop.

    ¿Tan, tan sencillo como para aprender scripting en media hora?

    Ese es el reto que nuestro compañero de la Asociación gvSIG, Óscar Martínez, se ha planteado con el webinar que realizamos en la Universidad Miguel Hernández y del que ahora tenéis disponible el vídeo.

    Reservad media hora de vuestro tiempo y seguid leyendo...

    Filed under: gvSIG Desktop, spanish Tagged: desarrollo, jython, python, quick start, scripting, tutorial

    by Alvaro at March 07, 2017 10:16 AM

    Stefano Costa

    Numbering boxes of archaeological items, barcodes and storage management

    Last week a tweet from the always brilliant Jolene Smith inspired me to write down my thughts and ideas about numbering boxes of archaeological finds. For me, this includes also thinking about the physical labelling, and barcodes.

    The question Jolene asks is: should I use sequential or random numbering? To which many answered: use sequential numbering, because it bears significance and can help detecting problems like missing items, duplicates, etc. Furthermore, if the number of items you need to number is small (say, a few thousands), sequential numbering is much more readable than a random sequence. Like many other archaeologists faced with managing boxes of items, I have chosen to use sequential numbering in the past. With 200 boxes and counting, labels were easily generated and each box had an associated web page listing the content, with a QR code providing a handy link from the physical label to the digital record. This numbering system was put in place during 3 years of fieldwork in Gortyna and I can say that I learned a few things in the process. The most important thing is that it’s very rare to start from scratch with the correct approach: boxes were labeled with a description of their content for 10 years before I adopted the numbering system pictured here. This sometimes resulted in absurdly long labels, easily at risk of being damaged, difficult to search since no digital recording was made. I decided a numbering system was needed because it was difficult to look for specific items, after I had digitised all labels with their position in the storage building (this often implied the need to number shelves, corridors, etc.). The next logical thing was therefore to decouple the labels from the content listing ‒ any digital tool was good here, even a spreadsheet. Decoupling box number from description of content allowed to manage the not-so-rare case of items moved from one box to another (after conservation, or because a single stratigraphic context was excavated in multiple steps, or because a fragile item needs more space …), and the other frequent case of data that is augmented progressively (at first, you put finds from stratigraphic unit 324 in it, then you add 4.5 kg of Byzantine amphorae, 78 sherds of cooking jars, etc.). Since we already had a wiki as our knowledge base, it made sense to use that, creating a page for each box and linking from the page of the stratigraphic unit or that of the single item to the box page (this is done with Semantic MediaWiki, but it doesn’t matter). Having a URL for each box I could put a QR code on labels: the updated information about the box content was in one place (the wiki) and could be reached either via QR code or by manually looking up the box number. I don’t remember the details of my reasoning at the time, but I’m happy I didn’t choose to store the description directly inside the QR code ‒ so that scanning the barcode would immediately show a textual description instead of redirecting to the wiki ‒ because that would require changing the QR code on each update (highly impractical), and still leave the information unsearchable. All this is properly documented and nothing is left implicit. Sometimes you will need to use larger boxes, or smaller ones, or have some items so big that they can’t be stored inside any container: you can still treat all of these cases as conceptual boxes, number and label them, give them URLs.

    QR codes used for boxes of archaeological items in Gortyna

    There are limitations in the numbering/labelling system described above. The worst limitation is that in the same building (sometimes on the same shelf) there are boxes from other excavation projects that don’t follow this system at all, and either have a separate numbering sequence or no numbering at all, hence the “namespacing” of labels with the GQB prefix, so that the box is effectively called GQB 138 and not 138. I think an efficient numbering system would be one that is applied at least to the scale of one storage building, but why stop there?

    Turning back to the initial question, what kind of numbering should we use? When I started working at the Soprintendenza in Liguria, I was faced with the result of no less than 70 years of work, first in Ventimiglia and then in Genoa. In Ventimiglia, each excavation area got its “namespace” (like T for the Roman theater) and then a sequential numbering of finds (leading to items identified as T56789) but a single continuous sequential sequence for the numbering of boxes in the main storage building. A second, newer building was unfortunately assigned a separate sequence starting again from 1 (and insufficient namespacing). In Genoa, I found almost no numbering at all, despite (or perhaps, because of) the huge number of unrelated excavations that contributed to a massive amount of boxes. Across the region, there are some 50 other buildings, large and small, with boxes that should be recorded and accounted for by the Soprintendenza (especially since most archaeological finds are State property in Italy). Some buildings have a numbering sequence, most have paper registries and nothing else. A sequential numbering sequence seems transparent (and allows some neat tricks like the German tanks problem), since you could potentially have an ordered list and look up each number manually, which you can’t do easily with a random number. You also get the impression of being able to track gaps in a sequence (yes, I do look for gaps in numeric sequences all the time), thus spotting any missing item. Unfortunately, I have been bitten too many times by sequential numbers that turned out to have horrible bis suffixes, or that were only applied to “standard” boxes leaving out oversized items.

    On the other hand, the advantages of random numbering seem to increase linearly with the number of separate facilities ‒ I could replace random with non-transparent to better explain the concept. A good way to look at the problem is perhaps to ask whether numbering boxes is done as part of a bookkeeping activity that has its roots in paper registries, or it is functional to the logistics of managing cultural heritage items in a modern and efficient way.

    Logistics. Do FedEx, UPS, Amazon employees care what number sequence they use to track items? Does the cashier at the supermarket care whether the EAN barcode on your shopping items is sequential? I don’t know, but I do know that they have a very efficient system in place, in which human operators are never required to actually read numerical IDs (but humans are still capable of checking whether the number on the screen is the same as the one printed on the label). There are many types of barcode used to track items, both 1D and 2D, all with their pros and cons. I also know of some successful experiments with RFID for archaeological storage boxes (in the beautiful depots at Ostia, for example), that can record numbers up to 38 digits.

    Based on all the reflections of the past years, my idea for a region- or state-wide numbering+labeling system is as follows (in RFC-style wording):

    1. it MUST use a barcode as the primary means of reading the numerical ID from the box label
    2. the label MUST contain both the barcode and the barcode content as human-readable text
    3. it SHOULD use a random numeric sequence
    4. it MUST use a fixed-length string of numbers
    5. it MUST avoid the use of any suffixes like a, b, bis

    In practice, I would like to use UUID4 together with a barcode.

    A UUID4 looks like this: 1b08bcde-830f-4afd-bdef-18ba918a1b32. It is the UUID version of a random number, it can be generated rather easily, works well with barcodes and has a collision probability that is compatible with the scale I’m concerned with ‒ incidentally I think it’s lower than the probability of human error in assigning a number or writing it down with a pencil or a keyboard. The label will contain the UUID string as text, and the barcode. There will be no explicit URL in the barcode, and any direct link to a data management system will be handled by the same application used to read the barcode (that is, a mobile app with an embedded barcode reader). The data management system will use UUID as part of the URL associated with each box. You can prepare labels beforehand and apply them to boxes afterwards, recording all the UUIDs as you attach the labels to the boxes. It doesn’t sound straightforward, but in practice it is.

    And since we’re deep down the rabbit hole, why stop at the boxes? Let’s recall some of the issues that I described non-linearly above:

    1. the content of boxes is not immutable: one day item X is in box Y, the next day it gets moved to box Z
    2. the location of boxes is not immutable: one day box Y is in room A of building B, the next day it gets moved to room C of building D
    3. both #1 and #2 can and will occur in bulk, not only as discrete events

    The same UUIDs can be applied in both directions in order to describe the location of each item in a large bottom-up tree structure (add as many levels as you see fit, such as shelf rows and columns):

    item X → box Y → shelf Z → room A → building B


    b68e3e61-e0e7-45eb-882d-d98b4c28ff31 → 3ef5237e-f837-4266-9d85-e08d0a9f4751
    3ef5237e-f837-4266-9d85-e08d0a9f4751 → 77372e8c-936f-42cf-ac95-beafb84de0a4
    77372e8c-936f-42cf-ac95-beafb84de0a4 → e895f660-3ddf-49dd-90ca-e390e5e8d41c
    e895f660-3ddf-49dd-90ca-e390e5e8d41c → 9507dc46-8569-43f0-b194-42601eb0b323

    Now imagine adding a second item W to the same box: since the data for item Y was complete, one just needs to fill one container relationship:

    b67a3427-b5ef-4f79-b837-34adf389834f → 3ef5237e-f837-4266-9d85-e08d0a9f4751

    and since we would have already built our hypothetical data management system, this data is filled into the system just by scanning two barcodes on a mobile device that will sync as soon as a connection is available. Moving one box to another shelf is again a single operation, despite many items actually moved, because the leaves and branches of the data tree are naïve and only know about their parents and children, but know nothing about grandparents and siblings.

    There are a few more technical details about data structures needed to have a decent proof of concept, but I already wrote down too many words that are tangential to the initial question of how to number boxes.

    by Stefano Costa at March 07, 2017 08:23 AM

    March 06, 2017

    gvSIG Team

    Aprendiendo SIG con Juego de Tronos (XIV): Mapas

    En este penúltimo post del curso para aprender la bases de los Sistemas de Información Geográfica mediante ejercicios prácticos con datos de Juego de Tronos vamos a trabajar con el documento “Mapa”.

    Un documento Mapa es un conjunto de elementos de diseño de un mapa o plano, organizados en una página virtual y cuyo objetivo es su salida gráfica (impresión o exportación a PDF). Lo que se ve en el diseño es lo que se obtiene al imprimir o exportar el mapa al mismo tamaño de página definido. En un Mapa se pueden insertar dos tipos de elementos: Elementos cartográficos y de diseño.

    En nuestro caso vamos a crear un mapa con la ruta seguida por los hermanos Greyjoy y dibujada en el post sobre “Edición gráfica”.

    Una vez tenemos abierto nuestro proyecto en gvSIG, lo primero que haremos es ir a la ventana de “Gestor de proyecto”. Una forma rápida de hacerlo es mediante el menú “Mostrar/Gestor de proyecto”. Seleccionamos el tipo de documento “Mapa” y pulsamos el botón de nuevo. Se nos abrirá una nueva ventana donde definiremos la características de la página de Mapa.

    En nuestro caso seleccionaremos un “Tamaño de página” de “A4”, con “Orientación” “Horizontal” y le indicaremos que utilice la Vista donde tenemos nuestras capas cargadas en lugar de “Crear nueva Vista”. Si tenéis más de una Vista en vuestro proyecto, aparecerá un listado con todas ellas.

    Veréis que crea un nuevo mapa, en el que se ha insertado la Vista indicada y que ocupa toda la superficie de la página:

    Pulsando sobre los “cuadrados negros” que aparecen en las esquinas y puntos medios del rectángulo que define la extensión de la Vista podemos cambiar su tamaño. De este modo vamos definiendo nuestro diseño del mapa. Haciendo clic sobre el elemento Vista insertado y arrastrando podemos desplazarlo. En nuestro caso redimensionamos la Vista insertada y la desplazamos, pasando a continuación a añadir otros elementos cartográficos.

    La mayoría de los elementos cartográficos están íntimamente ligados a un documento Vista, de modo que al realizar cambios en la Vista, pueden verse reflejados en el mapa (cambios de zoom, desplazamientos, modificación de leyendas, organización de capas, etc.). Estas herramientas están disponibles desde el menú “Mapa/Insertar“ y en la barra de botones correspondiente.

    Vamos a comenzar por insertar la leyenda. Esta herramienta está disponible desde el menú “Mapa/Insertar/Leyenda“ o con su botón:

    La leyenda siempre se asocia con una Vista insertada en el Mapa y permite representar la simbología de las distintas capas de esa Vista. Una vez seleccionada la herramienta, se indicará el primer extremo del rectángulo que define el espacio a ocupar por la leyenda haciendo clic sobre el área de Mapa en el lugar deseado, y arrastrando hasta soltar en el extremo opuesto. Se mostrará un cuadro de diálogo en el que puede definir las propiedades gráficas de la leyenda insertada:

    En esta ventana podemos marcar que capas (su simbología) queremos que aparezca en la leyenda.

    A continuación pasamos a insertar un símbolo de Norte. Esta herramienta está disponible desde el menú “Mapa/Insertar/Norte“ y en su botón correspondiente:

    Una vez seleccionada la herramienta, se indicará el primer extremo del rectángulo que define el espacio a ocupar por el símbolo de norte haciendo clic sobre el área de Mapa en el lugar deseado, y arrastrando hasta soltar en el extremo opuesto. Se mostrará un cuadro de diálogo en el que puede definir las propiedades gráficas del norte insertado:

    Y nuestro Mapa tendrá el siguiente aspecto:

    Para finalizar insertaremos un título con la herramienta de “Insertar texto” (en el menú Mapa/Insertar/Texto o en su botón correspondiente). El funcionamiento es similar al de los otros elementos, y en este caso lo que indicaremos es el texto que queremos que aparezca: “Greyjoy Brothers”.

    A partir de aquí y por no alargar demasiado el post os encomendamos a que reviséis la documentación relacionada con el documento Mapa y que vayáis probando a insertar escalas gráficas, cajetines, etc. así como a probar las herramientas de ayuda al dibujo…con práctica os pueden quedar mapas realmente bien diseñados.

    Una vez tengáis vuestro mapa acabado podéis exportarlo a PDF con el botón:

    Ya podéis enviar vuestro archivo PDF a todos vuestros contactos.

    Como suelen decir, la práctica hace al maestro…así que ya sabéis.

    Queda un post para despedir el curso…no os lo perdáis.

    Filed under: gvSIG Desktop, spanish Tagged: Exportar a PDF, Juego de tronos, mapa

    by Alvaro at March 06, 2017 09:28 PM

    Paul Ramsey

    Christy Clark's $1M Faux Conference Photo-op

    On February 25, 2013, Christie Clark mounted the stage at the “International LNG Conference” in Vancouver to trumpet her government’s plans to have “at least one LNG pipeline and terminal in operation in Kitimat by 2015 and three in operation by 2020”.

    Christy Clark's $1M Faux Conference Photo-op

    Notwithstanding the Premier’s desire to frame economic devopment as a triumph of will, and notwithstanding the generous firehosing of subsidies and taxbreaks on the still nascent sector, the number of LNG pipelines and terminals in operation in Kitimat remains stubbornly zero. The markets are unlikely to relent in time to make the 2020 deadline.

    And about that “conference”?

    Like the faux “Bollywood Awards” that the government paid $10M to stage just weeks before the 2013 election, the “LNG in BC” conference was a government organized “event” put on primarily to advance the pre-election public relations agenda of the BC Liberal party.

    In case anyone had any doubts about the purpose of the “event”, at the 2014 edition an exhibitor helpfully handed out a brochure to attendees, featuring an election night picture of the Premier and her son, under the title “We Won”.

    We Won

    The “LNG in BC” conference continued to be organized by the government for two more years, providing a stage each year for the Premier and multiple Ministers to broadcast their message.

    The government is no longer organizing an annual LNG confab, despite their protestations that the industry remains a key priority. At this point, it would generate more public embarassment than public plaudits.

    Instead, we have a new faux “conference”, slated to run March 14-15, just four weeks before the 2017 election begins: the #BCTech Summit.

    Like “LNG in BC”, the “BCTech Summit” is a government-organized and government-funded faux industry event, put on primarily to provide an expensive backdrop for BC Liberal politicking.

    BC Innovation Council

    The BC Innovation Council (BCIC) that is co-hosting the event is itself a government-funded advisory council run by BC Liberal appointees, many of whom are also party donors. To fund the inaugural 2016 version of the event, the Ministry of Citizens Services wrote a direct award $1,000,000 contract to the BCIC.

    The pre-election timing is not coincidental, it is part of a plan that dates all the way back to early 2015, when Deputy Minister Athana Mentzelopoulos directed staff to begin planning a “Tech Summit” for spring of the following year.

    “We will not be coupling the tech summit with the LNG conference. Instead, the desire is to plan for the tech summit annually over the next couple of years – first in January 2016 and then in January 2017.” – email from A. Mentzelopoulos, April 8, 2015

    The intent of creating a “conference” to sell a new-and-improved government “jobs plan”, and the source of that plan, was made clear by the government manager tasked with delivering the event.

    “The push for this as an annual conference has come from the Premier’s Office and they want to (i) show alignment with the Jobs Plan (including the LNG conference) and (ii) show this has multi-ministry buy-in and participation.” – S. Butterworth, April 24, 2015

    The event was not something industry wanted. It was not even something the BCIC wanted. It was something the Premier’s Office wanted.

    And so they got it: everyone pulled together, the conference was put on, and it made a $1,000,000 loss which was dutifully covered by the Province via the BC Innovation Council, laying the groundwork for 2017’s much more politically potent version.

    This year’s event will be held weeks before the next election. It too will be subsidized heavily by the government. And as with the LNG conference, exhibitors and sponsors will plunk down some money to show their loyalty to the party of power.

    LNG BC Sponsors

    The platinum sponsors of LNG in BC 2015 were almost all major LNG project proponents: LNG Canada, Pacific Northwest LNG, and Kitimat LNG. Were they, like sponsors at a normal trade conference, seeking to raise their profile among attendees? Or were they demonstrating their loyalty to the government that organized the event and then approached them for sponsorship dollars?

    It is hard to avoid the conclusion that these events are just another conduit for cash in our “wild west” political culture, a culture that shows many of the signs of “systematic corruption” described by economist John Wallis in 2004.

    “In polities plagued with systematic corruption, a group of politicians deliberately create rents by limiting entry into valuable economic activities, through grants of monopoly, restrictive corporate charters, tariffs, quotas, regulations, and the like. These rents bind the interests of the recipients to the politicians who create them.”

    Systematically corrupt governments aren’t interested in personally enriching their members, they are interested in retaining and reinforcing their power, through a virtuous cycle of favours: economic favours are handed to compliant economic actors who in turn do what they can to protect and promote their government patrons.

    Circle of Graft

    The 2017 #BCTech conference already has a title sponsor: Microsoft. In unrelated news, Microsoft is currently negotiating to bring their Office 365 product into the BC public sector. If the #BCTech conference was an ordinary trade show, these two unrelated facts wouldn’t be cause for concern. But because the event is an artificially created artifact of the Premier’s Office, a shadow is cast over the whole enterprise.

    Who is helping who here, and why?

    A recent article in Macleans included a telling quote from an anonymous BC lobbyist:

    If your client doesn’t donate, it puts you at a competitive disadvantage, he adds. It’s a small province, after all; the Liberals know exactly who is funding them, the lobbyist notes, magnifying the role donors play and the access they receive in return.

    As long as BC remains effectively a one-party state, the cycle of favors and reciprocation will continue. Any business subject to the regulatory or purchasing decisions of government would be foolish not to hedge their bets with a few well-placed dollars in the pocket of BC’s natural governing party.

    The cycle is systematic and self-reinforcing, and the only way to break the cycle, is to break the cycle.

    March 06, 2017 08:00 PM


    Finalmente disponibile il profilo DCAT-AP IT per CKAN


    Dear Reader,

    We apologize in advance, but this post is for our italian readers (hence in Italian only) to announce that we have finalized the implementation of the DCAT-AP_IT Metadata Profile leveraging on the CKAN Open Data product.

    Siamo lieti di annunciare il primo rilascio dell’estensione CKAN per il supporto al profilo applicativo  DCAT-AP_IT nei portali open data italiani. Lo sviluppo è stato sostenuto, in uno sforzo congiunto, dalla Provincia di Bolzano/Sud Tirol e dalla Provincia di Trento ed è disponibile gratuitamente con licenza  AGPL v3.0.

    Il profilo per la documentazione dei dati delle pubbliche amministrazioni (DCAT-AP_IT), reso disponibile dall’Agenzia per l’Italia Digitale (AgID), nasce con l’obiettivo di armonizzare i metadati con cui vengono descritti i dataset pubblici, al fine di migliorarne la qualità e favorire il riuso delle informazioni. L’estensione ckanext-dcatapit, sviluppata da GeoSolutions, è disponibile su una respository dedicata sotto il nostro account GitHub con un’accurata documentazione che aiuta a comprenderne le caratteristiche, i requisiti e le specifiche di installazione e configurazione (presto la repository sarà portata sotto l'egida di AgID). Le pubbliche amministrazioni italiane potranno quindi utilizzare l’estensione per rendere i propri cataloghi conformi al profilo italiano DCAT-AP_IT e favorire le pratiche di condivisione e standardizzazione con le altre PA del territorio italiano.

    [caption id="attachment_3307" align="alignnone" width="515"]Scheda di visualizzazione del dataset Scheda di visualizzazione del dataset[/caption]

    Grazie alla notevole esperienza acquisita dal team di sviluppo di GeoSolutions nella realizzazione di numerose estensioni per CKAN, nonché nell’installazione e configurazione di molteplici cataloghi che utilizzano questa piattaforma, l’estensione ckanext-dcatapit fornisce un insieme valido ed eterogeneo di funzionalità non solo per la creazione guidata di datasets, ma anche per l’integrazione di metadati provenienti da sorgenti esterne (CSW, RDF, JSON-LD) in conformità al Profilo Applicativo.

    [caption id="attachment_3308" align="alignnone" width="515"]Form di modifica del dataset Form di modifica del dataset[/caption]

    L’estensione ckanext-dcatapit è stata sviluppata con scrupolosa attenzione non solo in merito alle sue funzionalità caratteristiche e alla loro stabilità, ma anche per garantire la più alta compatibilità possibile con le altre estensioni spesso presenti nelle piattaforme CKAN installate. In aggiunta favorisce l’integrazione con eventuali estensioni custom che necessitano di definire campi aggiuntivi ai dataset. Anche gli aspetti legati al multilinguismo e la localizzazione dell’interfaccia sono stati affrontati e resi disponibili per garantire la massima usabilità da parte di quelle realtà che li necessitano, come per esempio le Provincie di Bolzano/Sud Tirol e Trento: l’estensione fornisce i propri files di localizzazione che aiutano a snellire eventuali personalizzazioni in questi termini, mentre l’estensione ckanext-multilang fornisce supporto per il multilinguismo dei contenuti presenti nel catalogo (dataset, organizzazioni, gruppi e altro).

    [caption id="attachment_3309" align="alignnone" width="515"]Multilinguismo al lavoro Multilinguismo al lavoro[/caption] Di seguito un elenco delle PA che già usano ckanext-dcatapit:
    • Il portale OpenData della Provincia di Bolzano/Sud Tirol.
    • Il portale OpenData del Trentino.
    • L’infrastruttura federata OpenDataNetwork (per il momento ancora in test) per il capofila Città Metropolitana di Firenze, che raccoglie e distribuisce i dati di vari enti toscani tra cui: Città Metropolitana di Firenze, Provincia di Prato, Provincia di Pistoia ed Autorità di Bacino dell’Arno.
    Invitiamo tutti coloro che sono interessati a partecipare allo sforzo per lo sviluppo di questa estensione o che fossero interessati ad utilizzare questa estensione a seguire il nostro blog o iscriversi alla nostra newsletter; raccomandiamo di visionare anche i nostri pacchetti di supporto professionale GeoSolutions Enterprise Support Services nel caso si volesse usufruire di un supporto attento e qualificato per la messa in produzione di questa estensione. Allo stesso modo vi invitiamo a visionare le informazioni sugli altri nostri prodotti Open Source quali GeoServerMapstore, GeoNode e GeoNetwork.
    The GeoSolutions team,

    by simone giannecchini at March 06, 2017 01:39 PM

    gvSIG Team

    gvSIG is nominated at the maximum category of the “Sharing & Reuse Awards” given by the European Commission

    As the tithe of this post says, the career of the gvSIG Project has been recognised by the European Commission with the nomination to the maximum category of the “Sharing & Reuse Awards”.

    The General Manager for Communication and Information Technologies, Vicente Aguiló, has announced that “the gvSIG project, born at the Generalitat, has been selected by the European Commission as finalist for the “Best open source software solution at the cross border category” at the first edition of the Sharing & Reuse Awards.

    The project will compete at the category with the highest impact, the international one, together with other three finalist proposals. A total of 17 projects have been nominated by the Commission finally, after evaluating 118 proposals, from different places around the European Union, for the cross border, national, regional and local categories.

    The community executive branch has created the Sharing & Reuse Awards in recognition of the modernization of the Public Administrations in Europe, through the development of electronic Government solutions that can be reused by other organizations, thanks to the open source software.

    The final results will be announced in March 29th 2017 in Lisbon, in the Sharing & Reuse Conference 2017, the slogan of which is “Solving the European IT puzzle together”. Apart from the results, being among the 4 nominated projects is a recognition to the gvSIG project and everything that has been built around it.

    The event, in its first edition, will meet the experts in open source software and public administration international community, as well as representatives of the European institutions, to debate about the advantages to share and reuse IT solutions in the public sector.sharing_awards

    The Communication and Information Technologies General Management (DGTIC) will present the gvSIG project at the conference in Lisbon, together with the other finalist proposals, coming from Germany, Austria, Belgium, Spain, Finland, France, Greece, the Netherlands and Czech Republic.

    The Communication and Information Technologies General Manager of the Generalitat has highlighted that “this nomination is a recognition for the career of a project that has been developed in and out of the Administration and that has converted the Valencian Community in an international mentor for geolocation through open source software use”.

    Vicente Aguiló has remembered that “the project was born at the Generalitat to create a geographic information system based on open source code, and once launched it was released to an international community of developers that form the gvSIG Association at this moment, and that we become part of it”.

    You can consult all the information about these awards and nominated projects in each category in:


    See you in Lisbon!

    Filed under: english, gvSIG Desktop, premios, press office, software libre Tagged: European Comission

    by Mario at March 06, 2017 01:23 PM

    Fernando Quadro

    10 anos de GeoServer no Brasil

    Desde meados de 2016, por um convite recebido do Jody Garnett, estou escrevendo no Blog do GeoServer (em inglês). Em meu segundo post, resolvi relatar um pouco sobre os 10 anos que a comunidade GeoServer-BR está comemorando neste ano de 2017.

    Tenho imenso orgulho de saber que tudo começou em 2007, através do curso que ministrei no III ENUM (Encontro Nacional de Usuários MapServer) em Brasília/DF.

    Com o passar dos anos, é muito gratificante ver como o GeoServer está difundido no Brasil e como tem sido amplamente utilizado pelas empresas de todos os setores e portes, e também pelos órgãos governamentais que o adotaram como servidor de mapas oficial da INDE (Infraestrutura Nacional de Dados Espaciais).

    Gostaria de agradecer a Boundless, que é a empresa mantenedora do GeoServer, além de todos que contribuíram de alguma forma nesses 10 anos para o crescimento e divulgação do GeoServer no Brasil.

    by Fernando Quadro at March 06, 2017 10:30 AM

    March 05, 2017

    Paulo van Breugel

    Exporting rasters to Mbtiles using GDAL

    Web maps are generally made up of many small, square images called tiles, which are placed side by side in order to create the illusion of a very large seamless image [for a good explanation, see here]. Tiled based maps can be made up of many tiles. Loading all those tiles would be inefficient and … Continue reading Exporting rasters to Mbtiles using GDAL

    by pvanb at March 05, 2017 11:24 PM

    gvSIG Team

    gvSIG nominado a la máxima categoría de los premios “Sharing & Reuse Awards” que otorga la Comisión Europea

    Como dice el titular de este post, la trayectoria del proyecto gvSIG ha sido reconocida por la Comisión Europa con la nominación a la máxima categoría de los premios denominados “Sharing & Reuse Awards”.

    El director general de Tecnologías de la Información y las Comunicaciones, Vicente Aguiló, ha anunciado hoy que “el proyecto gvSIG de la Generalitat ha sido seleccionado por la Comisión Europea como finalista para competir por el premio Mejor Solución Transfronteriza de software libre”, en la primera edición de los Sharing & Reuse Awards.

    El proyecto gvSIG competirá en la categoría de mayor proyección, la internacional, junto a otras tres propuestas finalistas. Un total de 17 proyectos han sido finalmente nominados por la Comisión, tras haber evaluado 118 propuestas, provenientes de todos los rincones de la Unión Europea, para las categorías de internacional o transfronteriza, nacional, regional y local.

    El Ejecutivo comunitario ha creado los premios Sharing & Reuse Awards en reconocimiento a la modernización de las Administraciones Públicas en Europa, mediante el desarrollo de soluciones de Gobierno electrónico que puedan ser reutilizadas por otras organizaciones, gracias al uso de software libre.

    El resultado final se anunciará el próximo 29 de marzo en Lisboa, en el marco de la Sharing & Reuse Conference 2017, que lleva por lema “Resolviendo juntos el rompecabezas TI europeo”. Al margen del resultado, estar ya entre los 4 proyectos nominados es todo un reconocimiento al proyecto gvSIG y a todo lo que ha construido a su alrededor.

    El evento, en su primera edición, reunirá a la comunidad internacional de expertos en software libre y administración pública, así como representantes de las instituciones europeas, para debatir sobre las ventajas de compartir y reutilizar soluciones informáticas en el sector público.sharing_awards

    La Dirección General de Tecnologías de la Información y las Comunicaciones (DGTIC) presentará el proyecto gvSIG en la conferencia de Lisboa, junto al resto de propuestas finalistas, procedentes de Alemania, Austria, Bélgica, España, Finlandia, Francia, Grecia, Países Bajos y República Checa.

    El titular de Tecnologías de la Información y la Comunicación de la Generalitat ha destacado que “esta nominación supone el reconocimiento a la trayectoria de un proyecto, que se ha desarrollado dentro y fuera de la Administración y que ha convertido a la Comunitat Valenciana en un referente internacional de la geolocalización mediante el uso de software libre”.

    Vicente Aguiló ha recordado que “el proyecto se gestó en la Generalitat con la idea de crear un sistema de información geográfica basado en código abierto y, una vez puesto en marcha, fue liberado en manos de una comunidad internacional de desarrolladores que ahora conforman la Asociación gvSIG y de la que formamos parte”.

    Podéis consultar toda la información sobre estos premios y proyectos nominados en cada una de las categorías en:


    ¡Nos vemos en Lisboa!

    Filed under: gvSIG Desktop, premios, press office, software libre, spanish Tagged: comisión europea

    by Alvaro at March 05, 2017 07:50 PM

    From GIS to Remote Sensing

    Update: Semi-Automatic Classification Plugin for QGIS 3

    QGIS is in the update process which should bring the version 3 in the second half of this year. This update will improve many aspects of QGIS, in particular with the upgrade to Python 3 and QT 5.
    Consequently, all the plugins should migrate to Python 3 and QT 5, and adapt to the API breaks, in order to work in QGIS 3.

    I have update of the Semi-Automatic Classification Plugin (SCP) to version 5.99 which runs in QGIS 3 (but not in QGIS 2). The tools of this version are the same as SCP version 5, but all the functions run with Python 3, QT 5, and are adapted to the new QGIS APIs.

    SCP 5.99 running in QGIS 2.99

    by Luca Congedo (noreply@blogger.com) at March 05, 2017 10:51 AM

    March 04, 2017

    Bjorn Sandvik

    Creating a TIN from a raster DEM

    NB! This blog post will constantly change until I find a good open source solution to create a Triangulated Irregular Network (TIN) from a Digital Elevation Model (DEM). Would you like to help? Please add a comment below!

    NEW! Read the first test of the TIN capabilities of SAGA GIS.

    People have already helped on Twitter, and I'll include some of these suggestions in this post.

    My example DEM of Jotunheimen in Norway can be downloaded here (144 MB GeoTIFF). This is the same dataset I've used previously for my terrain mapping experiments with three.js and Cesium.

    The goal now is to turn this raster DEM into a nice triangulated irregular network (TIN) optimised for 3D rendering. 

    The dream solution would be a command line tool (part of GDAL?) that can turn a raster DEM into an optimised TIN.

    Open source candidates: 

    GIS StackExchange

    Commercial tools: 

    by Bjørn Sandvik (noreply@blogger.com) at March 04, 2017 11:59 AM

    March 03, 2017

    GeoSpatial Camptocamp

    Conference GeoPython 2017

    After hosting a successful GeoPython conference in 2016, the 2017 edition of the GeoPython Conference will take place from May 8 to 10 in Basel/Muttenz, Switzerland.

    Cet article Conference GeoPython 2017 est apparu en premier sur Camptocamp.

    by camptocamp at March 03, 2017 09:55 AM

    gvSIG Team

    Camino a gvSIG 2.4: nuevo conjunto de plugins Jgrasstools, Epanet, Geopaparazzi…

    De la mano de HydroloGIS nos llega la primera versión, lista para testeo, de un buen número de novedades cuya versión final estará disponible para gvSIG 2.4.

    En la práctica los usuarios de gvSIG Desktop podrán disfrutar de decenas de nuevas herramientas, estructuradas en los siguientes bloques:

    JGrasstools Spatial Toolbox

    JGrasstools Spatial Toolbox nos mostrará una nueva caja de herramientas con geoprocesos de todo tipo y que se suman a los más de 350 geoprocesos ya existentes en gvSIG. 00_jgrass_gvsig_xxx


    Plugin para conectar con el software denominado “Epanet” y que permite el análisis de sistemas de distribución de agua potable. El programa es de dominio público y lo desarrolla la Agencia de Protección Ambiental de Estados Unidos (Environmental Protection Agency; más conocida por las siglas EPA).epanet_gis_gvsig


    Como todos sabréis, Geopaparazzi es un SIG libre disponible para dispositivos Android y orientado a la toma de datos en campo. Mediante este plugin el usuario podrá interactuar entre ambas aplicaciones, gvSIG Desktop y Geopaparazzi.00_geopaparazzi_gvsig_xxx

    Nuevas herramientas

    Un plugin que contiene un variado conjunto de herramientas y utilidades que harán más fácil la vida de los usuarios de gvSIG. Las nuevas herramientas disponibles son: Raster Styler, Position Info Tool, WKT Geometry Tool, Projection tool y Feature browser.00_spatial_tools_gvsig

    Los plugins que debéis activar para comenzar a testear estas herramientas están contenido en un único archivo, con extensión gvspks. Este tipo de archivos permiten empaquetar distintos plugins de gvSIG en un sólo paquete. Podéis instalarlo mediante el “Administrador de complementos”, con la opción “Instalación desde archivo…”. Una vez instalado, veréis que os aparece un conjunto de nuevos plugins listos para ser activados.

    Podéis encontrar toda la información sobre estas novedades en el siguiente documento:


    Y si ya estáis preparados para conocer todas estas novedades, descargar el paquete que contiene los plugins en:


    Filed under: Geopaparazzi, gvSIG Desktop, spanish Tagged: epanet, GRASS, JGRASS, JGrassTools

    by Alvaro at March 03, 2017 06:31 AM

    March 02, 2017

    gvSIG Team

    On the road to gvSIG 2.4: JGrasstools and Geopaparazzi plugins for gvSIG Desktop

    Good news! First release of the JGrasstools and Geopaparazzi plugins for gvSIG 👏👏👏

    JGRASS Tools in gvSIG Desktop:


    Geopaparazzi in “Add layer” tool:


    More info in: http://jgrasstechtips.blogspot.com.es/2017/03/first-release-of-jgrasstools-and.html

    Filed under: english, Geopaparazzi, gvSIG Desktop Tagged: Geoprocessing, GRASS, JGRASS

    by Alvaro at March 02, 2017 04:30 PM

    Andrea Antonello

    First release of the JGrasstools and Geopaparazzi plugins for gvSIG

    Ther are finally here. We were able to finalize, test and package the JGrasstools and Geopaparazzi plugins for gvSIG and make a first official release out of them.

    It is a first shy 0.1.0, but it contains tons of functionalities ready to be tested by the community.

    To download and install the plugins, please have a quick look at this document.

    Not much more to say, everything is in the linked quickstart guide... enjoy!

    by andrea antonello (noreply@blogger.com) at March 02, 2017 03:51 PM