Welcome to Planet OSGeo

February 20, 2017

GeoTools Team

GeoTools 16.2 Released

The GeoTools team is pleased to announce the release of GeoTools 16.2:
This release is also available from our maven repository.

This release is made in conjunction with GeoWebCache 1.10.2 and GeoServer 2.10.2.

GeoTools 16.2 is the latest stable release of the 16.x series and is recommended for all new projects.

Features and Improvements

  • Graduate YSLD module to supported status
  • Implement Cylindrical Equal Area Projection
  • Relax visibility of StyledShapePainter to allow override of vector fill in subclasses 

Bug Fixes

  • Improve label positioning when using follow line vendor option
  • Fix CRS.getCoordinateOperationFactory scalability bottleneck
  • Make GridCoverarageRenderer turn nodata/out of ROI pixels into transparent before rendering onto Graphics2D
  • Various ImageMosaic optimizations and bugfixes
And more! For more information please see the release notes (16.216.1 | 16.0 | 16-RC1 | M0 | beta).

About GeoTools 16

  • The wfs-ng module is now a drop in replacement and will be replacing gt-wfs
  • The NetCDF module now uses NetCDF-Java 4.6.6

by Torben Barsballe (noreply@blogger.com) at February 20, 2017 10:00 PM

Fernando Quadro

eBook: Open Source no Brasil

Nesse relatório, seu autor, Andy Oram, explora as várias tendências nos negócios, no ensino e nas políticas públicas que contribuíram para o estado atual da atividade open source no Brasil. Você vai descobrir a comunidade open source no país, seus movimentos de software livre, o envolvimento dos negócios e da força de trabalho, e as questões relativas à educação.

Apesar de seus problemas—a corrupção no governo, os problemas na saúde pública e as altas taxas de criminalidade—o Brasil ainda é uma das economias mais vibrantes da América Latina. Com suas fortes indústrias de extração, de produção e de serviços, a TI no Brasil está em expansão, à medida que as empresas buscam digitalizar suas operações. As startups de tecnologia também estão surgindo, e o software livre e o open source estão por toda parte.

Você pode baixar o eBook gratuitamente no site da O’Reilly, e descobrir um pouco mais sobre como os gringos veem o nosso país. Vale a leitura!

by Fernando Quadro at February 20, 2017 06:59 PM

gvSIG Team

Learning GIS with Game of Thrones (VI): Hyperlink and other information tools

Today we are going to see information tools, focusing on learning to use the “Hyperlink” tool.

There are 4 main information tools: information by point, consulting area, consulting distance and hyperlink. We would be able to add other ones like “Google Street View” that allows us to consult the images of this Google service… besides in this case there aren’t Google cars riding in the Game of Thrones landscape yet.

These 4 tools are available from the toolbar:


The first three tools are very intuitive and you can test them just only explaining their working.

Information by point: it gives us information about the element that we click on, having its layer active. It will show a window with the values of that element from the attribute table. For example, if we have “Locations” layer selected and we press on the point that represents “King’s Landing”, the next window will be opened:


Consulting area and distance tools work in a similar way. Once the tool is selected, we click on the View, and we will see the information about area and perimeter in one case, and about partial and total distance in the other case. This information is shown in the lower part of the screen, at the state bar (where we can see another information like scale, coordinates or units).


The hyperlink is more complex because the settings have to be configured previously at the layer “Properties”. We are going to see a practical example: 

Reviewing the previous post, “Editing Tables”, we are going to add a series of links to websites about houses of Game of Thrones. They will be added to the “Web” field of the attribute table of “Political” layer:

Results will be similar to these ones:


Now we are going to indicate to the layer that the “Web” field contains links to websites.

To open the Layer properties window we click on the layer name with the secondary button of the mouse at the Table of Contents, or we access from the “Layer/Properties” menu, having the layer active.


At the new window we go to “Hyperlink” tab, the tab that we are interested in now.

We press “Enable hyperlink”, and we select the “Web” field and the “Link to text and HTML files” action.


Now we can close this window already, clicking on the “Accept” button and we can start to use the hyperlink button on the “Political” layer.

What happens when we click on an element? …a browser is opened (that by the way it will be improved in the next version) with information about the web page indicated at the attribute table. In this case we will get the information about each house. For example, when we click on “The North” kingdom it will link to the information of the House Stark:


Now we are going to create another type of hyperlink, that will open an image that we have in our computer. In our case, we will see the shield of every house, that you can download from this zip file.

For that, firstly we are going to start editing mode at the “Political” layer and we are going to add the information about the path to the images in your computer, in the “Shield” field. For example:

  • /home/alvaro/Escritorio/Shields/Arryn.PNG
  • /home/alvaro/Escritorio/Shields/Baratheon.PNG
  • /home/alvaro/Escritorio/Shields/Greyjoy.PNG
  • /home/alvaro/Escritorio/Shields/Martell.PNG
  • /home/alvaro/Escritorio/Shields/NightsWatch.PNG
  • /home/alvaro/Escritorio/Shields/Stark.PNG
  • /home/alvaro/Escritorio/Shields/Tully.PNG
  • /home/alvaro/Escritorio/Shields/Lannister.PNG
  • /home/alvaro/Escritorio/Shields/Targaryen.PNG
  • /home/alvaro/Escritorio/Shields/Tyrell.PNG

Table will be like this one:


Such as we’ve done previously, we define the hyperlink settings, indicating that the field will be “Shield” and the action will be “Link to image files”:


If we check the “Hyperlink” tool, each time that we link on an element of the “Political” layer, an image will appear on a new window with the shield of the corresponding House. At that way, if we press on “The Westerlands” we will see the Lannister shield:


And as we pay our debts too, we invite you to read the next post of this peculiar course about GIS.

Filed under: english, gvSIG Desktop, training Tagged: area measure, distance measure, Game of Thrones, hyperlink, Information

by Mario at February 20, 2017 03:41 PM

From GIS to Remote Sensing

Brief Introduction to Remote Sensing

This post is about basic definitions of GIS and Remote Sensing, which are included in the user manual of the Semi-Automatic Classification Plugin.
In particular, the following topics are discussed:
  • Basic Definitions
  • GIS definition
  • Remote Sensing definition
  • Sensors
  • Radiance and Reflectance
  • Spectral Signature
  • Landsat Satellite
  • Sentinel-2 Satellite
  • ASTER Satellite
  • MODIS Products
  • Color Composite
  • Principal Component Analysis
  • Pan-sharpening
  • Spectral Indices
  • Supervised Classification Definitions
  • Land Cover
  • Supervised Classification
  • Training Areas
  • Classes and Macroclasses
  • Classification Algorithms
  • Spectral Distance
  • Classification Result
  • Accuracy Assessment
  • Image conversion to reflectance
  • Radiance at the Sensor’s Aperture
  • Top Of Atmosphere (TOA) Reflectance
  • Surface Reflectance
  • DOS1 Correction
  • Conversion to Temperature
  • Conversion to At-Satellite Brightness Temperature
  • Estimation of Land Surface Temperature

by Luca Congedo (noreply@blogger.com) at February 20, 2017 09:00 AM

gvSIG Team

El Atlas de Expansión Urbana se presenta en el Ateneo de Valencia

El próximo miércoles 8 de marzo en el Ateneo Mercantil de Valencia nuestro compañero Manuel Madrid presentará “El Atlas de Expansión Urbana” dentro de las actividades organizadas por el colectivo “Amigos del Mapa”. Proyecto en el que la Asociación gvSIG ha participado junto a UN-Habitat y la Universidad de Nueva York.

Si tenéis la oportunidad de asistir no la dejéis pasar. Las conclusiones de dicho trabajo son esclarecedoras en relación a como se están expandiendo nuestras ciudades y la problemática derivada de dicha expansión.


Filed under: events, Projects, spanish Tagged: Análisis, expansión urbana, urbanismo

by Alvaro at February 20, 2017 08:53 AM

gvSIG Team

Aprendiendo SIG con Juego de Tronos (IX): Exportar Vista a imagen

En gvSIG hay herramientas para diseñar planos más o menos complejos, pero hay muchos casos en que necesitamos tener una imagen rápida del encuadre de una Vista de gvSIG y no necesitamos nada más; por ejemplo para utilizar esa imagen en un documento que estemos redactando.

Hoy vamos a ver una herramienta muy sencilla pero muy útil cuando queremos tener una imagen inmediata de nuestra Vista.

Para ejecutarla simplemente debemos ir al menú “Vista/Exportar/Exportar Vista a imagen”. Nos aparecerá una nueva ventana donde simplemente indicaremos donde queremos guardar el fichero de imagen y en que formato (jpg, png, bmp o tiff).072_got

Una herramienta sencilla y útil, y muchas veces desconocida por los usuarios de gvSIG.

Filed under: gvSIG Desktop, spanish, training Tagged: Captura pantalla, Exportar, Imagen, Juego de tronos

by Alvaro at February 20, 2017 08:40 AM

Geomatic Blog

Aggregating points: JSON on SQL and loops on infowindows

NOTE: I’ll use CARTO but you can apply all this to any webmapping technology backed by a modern database.

Get all the data

So we start with the typical use case where we have a one to many relationship like this:

    select e.cartodb_id,
           l.cartodb_id as locaction_id,
      from locations l
inner join employees e
        on e.location = l.location
  order by location

Easy peasy, we have a map with many stacked points. From here you can jump to this excellent post by James Milner about dense point maps. My example is not about having thousands of scattered points that at certain zoom levels overlap. Mine is a small set of locations but many points “stacking” on them. In this case you can do two things: aggregate or not. When you aggregate you pay a prize for readability: reducing all your data to those locations and maybe using visual variables to show counts or averages or any other aggregated value and finally try to use the interactivity of your map to complete the picture.

So at this point we have something like this map, no aggregation yet, but using transparency we can see where CARTO has many employees. We could also use a composite operation instead of transparency to modify the color of the stacked points.

Stacking points using transparency

Stacking points using transparency

Aggregate and count

OK, let’s do a GROUP BY the geometry and an aggregation like counting. At least now we know how many people are there but that’s all, we loose the rest of the details.

    select l.the_geom_webmercator,
           min(e.cartodb_id) as cartodb_id,
           count(1) as counts
      from locations l
inner join employees e
        on e.location = l.location
  group by l.the_geom_webmercator
Grouping by location and counting

Grouping by location and counting

Aggregate one field

But in my case, with CARTO we have PostgreSQL at hand so we can do way more than that. PostgreSQL has many many cool features, handling JSON types is one of them. Mix that with the fact that almost all template systems for front-end applications allow you to iterate over JavaScript Objects and you have a winner here.

So we can combine the json_agg function with MustacheJS iteration over objects to allow rendering the names of our employees.

    select l.the_geom_webmercator,
           min(e.cartodb_id) as cartodb_id,
           json_agg(e.firstname) as names, -- JSON aggregation
           count(1) as counts
      from locations l
inner join employees e
        on e.location = l.location
  group by l.the_geom_webmercator,l.location

And this bit of HTML and Mustache template to create a list of employees we can add to the infowindow template:

<ul style="margin:1em;list-style-type: disc;max-height:10em;">
{{#names}}<li class="CDB-infowindow-title">{{.}}</li>{{/names}}

List of employees on the infowindow

We could do this without JSON types, composing all the markup in the SQL statement but that’s generating quite a lot of content to move to the frontend and of course making the whole thing way harder to maintain.

Aggregate several fields

At this point we can repeat the same function for the rest of the fields but we need to iterate them separatedly. It’d be way better if we could create JSON objects with all the content we want to maintain in a single output field we could iterate on our infowindow. With PostgreSQL we can do this with the row_to_json function and nesting an inner double SELECT to give the properties names. We can use directly row_to_json(row(field1,field2,..)) but then our output fields would have generic names.

    select l.the_geom_webmercator,
           min(e.cartodb_id) as cartodb_id,
           count(1) as counts,
             SELECT r
               FROM (
                 SELECT photourl as photo,
                        coalesce(preferredname,firstname,'') as name
             ) r
           ),true)) as data
      from solutions.bamboo_locations l
inner join solutions.bamboo_employees e
        on e.location = l.location
  group by l.the_geom_webmercator,l.location
  order by counts asc

With this query now we have a data field with an array of objects with the display name and web address for the employee picture. Easy now to compose this in a simple infowindow where you can see the faces and names of my colleagues.

<div style="column-count:3;">
<span style="display:inline-block;margin-bottom:5px;">
  <img style="height:35px;" src="{{photo}}"/> 
  <span style="font-size:0.55em;">{{name}}</span>


Adding pictures and names

That’s it. You can do even more if you retrieve all the data directly from your database and render on the frontend, for example if you use D3 you probably can do fancy symbolizations and interactions.

One final note is that if you use UTF grids (like in these maps with CARTO) you need to be conservative with the amount of content you put on your interactivity because with medium and big datasets this can make your maps slow and too heavy for the front-end. On those cases you may want to change to an interactivity that works like WMS GetFeatureInfo workflow, where you retrieve the information directly from the backend when the user clicks on the map, instead of retrieving everything when loading your tiles.

Check the map below and how the interactions show the aggregated contents. What do you think of this technique? Any other procedure to display aggregated data that you think is more effective?

Filed under: CARTO, cartography, GIS, SQL, webmapping

by Jorge at February 20, 2017 07:00 AM

February 19, 2017

Ivan Minčík

Unique jobs at Land Information New Zealand

I am working at Land Information New Zealand (LINZ) for almost one year now and my finding is that New Zealand is so unique country, unlike any other. Over the time, I have slowly realized, that LINZ and especially the part where I am working - Location Information, is also very positively unique with it's culture and people.
There is a giant area of land and sea we are caring about, stretching from New Zealand over South-West Pacific to Antarctica. We are running unique free data publishing service. We are using and contributing to lot of Open Source software like QGIS, PostGIS, GDAL, Python and Linux. We have unique managers doing Debian packaging. We are singing Maori songs every Friday. We have numerous running clubs. We are facing unique nature challenges and Kiwis are still the most optimistic people around the globe.

Great news is, that if you want to know what I am talking about, there is a unique opportunity. We are hiring two very interesting positions - DevOps Database Developer and Spatial IT Solutions Developer.

Have a look and send us your CV.

by Ivan Minčík (noreply@blogger.com) at February 19, 2017 11:58 PM


QGIS Grants #2: Call for Grant Proposals 2017

Dear QGIS Community

Last year we held our first ever call for Grant Proposals and it was a great success. If you are an early adopter using QGIS 3.0 preview builds, you can already try out some of the new capabilities that have arrived in QGIS thanks to these grants.

We are very pleased to announce the second round of Grants is now available to QGIS Contributors. The deadline for this round is Sunday, 19 March 2017. All the details for the Grant are described in the application form, and for more context we encourage you to also read these articles:

We look forward to seeing all your great ideas about how to improve QGIS!

Tim Sutton

QGIS Project Chair

by Tim Sutton at February 19, 2017 07:03 PM

Paul Ramsey

Super Expensive Cerner Crack-up at Island Health

Kansas City, we have a problem.

Super Expensive Cerner Crack-up at Island Health

A year after roll-out, the Island Health electronic health record (EHR) project being piloted at Nanaimo Regional General Hospital (NRGH) is abandoning electronic processes and returning to pen and paper. An alert reader forwarded me this note from the Island Health CEO, sent out Friday afternoon:

The Nanaimo Medical Staff Association Executive has requested that the CPOE tools be suspended while improvements are made. An Island Health Board meeting was held yesterday to discuss the path forward. The Board and Executive take the concerns raised by the Medical Staff Association seriously, and recognize the need to have the commitment and confidence of the physician community in using advanced EHR tools such as CPE. We will engage the NRGH physicians and staff on a plan to start taking steps to cease use of the CPOE tools and associated processes. Any plan will be implemented in a safe and thoughtful way with patient care and safety as a focus.
Dr. Brendan Carr to all Staff/Physicians at Nanaimo Regional General Hospital

This extremely expensive back-tracking comes after a year of struggles between the Health Authority and the staff and physicians at NRGH.

After two years of development and testing, the system was rolled out on March 19, 2016. Within a couple months, staff had moved beyond internal griping to griping to the media and attempting to force changes through bad publicity.

Doctors at Nanaimo Regional Hospital say a new paperless health record system isn’t getting any easier to use.

They say the system is cumbersome, prone to inputting errors, and has led to problems with medication orders.

“There continue to be reports daily of problems that are identified,” said Dr. David Forrest, president of the Medical Staff Association at the hospital.
– CBC News, July 7, 2016

Some of the early problems were undoubtedly of the “critical fault between chair and keyboard” variety – any new information interface quickly exposes how much we use our mental muscle memory to navigate both computer interfaces and paper forms.

IHealth Terminal & Trainer

So naturally, the Health Authority stuck to their guns, hoping to wait out the learning process. Unfortunately for them, the system appears to have been so poorly put together that no amount of user acclimatization can save it in the current form.

An independent review of the system in November 2016 has turned up not just user learning issues, but critical functional deficiencies:

  • High doses of medication can be ordered and could be administered. Using processes available to any user, a prescriber can inadvertently write an order for an unsafe dose of a medication.
  • Multiple orders for high-risk medications remain active on the medication administration record resulting in the possibility of unintended overdosing.
  • The IHealth system makes extensive use of small font sizes, long lists of items in drop-down menus and lacks filtering for some lists. The information display is dense making it hard to read and navigate.
  • End users report that challenges commonly occur with: system responsiveness, log-in when changing computers, unexplained screen freezes and bar code reader connectivity
  • PharmaNet integration is not effective and adds to the burden of medication reconciliation.

The Health Authority committed to address the concerns of the report, but evidently the hospital staff felt they could no longer risk patient health while waiting for the improvements to land. Hence a very expensive back-track to paper processes, and then another expensive roll-out process in the future.

This set-back will undoubtedly cost millions. The EHR roll-out was supposed to proceed smoothly from NRGH to the rest of the facilities in Island Health before the end of 2016.

This new functionality will first be implemented at the NRGH core campus, Dufferin Place and Oceanside Urgent Care on March 19, 2016. The remaining community sites and programs in Geography 2 and all of Geography 1 will follow approximately 6 months later. The rest of Island Health (Geographies 3 and 4) will go-live roughly 3 to 6 months after that.

Clearly that schedule is no longer operative.

The failure of this particular system is deeply worrying because it is a failure on the part of a vendor, Cerner, that is now the primary provider of EHR technology to the BC health system.


When the IBM-led EHR project at PHSA and Coastal Health was “reset” (after spending $72M) by Minister Terry Lake in 2015, the government fired IBM and turned to a vendor they hoped would be more reliable: EHR software maker Cerner.

Cerner was already leading the Island Health project, which at that point (mid-2015) was apparently heading to a successful on-time roll-out in Nanaimo. They seemed like a safe bet. They had more direct experience with the EHR software, since they wrote it. They were a health specialist firm, not a consulting generalist firm.

For all my concerns about failures in enterprise IT, I would have bet on Cerner turning out a successful if very, very, very costly system. There’s a lot of strength in having relevant domain experience: it provides focus and a deep store of best practices to fall back on. And as a specialist in EHR, a failed EHR project will injure Cerner’s reputation in ways a single failed project will barely dent IBM’s clout.

There will be a lot of finger-pointing and blame shifting going on at Island Health and the Ministry over the next few months. The government should not be afraid to point fingers at Cerner and force them to cough up some dollars for this failure. If Cerner doesn’t want to wear this failure, if they want to be seen as a true “partner” in this project, they need to buck up.

Cerner will want to blame the end users. But when data entry takes twice as long as paper processes, that’s not end users’ fault. When screens are built with piles of non-relevant fields, and poor layouts, that’s not end users’ fault. When systems are slow or unreliable, that’s not end users’ fault.

Congratulations British Columbia, on your latest non-working enterprise IT project. The only solace I can provide is that eventually it will probably work, albeit some years later and at several times the price you might have considered “reasonable”.

February 19, 2017 04:00 PM

Jackie Ng

React-ing to the need for a modern MapGuide viewer (Part 12): A positive cascading effect

The move to Jest for our testing/coverage needs has opened up some opportunities that were previously roadblocks.

Mainly, we can finally upgrade to Webpack 2. Previously, we were roadblocked because the karma runner just wouldn't work with webpack 2 configurations. Also unlike earlier attempts with Webpack 2 beta releases, thus upgrade to Webpack 2 was less painful and more importantly, the bundle size remained the same.

Also OpenLayers recently released 4.0.0, which also includes experimental ES2015 modules, the ES2015 module facilitates a "pay for only what you use" model which is great for us as we don't necessarily want to use the kitchen sink, only the parts of the library we actually use. It turns out based on their webpack example that it requires Webpack 2 to work as Webpack 1 will include said modules verbatim causing most browsers to blow up on the various ES2015 language constructs (like imports).

Well, how convenient that we just upgraded to Webpack 2! Switching over to the new ol package and its ES2015 modules, and making the required fixes in our codebase to use this new package, and checking the final production bundle size shows promise.

That is 150kb smaller than our current production bundle! Once other libraries we're using adopt ES2015 modules, we can expect even more weight loss.

by Jackie Ng (noreply@blogger.com) at February 19, 2017 01:31 PM

February 18, 2017

OSGeo News

Orfeo ToolBox 5.10 is released!

by jsanz at February 18, 2017 06:50 PM

OSGeo News


by jsanz at February 18, 2017 06:36 PM

OSGeo News

ACM SIGSPATIAL 2017 - Call for Participation

by jsanz at February 18, 2017 06:31 PM


Postgres Information Functions

Postgres contains a wealth of functions that provide information about a database and the objects within. The System Information Functions of the official documention provides a full list. There are a huge number of functions covering a whole host of info from the current database session, privileges, function properties.


Find an objects oid

A lot of the info functions accept the Object Identifier Type for objects in the database. This can be obtained by casting to regclass (also described in the oid docs) then to oid:

select 'schema_name.relation_name'::regclass::oid;

Where relation_name is a table, view, index etc.

View definition

select pg_get_viewdef('schema_name.view_name'::regclass::oid);

Or in psql you can use one of the built in commands:

\d+ schema_name.view_name

Function definition

Returns the function definition for a given function. Many built-in functions don't reveal much due to them not being written in SQL but for those that are you'll get the complete create function statement. For example to view the definition of the PostGIS st_colormap function:

select pg_get_functiondef('st_colormap(raster, integer, text, text)'::regprocedure);


A whole host of functions exist to determine privileges for schemas, tables, functions etc. Some examples:

Determine if the current users can select from a table:

select has_table_privilege('schema_name.relation_name', 'select');

Note: The docs state that "multiple privilege types can be listed separated by commas, in which case the result will be true if any of the listed privileges is held". This means that in order to test a number of privileges it is normally better to test each privilege individually as select has_table_privilege('schema_name.relation_name', 'select,update'); would return t even if only select is supported.

Determine if a user can use a schema:

select has_schema_privilege('schema_name', 'usage');

by walkermatt at February 18, 2017 07:55 AM

February 17, 2017

Jackie Ng

React-ing to the need for a modern MapGuide viewer (Part 11): I don't say this in jest

... but seriously, Jest completes my holy trinity of web front-end development nirvana.

  • React, because of its high performance and revolutionary component-based way of building frontend UIs. I could never go back to jQuery, data-binding, string templating and those other primitive ways of building frontends.
  • TypeScript, because it is in my opinion, the only sane programming language for frontend development. Just like jQuery was the glue that held together inconsistent browser APIs for many years, TypeScript the the glue that lets us play with future JS technologies in the present. TypeScript is JavaScript with C#-quality static typing. I love that a whole class of errors are eliminated through a simple compile step. I can't fathom having to maintain large code bases in a dynamically-typed language. TypeScript brings order and sanity in that regard. And with TypeScript 2.0, I don't have to deal with billion dollar mistakes.
  • And finally, Jest which I believe to be the total package for JavaScript unit testing that is sooooooo easy to set up! Code coverage is also included.
Before I tried Jest, the current unit test suite for mapguide-react-layout was a convoluted stack of:
I also tried to get code coverage working. But this required a tool called istanbul and because my code was TypeScript, it needed some TypeScript source map plugin for istanbul to recognise it, which resulted in some rube-goldberg-esque contraption that formed the foundation of my unit test suite, that didn't even get the coverage right! So I scrapped the code coverage part and just coasted along with karma/mocha/chai combo until now.

With the introduction of Jest, it does the job of karma/mocha/chai/istanbul in a single, easy to install package. Only some small changes to my unit test suite was required (porting chai assertions to their jest equivalents) and the whole test suite was passing. With a simple addition of a --coverage flag to jest, it then automagically generates code coverage results and reports.

Since I am now getting code coverage reports, the obvious next step was to upload this to the coveralls.io service. It turns out TravisCI already supports automatic upload of code coverage reports to coveralls. It just needed installing node-coveralls and piping the jest coverage output to it.

And with that, I get another shiny badge to brandish on the project home page

This is almost too easy. Results like this easily incentivize you to write good quality code.

One last thing before I close out this post. The 63% coverage is a bit of a misnomer. It turns out that it is actually the percentage of code in modules that your unit tests are currently testing, which makes sense. The moment I start bringing in other components and classes under test, I expect this percentage to plummet, which is merely incentive to write more tests to bump it back up.

by Jackie Ng (noreply@blogger.com) at February 17, 2017 01:01 PM

gvSIG Team

Tercera edición del concurso internacional Cátedra gvSIG a trabajos universitarios con geomática libre


Como ya han reflejado varios medios la Universidad Miguel Hernández (UMH) ya ha lanzado la convocatoria a la tercera edición del concurso de la Cátedra gvSIG sobre trabajos realizados con Sistemas de Información Geográfica libres.

Con el objetivo de fomentar el uso de la geomática libre en el mundo universitario y pre-universitario se lanza esta tercera edición del concurso, animando a los usuarios del software gvSIG y de los Sistemas de Información Geográfica libres en general a que compartan y den visibilidad a sus trabajos.

Los premios están dirigidos a estudiantes o egresados de secundaria y formación profesional, estudiantes o egresados universitarios y profesores universitarios e investigadores de todos los países. Los concursantes podrán presentarse de forma colectiva e individual, presentando su trabajo en inglés, castellano o valenciano.

Entre los trabajos seleccionados se otorgará un premio de 500 euros para cada una de las siguientes categorías:

  • Trabajo elaborado por alumnos de Bachillerato o Formación Profesional.
  • Proyecto Fin de Titulación Universitaria (Licenciatura, Grado, Máster).
  • Tesis doctoral o trabajo de investigación.

Cada vez son más los trabajos de ámbito universitario que utilizan gvSIG como parte fundamental de sus investigaciones. Si tú también formas parte de este colectivo, animate y presenta tú propuesta al concurso Cátedra gvSIG.

Más información aquí.

Y para los que tengáis curiosidad os ponemos el vídeo del Informativo de Radio UMH donde se han hecho eco de este concurso:

Filed under: Geopaparazzi, gvSIG Desktop, gvSIG Mobile, gvSIG Online, premios, press office, Projects, software libre, spanish Tagged: Cátedra, Concurso, geomática, Tesis, trabajo fin de grado, Universidad

by Alvaro at February 17, 2017 10:25 AM

February 15, 2017

Jackie Ng

Announcing: mapguide-react-layout 0.8

Here's a new release of mapguide-react-layout

Here's what's new in this release

Multiple Map Support

If you load an Application Definition with multiple map groups, the viewer now properly supports them.

Thanks to the use of redux (as my previous blog adventure post explained), map state is all nicely isolated from each other and makes it easy for components and commands to easily be aware of multiple maps, such is the case of the measure component (notice how recorded measurements switch along with the active map)

Also for Task Pane content, we added some smarts so that you know whether current task pane content is applicable or not to the current active map.

Other Changes

  • Update Blueprint to 1.9.0
  • Update React to 15.4.2
  • Improved performance of redux aware components to avoid unnecessary re-rendering
  • Sidebar Template: Fix a small sliver of the Task Pane content visible when collapsed
  • Legend: Fix infinite loop on maps with multiple (>2) levels of group nesting
  • Hover styles no longer render for disabled toolbar items
  • Clicking an expanded panel in an accordion no longer collapses it (an expanded panel should be collapsed by clicking another collapsed panel). This affects viewer templates that use accordions (eg. Slate)
  • Added support for InvokeURL command parameters
  • Fix default positioning of modal dialogs


by Jackie Ng (noreply@blogger.com) at February 15, 2017 01:32 PM

Tom Kralidis

OSGeo Daytona Beach Code Sprint 2017 redux

I attended the 2017 OSGeo Code Sprint last week in Daytona Beach.  Having put forth a personal sprint workplan for the week, I thought it would be useful to report back on progress. pycsw There was lots of discussion on refactoring pycsw’s filter support to enable NoSQL backends.  While we are still in discussion, this […]

by tomkralidis at February 15, 2017 12:32 PM

gvSIG Team

Nuevo geocodificador CartoCiudad desarrollado por la Asociación gvSIG


Hoy se ha anunciado el nuevo geocodificador de Cartociudad, un servicio considerablemente perfeccionado y que permite obtener mejores resultados con respecto a su antecesor. Desde la Asociación gvSIG nos hacemos eco de esta noticia, mostrando nuestra satisfacción por haber participado en su desarrollo conjuntamente con Scolab, una de las empresas socias de la Asociación gvSIG.

Cartociudad es un proyecto colaborativo de producción y publicación mediante servicios web de datos espaciales de cobertura nacional. Contiene información de la red viaria continua (calles con portales y carreteras con puntos kilométricos), cartografía urbana y toponimia, códigos postales, y distritos y secciones censales.

El proyecto Cartociudad está liderado y coordinado por el Instituto Geográfico Nacional (IGN). Se genera a partir de datos oficiales del IGN, la Dirección General del Catastro, el Grupo Correos y el Instituto Nacional de Estadística. Además, colaboran en su elaboración las comunidades autónomas de País Vasco, Navarra, Comunidad Valenciana, La Rioja, Baleares y Andalucía.

Con esta nueva aplicación desarrollada por la Asociación gvSIG se puede realizar tanto geocodificación directa como inversa. Para la obtención de coordenadas a partir de una dirección, con el nuevo servicio se pueden geolocalizar tanto una dirección urbana, como un punto kilométrico de una carretera. El servicio ofrece la posibilidad de buscar una dirección utilizando el nombre de entidades menores al municipio para localizarla. Esto ha sido posible gracias a la utilización de la información de referencia de poblaciones del IGN.

Como novedad, el servicio permite la geolocalización de referencias catastrales obteniendo las coordenadas de parcela a través del servicio SOAP de callejero y datos catastrales no protegidos de la Dirección General del Catastro.

El visualizador del proyecto CartoCiudad utiliza ya esta nueva aplicación en la ventana de búsqueda y enrutamiento.

Los detalles sobre la utilización de este nuevo servicio, se publicarán próximamente en la guía técnica de servicios web.

Más información en el blog de la IDEE.

Filed under: geoportal, gvSIG Association, IDE, press office, Projects, software libre, spanish Tagged: cartociudad, cálculo de rutas, directa, geocodificación, geolocalización, inversa, referencias catastrales

by Alvaro at February 15, 2017 11:09 AM

February 14, 2017

gvSIG Team

Aprendiendo SIG con Juego de Tronos (VIII): Calculadora de campos

La “calculadora de campos” es una de las herramientas más utilizadas por los usuarios de SIG a la hora de editar los atributos de una capa. El motivo es su versatilidad y el ahorro de tiempo que proporciona a la hora de editar distintos registros al mismo tiempo.

Permite realizar distintos tipos de cálculos sobre los campos de una tabla. Esta herramienta puede ejecutarse en todos los registros de una tabla o en aquellos que se encuentren seleccionados.

Veamos como funciona con unos simples ejercicios sobre nuestros datos de Juego de Tronos. Pero antes de comenzar veamos su interfaz.


  1. Información. Proporciona información sobre el “Campo” o “Comandos” seleccionados.
  2. Campo. Listado de campos de la Tabla. Con doble clic sobre un campo se añade a la expresión a aplicar.
  3. Tipo. En función del tipo seleccionado se actualiza la lista de “Comandos” disponibles.
  4. Comandos. Listado de comandos disponibles en función del “Tipo” seleccionado. Con doble clic sobre un comando se añade a la expresión a aplicar.
  5. Expresión. Operación que se aplicará sobre el campo seleccionado. La expresión se puede escribir directamente.

Vista la teoría, pasamos a realizar nuestro ejercicio práctico.

En primer lugar abrimos la tabla de atributos de la capa “Locations”, que si has ido siguiendo todos los ejercicios ahora tendrá 7 columnas. Uno de los campos existentes es “type” que contiene los tipos de localización (city, castle, ruin, town, other).048_got

Vamos a imaginar que queremos añadir una nueva columna en la que poner el tipo de localización en idioma castellano. Podríamos hacerlo manualmente, tal y como vimos en el post de “Edición de Tablas”, pero gracias a la “Calculadora de campos” podemos hacer este ejercicio de forma mucho más rápida.

Siguiendo los pasos que aprendimos en el post de “Edición de Tablas”, ponemos la Tabla en edición y añadimos una columna de tipo cadena (“String”), dejando el número de caracteres por defecto (50). A esa nueva columna la llamaremos “Tipo”. Podríamos dejar el dato de “Valor por defecto” vacío, pero para ahorrar tiempo en el rellenado pondremos “Otro” (sin las comillas). De este modo rellenará de forma automática todos los registros con este valor. Ahora ya sólo queda actualizar el resto de valores.059_got

En este momento la tabla quedaría así:060_got

Ahora utilizaremos la herramienta de “Selección por atributos” para ir seleccionando los distintos valores del campo “Type”, y la calculadora de campos para rellenar de forma automática las filas seleccionadas con el valor correspondiente.

Llegados a este punto, si no sabes utilizar la herramienta de “Selección por atributos” revisa el post en que explicamos su funcionamiento.

Vamos a comenzar seleccionando todas las filas cuyo “type” es “Castle”:061_got

Una vez seleccionadas, pulsamos la cabecera del campo “Tipo” (se muestra de un color gris oscuro).064_got

Ejecutamos la herramienta de “Calculadora de campos”, disponible en el menú “Tabla/Calculadora de campos” y en su botón correspondiente.065_got

Se nos abrirá una nueva ventana, en la que podremos escribir la expresión “Castillo” con la que queremos que rellene los campos. Es importar señalar que los textos deben ir entre comillas dobles.066_got

Al pulsar “Aceptar” se rellenaran las celdas del campo “Tipo” de las filas seleccionadas:063_got

Repetimos la misma operación con el resto de valores del campo “type”. Primero seleccionar las filas y luego con la calculadora de campos rellenar los datos:

  • Type “City” = Tipo “Ciudad”
  • Type “Ruin” = Tipo “Ruina”
  • Type “ Town” = Tipo “Pueblo”

Una vez finalizamos nuestra tarea, terminamos la edición y guardamos los cambios. Nuestra tabla quedará con el siguiente aspecto:070_got

La “Calculadora de campos” es muy potente y permite utilizar expresiones complejas. Te recomendamos que experimentes con ella y aprendas todas sus posibilidades. Hasta el próximo post…

Filed under: gvSIG Desktop, spanish, training Tagged: Calculadora de campos, Editar tablas, Juego de tronos, selección por atributos

by Alvaro at February 14, 2017 11:00 PM

Geomatic Blog

How a daily digest of geospatial links is distributed

TL;DR If you are interested on getting a daily digest of geospatial links subscribe to this mailing list or this atom feed. Take «daily» with a grain of salt.

Over the last six years Raf Roset, one of my favourite geonerds out there, has been sending all the cool stuff he founds about our geospatial world to Barcelona mailing list at OSGeo mailman server. He started circa 2011 sending one link per mail, but in 2013-04-03 he started to make a daily digest. A gun burst in Spanish is called Ráfaga so the joke was really at hand when someone proposed to call those digests that way.

Time passes, September 2014 and I ask Raf to send them also to Valencia mailing list, since most people there understand Catalan and the content was too good to be enjoyed only by our loved neighbours. Finally in January 2015 I decide to start translating them into Spanish and send them also to Spanish and Seville mailing lists.

Then in May I join CARTO and @jatorre thinks is a good idea if I can send them to the whole company mailing list so after some weeks I stop translating them into Spanish. Since that day I only do it English, trying to follow Raf lead everyday translating his mails and forwarding them to CARTO internal mailing list and the rest of the OSGeo ones.

Also at June I decided to put those mails in a simple website so the Ráfagas would also be accessible on GitHub and a static jekyll website so anyone could use the Atom feed to reach them.

Final chapter, in July I also decide to create a dedicated mailing list just for those people who are only interested in receiving those digest mails, obviously thinking in a broader audience, not just my fellow friends from Spain. I think at some point I will stop sending them to the Spanish lists because normally Ráfagas don’t fire any discussion and I’m sending the same message to three lists. To be fair they sometimes provoke discussions at CARTO mailing list. By the way I’m almost certain the full team has a filter to move them to their archives and they think I’m just an annoying spammer (a couple of times I’ve changed the subject just to troll them xDDD).

To conclude I want to post here my daily Ráfagas experience:

  • Raf is an early bird and sends the digest in the morning, I copy the contents into a shared Google Doc where a group of collaborators help me on translating the content. It may seem not a lot of effort, but doing this every single day needs a team. Really.
  • I go to my favorite text editor, put the translated content into a new file and start a local server to check the website renders properly.
  • If everything is OK I copy the rendered content and send it to CARTO and OSGeo mailing lists
  • I commit and Push to the GitHub repo so the website is updated along with the feed.
  • I archive Raf’s mail from my inbox.

Creating a Ráfaga

That’s it. Raf you are a formidable example of perseverance and I hope you’ll have the energy to keep giving us all those contents for many years. Thanks mate!

Filed under: CARTO, cartography, GIS

by Jorge at February 14, 2017 09:40 PM

gvSIG Team

gvSIG Online en el especial de Mapping de las Jornadas Ibéricas de Infraestructuras de Datos Espaciales


La revista Mapping, una de las publicaciones técnico-científicas más reconocidas en materia de Geomática y Ciencias de la Tierra, dedica su número 180 a las pasadas Jornadas Ibéricas de Infraestructuras de Datos Espaciales (JIIDE), incluyendo entre su selección de ponencias presentadas un artículo de gvSIG Online, la plataforma en software libre para IDE, una parte fundamental de la suite de soluciones de gvSIG.

Podéis acceder a su lectura en el siguiente enlace:


Cada vez son más las entidades que están adoptando gvSIG Online. Si tú también estás interesado puedes contactarnos en info@gvsig.com Libertad y profesionalidad para poner en marcha tú Infraestructura de Datos Espaciales y SIG Corporativo.

Filed under: events, geoportal, gvSIG Online, IDE, software libre, spanish Tagged: gvSIG Suite, INSPIRE, LISIGE

by Alvaro at February 14, 2017 05:14 PM

gvSIG Team

Geopaparazzi Code Sprint and…first image of gvSIG Mobile 2.0

If you are interested in Mobile GIS, the future of Geopaparazzi and the first version of the all-new, all-different gvSIG Mobile, you must read this post….

Filed under: development, english, Geopaparazzi, gvSIG Mobile Tagged: gvSIG Suite

by Alvaro at February 14, 2017 04:43 PM

Volker Mische

An R-tree implementation for RocksDB

It's long been my plan to implement an R-tree on top of RocksDB. Now there is a first version of it.

Getting started

Checkout the source code from my RocksDB rtree-table fork on Github, build RocksDB and the R-tree example.

git clone https://github.com/vmx/rocksdb.git
cd rocksdb
make static_lib
cd examples
make rtree_example

If you run the example it should output augsburg:

$ ./rtree_example

For more information about how to use the R-tree, see the Readme file of the project.


The nice thing about LSM-trees is that the index data structures can be bulk loaded. For now for my R-tree it's just a simple bottom up building with a fixed node size (default is 4KiB). The data is pre-sorted by the low value of the first dimension. This means that data has a total order, hence also sorted results based on the first dimension. The idea is based on the paper On Support of Ordering in Multidimensional Data Structures by Filip Křižka, Michal Krátký, Radim Bača.

The tree is far from optimal, but it is a good starting point. Currently only doubles are supported. In the future I'd like to support integers, fixed size decimals and also strings.

If you have a look at the source code and cringe because of the coding style, feel free to submit pull requests (my current C++ skills are sure limited).

Next steps

Currently it's a fork of RocksDB which surely isn't ideal. I've already mentioned it in last year's FOSS4G talk about the R-tree in RocksDB (warning: autoplay) that there are several possibilities:

  • Best (unlikely): Upstream merge
  • Good: Add-on without additional patches
  • Still OK: Be an easy to maintain fork
  • Worst case: Stay a fork

I hope to work together with the RocksDB folks to find a way to make such extensions easily possible with no (or minimal) code changes. Perhaps having stable interfaces or classes that can easily be overloaded.

by Volker Mische at February 14, 2017 02:54 PM

gvSIG Team

Learning GIS with Game of Thrones (V): Editing tables

We are going to continue with the course about introduction to GIS with Game of Thrones. At this post we are going to review the alphanumeric editing tools. Using the “Political” layer, that contains the kingdoms of the continent called “Westeros”, we are going to complete the original alphanumeric information with the sentence of the reigning house and two fields that will allow us to see (in a next post) how the “Hyperlink” tool works.

Are you ready?

Once we have opened our project, we put the “Political” layer active and we open its attribute table, such as we saw in the “Tables” post. This table contains 3 fields: id, name (name of the kingdom) and ClaimedBy. We are going to start editing now and add three additional fields.

To start editing we are going to access to the “Table/Start editing” menu or we press the corresponding button:

024_gotIf you have the View visible, you will see that the name of the layer (“Political”) is in red colour now, that indicates that the layer is in editing mode.


We are going to add the three columns, one by one. There are several ways to do it, and we are going to see the easiest one, using the tool of the “Table/Add column” menu or from its corresponding button:

026_gotWhen we press the button a new window will appear where we can define: field name, type, length (maximum number of characters), precision, (only for numeric fields) and value by default (this is optional, cells will be empty if we don’t write anything here).


The values of the new three fields to create will be:

  • Name: Words, Type: String, Length: 50
  • Name: Shield, Type: String, Length: 100
  • Name: Web, Type: String, Length: 100

Once the three fields are added our table will be like this one:


Now we can start to fill in the cells with the data of each one. For that we only have to double-click on the corresponding cell and start to write. We will fill in the cells then.

For “Words” field we will add the next sentences for each of the reigning houses:

  • Tully: “Family, Duty, Honor”

  • Stark: “Winter is Coming”

  • Greyhoy: “What Is Dead May Never Die”

  • Martell: “Unbowed, Unbent, Unbroken”

  • Baratheon: “Ours is the Fury”

  • Arryn: “As High as Honor”

  • Lannister: “A Lannister Always Pays His Debts”

  • Targaryen: “Fire and Blood”

  • Tyrell: “Growing Strong”

Results will be similar to these ones:


As we have spoken about, we work with the other two fields in a next post related to hyperlinks. So we finish editing mode of the table from the “Table/Stop editing” or from its corresponding button:

Before finishing it’s important to tell that there’s a tool that allows us to edit the alphanumeric values of a layer from the View directly. Sometimes it can help us to save time in our updating data tasks.

To check it, from our View and with “Political” layer activated, we press on the “Attribute editor” button:

032_gotTo use it we press on the element to edit and a new window will be opened with its alphanumeric attributes, that we can modify.

033_gotTest it and check its working. To finish it you have to press “Finish editing” button at that window.

See you at the next posts to continue learning…

Filed under: gvSIG Desktop

by Mario at February 14, 2017 12:22 PM

Andrea Antonello

Wrap up of the geopaparazzi code sprint in Valencia

This will be a bit long and a bit for developers. But it might be a good read for anyone interested in the future of geopaparazzi.

Last week we meet up with the guys of Scolab to investigate, develop and plan future geoapaparazzi activities.

We had a way to big agenda, but we were positive we could do at least part of it:
  • investigate a map renderer upgrade (mapsforge 0.7.0 or Nasa World Wind Android)
  • make geopaparazzi pluggable to allow easier branding and customization
  • investigate the possibility to use forms also for spatialite layers/features

Investigation of a map renderer upgrade

This is not exactly strategic at the time, but it would be good to have. Geopaparazzi now has support for some basic feature editing and the current workarounds to have this going are not all that nice.

So we gave a good test to Nasa World Wind.NWW is easy to understand, nicely coded and allows for a clean integration of mapping tools. It has built in support for WMS and it really looks as if it has all we need. There are a few problems though. It only seems to support geographic projection WGS84. We tried to implement a mapsforge offline maps provider, which worked out well, but then was impossible to finalize due to the missing mercator projection.

All the created code is available on my NWW clone. You can find the simple create lines tool here and the mapsforge integration here.

Another problem of the NWW project is the low response rate on the forum. It is an open source project so we can't blame it, but it sure has an impact on the choice. I tried to post two questions on the android support forum, but there has been no reaction at all. It really is a pity, because we would love to use that project as next geopaparazzi renderer.

We also gave mapsforge 0.7.0 (the current) a go. There are a ton of examples available in the demo app. One problem is that the app crashes constantly while switching between examples. The other one is that the API has changed a lot from the version we are using in geopaparazzi. That means that we would have to start from scratch.

We had to stop on this due to time constraints. We now have more insight, but no clear ideas at all. This is a major work that needs to be done at some point but can't be done without the proper resources. Should we have them at some point, this investigation will sure help to get started.

Geopaparazzi plugins system

We then started to investigate possibilities to make plugins installable from the play store. This has been proven to be quite difficult and resources demanding while trying to make plugins thin and generic.

So we decided to make a first intermediate step. We created a plugin system that would be based on intent services and libraries. This means that it is possible to package a version of geopaparazzi that is branded and presents functionalities that the official version does not.

Branding isn't actually a plugin, but falls anyways into this pot of customization, so I will quickly explain it.


To brand geopaparazzi with an own name and style it is now possible to create a simply minimalistic android application.

Previously the geopaparazzi application was completely contained inside the android module named geopaparazzi.app while only the reusable code pieces were in their own modules:
  • geopaparazzilibrary
  • geopaparazzimapsforge
  • geopaparazzimarkerslib
  • geopaparazzispatialitelibrary
Now the logic of the app has been moved to the module geopaparazzi_core, while a minimalistic app wrapper is contained in the geopaparazzi.app module.

Looking into the wrapper modules shows that there is only one class containing:

public class GeopaparazziActivity extends GeopaparazziCoreActivity {


This means we just extend the main class and that is it.

In the same module we can define the app name and a custom style, as well as a custom icon for the app.

This minimalistic module makes it possible to maintain your own branded app in a very simple way.

Sure, plugins are necessary to make it really yours. :-)

Since the refactoring was in process we also decided to make the core module less dependent from the company that gave birth to the project, HydroloGIS. This module was the last one containing the eu.hydrologis.geopaparazzi namespace, which was changed to eu.geopaparazzi to better stress the importance of openness. So if you were depending on this code, you will need to change the import removing the reference to hydrologis.


As written before the plugin system is based on intent services. We have created during this code sprint 2 first extension points that allow to customize import and export menus and actions.

There is now a folder named plugins, that contains available import and export plugin. If you do not include those in your app, then the import and export views will be empty.

Have a look at the plugins, simply look in the simple ones create in the plugins folder. It is quite easy to create one. If you need help, please write to the mailinglist of geopaparazzi.

gvSIG Mobile

The result of this first implementation of the plugins and branding is the first version of gvSIG Mobile.

While geopaparazzi will always exist and be developed, gvSIG Mobile is the app maintained by the gvSIG Association as the mobile solution of there stack:

    As you can see from the screenshots it is quite simple to brand the app with a custom style and name.

    Right now geopaparazzi and gvSIG Mobile are very similar, but this will change with the use of the plugin system. Right now there is already a big difference between geopaparazzi and gvSIG Mobile. gvSIG Mobile has the possibility to synchronize spatialite databases with gvSIG Online, which makes it possible to centralize data surveys.

    Soon Scolab will also add the possibility to synchronize geopaparazzi projects to gvSIG Online to create online projects. This will make surveying even more fun and simple.

    With time and resources (based on the jobs we do on this) we will slowly add extension points to provide dashboard actions, context menu entries and even map tools.

    Forms for spatialite layers

    This should have been an investigation of the effort necessary to allow the use of the complex forms of geopaparazzi also for spatialite layers. Also, it should be possible to create a tools for simple forms creation in gvSIG Online.

    Sadly we didn't have time to even talk about this during the code sprint, time was too short.

    Wrap up

    It has been good to sit down with other developers and work together on common goals for a tiny project as geopaparazzi. I see the project growing slowly but constantly, which fills my hear with joy. The creation of the twin gvSIG Mobile is an important step and makes geopaparazzi the first choice in a GIS stack that is used all over the world.

    Well, we'll see what the future brings. :-)

    A quick image of the first visualization of gvSIG Mobile on an Android phone (one day this image will be important :-D ). With Jose^2 and Alvaro. Cesar is hidden somewhere :-)

    by andrea antonello (noreply@blogger.com) at February 14, 2017 08:42 AM

    February 13, 2017

    gvSIG Team

    New support services on gvSIG Desktop

    Use of Geographic Information System for any organization that manages geographic information is a fundamental tool. Adopting open source solutions in any entity, an upward trend, usually requires training and support services.

    gvSIG Desktop is an open source GIS used in more than 160 countries and with a high level of implementation in a corporative level. The need to count with a support professional service by users to guarantee the resolution of events or doubts related to its use, as well as having technology updated with the improvements that are published continuously have been added to the interest to use gvSIG Desktop to analyse and manage geographic information.

    More over gvSIG Desktop software, it’s important to emphasise that the gvSIG Association is the responsible for its maintenance and evolution, providing support, development, training and consulting services on that solution, as well as on gvSIG Suite, the free geomatics solutions catalogue maintained by the gvSIG Association.

    Now, in 2017 we launch new packaged support services, in a user level, as well as in a developer one. With this service, gvSIG Desktop users can keep the value of the product, counting with the most advanced version always and with the support of the gvSIG Association, the organization that is the responsible for the maintenance and evolution of the application.


    What does the gvSIG Desktop annual support include?

    Level I, for users:

    • Resolution of user doubts: We offer a support team to solve all the doubts and problems found in gvSIG Desktop use. With this technical support, a customized monitoring of reported events is got, communicating with users to solve them. Distance communication or support portal will be established, and there will be on-site visits in case it’s needed. 
    • Service management by a Technical Manager. A technical responsible with detailed knowledge about installations and clients needs will participate. This specific knowledge about the installation, business and needs of the final user will allow to identify in a precise and optimized way the support needs and the corrective, preventive and/or evolutionary actions that can be necessary in a concrete moment currently as well as in a planned future.
    • Information about new versions, updates and improvements of the software that the gvSIG Association releases periodically. Support to launch updates and installation of new plugins.
    • Training for users.
    • Discounts in other services offered by the gvSIG Association.

    Level II, with development service support includes:

      • Development packages by hours oriented to promote improvements on gvSIG Desktop and bug correction required by the client. Any request will be evaluated and the gvSIG Association will communicate to the client how any hours it would take. Once it’s confirmed by the client it would be carried out. Hour packages are the next: 160, 320, 480 or 1.000 hours.

    Contact us: info@gvsig.com

    Filed under: Business, development, english, gvSIG Association, gvSIG Desktop, training Tagged: Support

    by Mario at February 13, 2017 06:54 PM


    FAKE MAPS, very dishonest!!!

    An open letter to President Trump

    Dear Mr President
    I read in this morning’s (failing) New York Times that you were pretty keen on maps in your briefing papers.

    And while Mr. Obama liked policy option papers that were three to six single-spaced pages, council staff members are now being told to keep papers to a single page, with lots of graphics and maps.

    “The president likes maps,” one official said.

    Now I recognise that this may be a deliberate attempt by the dishonest press to mislead people and that you may not like maps. But, in case you do like maps, I wanted to give you a bit of insight into some of the sneaky things that those very dishonest cartographers (they even have a long foreign name to confuse people) do to make FAKE MAPS.

    So the first thing FAKE MAP makers can do is to change projection from a wholesome projection like this Web Mercator projection which emphasises the size of your hands the US compared to the southern hemisphere

    Web Mercator – A few major misconceptions based on this map: Alaska is nearly as large as the continental U.S. Greenland is roughly the same size as Africa. Europe (excluding Russia) is only a bit larger than South America. Antarctica dwarfs all the continents. In reality: Alaska can fit inside the continental U.S. about three times. Greenland can fit inside Africa about 14 times. South America nearly doubles Europe’s land mass. Antarctica looks like the second-smallest continent. (sourceBusiness Insider UK)

    Compare that with this sneaky Gall-Peters projection which is an “equal-area cylindric or cylindrical equal-area projection. It achieved considerable notoriety in the late 20th century as the centerpiece of a controversy about the political implications of map design.” (Source Wikipedia)

    The Peters Projection (via Wikipedia)

    You might prefer a nice US centric projection like this US Centric Map

    Thx to Jason Davies

    Or you could suggest that your map makers read Michael Corey’s guide to map projections for the US

    FAKE MAPS? Which projection ‘accurately’ portrays the United States? Thx to Michael Corey

    So enough about projections, they are pretty technical and can be confusing even for experts.

    Once you have chosen your projection (I wonder how long it will be before a loyal map maker comes up with a Trump projection?) then maps are a good way of presenting a lot of information and enabling you to get a clear view of the subject matter on which you are being briefed. Except that sometimes they aren’t! Before you make any major decisions (think immigration bans, voter registration changes, healthcare, starting a war etc) you might want to read How to Lie with Maps by Mark Monmomier – yes someone has written a whole book about FAKE MAPS, very very dishonest cartographers.

    How to Lie with Maps by Mark Monmomier

    If you are feeling bored on one of those lonely nights in the White House you could also try playing the Redistricting Game which will give you a clue to how someone could win an election without winning the majority of the votes (the answer is more to do with Governor Elbridge Gerry than illegal immigrants). I loved this quote on the home page (even if it did come from a failed, loser, Democrat)

    “The polarization and poisonous atmosphere that have infected the House of Representatives for the past two decades or more can be traced — in large part — to the manner in which district lines are drawn in most states.”

    I would also be remiss not to advise you to normalise your choropleths, no that is not some obscene abuse it’s advice from my friend Ken Field that you can read here, here, here and here.

    So by now you may be wondering whether getting your daily briefings in the form of a map is such a great idea? Well on the plus side lots of people have been making maps that you might not have seen so I thought I would share a few of the best with you.

    The World According to Donald Trump Thx to Huffington Post and Aaron Nemo

    Thx to Hispanic Market Works

    Thx to Andrea Mann, David Schneider and David Beresford at Huffington Post UK

    Thx to Yanko Tsvetkov via The Independent

    There are a lot of FAKE MAPS out there from those failing very very dishonest cartographers, keep them coming.

    by steven at February 13, 2017 04:38 PM

    From GIS to Remote Sensing

    Webinar by NASA ARSET on Land Cover Classification with Satellite Imagery: Materials Available

    The NASA ARSET (Applied Remote Sensing Training) held a webinar on Land Cover Classification with Satellite Imagery using the Semi-Automatic Classification Plugin for QGIS.

    First, I want to thank very much NASA ARSET for organizing this webinar, and especially the instructor Cindy Schmidt who had very kind words for my plugin and the plugin manual.
    It gives me great satisfaction that such an important institution considered using the Semi-Automatic Classification Plugin in a webinar about remote sensing. This inspires me to do even more for the plugin development.

    The demand for the webinar was high. For those who couldn't attend the webinar, the materials and recordings are now freely available.

    Image Credit: NASA/USGS, NASA Earth Observatory

    by Luca Congedo (noreply@blogger.com) at February 13, 2017 11:49 AM

    gvSIG Team

    Aprendiendo SIG con Juego de Tronos (VII): Añadir las coordenadas a una Tabla

    Hoy vamos a ver una herramienta muy sencilla a la par que útil. Permite añadir de forma automática las coordenadas X e Y (o Latitud/Longitud) de una capa de puntos. En nuestro caso, con cartografía ficticia sobre el sistema de proyección EPSG 4326 (el que utilizan los GPS), nos dará unas coordenadas que representan la latitud y longitud.

    La capa de puntos que tenemos es “Locations”, sobre la que vamos a probar la herramienta denominada “Añadir X e Y”.

    En primer lugar ponemos activa la capa “Locations” y abrimos su Tabla de atributos (como vimos en el post “Tablas”).

    A continuación ejecutamos la herramienta, bien en el menú “Tabla/Añadir medición/Añadir X e Y”, bien en el botón correspondiente:046_got

    Veremos como de forma automática añade dos nuevas columnas a la tabla de atributos y que contienen los datos de las coordenadas.047_got

    Ahora ya podemos enviar a nuestros dragones a las coordenadas exactas

    El siguiente post is coming…

    Filed under: gvSIG Desktop Tagged: Añadir coordenadas, Juego de tronos

    by Alvaro at February 13, 2017 09:43 AM

    February 11, 2017

    Paul Ramsey

    On "Transformation" and Government IT

    “You may find yourself,
    In a beautiful house,
    With a beautiful wife.
    You may ask yourself,
    well, how did I get here?”
    – David Byrne, Once In a Lifetime

    I’ve spilled a lot of electrons over the last 5 years talking about IT failures in the BC government (ICM, BCeSIS, NRPP, CloudBC), and a recurring theme in the comments is “how did this happen?” and “are we special or does everybody do this?”

    The answer is nothing more sophisticated than “big projects tend to fail”, usually because more people on an IT project just adds to organizational churn: more reporting, more planning, more reviewing of same, all undertaken by the most expensive managerial resources.

    That being so, why do we keep approving and attempting big IT projects? BC isn’t unique in doing so, though we have our own organizational tale to tell.


    Both the social services Integrated Case Management (2009) and the Natural Resources Permitting Project (2013) projects were born out of “transformation” intiatives, attempts to restructure the business of a Ministry around new principles of authority and information flow.

    Even for businesses as structured and regimented as a Department of Motor Vehicles, these projects can be risky. For a Ministry like Children & Families, where the stakes are children’s lives, and the evaluations of the facts of cases are necessarily subjective, the risk levels are even higher.

    On "Transformation" and Government IT

    Nonetheless, in the mid 2000’s, the Province gave the Ministry of Management Services a new mission, to “champion the transformation of government service delivery to respond to the everyday needs of citizens, businesses and the public sector.” In particular, the IT folks in the Chief Information Officer’s department took this mission in hand. This may be a clue as to why IT projects became the central pivot for “transformation” initiatives.

    Around the same time, the term “citizen centred service delivery” began to show up in Ministry Service Plans. Ministries were encouraged to pursue this new goal, with the assistance of Management Services (later renamed Citizens’ Services). This activity reached a climax in 2010, with the release of Citizens @ The Centre: B.C. Government 2.0 by then Deputy to the Premier Alan Seckel.

    The “government 2.0” bit is a direct echo of hype from south of the border, where in 2009 meme-machine Tim O’Reilly kicked off a series of conferences and articles on “Government 2.0” as a counterpart to his “web 2.0” meme.

    Enthusiasm for technology-driven “government as a platform” resulted in some positive side effects, such as the “open data” movement, and the creation of alternative in-sourced delivery organizations, like the UK Government Digital Service and the American 18F organization in the General Service Administration. In BC the technology mania also wafted over the top levels of civil service, resulting in the Citizens @ The Centre plan.

    As a result, the IT inmates took over the asylum. Otherwise sane Ministries were directed to produce “Transformation and Technology Plans” to demonstrate their alignment with the goals of “Citizens @ the Centre”. Education, Transportation, Natural Resources and presumbly all the rest produced these plans during what turned out to be the final years of Premier Gordon Campbell’s rule.

    The 2011 change in leadership from Gordon Campbell’s technocratic approach to the “politics über alles” style of Christie Clark has not substantially reduced the momentum of IT-driven transformation projects.

    Part of this may be a matter of senior leadership personalities: the current Deputy to the Premier, Kim Henderson was leading Citizen’s Services when it produced Citizens @ The Centre; the former CIO Dave Nikolejsin, an architect of the catastrophic ICM project, remains involved in the ongoing NRPP transformation project, despite his new perch in the Ministry of Natural Gas Developent.


    The IT-led do-goodism of “transformation” explains to some extent how systems became the organizing principle for these disruptive projects, but it doesn’t fully explain their awesome size. Within my own professional memory, only 15 years ago, $10M was an surprisingly large IT opportunity in BC. Now I can name half a dozen projects that have exceeded $100M.

    The social services Integrated Case Management project provides an interesting study in how a budget can blow up.

    “Integrated Case Management” as a desirable concept was suggested by the 1996 Gove Report, (yes, 1996) leading to an “Integrated Case Management Policy” in 1998, and by 1999 a Ministry working group examining “off-the-shelf” (COTS) technology solutions. The COTS review did not turn up a suitable solution, and the Ministry buried itself in a long “requirements gathering” process, culminating in a working prototype by 2002. In 2003, an RFP was issued to expand the prototype into a pilot for a few hundred users.

    The 2003 contract to build out the pilot was awarded to GDS & Associates Systems in the amount of $142,800. No, I didn’t drop any zeroes. Six years before a $180,000,000 ICM contract was awarded to Deloitte, the Ministry thought a working pilot could be developed for 1/1000 of the cost.

    What happened after that is a bit of a mystery. Presumably the pilot process failed in some way (would love to know! leave a comment! send an email!) because in 2007 the government was back with a Request to procure a “commercial off-the-shelf” integrated case management solution.

    This was the big kahuna. Instead of piloting a small solution and incrementally rolling out, the government already had plans to roll the solution out to over 5000 government workers and potentially 12000 contractors. And the process was going to be incredibly easy:

    • Phase 1: Procure Software
    • Phase 2: Planning and Systems Integration
    • Phase 3: Blueprint and Configure
    • Phase 4: Implementation
    • Phase 5: Future Implementation

    Here’s where things get confusing. Despite using “off-the-shelf” software to avoid software development risks, and having a simple plan to just “configure” the software and roll it out, the new project was tied to a capital plan for $180M dollars, 1000 times the budget of the custom pilot software from a few years earlier.


    Whereas the original pilot aimed to provide a new case management solution, full stop, the new project seems to have been designed to “boil the ocean”, touching and replacing hundreds of systems throughout the Ministry.


    At this point the psychology of capital financing and government approvals starts to come into play. Capital financing can be hard to get. Large plans must be written and costed, and business cases built to show “return on investment” to Treasury Board.

    One way to gain easy “return” is to take all the legacy systems in the organization, fluff up their annual operating costs as much as possible, bundle them together and say “we’re going to replace all this, for an annual savings of $X, which in conjunction with efficiencies $Y from our new technology and centralized maintenance gives us a positive ROI!”

    I’m pretty sure this psychology applies, since almost exactly the same arguments backstop the “business case” for the ongoing Natural Resource Permitting Project.

    A natural effect of bundling multiple system integrations is to blow up the budget size. This is actually a Good Thing (tm) from the point of view of technology managers, since it provides some excellent resumé points: “procured and managed 9-digit public IT project.”

    A manager who successfully delivers a superb $4M IT project gets a celebratory dinner at the pub; a manager who brings even a terrible $140M IT project to “completion” can write her ticket in IT consulting.

    The only downside of huge IT projects is that they fail to provide value to end-users a majority of the time, and of course they soak the taxpayers (or shareholders in the case of private sector IT failures, which happen all the time) for far more money than they should.

    Can We Stop? Should We?

    We really should stop. The source of the problems are pretty clear: overly large projects, and heavily outsourced IT.

    Even the folks at the very top can see the problem and describe it, if they have the guts.

    David Freud, former Conservative minister at the UK Department for Work and Pensions, was in charge of “Universal Credit” a large social services transformation project, which included a large poorly-built IT component – a project similar in scope to our ICM. He had this to say in a debrief about what he learned in the process:

    The implementation was harder than I had expected. Maybe that was my own naivety. What I didn’t know, and I don’t think anyone knew, was how bad a mistake it had been for all of government to have sent out their IT.

    It happened in the 1990s and early 2000s. You went to these big firms to build your IT. I think that was a most fundamental mistake, right across government and probably across government in the western world…

    We talk about IT as something separate but it isn’t. It is part of your operating system. It’s a tool within a much better system. If you get rid of it, and lose control of it, you don’t know how to build these systems.

    So we had an IT department but it was actually an IT commissioning department. It didn’t know how to do the IT.

    What we actually discovered through the (UC) process was that you had to bring the IT back on board. The department has been rebuilding itself in order to do that. That is a massive job.

    The solution will be difficult, because it will involve re-building internal IT skills in the public service, a work environment that is drastically less flexible and rewarding than the one on offer from the private sector. However, what the public service has going for it is a mission. Public service is about, well, “public service”. And IT workers are the same as anyone else in wanting their work to have value, to help people, and to do good.

    The best minds of my generation are thinking about how to make people click ads.
    – Jeffrey Hammerbacher

    When the Obamacare healthcare.gov site, built by Canadian enterprise IT consultant CGI, cratered shortly after launch, it was customer-focused IT experts from Silicon Valley and elsewhere who sprang into action to rescue it. And when they were done, many of them stayed on, founding the US Digital Service to bring modern technology practices into the government. They’re paid less, and their offices aren’t as swank, but they have a mission, beyond driving profit to shareholders, and that’s a motivating thing.

    Government can build up a new IT workforce, and start building smaller projects, faster, and stop boiling the ocean, but they have to want to do it first. That’ll take some leadership, at the political level as well as in the civil service. IT revitalization is not a partisan thing, but neither is it an easy thing, or a sexy thing, so it’ll take a politician with some guts to make it a priority.

    February 11, 2017 04:00 PM

    gvSIG Team

    Día internacional de la mujer y la niña en la ciencia, ¿un día como otro cualquiera?

    Hoy es el Día Internacional de la Mujer y la Niña en la Ciencia, un día que nos recuerda algo que ocurre de forma silenciosa el resto de días del año.

    Si miramos atrás, es fácil comprobar que las mujeres han contribuido a la ciencia desde sus inicios, aunque en multitud de casos no hayan sido reconocidas por ello. Si volvemos al presente, a día de hoy las mujeres todavía siguen enfrentándose a barreras que les impiden participar plenamente en el mundo científico y tecnológico, lo que sin duda es un factor más que impide avanzar hacia una plena igualdad de género y al emponderamiento de las mujeres.

    En la parte que nos toca, asistiendo a cualquier congreso de geomática libre, también podemos comprobar que la situación de nuestro “gremio” no es diferente. De hecho incluso pienso que no sé es consciente de esta realidad; recuerdo que en alguna ocasión hemos planteado realizar algún debate sobre ello en jornadas gvSIG y siempre se ha percibido como algo “no necesario”.

    Sin embargo, sobre esto se habló por primera vez en las pasadas Jornadas Internacionales de gvSIG, en el marco de una ponencia que anunciaba los premios de la Cátedra gvSIG. Aquí os dejo la grabación a partir del momento en que se trata este tema, vale la pena que le deis un vistazo:

    Y aprovecho para dar las gracias a todas las que participáis en la Comunidad gvSIG y formáis parte necesaria de la construcción de este proyecto común.

    Filed under: opinion, software libre Tagged: ciencia, desigualdad, emponderamiento, igualdad de género, mujer, tecnología

    by Alvaro at February 11, 2017 10:57 AM

    February 10, 2017

    Tim Waters

    Mapwarper Tutorial & Spatial Humanities Workshop by Lincoln Mullen

    Lincoln Mullen has written a great series of workshops on the Spatial / Digital Humanities over five days. Day 3 is focused around Georectification with Mapwarper.net and is a very good tutorial.
    The full workshop contents are here, and I recommend you to check them out:
    Day 1: Introduction and Setup | Map Literacy | Narrative Maps
    Day 2: Data Maps | QGIS
    Day 3: Georectification | Working with Spatial Data
    Day 4: Deep Maps | From Manuscripts to Maps
    Day 5: Programmatic Maps

    Lincoln is an assistant professor in the Department of History and Art History at George Mason University, working on the history of American religions as a digital historian.

    The link to the Georectificaiton Workshop is here: http://lincolnmullen.com/projects/spatial-workshop/georectification.html

    The screenshot of the tutorial is below the fold.



    by tim at February 10, 2017 07:09 PM

    gvSIG Team

    Nuevos servicios de soporte sobre gvSIG Desktop

    Para cualquier organización que gestione información geográfica el uso de Sistemas de Información Geográfica (SIG, en adelante) se convierte en una herramienta fundamental. La adopción de soluciones libres en cualquier entidad, una tendencia al alza, suele requerir de servicios de soporte y formación.

    gvSIG Desktop es un SIG en software libre utilizado en más de 160 países y con un alto grado de implantación a nivel corporativo. Al interés creciente en utilizar gvSIG Desktop para analizar y gestionar información de carácter geográfico, se une la necesidad de muchos usuarios de contar con un servicio profesional de soporte para garantizar la resolución de incidencias o dudas relacionadas con su utilización, así como tener actualizada la tecnología con las mejoras que continuamente se van publicando.

    Más allá del software gvSIG Desktop, es importante reseñar que la Asociación gvSIG es la responsable de su mantenimiento y evolución, proporcionando servicios de soporte, desarrollo, formación y consultoría alrededor de dicha solución, así como de gvSIG Suite, el catálogo de soluciones de geomática libre mantenido por la Asociación gvSIG.

    Este 2017 ponemos en marcha unos nuevos servicios de soporte paquetizados, tanto a nivel de usuario como de desarrollo. Con este servicio, los usuarios de gvSIG Desktop, pueden mantener el valor del producto, contando siempre con la versión más avanzada y con el soporte de la Asociación gvSIG, la propia organización responsable del mantenimiento y evolución de la aplicación.


    ¿Qué incluye el soporte anual de gvSIG Desktop?

    El Nivel I, para usuarios:

    • Resolución de dudas de usuario: ponemos a disposición un equipo de soporte para solventar todas las dudas y problemas encontrados en el uso de gvSIG Desktop. Con este soporte técnico se obtiene un seguimiento personalizado de las incidencias reportadas, llevándose a cabo las comunicaciones pertinentes para su resolución. Se establecerán mecanismos de comunicación a distancia, portal de soporte y, en su caso, se realizarán visitas in-situ.
    • Gestión del servicio por un Coordinador Técnico. Se trata de un responsable técnico que dispondrá de un conocimiento detallado de las instalaciones y necesidades del cliente. Este conocimiento específico de la instalación, negocio y necesidades del usuario final permitirá identificar de forma mucho más precisa y optimizada las necesidades de soporte y las acciones correctivas, preventivas y/o evolutivas que pueden necesitarse en un momento determinado, tanto en la actualidad como en un futuro planificado.
    • Información de nuevas versiones, actualizaciones y mejoras en el software que la Asociación gvSIG libera periódicamente. Soporte para llevar a cabo las actualizaciones e instalación de nuevos complementos (plugins).
    • Formación para usuarios.
    • Descuentos en otros servicios ofrecidos por la Asociación gvSIG.

    En Nivel II, de soporte con servicio de desarrollo incluye:

      • Paquetes de horas de desarrollo orientados a proporcionar mejoras sobre gvSIG Desktop y corrección de bugs requeridos por el cliente. Cada petición se valorará y se comunicará al cliente el consumo de horas requeridos, una vez confirmado por el cliente se procederá a su ejecución. Los paquetes de horas pueden ser de: 160, 320, 480 o 1.000 horas.

    Contáctanos en info@gvsig.com

    Filed under: Business, gvSIG Association, gvSIG Desktop, spanish Tagged: Soporte

    by Alvaro at February 10, 2017 11:51 AM

    OSGeo News

    "Geospatial Professional Society of the Year" Awarded to OSGeo Foundation

    by jsanz at February 10, 2017 11:46 AM


    AutoForm Plugin for QGIS

    The AutoForm plugin for QGIS automatically sets the edit widget type for the fields of a selected layer based on their data types and foreign keys. This is in order to save the user time they may need to spend on manually editing these widgets. In order for this to work correctly, the layer information must be stored in a PostgreSQL database. Furthermore, foreign keys must have a constraint rule for this to work. The plugin does NOT allow you to set relations. It merely checks for any and takes advantage of them accordingly.


    When you have a database with a lot of tables, there should be a way for quickly entering data whithout having to configure the input form for each table first. In Switzerland this a common use case when you import Interlis data with ili2pg based on an official data model. ili2pg automatically adds reference constraints based on the Interlis model, which is a prerequisite to detect relations in a generic way.

    There is already a plugin with a very similar goal: DataDrivenInputMask written by Bernhard Ströbl. Maybe it was bad luck, but we had always troubles when using it. Once we eventually found out the PostgreSQL driver for the QtSql library was missing and another time we failed to use the plugin because we had a socket based connection to PostgreSQL. In both cases we were stuck in the login dialog without a hint why the connection had failed. Another problem is in some cases that the DataDrivenInputMask plugin writes additional metadata into the original database.

    So we decided to make our own proof of concept, hopefully avoiding these problems and having the Swiss “ili2pg” use case in mind. A major conceptual difference is, that AutoForm uses the built-in input form functionality of QGIS instead of creating custom dialogs like DataDrivenInputMask does.

    Running the plugin

    After you have installed the AutoForm plugin (flagged “experimental”) via the QGIS Plugin repository you should see a new option in the Plugins menu called AutoForm. Clicking on it will give you the option Generate Form. Running this with the layer selected will generate your form. If everything is correctly configured, this next step should be easy.

    • Select a PostGIS layer from the Layers Panel
    • Click on Generate Form

    And thats it! Now if you toggle editing you should see the changes which were made. In my case, this was the result:



    If your layer had a foreign key reference to one or multiple other tables, then they too were added automatically to the Layers Panel and their values should be selectable in their according fields.

    Behind the Scenes

    In order to perhaps help a bit with understanding how this Plugin works, I am adding this section for anyone who is curious. The Plugin follows this process:

    • Check if a layer is selected
    • If a layer is selected search for any key relationships to other tables
    • If a foreign key relationship to another table was found the load that table into the project and set the corresponding fields to a Value Relation WidgetType
    • Proceed normally and set the remaining Widgets where applicable

    The script checks first if the field has already been changed, so as to prevent user made edits or the previously set Value Relations from being overwritten. Then it takes a look at the typeName() of the field in order to determine which widgettype to assign. For example if the field has the Date Type, it would use the calender widget. It does this for each field in the layer. Should the user wish to make additional changes (or correct any mistakes which might have been made), they can do so in the ‘Fields’ tab of the layer properties.


    This plugin is in an early stage and serves as a proof of concept. There is much left to be done, especially in supporting embedded forms for related tables. We also don’t analyze additional metadata written by ili2pg, which would allow to recognize whether related tables are only lookup tables without the need to embed them as entry forms.

    The source code is available on Github. If you are interested in collaborating or maybe sponsoring improvements to this plugin, please contact us!

    February 10, 2017 07:01 AM

    February 09, 2017

    Fernando Quadro

    Smart Cities – Uma visão geográfica

    Isto é simplesmente o equivalente ao mundo tecnológico com um nova roupagem? Estamos vestindo conceitos antigos com uma nova terminologia? Ou podemos realmente começar a entender como nossas cidades estão evoluindo e apoiar os planejadores e desenvolvedores de novas cidades? A percepção de que 75% das pessoas globalmente estarão vivendo nas cidades em 2050 (IBM) requer soluções inteligentes, mas a chave para isso é que os dados que os alimentam sejam precisos e acionáveis. Nossa responsabilidade, portanto, é garantir que o valor da localização seja representado e envolvido o mais cedo possível neste processo.

    Ser inteligente, de ‘meu ponto de vista’, é estar levando as fantásticas fontes de dados geoespaciais de vários repositórios diferentes, integradas em um sistema, permitindo que os dados fluam e soluções sejam desenvolvidas à medida que conexões naturais são feitas entre os conjuntos de dados. Isso pode ser simplesmente acompanhamento dinâmico do volume crescente de dados de imagens de satélite, tanto Open (Sentinel e Landsat) como comercialmente disponíveis, ou uma visão mais detalhada dos dados geoespaciais para construir uma compreensão abrangente de um crescimento e desenvolvimento das cidades. Entender a visão de cima já está ajudando as cidades a considerar suas necessidades passadas, presentes e futuras.

    Em um recente trabalho de apoio ao Banco Mundial na Tanzânia mostrou exatamente isso: levando os dados Landsat e Sentinel para apresentar não apenas onde a expansão urbana havia ocorrido, mas também quais áreas estão crescendo mais rápido; Igualando uma visão dinâmica em um ambiente urbano – neste exemplo ajudando a direcionar topógrafos para gerar mapas novos e precisos de uma cidade em crescimento.

    Figura 1: Crescimento Urbano em Dar es Salaam 2014 – 2015

    Somos capazes de interpretar a mudança geográfica dentro do município. O interesse chave é onde a mudança ocorre, na Figura 1 mostra que o setor urbano central não está mudando significativamente, o rápido crescimento está no desenvolvimento não planejado para o norte e sul, demonstrado pelos setores mais escuros de laranja.

    O valor está levando esta etapa ainda mais, para mostrar o ambiente dinamicamente em mudança à medida que novas imagens de satélite se tornam disponíveis. A Figura 2 ilustra como a paisagem urbana continua a mudar rapidamente entre 2015 e 2016.

    Figura 2: Crescimento Urbano em Dar es Salaam 2015 – 2016

    No entanto, quando se olha para o mesmo tipo de dados, neste caso dados de imagens de satélite no Reino Unido, o nosso interesse específico é na mudança do ambiente ocidental, e gira em torno da mudança no espaço verde, desta vez em Milton Keynes.

    Figura 3: Mudança do Espaço Verde em Milton Keynes

    Smart sugere inteligência, e nesses exemplos simples é a inteligência dos dados convertida em análise para apresentar os resultados relevantes para as perguntas, “onde está crescendo minha cidade?”, Em oposição a”onde os espaços verdes estão perdidos em minha cidade?”; Prioridade e foco dita o que é de maior preocupação, mas intrigantemente a fonte de dados é a mesmo.

    Para os geógrafos, a cartografia é o aliado confiável na realização de tornar-se um gênio. Seu poder de combinar múltiplos conjuntos de dados, extrair informações complexas e reunir em forma de uma apresentação elegante faz do mapa a ferramenta de visualização purista do geógrafo. Desta forma, podemos e devemos continuar a abraçar os dados que a cada dia crescem e estão em constante mutação, pois nunca foi tão importante criar mapas claros e facilmente reconhecíveis.

    As aplicações inteligentes nos permitem, como empresa, dar esses passos adiante, permitindo-nos entregar não apenas um mapa, mas também os dados de inteligência de negócios associados. No ambiente urbano, podemos dar contexto a um planejador de cidade e desenvolvedor para que eles possam ver e avaliar potenciais atividades de escala macro que podem rapidamente informar e envolver um público mais amplo através de uma representação clara das cidades vivas do século XXI.

    Quanto mais rápido transformarmos o complexo em simples, mais rápido apoiamos as decisões necessárias no planejamento e desenvolvimento de cidades inteligentes novas ou existentes.

    Esta é uma tradução livre do artigo “Smart Cities – A Geographers view” escrito por Phil Cooper, CIO e Diretor da Sterling GEO.

    Fonte: LinkedIn Pulse

    by Fernando Quadro at February 09, 2017 11:54 AM

    February 08, 2017

    gvSIG Team

    Aprendiendo SIG con Juego de Tronos (VI): Hiperenlace y otras herramientas de información

    Hoy veremos las herramientas de información, centrándonos en aprender a manejar la herramienta de “Hiperenlace”.

    Son 4 las principales herramientas de información: información por punto, consultar área, consultar distancia e hiperenlace. A esas podríamos añadir otras como “Google Street View” que nos permite consultar las imágenes de este servicio de Google…aunque en este caso todavía no hay coches de Google paseando por los paisajes de Juego de Tronos.

    Esas 4 herramientas están accesibles en la barra de botones:


    Las 3 primeras son muy intuitivas y basta comentar su funcionamiento para que comencéis a probarlo.

    Información por punto: nos da información del elemento en el que hagamos clic, estando su capa activa. Nos mostrará una ventana con los valores de ese elemento en su tabla de atributos. Por ejemplo, si teniendo activada la capa “Locations” pulsamos sobre el punto que representa “King’s Landing” (Desembarco del Rey) se nos abrirá la siguiente ventana:


    Las herramientas de consultar área y distancia tienen un funcionamiento similar. Una vez seleccionada la herramienta vamos haciendo clics en la Vista y nos va mostrando en un caso información de perímetro y área, y en otro distancia parcial y total. Esta información aparece en la parte inferior de la pantalla, en la denominada barra de estado (también aparece otra información como escala, unidades o coordenadas).


    La herramienta de hiperenlace es más compleja, ya que requiere previamente definir en las “Propiedades” de la capa la configuración de los hiperenlaces. Vamos a ver su funcionamiento con un ejemplo práctico.

    Repasando el post anterior de “Edición de Tablas” vamos a añadir al campo “Web” de la tabla de atributos de la capa “Political”una serie de enlaces a páginas web con información de las Casas de Juego de Tronos:

    El resultado será similar al siguiente:


    Ahora vamos a indicarle a la Capa que tiene un campo (“Web”) que es un hiperenlace a una página web.

    Para abrir la ventana de Propiedades de una capa pulsamos con el botón secundario sobre el nombre de la capa en la Tabla de Contenidos o bien activamos la capa y vamos al menú “Capa/Propiedades”.


    De la ventana que nos abre vamos a la pestaña “Hiperenlace”, que es la que nos interesa en esta ocasión.

    Pulsamos “Activar hiperenlace”, seleccionamos el campo “Web” y la acción “Enlazar con ficheros de texto y HTML”.


    Ahora ya podemos cerrar esta ventana, pulsando el botón de “Aceptar” y comenzar a utilizar el botón de hiperenlace sobre la capa “Political”.

    ¿Qué ocurre cada vez que pulsamos sobre un elemento?…pues se abre un navegador (que por cierto se mejorará en la siguiente versión de gvSIG) con información de la página web indicada en la Tabla de atributos. Y que, en este caso, nos da toda la información sobre cada una de la Casas. Por ejemplo al pulsar en el reino del Norte ( “The North”) nos enlaza con la información de la casa Stark:


    Ahora vamos a crear otro tipo de hiperenlace. Uno que abra una imagen que tengamos en nuestro ordenador. En nuestro caso los escudos de cada una de las casas, que podéis descargar en un fichero comprimido de aquí.

    Para ello primero ponemos la Tabla de atributos de la capa “Political” en edición y añadimos la información de la ruta donde tengáis las imágenes al campo “Shield”. En mi caso:

    • /home/alvaro/Escritorio/Shields/Arryn.PNG
    • /home/alvaro/Escritorio/Shields/Baratheon.PNG
    • /home/alvaro/Escritorio/Shields/Greyjoy.PNG
    • /home/alvaro/Escritorio/Shields/Martell.PNG
    • /home/alvaro/Escritorio/Shields/NightsWatch.PNG
    • /home/alvaro/Escritorio/Shields/Stark.PNG
    • /home/alvaro/Escritorio/Shields/Tully.PNG
    • /home/alvaro/Escritorio/Shields/Lannister.PNG
    • /home/alvaro/Escritorio/Shields/Targaryen.PNG
    • /home/alvaro/Escritorio/Shields/Tyrell.PNG

    La Tabla quedará del siguiente modo:


    Tal y como hemos hecho anteriormente, redefinimos el hiperenlace indicando que el campo es “Shield” y la acción “Enlazar con ficheros de imagen”:


    Si probamos la herramienta “Hiperenlace”, cada vez que pulsemos sobre un elemento de la capa “Political” nos abrirá una imagen con el escudo de la Casa correspondiente. Así si pulsamos en “The Westerlands” (Tierras del Oeste) nos aparecerá el escudo de los Lannister:


    Y como nosotros también pagamos nuestras deudas, os emplazamos a un siguiente post de este curso de SIG tan peculiar.

    Filed under: gvSIG Desktop, spanish, training Tagged: hiperenlace, informacion, Juego de tronos, medir área, medir distancia

    by Alvaro at February 08, 2017 06:07 AM

    February 06, 2017

    Equipo Geotux

    ¡GeoTux cumple 10 años! Aquí un recuento...

    El 6 de febrero de 2017 el sitio web GeoTux cumple 10 años de creado. En este post hacemos un recuento de lo que han sido esos 10 años.



    ¿Cómo empezó GeoTux?

    GeoTux es hijo de GeoLinUD (Geoinformática en Linux de la Universidad Distrital), un sitio web creado por Samuel Mesa durante sus últimos semestres de Ingeniería Catastral y Geodesia en Bogotá. GeoLinUD tuvo por objeto servir de fuente de información y ayuda en el área de la Geoinformática con software libre.

    Al culminar el paso por la universidad e ingresar al ambiente laboral, Samuel quiso fortalecer GeoLinUD y darle un respaldo más profesional. Extendió una invitación a varios compañeros para hacer parte del proyecto. A Germán Carrillo le gustó la idea y decidió hacer parte directa de lo que luego llamarían juntos: GeoTux. Terminaba el año 2006 y comenzaba el 2007 y tras varias reuniones, GeoTux estaba listo para ser publicado (gracias al hosting gratuito francés TuxFamily, que gestionó Samuel) y promovido entre los contactos académicos y laborales de los dos administradores.

    GeoTux nació el martes 6 de Febrero de 2007, con el objetivo principal de promover la Geoinformática en software libre. El sitio web colaborativo estuvo listo semanas antes, pero por la proximidad a una fecha especial para Samuel, se decidió aplazar el lanzamiento para el día 6. Se generó material de difusión y muy pronto se publicaron los dos primeros posts relativos a la Geoinformática en Python y a la Construcción de un visor de Shapefiles con MapWinGIS, los cuales sirvieron para empezar a difundir GeoTux a lectores hispanohablantes.

    En estos 10 años...

    El año 2007 fue muy activo para GeoTux. En julio, Samuel y Germán viajaron a Mocoa, Putumayo, para implementar el SIG de un cabildo indígena NASA del Putumayo, una experiencia inolvidable.

    Un poco más tarde, entre septiembre y octubre de…


    February 06, 2017 03:33 PM