Welcome to Planet OSGeo

December 19, 2014

Boundless Blog

OGR FDW FTW!

Merry Christmas, nerds! Have you ever wanted to connect your PostGIS database directly to an existing GIS file or database and read from it without importing the data. I have good news, repeat after me: OGR FDW FTW!

(How did this happen? Boundless provides its engineers with “innovation time” to pursue personal technical projects, and this year I chose to blow all my time in one go on this project. Like the idea of innovation time? Join us!)

OGR, is the vector subsystem of the GDAL open source format library. The OGR API lets applications read and write to many different formats (Shapefile, Oracle, SDE, MapInfo, PostGIS, FGDB, GML, etc) without having to import them first.

FDW, is the PostgreSQL API for implementing “foreign data wrappers”: virtual tables in the database that are actually connections to remote data files, repositories and servers. There are FDW implementations to connect to MySQL, Oracle, SQLite, and even flat text files!

FTW, is “for the win”! Because the OGR API is a simple table-oriented read-write library that is practically begging to be hooked up to the PostgreSQL FDW system and expose OGR data sources as tables in PostgreSQL.

Here’s how it works.

First, go to the source code repository, build and install the ogr_fdw extension.

Next, create the ogr_fdw extension and the postgis extension.

CREATE EXTENSION postgis;
CREATE EXTENSION ogr_fdw;

Now create a “server”. For a database FDW, the server would be an actual database server somewhere. For the OGR FDW, a server is an OGR connection string: it could be a database server, a directory full of files, or (as in this case) a web service:

CREATE SERVER opengeo
  FOREIGN DATA WRAPPER ogr_fdw
  OPTIONS (
    datasource 'WFS:http://demo.opengeo.org/geoserver/wfs',
    format 'WFS' );

Now create a “foreign table”. This will look just like a table, to the database, but accessing it will actually create an access to the remote server.

CREATE FOREIGN TABLE topp_states (
  fid integer,
  geom geometry,
  gml_id varchar,
  state_name varchar,
  state_fips varchar,
  sub_region varchar,
  state_abbr varchar,
  land_km real,
  water_km real )
  SERVER opengeo
  OPTIONS ( layer 'topp:states' );

Now, treat the table like you would any other PostGIS table, and ask it a question in SQL:

SELECT st_area(geom::geography) 
FROM topp_states 
WHERE state_name = 'Missouri';

And the answer comes back: 180863 sq km.

How does it work? The PostgreSQL query fires off an OGR query to the server (in this case, the OpenGeo Suite demo server) which pulls the table down, and it is then filtered and calculated upon locally in PostgreSQL.

Could it be better? Sure!

It could push SQL restriction clauses down to the OGR driver, reducing the quantity of data returned to the server. For big tables, this will be very important.

It could restrict the number of columns it returns to just the ones needed for the query. This will make things a little faster.

It could allow read/write access to the table, so that INSERT, UPDATE and DELETE queries can also be run. This opens up a whole world of interoperability possibilities: imagine your database being able to directly edit a File Geodatabase on the file system? Or directly edit an ArcSDE server in a workgroup?

The biggest limitation of the PostgreSQL FDW system is that it requires a table definition before it can work, so you require a priori knowledge of the table structure to set up your tables. Because this just creates busywork, I’ve also bundled a utility program with the ogr_fdw extension: ogr_fdw_info. The utility will read an OGR data source and layer and return the SQL you need to create an FDW server and table for reading that layer.

Enjoy wrapping your foreign tables, and enjoy the holidays!

The post OGR FDW FTW! appeared first on Boundless.

by Paul Ramsey at December 19, 2014 05:11 PM

Slashgeo (FOSS Articles)

10th International gvSIG Conference: reports, posters and articles

We would like to inform you of the availability of presentations, posters and articles presented during the 10th International gvSIG Conference [1], which were held from December 3rd to 5th in Valencia (Spain).

The videos of the report sessions and workshops are also available to be visualized online. All the videos are available with English and Spanish audio, excepting the three workshops given on Wednesday and Thursday, that are only in Spanish.

With this publishing, we pretend to bring the Conference closer to the interested people that couldn’t attend the event, having the possibility to access to the recording of the different sessions.

[1] http://www.gvsig.org/plone/community/events/jornadas-gvsig/10as/reports

The post 10th International gvSIG Conference: reports, posters and articles appeared first on Slashgeo.org.

by Mario Carrera at December 19, 2014 05:01 PM

GIS for Thought

X Percent of the Population of Scotland Lives Within Y Miles of Edinburgh

Follow up from the Glasgow post by request.

This is a pretty easy question to answer, using the 2011 Scottish Census population results and the Census Output Area Population Weighted Centroids. Then we get the extents of Edinburgh City Council from OS Boundary Line.

The results are:

Pop. Count: %
Scotland 5295403 100
Edinburgh 476626 9
25 km 1276757 24.1
50 km 2500093 47.2
50 miles 3919910 74
100 km 4310869 81.4
100 miles 4812421 90.9

So we see more than people live close to Glasgow, but with 50 miles + they are closer to the capital.

To see how these boundaries look on a map:

Population buffers around Edinburgh

A few caveats:
We are using the population weighted centroids, which will produce some minor inaccuracies, but is a very good generalisation.
Also we are using euclidean buffers on the British National Grid plain, so these are not geodesic buffers. The difference will likely be small at these distances.

by Heikki Vesanto at December 19, 2014 08:00 AM

Tyler Mitchell

Twitter cards made easy in WordPress

Just a note to self and others using WordPress – the Yoast WordPress SEO plugin did the trick with only a couple clicks.   If you are tempted to hack your own HTML headers, you are going the wrong direction – follow this tutorial instead: http://www.wpbeginner.com/

Thanks WPBeginner and Yoast!

The post Twitter cards made easy in WordPress appeared first on spatialguru.com.

by Tyler Mitchell at December 19, 2014 03:28 AM

PostGIS Development

Foreign Data Wrappers for PostGIS

The last couple weeks have seen two interesting updates in the world of PostgreSQL “foreign data wrappers” (FDW). Foreign data wrappers allow you to access remote data inside your database, exactly like other tables. PostgreSQL ships with two example implementations, one for accessing text files, and the other for accessing remote PostgreSQL servers.

The two updates of interest to PostGIS users are:

  • The Oracle FDW implementation was recently enhanced to support spatial columns, so an Oracle table with SDO_GEOMETRY columns can be exposed in PostgreSQL as a table with PostGIS geometry columns.
  • A new OGR FDW implementation was released that supports exposing any OGR data source as a table with PostGIS geometry columns.

Now you can access your PostGIS data without even going to the trouble of importing it first!

by Paul Ramsey at December 19, 2014 12:00 AM

December 18, 2014

gisky

FOSDEM geospatial devroom: schedule announced!

I'm excited to announce that the schedule for the first edition of the geospatial devroom at FOSDEM is ready! I was very surprised by the number of people who submitted proposals for presentations and the quality of the proposals. All speakers have now confirmed, so I'm really happy I can share the schedule with all of you.


Head over to:
https://fosdem.org/2015/schedule/track/geospatial/
to have a look.

For those who don't know it, FOSDEM is one of Europe's largest open source developer gatherings. Access is free and no registration is required. It takes place on Saterday 31 january and Sunday 1 february (when we will have the geospatial devroom).


If you can not come to Brussels (or if the room is full): we will stream/record everything, so you can still enjoy the great talks.




Hope to see many of you in Brussels!

(image source)

by Johan Van de Wauw (noreply@blogger.com) at December 18, 2014 10:12 PM

Tyler Mitchell

Social Graph – Year in Review (Preparation)

Graph developed from Tyler Mitchell's LinkedIn connections.

I pulled this visualization of my LinkedIn social graph together in just a few minutes while working through a tutorial.  What about other social networks?  Give me your input…

I’m picking away at an article reviewing the various ways to access your own, personal, social graph data from the various sources out there.

Wow, what a mixture or results are available through the different APIs that exposed.  Some good and some bad – some really bad!  I’m especially interested in assessing how many “walled gardens” got taller or shorter walls.

Your Social Graph Projects?

I’d like to add your input to my research.  If you have:

  • built an app that leverages social network graph data to identify relationships
  • run social network or graph analytics on such data, or
  • just generally worked with APIs from Twitter, Google+, LinkedIn, Facebook (or others)

… I’d like to hear from you.  What information could you not got access to?  What information was easiest to come by?  Were some barriers insurmountable?

I’ll be completing my article, with both high level and technical “how to” reviews, later this year.

Please comment here or drop me a note.

The post Social Graph – Year in Review (Preparation) appeared first on spatialguru.com.

by Tyler Mitchell at December 18, 2014 06:30 PM

Boundless Blog

OL3-Cesium brings 3D to OpenGeo Suite web apps

OpenLayersWith the news the Google Earth API has been deprecated, what is the best way to add a 3D globe to your mapping application? Thanks to advances in browser technology such as WebGL, which allows web applications to use the device's graphics processor (GPU) for hardware accelerated rendering, OpenLayers 3 and Cesium can dynamically visualize data on 2D maps and 3D globes without the need for any plugins.

Wouldn't it be nice to be able to just switch to a 3D globe view from an OpenLayers 3 map, much like how GeoExplorer can switch between OpenLayers 2 and Google Earth? Alongside Klokan Technologies and camptocamp, we helped create OL3-Cesium and recently included it in OpenGeo Suite 4.5 to achieve just this.

Visualizing GPS tracks in OL3-Cesium

In this post, I will show how to add a 3D globe view to a mapping application using the Boundless SDK and OL3-Cesium. As an example, I created an app that allows me to drag and drop GPS tracks on a map, then switch to 3D and explore. I enjoy mountain biking in my free time, because it challenges completely different regions of the brain than writing software, and in August I succeeded in riding my steepest descent so far, the famous nose trail north of Vienna.  See how this looks with the GPS track of my nose trail adventure:

GPS Track Viewer in Action from Andreas Hocevar on Vimeo.

As you can see, OpenLayers reads 3D coordinates from the GPX file I dragged on the map, but the third dimension is only visible as I switch to the globe view and tilt the map. Obviously the elevation reported by my GPS does not match the elevation model that I use for the globe view, so my track is quite a bit above the surface. Anyway, you get the picture.

Now let's take a look at what I did so that my app can provide a globe view. Initially, I created my application with

$ suite-sdk create ol3-cesium-demo ol3view

I made a few tweaks to the layout, removed the States layer and the layer control, and I added a ol.interaction.DragAndDrop. For the globe view itself, the first thing to do is to create an olcs.OLCesium instance, connect it with the map and add an elevation model (known as terrain in Cesium). I did this in src/app.js, right after the code that creates the ol.Map instance:

var ol3d = new olcs.OLCesium(map); // map is the ol.Map instance
var scene = ol3d.getCesiumScene();
var terrainProvider = new Cesium.CesiumTerrainProvider({
  url: '//cesiumjs.org/stk-terrain/tilesets/world/tiles'
});
scene.terrainProvider = terrainProvider;

To add a "Toggle globe view" menu item, I added a list item to the dropdown-menu list:

<li><a href="#" data-toggle="collapse" data-target=".navbar-collapse.in" id="toggle-globe"><i class="fa fa-globe"></i>&nbsp;&nbsp;Toggle globe view</a></li>

Finally, at the bottom of my src/app.js file, I added a handler for the "Toggle globe view" menu item:

$('#toggle-globe').click(function() {
  ol3d.setEnabled(!ol3d.getEnabled());
});

That's it!

Drag in your GPS tracks!

You can try the demo live at http://ahocevar.github.io/ol3-cesium-demo/ and view the source code or fork it on GitHub.

The post OL3-Cesium brings 3D to OpenGeo Suite web apps appeared first on Boundless.

by Andreas Hocevar at December 18, 2014 02:30 PM

GeoSpatial Camptocamp

GeoNetwork 3 : une nouvelle génération de catalogue de métadonnées

L’équipe Geospatial de Camptocamp est heureuse de vous annoncer la sortie en mars 2015 de la version 3.0 de GeoNetwork. Fruit d’une longue histoire commencée à la FAO au début des années 2000, cette nouvelle mouture va marquer les esprits avec la refonte complète des interfaces utilisateurs.

S’appuyant sur les possibilités offertes par AngularJS et Bootstrap, les interfaces ont été conçues pour faciliter le travail des utilisateurs. L’administrateur disposera avec GeoNetwork3 d’une console épurée lui permettant d’accéder aux tableaux de bord de son catalogue et aux outils de gestion et de paramétrage (figure 1).

Figure 1 : console d’administration

console administrationL’exemple présenté sur la figure 2, extrait des outils de création d’un nouvel enregistrement dans le catalogue, illustre les efforts réalisés pour offrir un maximum de confort pour la réalisation des tâches d’administration.

Figure 2 : Interface de création d’un nouvel enregistrement dans le catalogue

add a new recorrdFédérer des partenaires autour d’un catalogue de métadonnées, c’est aussi leur donner des outils pratiques. L’édition d’une fiche de métadonnées reste l’activité la plus courante et la plus complexe pour les contributeurs d’un catalogue. GeoNetwork 3 propose par défaut un formulaire d’édition aéré et ergonomique (figure 3). Cet éditeur est de plus facilement configurable permettant ainsi la mise à disposition de formulaires simplifiés (figure 3).

Figure 3 : Formulaire d’édition pour le projet MedSea

formulaire editionDe plus, il est enrichi par des outils d’aide à la saisie : suggestions (figure 4), création d’imagette à partir du service de visualisation (figure 5), validation au regard de recommandations de la directive INSPIRE et de règles ISO (figure 6), navigation rapide dans l’éditeur (Figure 7), publication des données depuis le catalogue (Figure 8).

Figure 4 : Aide à la saisie (suggestions)

aide a la saisieFigure 5 : Création d’un aperçu à partir du service de visualisation

creation apercu

Figure 6 : Outil de validation d’une fiche

Outil de validation d'une fiche

Figure 7 : Navigation rapide dans l’éditeur

Navigation rapide dans l’éditeur

Figure 8 : Publication des données depuis le catalogue

Publication des données depuis le catalogueLe visualiseur cartographique de GeoNetwork 3 utilise la dernière version d’OpenLayers (OL3 – figure 9). GeoNetwork 3 est ainsi composé des librairies les plus modernes.

Figure 9 : visualiseur cartographique

visualiseur cartographique

La version 2.11 de GéoSource, déjà déployée en France au GrandLyon, chez Parcs Nationaux de France, au Conseil Général du Vaucluse et bientôt pour la plate-forme mongeosource du BRGM, permet à ses utilisateurs de profiter de la nouvelle interface d’administration et d’édition de GeoNetwork 3.

Les prochaines versions des catalogues Sextant de l’IFREMER (figure 10) et Geocat de Swisstopo (figure 11) dont les sorties sont prévues début 2015 seront les premières instances de GeoNetwork 3 déployées en production.

Figure 10 : Interface de recherche de Sextant (prochaine version)

interface de recherche de Sextant (prochaine version)Figure 10 : Interface de recherche de Geocat (prochaine version)

interface de recherche de Geocat (prochaine version)
GeoNetwork 3 sera présenté lors du prochain « GeoNetwork community Meeting » qui sera organisé par Swisstopo dans ses locaux à Bern les 18, 19 et 20 mars 2014. Cette rencontre permettra aux utilisateurs de découvrir les nouveautés de GeoNetwork et ses prochaines évolutions (eg. amélioration de l’utilisation des thésaurus, implémentation de la nouvelle version de l’ISO19115-1). Ce sera également une opportunité de discuter en direct avec les développeurs et les membres du comité de direction de la communauté. Nous vous espérons donc nombreux à cet événement ! L’édition précédente organisée à Amersfoort en Hollande avait été l’occasion d’un échange fructueux où les retours d’expérience ont permis à l’époque de valider les orientations prises pour GeoNetwork 3.

Camptocamp tient à remercier l’ensemble des contributeurs, développeurs et financeurs de cette nouvelle version très prometteuse de GeoNetwork. Nous restons à votre disposition pour toute demande de renseignement complémentaire sur GeoNetwork 3.

 

Cet article GeoNetwork 3 : une nouvelle génération de catalogue de métadonnées est apparu en premier sur Camptocamp.

by Emmanuel Belo at December 18, 2014 02:25 PM

Nathan Woodrow

Shortcuts to hide/show panels in QGIS

Just a quick one to show how to assign shortcuts to hide and show for panels in QGIS. Add this to your .qgis2\python\startup.py

from functools import partial
from qgis.utils import iface

from PyQt4.QtCore import *
from PyQt4.QtGui import *

mappings = {"Layers": Qt.ALT + Qt.Key_1,
            "Browser": Qt.ALT + Qt.Key_2,
            "PythonConsole": Qt.ALT + Qt.Key_3}
shortcuts = []

def activated(dock):
    dock = iface.mainWindow().findChild(QDockWidget, dock)
    visible = dock.isVisible()
    dock.setVisible(not visible)

def bind():
    for dock, keys in mappings.iteritems():
        short = QShortcut(QKeySequence(keys), iface.mainWindow())
        short.setContext(Qt.ApplicationShortcut)
        short.activated.connect(partial(activated, dock))
        shortcuts.append(short)

bind()

and now you can hide and show using ALT + number

by Nathan Woodrow at December 18, 2014 02:00 PM

Stefano Costa

Flickr selling prints of Creative Commons pictures: a challenge, not a problem

A few weeks ago Flickr, the most popular photo-sharing website, started offering prints of Creative Commons-licensed works in their online shop, among other photographs that were uploaded under traditional licensing terms by their authors.

In short, authors get no compensation when one of their photographs is printed and sold, but they do get a small attribution notice. It has been pointed out that this is totally allowed by the license terms, and some big names seem totally fine with the idea of getting zero pennies when their work circulates in print, with Flickr keeping any profit for themselves.

Some people seemed actually pissed off and saw this as an opportunity to jump off the Flickr wagon (perhaps towards free media sharing services like Mediagoblin, or Wikimedia Commons for truly interesting photographs). Some of us, those who have been involved in the Creative Commons movement for years now, had a sense of unease: after all, the “some rights reserved” were meant to foster creativity, reuse and remixes, not as a revenue stream for Yahoo!, a huge corporation with no known mission of promoting free culture. I’m in the latter group.

But it’s OK, and it’s not really a big deal, for at least two reasons. There are just 385 pictures on display in the Creative Commons category on the Flickr Marketplace, but you’ve got one hundred million images that are actually available for commercial use. Many are beautiful, artistic works. Some are just digital images, that happen to have been favorited (or viewed) many times. But there’s one thing in common to all items under the Creative Commons label: they were uploaded to Flickr. Flickr is not going out there on the Web, picking out the best photographs that are under a Creative Commons license, or even in the public domain, I guess they are not legally comfortable with doing that, even if the license totally allows it. In fact, the terms and conditions all Flickr users agreed to state that:

 

[…] you give to Yahoo the following licence(s):

  • For photos, graphics, audio or video you submit or make available on publicly accessible areas of the Yahoo Services, you give to Yahoo the worldwide, royalty-free and non-exclusive licence to use, distribute, reproduce, adapt, publish, translate, create derivative works from, publicly perform and publicly display the User Content on the Yahoo Services

That’s not much different from a Creative Commons Attribution license, albeit much shorter and EULA-like.

In my opinion, until the day we see Flickr selling prints of works that were not uploaded to their service, this is not bad news for creators. Some users feel screwed, but I wouldn’t be outraged, not before seeing how many homes and offices get their walls covered in CC art.

The second reason why I’m a bit worried about the reaction to what is happening is that, uhm, anyone could have been doing this for years, taking CC-licensed stuff from Flickr, and arguably at lower prices (17.40 $ for a 8″ x 10″ canvas print?). Again, nobody did, at least not on a large scale. Probably this is because few people feel comfortable commercially appropriating legally available content ‒ those who don’t care do this stuff illegally anyway, Creative Commons or not. In the end, I think we’re looking at a big challenge: can we make the Commons work well for both creators and users, without creators feeling betrayed?

Featured image is Combustion [Explored!] by Emilio Kuffer.

by Stefano Costa at December 18, 2014 11:08 AM

Even Rouault

JPEG-in-TIFF with mixed qualities

Some time ago, I wrote about advanced JPEG-in-TIFF uses in GDAL, i.e. TIFF files compressed with the JPEG codec. Recently, I've got again to that topic when I realized that recent libtiff versions produced files a bit bigger than necessary by repeating the quantization tables in each tile or strip, instead of omitting them in favor of the global quantization tables that are stored in a global location, the JpegTables TIFF tag, shared by all tiles or strips. This issue is now solved.

A quantization table is a set of 64 integer coefficients that are used to divide values coming from the Discrete Cosine Transform, before passing them to the Huffman compression stage and finally reaching the JPEG binary stream. The higher the coefficients, the lower the quality. The decoder must have the quantization table used by the encoder to properly recompute the values needed to do the Inverse Discrete Cosine Transform. The TIFF specification supplement 2 allows the quantization table to be stored in the JpegTables TIFF tag for all tiles or strips. Up to now, I thought this was a requirement, but indeed, reading more closely the specification, I found this was only an option. So there are several legit possibilities :
  1. centralized quantization tables stored in JpegTables TIFF tag, and none in each tile/strip. This is what GDAL generates with its internal version of libtiff (or if using older libtiff, such as 3.8.2, or the to-be-released libtiff 4.0.4) will now produce.
  2. centralized quantization tables stored in JpegTables TIFF tag, and redefined in some tile/strip if needed.
  3. no JpegTables TIFF tag, and quantization tables defined in each tile/strip.
Options 2 and 3 makes it possible to have TIFF files with strips/tiles of different qualities. Actually even option 1 could be used to have up to 2 different qualities. Why 2 ? The reason is that JPEG specification allows a maximum of 4 different quantization tables (also technically it could reference up to 16 tables, but the limit of 4 is actually enforced by libjpeg, so better not play with that!). For RGB imagery compressed in the YCbCr color space, libjpeg uses two quantization tables: one for the Y component and another one for the Cb and Cr components. Hence the capability of storing a set of table for 2 quality settings.

So, as mentionned in my previous article about GeoPackage, it would actually be possible to generate JPEG-in-TIFF with different qualities depending on area(s) of interest, for example medium quality (and high compression) on most areas, and high quality on specific spots. I would expect such variants to have good compatibility with existing readers of JPEG-in-TIFF.

If you are interested in exploring that, you can contact me.

by Even Rouault (noreply@blogger.com) at December 18, 2014 09:21 AM

PostGIS Development

PostGIS 2.1.5 Released

The 2.1.5 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.5 release. As befits a patch release, the focus is on bugs, breakages, and performance issues

http://download.osgeo.org/postgis/source/postgis-2.1.5.tar.gz

Continue Reading by clicking title hyperlink ..

by Paul Ramsey at December 18, 2014 12:00 AM

December 17, 2014

Boundless Blog

New in OpenGeo Suite 4.5: Style maps more easily with YSLD

Some years ago, I gave a talk at FOSS4G entitled "Styled Layer Descriptor, or How I Learned To Stop Worrying and Love XML." Designed to appease the skeptical audience, I described some of the more nifty features of the SLD syntax that GeoServer uses to style its maps and layers. You could have been excused for not sharing in my enthusiasm and Dr. Strangelove fans may notice that I was equating XML with a nuclear bomb, so the point was never lost on me either.

For example:

<Rule>
  <PointSymbolizer>
    <Graphic>
      <Mark>
        <WellKnownName>circle</WellKnownName>
        <Fill>
          <CssParameter name="fill">#ff0000</CssParameter>
        </Fill>
        <Stroke>
          <CssParameter name="stroke">#000000</CssParameter>
          <CssParameter name="stroke-width">2</CssParameter>
        </Stroke>
      </Mark>
      <Size>8</Size>
    </Graphic>
  </PointSymbolizer>
</Rule>

That’s a lot of markup just to create red points with a black outline. Can't we do better? Indeed we can.

Maps are not web pages, alas

There have been a few efforts over the years to make improvements to how users style layers in GeoServer. One such notable attempt was to adapt the usage of CSS in web page design to the task of map making. At first blush, this seems like a perfect fit: web-based maps can use web-based design! And in truth, the CSS styling extension for GeoServer is a powerful tool for converting CSS-like markup to SLD.

There are issues, though. CSS uses a fundamentally different painting model from SLD, so it is not possible to convert freely back and forth between CSS and SLD. Generated SLDs typically have many more rules than the CSS rules, so a reverse converter would need to identify the redundancies and eliminate them. Because the underlying rendering engine was built on the SLD model, this was problematic, and rewriting the engine wasn't feasible.

So then the question became: is there a way to simplify the syntax, while still remaining true to the underlying rendering model?

Y not?

Frustrated from working with SLD, one of my former colleagues came up with an idea: why not adapt an existing markup language? He was familiar with YAML, which we had used internally in our work with map printing. It was pleasantly terse and seemed suited to the task. This idea percolated for a while, and months later, has emerged as a central component of OpenGeo Suite Composer: YSLD.

Remember that SLD example above? Here it is as YSLD:

point:
  size: 8
  symbols:
  - mark:
  	shape: circle
  	fill-color: '#FF0000'
  	stroke-color: '#000000'
  	stroke-width: 2

Much better isn't it? Easier to read and more compact — and with many fewer brackets, that's for sure.

There are other advantages to YSLD as well. Unlike XML, YSLD is schema-less, so ordering of components is not important. And for the first time, you can now define markup that is repeated throughout the document with a variable, so you can define it once and reuse it all over.

And, since YSLD uses the same underlying model as SLD, you can translate back and forth as you wish, making it completely interchangeable and compliant with OGC standards.

Get started with YSLD!

YSLD and OpenGeo Suite Composer are available exclusively to our enterprise customers. If you needed another reason to get OpenGeo Suite Enterprise, aside from proven on-demand support and maintenance, assistance with upgrades, and advice direct from the software developers, how about we add one more: a better way to style maps.

YSLD example

Have you checked out YSLD and Composer yet? Contact us to learn more or evaluate the tools.

The post New in OpenGeo Suite 4.5: Style maps more easily with YSLD appeared first on Boundless.

by Mike Pumphrey at December 17, 2014 04:43 PM

geOps

Visualizing a train network with GeoServer rendering transformations

In the context of the Trafimage geoportal of the swiss railway company SBB geOps was given the task to visualize events and activities along the train tracks. These include planned, ongoing and finished construction work, maintenance tasks and information on equipment along the tracks.

by Nico Mandery at December 17, 2014 02:07 PM

Paulo van Breugel

Access R from GRASS GIS on Windows

Since I have switch from Windows to Linux, many years ago, things have started to look a lot brighter for those wanting to use GRASS on Windows. I won’t switch back to Windows any time soon, but I recently had to install WinGRASS for somebody else. And it was a whole lot easier than I […]

by pvanb at December 17, 2014 01:08 PM

December 16, 2014

gvSIG Team

Como descargar y compilar un plugin de gvSIG 2.1.0

Hola de nuevo,
En el articulo anterior, Como descargar y compilar gvSIG 2.1.0 en Linux y Windows me dejé en el tintero algunas cosillas. Entre ellas me falto comentar como hacer para poder descargar y compilar un plugin de gvSIG.

Voy a asumir, que ya hemos sido capaces de instalar y configurar las herramientas de desarrollo que comenté en ese articulo. Partiendo de ahí, compilar un plugin es relativamente sencillo, lo mas complicado puede ser conseguir los fuentes del plugin a compilar.

En el proyecto gvSIG, utilizamos como plataforma para la gestión de proyectos de desarrollo la herramienta “redmine”. La podemos encontrar en:

En el “redmine” tenemos dados de alta varios proyecto, entre ellos el correspondiente al núcleo de gvSIG, “Application: gvSIG desktop”. Podemos obtener un listado de los proyectos en la url:

En casi todos ellos encontraremos la url desde la que podemos descargar los fuentes.
Tenemos que tener en cuenta que un proyecto de redmine puede almacenar más de un proyecto de desarrollo y este a su vez puede contener varios plugins de gvSIG.

Vamos a ver como haríamos para descargar y compilar el proveedor de datos de PostgreSQL para gvSIG.

Mirando la lista de proyectos encontraremos una de nombre “gvSIG data provider for PostgreSQL”.
Pincharemos en él y nos mostrara la página del proyecto en el redmine. Allí mismo tendremos un enlace que dice algo como:

SVN: http://devel.gvsig.org/svn/gvsig-postgresql

Navegaremos a él, y encontraremos que en la dirección:

http://devel.gvsig.org/svn/gvsig-postgresql/trunk/org.gvsig.postgresql/

Tendremos un fichero “pom.xml”. Bien pues este seria el raíz del proyecto de nuestro plugin. Tenemos que acostumbrarnos a que los proyectos de gvSIG son proyectos maven multimodulo.

Para compilarlo y ejecutarlo es bastante sencillo.
Lo primero precisaremos de un gvSIG 2.1.0 instalado. Podemos usar una instalación normal de gvSIG o los binarios de gvSIG que se generaron siguiendo las indicaciones del articulo anterior.
Si ya los tenemos, deberemos ir al HOME de nuestro usuario y comprobar si existe ahí un fichero “.gvsig-devel.properties”. Este fichero contiene la ruta en la que encontrar nuestra instalación de gvSIG para desplegar en el los binarios del plugin que vamos a compilar. Si ya compilamos el núcleo de gvSIG ese proceso lo habrá creado con los valores adecuados. Tendremos en él algo como:

~ $ cd
~ $ cat .gvsig-devel.properties
#Fri Dec 12 00:13:47 CET 2014
gvsig.product.folder.path=C\:/devel/org.gvsig.desktop/target/product
~ $

Si no tenemos el fichero ya creado simplemente lo crearemos dejando que la entrada “gvsig.product.folder.path” apunte a la instalación de gvSIG sobre la que queremos desplegar el plugin.

Una vez esto configurado, simplemente abriremos nuestro Console2 con Busybox ya configurado y ejecutaremos:

~ $ cd c:/devel
~ $ svn checkout http://devel.gvsig.org/svn/gvsig-postgresql/trunk/org.gvsig.postgresql/
...salida del comando svn...
~ $ cd org.gvsig.postgresql
~ $ mvn install
... salida del comando install...

Si estamos en Linux, cambiaremos únicamente el comando “cd” para adaptarlo a la carpeta donde queramos dejar los fuentes.

Bueno, este proceso descarga los fuentes, los compila y despliega los binarios de nuestro plugin sobre la carpeta que indicamos en el fichero “.gvsig-devel.properties”. Por un lado nos dejara:

  • En gvSIG/extensiones el plugin ya instalado y listo para cargarse en el siguiente arranque de gvSIG.
  • En install el paquete con nuestro plugin “.gvspkg” por si queremos pasárselo a algún usuario para que lo instale con el administrador de complementos.

Ahora simplemente tendremos que arrancar el gvSIG y listo, ya arrancara con nuestro plugin.
Hay que tener en cuenta que si desplegamos sobre unos binarios de gvSIG generados a partir de los fuentes, para arrancar gvSIG deberemos arrancar ejecutando el fichero gvSIG.sh (aunque sea un sistema Windows), tal como se comento en el articulo anterior, mientras que si lo desplegamos en una instalación estándar lo arrancaremos como normalmente se arranque esa distribución.

Bueno, y eso es todo.
Como siempre, espero que os sirva.
A ver si en el próximo comento como trabajar con un IDE y depurar gvSIG.

Un saludo
Joaquin


Filed under: opinion

by Joaquin del Cerro at December 16, 2014 07:17 PM

Antonio Santiago

The Book of OpenLayers 3, completed !!!

It was a long road but finally it comes true: The Book of OpenLayers 3 is finished.

The chapter Controls and Interactions concludes the exploration of the main concepts related with OpenLayers version 3. This chapter is focused on showing how to work with the two main tools necessary to interact with the maps and its contents.

New samples have been created. Remember the source code can be found at https://github.com/acanimal/thebookofopenlayers3 and a running demo is available at http://acanimal.github.io/thebookofopenlayers3/.

I must admit finished is not the best word to apply, OpenLayers3 is big, complex and awesome enough to write tons of chapters, but I must put a final dot and leave the typewriter… for a while :)

Some of you have contact to me notifying spelling errors. I have given priority to the release of the final chapter.

Please don’t hesitate to contact me to notify me more error (or anything), my next release will be a maintenance version fixing all that errors.

Many of you has suggested me many ideas on features to write about, so I think I will come back with a “There is more” really final chapter some day.

Thanks for your confidence.

by asantiago at December 16, 2014 06:50 PM

Boundless Blog

New in OpenGeo Suite 4.5: Build maps with Composer

With the release of OpenGeo Suite 4.5, we’re proud to introduce OpenGeo Suite Composer, a tool for creating, styling, and publishing maps that is currently available exclusively to our enterprise customers. By focusing on the user experience, Composer makes authoring and publishing maps to GeoServer vastly easier than ever before with a simpler styling syntax, real-time feedback, and convenience features such as code-completion and sample code. Getting started is quick and straightforward.

Map styling is easy with YSLD

YSLD example

A typical workflow in OpenGeo Suite Composer starts with creating a new map, adding layers to it, tweaking the sample code provided for the layers, and saving the map. The new YSLD styling syntax is shorter and easier to write and, while still remaining compliant with OGC standards, represents a significant departure from the SLD syntax that was the main method for styling data layers in OpenGeo Suite. Thanks to its terse notation, a YSLD code block for styling a data layer might be 12 lines whereas its exact SLD counterpart could easily be 40 or more lines.

See the map as you edit

This new syntax combined with the ability to view changes in real-time enables cartographers to improve the quality of their maps, thus speeding up the styling process. Taking a look at the Composer interface, shown in the video above, we can see that the map takes up a large portion of the interface and is zoomable (a), that the data layer list is accessible via a re-sizable column in the center (b), and that the code for the selected layer is editable on the right-hand side (c).

intro1.png

Productivity boosters

To further enhance productivity, we’ve introduced a number of convenience features. For example, the ability to set zoom levels in the code for styles that change by zoom. In the above example, for the layer “natural,” zooms 7 and higher will display with the fill color, stroke color (i.e., outline color), and stroke width specified. Other zoom levels will not display that data layer. Previously, it was necessary to provide minimum and maximum zoom levels by source-scale denominator, and while that is still an option in YSLD, it can be difficult and verbose. Other features include:

  • Colors can be chosen via a color wheel interface or specified by color name (e.g., “green,” “blue”)
  • Any number of rules can be provided for a data layer
  • The order in which the code specifies things like filters and symbolizers is flexible
  • Code autocomplete is provided via keyboard shortcuts
  • See attributes for each data layer so they can be used in styling rules (e.g., features of type=park are green while type=hospital are pink)
  • Pan and zoom the map to determine which data layers should appear at which zoom levels

Less hassle means more designing

OpenGeo Suite Composer is not just an improved alternative to SLD, it is a significant interface overhaul that enables cartographers to make maps in a way that provides instant visual feedback and a much gentler learning curve. With Composer, a cartographer’s emphasis is primarily oriented toward the design of sophisticated cartographic compositions—a welcome sight.

A Composer-built map example

This example map, best viewed at zooms 4-10, was designed in OpenGeo Suite Composer, employing many Natural Earth datasets as well as a high-resolution land boundary built from OpenStreetMap polygons. Additionally, a high-resolution OpenStreetMap natural area dataset is visible in Canada at the higher zoom levels.

Try OpenGeo Suite Composer!

OpenGeo Suite Composer is available exclusively to our enterprise customers. Contact us to learn more or evaluate the tool.

The post New in OpenGeo Suite 4.5: Build maps with Composer appeared first on Boundless.

by Gretchen Peterson at December 16, 2014 02:44 PM

OSGeo News

Announcing the FOSS4G 2016 Call for Hosting

by aghisla at December 16, 2014 12:08 PM

December 15, 2014

Just van den Broecke

Into the Weather – Part 3 – Publishing Data to the Cloud – 1

In my last post, Into the Weather – Part 2, I outlined a global architecture of a Davis Vantage Pro2 weather station connected to a Raspberry Pi  (RPi) running weewx weather software to capture raw weather data. Here I will try to depict how to bring this weather data “from the fluffy clouds into the digital cloud”.  Finally, at the end, also some geospatial content. The image below shows the weather station sensors at the Geonovum building rooftop (was quite hazardous replacing a faulty temperature sensor there!) and the Davis console connected to the Raspberry Pi (transparent enclosure). All documentation and code can be found via: sospilot.readthedocs.org.

davis-pws-geonovum-pics

To recap: the Davis Weather Station continuously captures raw weather data through its sensors: temperature (out/in), pressure, wind (speed, direction), rainfall and even UV-radiation. This data is initially gathered within the local console display. This is fine for personal/local usage, but for capturing history, deriving trends and in particular for external sharing this is quite limited. The real fun starts with getting access to the raw data and go from there.

Weather Project Setup

This is where the Raspberry Pi with weewx and later Stetl, PostGIS, GeoServer and the 52North SOS come in, but I’ll go step-by-step. Let’s first see how we can publish weather data with just weewx.

My first post Into the Weather – Part 1 in this series introduced weewx, a Python framework for capturing, storing and publishing weather data. The Davis weather station is connected via USB to the RPi. The RPi runs weewx to gather and store weather data (in a SQLite DB) from the weather station. But weewx can do more than this: it can also publish weather data to a variety of services. As any well-designed framework, weewx is basically a kernel, the weewx engine with configurable plugins, all specified and parameterized from a single configuration file weewx.conf,  like in this example. The weewx daemon process runs forever in a main loop continuously calling on all plugins.

First there are weewx station-drivers that continuously capture raw data from most common weather stations. Although there are many brands of weather stations, many will share common hardware and protocols. The second class of plugins are archiving drivers, where/how to store raw weather data. Two standard archiving drivers are available: SQLite and MySQL. My choice: SQLite. For publication from archived data, a standard reporting driver generates a plain HTML website using an extensible skin containing (HTML) templates. By configuring an FTP or rsync destination, the generated HTML can be published to a remote webserver. This is the first connection to the digital cloud. Off course the skin and templates are highly configurable as in this example. Many examples can be found on the web. I found the nice byteweather-template by Chris Davies-Barnard. Below is the result as can be found at: sensors.geonovum.nl/weewx.

Weewx Standard Report

Weewx Standard Report

In addition, I’ve added even a more dynamic weather display like the Steelseries Gauges, as seen below and via the link sensors.geonovum.nl/weewx/gauges.

Just like other crowd-sourced projects like OpenStreetMap and WikiPedia there are various weather  communities where you can join and publish your weather data via RESTful APIs. weewx provides drivers for most common communities like Weather Underground and PWSWeather. For example, I registered the Geonovum weather station as  Geonovum IUTRECHT96 as below.

Weather Underground also provides various apps and a map, the WunderMap. Here you can view your station, together with all others that jointly provide weather data. As can be seen there is already quite some coverage within The Netherlands.

 

All in all, there is a fascinating world to explore once you get into the weather domain and its many communities.

So why am I doing all of this? Apart from having the opportunity to develop this as part of the SOSPilot Project at Geonovum, I think that “geospatial” is moving from 2D to “N-dimensional”: not only more and more “3D”   is hitting the shores (just see the recent 2014 blogs at planet.osgeo.org), but also location-based sensor data (like Air Quality and weather data) and the Internet of Things drives a need to deal with time-series data: management, storage, services and visualization. Within the Open Source geospatial world I happily see that many frameworks and tools are extended to deal with 3D, like OpenLayers/Cesium (one of my next posts) and PostGIS/PDAL and with Time like in GeoServer Dimension support. Also the OGC Sensor Web Enablement and its lighter- weight version OGC SensorThings is gaining more attention.

Not yet done with the weather. Next post I will dive into further unlocking weather data via OGC services like WMS and SOS. That would be “Publishing Data to Cloud 9″ ;-).

 

 

by Just van den Broecke at December 15, 2014 11:19 PM

gvSIG Team

Como descargar y compilar gvSIG 2.1.0 en Linux y Windows

Nota: solo voy a comentar como compilar el “núcleo” de gvSIG Desktop. Esto es solo una parte de lo que lleva la distribución de gvSIG. El resto de complementos se compilan por separado.

Hola a todos.

Normalmente desarrollo desde Linux, Ubuntu o Kubuntu. Descargarse y compilar el núcleo de gvSIG en Ubuntu suele ser relativamente sencillo.

Bastaría con ejecutar unos pocos comandos, que normalmente son fáciles de instalar en el sistema ya que están en el repositorio de paquetes de este y se pueden instalar con el comando “apt-get”. Necesitaremos tener instalado:

  • JDK 1.7, yo suelo usar el “openjdk-7-jdk”
  • Cliente de Subversion, normalmente “subversion”
  • Maven, para lo que precisaremos instalar el paquete “maven”

Para instalar esto bastaría con ejecutar desde la consola los comandos:

sudo apt-get install openjdk-7-jdk subversion maven

Una vez tenemos esto instalado, para descargar, compilar y ejecutar gvSIG bastaría con:

~ $ cd
~ $ mkdir -p devel/gvsig-desktop
~ $ cd devel/gvsig-desktop
~ $ svn checkout https://devel.gvsig.org/svn/gvsig-desktop/trunk/org.gvsig.desktop/
...salida del comando svn...
~ $ cd org.gvsig.desktop
~ $ mvn install
... salida del comando install...
~ $ cd target/product
~ $ ./gvSIG.sh

La cuestión es que si queremos desarrollar desde MS Windows, la cosa se complica un poco mas, ya que no tenemos en el sistema las herramientas que se precisan para hacer estas operaciones.

Voy a comentar muy deprisa que tendríamos que instalar para poder hacer esto mismo que he comentado para Linux en MS Windows.

Evidentemente necesitaremos el JDK, Subversion y Maven igual que en Linux, pero además, yo recomendaría un par de utilidades mas. Lo primero que echo de menos en Windows es una consola con alguna funcionalidad mas que la del sistema. Principalmente me gustaría que fuese redimensionable y que pudiese configurar el histórico. Después de dar unas vueltas por la web he acabado decidiéndome por Console2. El proyecto esta alojado en SourceForge, y podemos descargarnos los binarios desde:

Lo siguiente que precisaremos es disponer de un entorno para ejecutar archivos shell estilo Linux (bash). Hay varios para MS Windows, pero me he decantado por Busybox, mas que nada por que ofrecía lo necesario para compilar y ejecutar gvSIG en un solo ejecutable, con lo que no llena con muchas cosas el sistema. Los binarios los descargue desde:

Se trata solo de un ejecutable, no requiere instalación. Por comodidad lo he dejado en la carpeta de Windows, en mi sistema:

C:\WINDOWS

Una vez descargadas estas dos utilidades, instale el Console2. En general los valores por defecto son adecuados, solo hice un par de modificaciones:

  • Aumente el tamaño del buffer de pantalla, en el menú “Edit->Settings…”, en la sección “Console”, podemos cambiarlo. Yo suelo tener puesto 2000.
  • Configure un nuevo “tab”. En la misma ventana de “Settings”, en la sección “Tabs”, di de alta un nuevo tab con los valores:
    • Title “Terminal (busybox)”
    • Icon Use default
    • Shell “busybox bash -l”
    • Startup dir “C:\”

Y lo deje como el primer “Tab” de la lista, para que al arrancar Console2 por defecto iniciase con el.

Con esto así, simplemente, arrancando Console2 desde el icono que me ha dejado en el escritorio, ya tengo algo que me permite ejecutar scripts de shell estilo “Linux”.

Ahora hay que instalar lo siguiente:

Una vez instalado todo esto solo falta un poquito de configuración. En el HOME del usuario hay que crear un fichero “.profile”, en mi caso en:

C:/Documents and Settings/devel/.profile

Una forma rápida para ver donde esta nuestro HOME podría ser arrancar Console2 y en el shell que acabamos de configurar ejecutar:

~ $ cd
~ $ pwd
C:/Documents and Settings/devel
~ $ 

Y nos dirá donde esta nuestro HOME.

Allí crearemos el archivo “.profile” con el siguiente contenido:

#---------------------------
# .profile 
export JAVA_HOME="C:/Archivos de programa/Java/jdk1.7.0_71"
PATH="$PATH;C:/Archivos de programa/Java/jdk1.7.0_71/bin"
PATH="$PATH;C:/Archivos de programa/Subversion/bin"
PATH="$PATH;C:/Archivos de programa/apache-maven-3.0.5/bin"
alias l="ls -l "
alias ll="ls -l "
# 
#---------------------------  

Básicamente, creamos la variable JAVA_HOME y añadimos al PATH las carpetas de los binarios del JDK, Subversion y Maven. Para vosotros pueden cambiar las rutas según donde hayáis instalado las herramientas.

Una vez creado este fichero, cerraremos la consola y la volveremos a abrir, para que lea el fichero, y ya estaremos listos para descargar, ejecutar y compilar gvSIG.

Para ello, ejecutaremos los mismos comandos que comenté para un sistema Linux desde dentro de nuestra Console2. Bueno, haremos una pequeña variación… nos descargaremos los fuentes en la carpeta “C:\devel”.  El hacerlo aquí es para intentar evitar problemas con la longitud de los nombres de archivo. Muchos programas de Windows, entre ellos el “Explorer” o el “cmd” tienen problemas para acceder a ficheros cuyo nombre excede los 255 caracteres, y en la estructura de archivos de los fuentes de gvSIG acaba habiendo nombres de fichero bastante largos.

~ $ cd c:/
~ $ mkdir devel
~ $ cd devel
~ $ svn checkout https://devel.gvsig.org/svn/gvsig-desktop/trunk/org.gvsig.desktop/
...salida del comando svn...
~ $ cd org.gvsig.desktop
~ $ mvn install
... salida del comando install...
~ $ cd target/product
~ $ ./gvSIG.sh

Para asegurarme de que se puedan reproducir estos pasos (por si cambian los enlaces) he dejado en:

http://devel.gvsig.org/download/tools/gvsig-desktop/2.1.0

Una copia de los ficheros de descarga de:

Así podéis tener la misma versión de los programas que he usado yo.

Se me queda en el tintero…

Espero comentarlo, aunque sea brevemente, en próximos artículos.

Un saludo

Joaquin


Filed under: development, gvSIG Desktop, gvSIG development, spanish

by Joaquin del Cerro at December 15, 2014 04:56 PM

Even Rouault

GDAL GeoPackage raster support

TLDR

One of the recent additions in GDAL development version (2.0dev) is the support for raster tiles in the GeoPackage format.

A bit of history

The GeoPackage format has been adopted this year as a OGC standard and covers storage of both raster and vector data within the same SQLite3 container. This is not a completely revolutionary approach as there were precedents (I may have forgotten some!) :
- storage of vector format in SQLite3 database was at least done by FDO (with storage of geometries as WKT, WKB or Autodesk own geometry binary format "FGF") and Spatialite (in its own Spatialite geometry binary format, somewhat derived from WKB, but incompatible, and with compressed variants as well). Spatialite introduced the use of SQlite virtual RTree tables to implement spatial index.  Both formats have been supported for long by the OGR SQLite driver. GeoPackage vector adds yet another geometry binary format, GPB (GeoPackageBinary) that consists of a header followed by actual WKB, and borrowed the idea of RTree for spatial index from Spatialite (which was a candidate implementation in early draft versions)
- storage of rasters in SQLite3 database was at least done by MBTiles, with raster tiles being stored as PNG or JPEG tiles as BLOB records, and a multi zoom level tile indexing scheme. GeoPackage raster support clearly derives from that design choice, and use the same naming of columns in the tile tables, but with various improvements (that will be perceived as defects by proponents of MBTiles simplicity), such as support for arbitrary reference spatial systems (MBTiles is bound to Google Mercator projection) and custom tiling schemes... and also a subtle semantic difference that we will detail afterwards. GeoPackage can also support several raster tables within the same container. One current limitation of GeoPackage is that currently only images that have 8-bit depth per channel, limited to R,G,B,A color space are supported, which prevents from storing DEMs or multi-spectral imagery. So there is still place for other solutions, such as Rasterlite 2, the raster-side of Spatialite, which offers a variety of storage formats, supported bit depths, multiple regions with different resolutions within the same coverage, etc...

In addition to the above, GeoPackage introduces metadata (with a potential hierarchical organization), ways of expressing layer schema and constraints (beyond SQL capabilities provided by SQLite 3), and a formalism to define extensions to the specification, both for a few standardized official extensions and more "proprietary" extensions.

GDAL 1.11 already had support for most of GeoPackage vector specification. More recent developments have added support for spatial index, curve geometries, and aspatial tables.

Raster support was still missing. This is now the case in the latest development version.

GDAL GeoPackage raster support

GeoPackage is now one of the very few GDAL drivers to support both raster and vector with the same "Dataset" object, which is now possible since the unification of the GDAL and OGR driver models. This means that you only have to open it once to explore its content, and not once with the GDAL API, and another one with the OGR API. I should note that the way we handle multiple raster "layers" in the GDAL API through the subdataset mechanism, which requires multiple Open() calls, is probably not yet optimal compared to the vector layer API. Food for thought...

The GeoPackage raster driver has the following capabilities :
- reading, creation... and update of raster layers. That means that you can use GeoPackage as the direct output format of any GDAL utility or API, gdal_translate, gdalwarp, etc...
- on-the-fly conversion in both ways between R,G,B,A colorspace exposed by GDAL and the storage tile format :
    * grey level PNG or JPEG tiles,
    * grey level with alpha band PNG,
    * RGB PNG, JPEG or WebP tiles (storage as WebP tiles is one of the official extensions to the baseline specification),
    * RGBA PNG or WebP tiles
    * 8-bit quantized (256 color palette) PNG tiles
- on creation/update, it is possible to use a strategy where JPEG tiles are used when the tile content is fully opaque, and PNG when there is transparency involved. Note that fully transparent tiles are not stored at all, as allowed by the specification, to allow efficient sparse storage.
- use, creation and update of multiple zoom levels (known as overviews in GDAL, or pyramids) for fast zoom-in/zoom-out operations. Including support for overview levels whose resolution does not necessarily differ by power-of-two factors ( "gpkg_other_zoom" extension in GeoPackage terminology )
- reading, creation and update of metadata (in a "flat" way, i.e. ignoring the potential hierarchy of metadata records)
- reading and creation of several tiling schemes. By default, the driver will create rasters with a tiling scheme that perfectly fits the resolution and spatial registration of the input dataset being converted to GeoPackage, so that no loss of image quality (if using PNG storage) or georeferencing occurs. But for some uses, adoption of more "universal" tiling schemes with world coverage might be desirable to ease overlapping of several raster coverages, or extending the spatial extent of an existing raster. A few such tiling schemes are available, such as the popular GoogleMapsCompatible on (reused from WMTS specification). Note: although GDAL should have no problem with it, some tests have shown that another available tiling scheme, GoogleCRS84Quad, might be difficult to handle for other implementations of the specification, so better not use it until its relevance for GeoPackage has been clarified.

Full documentation of the driver page is available for those who want to explore its capabilities. All in all, pretty much everything in the raster part of the specification has been implemented.

Tiles and blocks

One of the toughest part of the implementation was the update of tiles when the origin of the GDAL area of interest does not match the corner of a tile. This is the general case when using a global tiling scheme, where the origin of a raster can be anywhere within a tile. GDAL internally uses its own tiling system, with raster "blocks" (in red in the below drawing). In the case of GeoPackage, of course we choose block dimensions that exactly matches GeoPackage tiles. But due to that possible shift of origins, a GDAL block can potentially overlap 4 GeoPackage tiles (in blue). And to add more fun, the dimensions of GDAL rasters (black rectangle) are not necessarily a multiple of the block size.


Filling GDAL blocks from GeoPackage tiles is relatively easy : you figure out which 4 tiles are needed, read them and composite the interesting pixels into the GDAL block. For more performance we cache the last read tiles so that if reading the file from left to right (typical way GDAL algorithms process a raster), we only need to load 2 new tiles, instead of 4, for each GDAL block.
Regarding writing of tiles from GDAL blocks, a naive implementation would reload, update and recompress each of those 4 tiles each time a block is updated, but besides the fact that this would be time consuming, this would introduce repeated steps of image quality loss when using lossy compression formats, such as JPEG or WebP (or 8-bit PNG), as the storage format. Instead, we use a temporary database of uncompressed and partially updated tiles, and wait for a tile to be completely updated (and that considering its 4 R,G,B,A components) before compressing it to its final storage.

What's next ?

Potential future enhancements, on the raster as well as vector side, could consist in :
- studying how de-duplication of tiles (e.g. avoiding storing multiple times a fully blue tile when mapping oceanic areas) could be done. This is likely possible through updatable views and is a feature of MBTiles.
- as tiles are completely independant from each other, it might possible to have different quality settings of JPEG/WebP compression per area(s) of interest.
- adding reading and writing of metadata at the vector layer level. This should now be possible with the GDAL-OGR unification. That would require an update in ogrinfo to display such metadata, and in ogr2ogr to define and propagate them.
- use of schema/column constraints in vector layers.
- more technical: creating the triggers that correspond to the gpkg_geometry_type_trigger and gpkg_srs_id_trigger extensions, that can be used to ensure consistency of geometries.
- and likely keeping up with future versions of the specification.

 

The fun part ! Two formats in one

Ah, and for those who remembered and reached that part of the article, I mentionned that there was a subtle difference between GeoPackage rasters and MBTiles. I can hear you : "which one??" Well, by using the GoogleMapsCompatible tiling scheme, mandatory for MBTiles, in a GeoPackage, could not we manage to have a same container that is at the same time both a valid GeoPackage and MBTiles ? The answer is yes, but that requires some trickery. A key difference is that MBTiles was based on the OSGeo TMS (Tile Map Service ) specification that decided to use a tile row number of 0 for the bottom most tile of the tiling, whereas the later OGC WMTS specification on which GeoPackage is based decided that it would be the top most tile! Fortunately with some SQL, we can do the renumbering. Demo :

$ gdal_translate test.tif test.gpkg -of GPKG -co TILE_FORMAT=PNG -co TILING_SCHEME=GoogleMapsCompatible

$ gdaladdo -r cubic test.gpkg 2 4
$ ogrinfo test.gpkg  -sql "CREATE VIEW tiles AS SELECT test.zoom_level, tile_column, tm.matrix_height-1-tile_row AS tile_row, tile_data FROM test JOIN
gpkg_tile_matrix tm ON test.zoom_level = tm.zoom_level AND tm.table_name = 'test'"
$ ogrinfo test.gpkg  -sql "CREATE TABLE metadata(name TEXT, value TEXT)"


$ ogrinfo test.gpkg  -sql "INSERT INTO metadata VALUES('name', 'my_tileset')"
$ ogrinfo test.gpkg  -sql "INSERT INTO metadata VALUES('type', 'overlay')" // or 'baselayer'
$ ogrinfo test.gpkg  -sql "INSERT INTO metadata VALUES('version', '1.1')"
$ ogrinfo test.gpkg  -sql "INSERT INTO metadata VALUES('description', 'description')"
$ ogrinfo test.gpkg  -sql "INSERT INTO metadata VALUES('format', 'PNG')"

And the result is also now a valid MBTiles dataset (while still being a valid GeoPackage), that can for example be opened by the GDAL MBTiles driver (after renaming to .mbtiles, or creating a symbolic link, since the MBTiles driver will not try to open a .gpkg file).


Finally, I would like to thank Safe Software for having financially supported this work on GeoPackage raster support.

by Even Rouault (noreply@blogger.com) at December 15, 2014 09:26 AM

Petr Pridal

WebGL Earth 2: the Leaflet compatible JavaScript 3D globe powered by Cesium

http://www.webglearth.com/

To embed a 3D globe in a website with an open-source project is now really easy. If you have a simple map application made with the popular Leaflet library, you can with almost no effort turn it into 3D interactive globe with the new WebGL Earth 2 project: just replace "L." with "WE." ;-)

See the examples of use of the API:
http://examples.webglearth.org/

The globe is made only with JavaScript, using WebGL HTML5 technology. No browser plugin is required. It means it runs automatically on all modern browsers on all computers with recent graphic cards. Chrome, Firefox, Safari, IE 11+ and even the latest Android mobile devices and soon also all Apple devices with iOS 8+.

This is a complete reimplementation of our original WebGL Earth project.

Our own JavaScript rendering core has been replaced with the great open-source Cesium project to maximally gain from the open-source principles of sharing of the development efforts. We are looking forward to contributing to the Cesium project in the future, instead of developing our own separate core.

Our target is still the same: an easy to use open-source project with public API, allowing easy embedding of a modern 3D globe in websites, with out-of-the box user friendly interaction and support for mobile devices whenever possible.

In version 2.0 we have decided to emulate the popular Leaflet JS JavaScript API enriched with 3D functions for altitude, tilting and heading of the view, and flying animations on globe. The core functions are implemented and we hope to improve the compatibility of the APIs with the help of community in the future (GitHub pull requests are very welcome ;-).

The project also preserves the original WebGL Earth JavaScript API 1.0 whenever possible.

OpenStreetMap, Bing, MapBox and other tile layers can be easily used with the globe.

Custom geodata (GeoTIFF, ECW, MrSID, ...) can be easily preprocessed with MapTiler (http://www.maptiler.com) to create attractive globes, which can be hosted on any webserver without additional software, on a LAMP hosting with TileServer-php or even on Amazon S3 and other cloud storage services. See: http://tileserver.maptiler.com/#cassini-terrestrial/webglearth

by Petr Pridal (noreply@blogger.com) at December 15, 2014 08:47 AM

Petr Pridal

WebGL Earth: Open Source 3D Globe for Web Browser



WebGL Earth is an open source software enabling to explore, zoom and “play” with the 3D globe directly in a web browser on any platform including mobile devices - without a plugin. The project lives from support and cooperation of developer community.

Live demo:
http://www.webglearth.com/.
Project page with source code:
http://www.webglearth.org/.

It is written in JavaScript using HTML5 Canvas tag with WebGL extension. The code uses Closure Library and Compiler - the same toolset which is behind GMail, Docs, Maps and other Google products.

The project displays detailed street-level data (via OpenStreetMap), detailed aerial imagery (via Bing Maps) and any other custom maps which are available in popular Mercator tiles (prepared by MapTiler, GDAL2Tiles, TileCache, GeoWebCache, etc).

Browser supporting WebGL, like Mozilla Firefox 4 Beta, Google Chrome Beta, WebKit nightly or other is necessary to try this project out.

If you don’t have one, you can at least check the video:


WebGL Earth is available to embed in your own websites via a simple JavaScript API, which we hope to extend soon - based on requests of community. Check the first set of examples demonstrating the API:


   


Developers can also modify the code completely and use the rendering core as a component for other projects. The best place to start with development is the Quickstart document.

Users are welcome to submit feedback. Developers can join mailing-list, join our community, create a new amazing applications from the source code and make contribution to WebGL Earth or help with reporting and fixing of bugs.

We are looking forward to hear from you. The future of this open-source project is in your hands!

by Petr Pridal (noreply@blogger.com) at December 15, 2014 08:47 AM

December 14, 2014

Tamas Szekeres

The new GISIntenals site (providing Windows binary packages) is about to be released

GISInternals is an online system for creating daily built binary packages for the GDAL and MapServer projects. The anchestor of this system has been provided to be the Windows buildslaves for the osgeo buildbot in 2007. The build system in the current form (providing downloadable packages) has been set up in 2009. As of this time, the system has been continuously improved by adding more and more packages to make the life easier for the users and developers of these open source projects. During this time the number of the visitors and the amount of the downloads continues to grow, and the site has been visited from more than 160 countries around the world.
In 2014 December, I'm about to release a new version with several changes in the design and the architecture. The functionality has been divided into 3 portions: the GUI frontend, the file servers (php and asp.net backends) and the the build server. This approach will provide to increase the reliability and the availability of the site by the chance of using multiple build servers (upload agents) in the future.

If you would like to preview the upcoming version visit download.gisinternals.com.

There are several upcoming changes are planned to be released in December including support for the newer compiles and additions in the packages (like MapCache and the new GDAL drivers).


by Tamas Szekeres (noreply@blogger.com) at December 14, 2014 11:20 PM

Free and Open Source GIS Ramblings

2nd edition of Learning QGIS

It’s my pleasure to announce that the updated and extended 2nd edition of Learning QGIS is available now.

I also want to take this opportunity to thank everyone who made the 1st edition such a great success!

This second edition has been updated to QGIS 2.6 and it features a completely new 6th chapter on Expanding QGIS with Python. It introduces the QGIS Python Console, shows how to create custom Processing tools, and provides a starting point for developing plugins.

Overall, the book has grown by 40 pages and the price of the print version has dropped by 3€ :-)

Happy QGISing!

2031OSos_mockupcover_normal_0


by underdark at December 14, 2014 06:27 PM

Markus Neteler

GRASS GIS 7: Vector data reprojection with automated vertex densification

GRASS GIS 7 just got better: When reprojecting vector data, now automated vertex densification is applied. This reduces the reprojection error for long lines (or polygon boundaries). The needed improvement has been kindly added in v.proj by Markus Metz.

1. Example

As an (extreme?) example, we generate a box in LatLong/WGS84 (EPSG: 4326) which is of 10 degree side length (see below for screenshot and at bottom for SHAPE file download of this “box” map):

[neteler@oboe ~]$ grass70 ~/grassdata/latlong/grass7/
# for the ease of generating the box, set computational region:
g.region n=60 s=40 w=0 e=30 res=10 -p
projection: 3 (Latitude-Longitude)
zone:       0
datum:      wgs84
ellipsoid:  wgs84
north:      60N
south:      40N
west:       0
east:       30E
nsres:      10
ewres:      10
rows:       2
cols:       3
cells:      6
# generate the box according to current computational region:
v.in.region box_latlong_10deg
exit

Next we start GRASS GIS in a metric projection, here the EU LAEA:

# EPSG 3035, metric EU LAEA:
grass70 ~/grassdata/europe_laea/user1/
GRASS 7.0.0svn (europe_laea): >

Now we first reproject the map the “traditional way” (no vertex densification as in most GIS, here enforced by smax=0):

v.proj box_latlong_10deg out=box_latlong_10deg_no_densification
location=latlong mapset=grass7 smax=0

Then we do a second reprojection with new automated vertex densification (here we use the default values for smax which is a 10km vertex distance in the reprojected map by default):

v.proj box_latlong_10deg out=box_latlong_10deg_yes_densification
location=latlong mapset=grass7

Eventually we can compare both reprojected maps:

g.region vect=box_latlong_10deg_no_densification

# compare:
d.mon wx0
d.vect box_latlong_10deg_no_densification color=red
d.vect box_latlong_10deg_yes_densification color=green fill_color=none
Comparison of the reprojection of a 10 degree large LatLong box to the metric EU LAEA (EPSG 3035): before in red and new in green. The grid is based on WGS84 at 5 degree spacing.

Comparison of the reprojection of a 10 degree large LatLong box to the metric EU LAEA (EPSG 3035): before in red and new in green. The grid is based on WGS84 at 5 degree spacing.

The result shows how nicely the projection is now performed in GRASS GIS 7: the red line output is old, the green color line is the new output (its area filling is not shown).

Consider to benchmark this with other GIS… the reprojected map should not become a simple trapezoid.

2. Sample dataset download

Download of box_latlong_10deg.shp for own tests (1kB).

The post GRASS GIS 7: Vector data reprojection with automated vertex densification appeared first on GFOSS Blog | GRASS GIS Courses.

by neteler at December 14, 2014 12:11 AM

December 13, 2014

Paul Ramsey

What to do about Uber (BC)

Update: New York and Chicago are exploring exactly this approach, as reported in the New York Times: "Regulators in Chicago have approved a plan to create one or more applications that would allow users to hail taxis from any operators in the city, using a smartphone. In New York, a City Council member proposed a similar app on Monday that would let residents “e-hail” any of the 20,000 cabs that circulate in the city on a daily basis."


Nick Denton has a nice little article on Kinja about Uber and how they are slowly taking over the local transportation market in cities they have been allowed to operate.

it's increasingly clear that the fast-growing ride-hailing service is what economists would call a natural monopoly, with commensurate profitability... It's inevitable that one ride-sharing service will dominate in each major metropolitan area. Neither passengers nor drivers want to maintain accounts with multiple services. The latest numbers on [Uber], show a business likely to bring in nearly $1bn a month by this time next year, far ahead of any competitor

BC has thus far resisted the encroachment of Uber, but that cannot last forever, and it shouldn't: users of taxis in Vancouver aren't getting great service, and that's why there's room in the market for Uber to muscle in.

Like Denton, I see Uber as a mixed bag: on the one hand, they've offered a streamlined experience which is qualitatively better than the old taxi service; on the other, in setting up an unregulated and exploitative market for drivers, they've sowed the seeds of chaos. The thing is, many of the positive aspects of Uber are easily duplicable by existing transportation providers: app-based dispatching and payment aren't rocket science by any stretch.

As an American, Denton naturally reaches for the American solution to the natural monopoly: regulated private enterprise. In the USA, monopolists (electric utilities, for example) are allowed to extract profits, but only at a regulated rate. As Canadians, we have an additional option: the Crown corporation. Many of our natural monopolies, like electricity, are run by government-owned corporations.

Since most taxis are independently owned and operated anyways, all that a Crown taxi corporation would need to do is provide a central dispatching service, with enough ease-of-use to compete with Uber and its like. The experience of users would improve: one number to call, one app to use, no payment hassles, optimized routing, maybe even ride sharing. And the Crown corporation could use supply management to prevent a race to the bottom that would impoverish drivers and reduce safety on the roads.

There's nothing magical about what Uber is doing, they are arbitraging a currently inefficient system, but the system can save itself, and all its positive aspects, by recognizing and reforming now. Bring on our next Crown corporation, "BC Dispatching".

by Paul Ramsey (noreply@blogger.com) at December 13, 2014 11:59 PM

Paulo van Breugel

Recode your raster file in GRASS GIS using a csv file

The two easiest ways to reclassify a raster layer in GRASS GIS are using the r.reclass or r.recode functions. Although both are easy enough to use, sometimes it would be nice if you could just provide the input layer and a simple table with re-class values to create new raster maps. A fairly trivial task, […]

by pvanb at December 13, 2014 01:44 PM

Free and Open Source GIS Ramblings

QA for Turn Restrictions in OSM

Correct turn restriction information is essential for the vehicle routing quality of any street network dataset – open or commercial. One of the challenges of this kind of information is that these restrictions are typically not directly visible on each map.

This post is inspired by a share on G+ which resurfaced in my notifications. In a post on the Mapbox blog, John Firebaugh presents the OSM iD editor which should make editing turn restrictions straight-forward: clicking on the source link turns the associated turn information visible. By clicking on the turn arrows, the user can easily toggle between allowed and forbidden.

iD, the web editor for OpenStreetMap, makes it even simpler to add turn restrictions to OpenStreetMap.

editing turn restrictions in iD, the web editor for OpenStreetMap. source: “Simple Editing for Turn Restrictions in OpenStreetMap” by John Firebaugh on June 06 2014

But the issue of identifying wrong turn restrictions remains. One approach to solving this issue is to compare restriction information in OSM with the information in a reference data set.

This is possible by comparing routes computed on OSM and the reference data using a method I presented at FOSS4G (video): a turn restriction basically is a forbidden combination of links. If we compute the route from the start link of the forbidden combination to the end link, we can check if the resulting route geometry violates the restriction or uses an appropriate detour:

read more about this method and results:

illustrative slide from my LBS2014 presentation on OSM vehicle routing quality – read more about this method and results for Vienna in our TGIS paper or the open pre-print version

It would be great to have an automated system comparing OSM and open government street network data to detect these differences. The quality of both data sets could benefit enormously by bundling their QA efforts. Unfortunately, the open government street network data sets I’m aware of don’t contain turn information.


by underdark at December 13, 2014 11:36 AM

Tyler Mitchell

Data Sharing Saved My Life – or How an Insurer Reduced My Healthcare Claim Costs

It’s not every day that you receive snail mail with life-changing information in it, but when it does come, it can come from the unlikeliest sources.

Healthcare data shown in a list of bio sample test resultsMy initial test results showing problems with the Liver

A year ago, when doing a simple change of health insurance vendors, I had to give the requisite blood sample.  I knew the drill… nurse comes to the house, takes blood, a month later I get new insurance documents in the mail.

But this time the package included something new: the results of my tests.

The report was a list of 13 metrics and their values, including a brief description about what they meant and what my scores should be.  One in particular was out of the norm.  My ALT score, which helps measure liver malfunction, was about 50% higher than the expected range.

Simple Data Can Be Valuable

Here is the key point: I then followed up with my family doctor, with data in hand.  I did not have to wait to see symptoms of a systemic issue and get him to figure it out. We had a number, right there, in black and white. Something was wrong.

Naturally, I had a follow up test to see if it was just a blip.  However, my second test showed even worse results, twice as high in fact!  This lead to an ultrasound and more follow up tests.

In the end, I had (non-alcoholic) Fatty Liver Disease.  Most commonly seen in alcoholics, it was a surprise as I don’t drink.  It was solely due to my diet and the weight I had put on over several years.

It was a breaking point for my system and the data was a big red flag calling me to change before it was too late.

Wellness FX chart of ALTA chart showing some of my earlier tests – loaded into WellnessFX.com for visualisation.

Not impressed with my weight nor all my other scores, I made simple but dramatic changes to improve my health.*  Changes were so dramatic that my healthcare provider was very curious about my methods.

By only making changes to my diet I was able to get my numbers to a healthy level in just a few months.  In the process I lost 46 pounds in 8 months and recovered from various other symptoms.  The pending train wreck is over.

Long Term Value in Sharing Healthcare Data

It’s been one year this week, so I’m celebrating and it is thanks to Manulife or whoever does their lab tests, for taking the initiative to send me my lab results.

It doesn’t take long to see the business value in doing so, does it?   I took action on the information and now I’m healthier than I have been in almost 20 years.  I have fewer health issues, will use their systems less, will cost them less money, etc.

Ideally it benefits the group plan I’m in too as a lower cost user of the system.  I hope both insurers and employers take this to heart and follow suit to give the data their people need to make life changing and cost reducing decisions like this.

One final thought.. how many people are taking these tests right now?  Just imagine what you could do with a bit of data analysis of their results.  Taking these types of test results, companies could be making health predictions for their customers and health professionals to review.  That’s why I’m jumping onto “biohacking” sites like WellnessFX.com to track all my scores these days and to get expert advice on next steps or access to additional services.

I’m so happy with any data sharing, but why give me just the raw data when I still have to interpret it?  I took some initiative to act on the results, but what if I had needed more incentive?  If I had been told “Lower your ALT or your premiums will be 5% higher” I would have appreciated that.

What’s your price?  If your doctor or insurer said “do this and save $100″ – would you do it?  What if they laid the data out before you and showed you where your quality of life was headed, would it make a difference to you?

I’m glad I had this opportunity to improve my health, but at this point I just say thanks for the data … and pass the salad please!

Tyler


* I transitioned to a Whole Food – Plant Based diet (read Eat to Live and The China Study).  You can read more about the massive amounts of nutrition science coming out every year at NutritionFacts.org or read research papers yourself.

The post Data Sharing Saved My Life – or How an Insurer Reduced My Healthcare Claim Costs appeared first on spatialguru.com.

by Tyler Mitchell at December 13, 2014 07:53 AM

December 12, 2014

BostonGIS

AvidGeo LocationTech Tour Boston 2014, Synopsis

This past Monday we gave an introductory workshop on PostGIS. Our slides are here: PostGIS tutorial. Overall the event was very educational. I did have a butterfly feeling through out since we were the last to speak.

Talk slides

Talks were good, but most of the topics were not new to me so harder to appreciate. The talk I thought was the best was given by Steve Gifford on Geospatial on Mobile. Two things I liked about this talk was it was a topic I didn't know much about (developing native 3D globe map apps for iOS and Android) and Steve was a very funny speaker. That kind of funniness that looks unplanned and natural. I wish it had a bit more Android content in it though. The full list of talks and associated slides below.

Workshops

The workshops I think were the best part of the event. I particularly enjoyed Andrew Hill's CartoDB talk (PostGIS without the pain but with all the fun) mostly because I haven't used CartoDb so first I've seen it in action. Being able to drag and drop a zip file from your local computer or a url of data from some site and have a table ready to query I thought was pretty cool. You could also schedule it to check for updates if it was a url to a shapefile zip or somethng. Of course being able to write raw PostGIS spatial queries and a map app in 5 minutes was pretty slick.

Ours came after, and unfortunately I think it was pretty dry and too technical for most GIS folks. Plus we hadn't told people to download the files before hand so next to impossible to follow along. We should have called it PostGIS with the pain and on a shoestring budget.

by Regina Obe (nospam@example.com) at December 12, 2014 10:11 PM

Geomatic Blog

The Null Island Algorithm

We geomaticians like to gather around a mythical place called Null Island. This island has everything: airports, train stations, hotels, postcodes, all kinds of shops, a huge lot of geocoded addresses, and whatever geographical feature ends up with null coordinates due to whatever buggy geoprocessing pipeline and ends up in the (0,0) coordinates.

But earlier this year, some geonerds such as @mizmay and @schuyler realized that there is no one Null Island, but one Null Island per datum / coordinate system (depending on who you ask). And @smathermather had the spare time to find out how the “Null Archipielago” looks like:

(Null archipielago image by @smathermather, containing Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under CC BY SA)

Fast forward a few months. I received an e-mail from one of my university peers, asking for help with a puzzle:

A friend of mine received a puzzle with some coordinates. He has to find a place on earth represented by 861126.41, 941711.64.

It’s supposed to be a populated place. Any ideas?

Well, off the top of my head, those looked slightly like UTM coordinates – two digits after the decimal point, suggesting centimeter precision… but the easting is way off the valid range for UTM coordinates.

And I realized this is the Null Archipielago problem, all over again; but instead of plotting (0,0) on a map, let’s plot all points having (861126.41, 941711.64) coordinates in any reference system.

Cue PostGIS. We can create a point in every CRS like so:

select srid, ST_GeomFromText('POINT(861126.41 941711.64)',srid) as geom
 from spatial_ref_sys;

Note the complete absence of PL/SQL in there.

But it will be much easier to work with the data if all the points are in our beloved EPSG:4326 latitude-longitude coordinate system. And while we’re at it, let’s materialize that data into a table:

select srid, ST_Transform(ST_GeomFromText('POINT(861126.41 941711.64)',srid),4326) as geom
 from spatial_ref_sys;

But there is a problem with this – the PostGIS query will crash due to some CRSs having an empty Proj4 string. This took me a while to trace and fix:

select srid, ST_Transform(ST_GeomFromText('POINT(861126.41 941711.64)',srid),4326) as geom
 from spatial_ref_sys where proj4text!='';

And now we can take this data out into a file… but once again, there’s a catch: some of the coordinates are out of bounds and represented by (∞, ∞) coordinate pair. Even though file formats can handle ∞/-∞ values (good thing we know how IEEE floating point format works, right folks?), some mapping software can not accommodate for these values. And I’m looking at you, CartoDB upload page.

In this particular case, there are only points for (∞, ∞) so the data can be cleaned up in just one pass:

delete from archipielago where ST_X(geom)>180;

Then just add a tiny bit of CartoDB magic, and publish a map:

nullarchipielago

https://ivansanchez.cartodb.com/viz/1ac4a786-805a-11e4-bc48-0e853d047bba/public_map

I still don’t know if the original puzzle has anything to do with any obscure used-in-the-real-world CRS, but at least it’s worth a try.


Archivado en: GIS

by RealIvanSanchez at December 12, 2014 12:15 PM

December 11, 2014

Boundless Blog

OpenGeo Suite 4.5 Released!

Boundless is proud to announce the release of OpenGeo Suite 4.5! Each new version of OpenGeo Suite includes numerous fixes and component upgrades as well as bringing many new features and improvements to the platform, including:

composer1.png

Try it!

Download OpenGeo Suite 4.5 and try our census map tutorial or heat map tutorial to learn more. Details about this release are included in the release notes and, as always, we strongly advise you to read the updating instructions and backup your data before installing.

Want to try out new features like Composer? Interested in support and maintenance from the experts at Boundless? Contact us to learn more about our OpenGeo Suite Enterprise offerings!

Use promotional code suite45 for a discount of 45% on training for the next week.

The post OpenGeo Suite 4.5 Released! appeared first on Boundless.

by Rolando Peñate at December 11, 2014 03:00 PM

OSGeo News

ZOO-Project 1.4.0 release

by aghisla at December 11, 2014 12:13 PM

gvSIG Team

gvSIG 2.1: from Excel to gvSIG

During the last Google Summer Code a new plugin for gvSIG 2.1 was developed. This plugin allows to load data saved in Microsoft Excel format.
This plug-in will be included by default in the next build of gvSIG, but it is possible to test it from now if you wish.
Normally we can installed plug-in through the Add-ons manager by selecting the “Standard Installation “, accessing basic plug-in included in the gvSIG distribution; either through the ” Install from URL” , accessing also more available plug-in on the gvSIG remote repository.
We can also install any plug-in from the ” Install from file ” option ; this option can be very useful for testing extensions that are neither in the standard distribution nor on the remote repository.
Let’s have a look first to a video where we show how to install the Excel plug-in, the file can be downloaded from here.

Once installed, restart gvSIG and verify that adding a new table in Excel format is now supported.
Through this plug-in we can:

  • Load Excel spreadsheets as tables
  • Load Excel spreadsheets as layers

From gvSIG we can define the following properties of Excel file to be loaded. The main properties are:

  • File: file path
  • Locale : drop down list to select the setup that defines the characters set used as separators for thousands and decimals.
  • Sheet to load : drop down list to select the Excel file to be loaded as a table.
  • Use first row as header : If this option is activated the first row will be used as fields name.
  • CRS: if the Excel worksheet contains coordinates, this parameter allows you to specify the coordinate reference system .
  • Point (X , Y, Z ) : fields name containing the coordinates. In the event that Excel sheet contains coordinates, at least X and Y fields have to be indicated.

We can also define other properties (in the ” Advanced” tab) as, for example, force the field type when loading the table. In this plug-in manual you will can find more detailed information.
As mentioned, in gvSIG 2.1, it is possible to load an Excel spreadsheet and, in presence of coordinates, we can load directly as a layer.
Let’s see an example where we load an Excel spreadsheet as a table that contains the average age of the population of Africa. In this example we have indicated that the first row contains the column names.

In another example we load an Excel spreadsheet that contains fields with coordinates directly as a layer. In this case we define the CRS and the names of the fields containing the x and y coordinates, called ” X ” and ” Y”.


Filed under: english, gvSIG Desktop Tagged: Excel, gvSIG 2.1, Table

by Giuliano Ramat at December 11, 2014 09:34 AM

gvSIG Team

gvSIG 2.1: de Excel a gvSIG

En el marco del pasado Google Summer of Code se desarrolló un nuevo complemento para gvSIG 2.1 que permite el soporte de datos en formato Microsoft Excel.

Este plugin estará incluido de base en el próximo build de gvSIG, pero los que quieran probarlo previamente, ya pueden hacerlo.

Normalmente instalamos complementos bien seleccionando la “Instalación estándar”, es decir, accediendo a los complementos incluidos de base en la distribución de gvSIG; bien a través de la “Instalación desde URL”, es decir, accediendo a los complementos disponibles en el repositorio remoto de gvSIG.

También podemos instalar cualquier complemento desde la opción “Instalación desde archivo”; esta opción puede ser muy útil para probar extensiones que todavía no están en la distribución estándar ni en un repositorio remoto.

Veamos un primer vídeo en el que se muestra como instalar el complemento de Excel, cuyo fichero podemos descargar de aquí.

Una vez instalado, reiniciamos gvSIG y veremos que al añadir una nueva tabla el formato Excel ya está soportado.

Mediante este complemento podemos:

  • Cargar hojas de Excel como tablas
  • Cargar hojas de Excel como capas

Desde gvSIG podemos definir las siguientes propiedades del Excel que queremos añadir. Las propiedades principales son:

  • File: ruta del archivo
  • Locale: desplegable par seleccionar la configuración que determina los caracteres utilizados como separadores de miles y de decimales.
  • Sheet to load: desplegable que permitirá indicar que hoja del fichero Excel queremos añadir como tabla.
  • Use first row as header: si activamos esta opción la primera fila determinará el nombre de los campos.
  • CRS: en el caso de que la hoja Excel contenga coordenadas este parámetro permite indicar su sistema de referencia coordenado.
  • Point (X, Y, Z): nombre de los campos que contienen las coordenadas. En el caso de que la hoja Excel contenga coordenadas habrá que indicar al menos los campos X e Y.

Además podremos definir otras propiedades (en la pestaña “Advanced”) como la posibilidad de forzar el tipo de campos al cargar la tabla. En el manual de este complemento se puede encontrar detallada toda la información.

Como hemos comentado, en gvSIG 2.1, una hoja Excel podemos añadirla como tabla o, en caso de que tenga coordenadas, directamente como una capa.

Veamos un primer ejemplo en el que cargamos una hoja Excel como tabla y que representa la edad media de la población de África. En este ejemplo le hemos indicado que la primera fila contiene los nombres de las columnas.

Veamos ahora un ejemplo el el que cargamos una hoja Excel que contiene campos con coordenadas directamente como capa. En este caso debemos indicar el CRS y el nombre de los campos que contienen las coordenadas x e y, que en nuestro caso se llaman “X” e “Y”.


Filed under: gvSIG Desktop, opinion, spanish Tagged: Excel, gvSIG 2.1, tabla

by Alvaro at December 11, 2014 08:23 AM

December 10, 2014

gvSIG Team

gvSIG 2.1: Alphanumeric editor at the view

One of the new features that has been included in the last gvSIG 2.1 builds, thanks to the Brazilian company GAUSS geotecnologia e engenharia, is an easy and useful tool: an alphanumeric editor that allows to edit the attributes of any element of a layer without having to open its table.

The working mode is similar to the “Info by point” tool, but in this case we can edit the attributes of the selected element. At this way the editing tasks of the gvSIG users are speeded up.

Here we can see a video about this tool:


Filed under: english, gvSIG Desktop Tagged: editing, editor, gvSIG 2.1

by Mario at December 10, 2014 04:20 PM

gvSIG Team

gvSIG 2.1: Derived Geometries

The function called Derived Geometries is now available in gvSIG 2.1. We can install, as usual, through the Addons Manager .

And how this new function works?

It creates layers of polygons or lines from a point layer , or polygons from polylines.

Let’s see, through a practical example, the operation of the tool Derived Geometries . The steps detailed in the video are :

  • Load a point layer obtained by a GPS survey conducted on a set of buildings.
  • Add the point layer as layer events, indicating what are fields containing X and Y coordinates
  • Export the layer of events as points shapefile, then label it to better show the points belonging to each building.
  • Run the tool Derived Geometries to generate a polygons layer representing the different buildings


Filed under: english, gvSIG Desktop Tagged: derived geometries, gvSIG 2.1, vector edition

by Alvaro at December 10, 2014 09:37 AM