Welcome to Planet OSGeo

March 28, 2015

Tyler Mitchell

IoT Day 2: Cloud Services for Energy Monitoring

Energy monitoring isn’t only about knowing what’s happening right now, but also understanding what happened historically.  Often that includes knowing not just what was happening but also when and why.  Enter cloud services for energy monitoring.  They range from simple charts and graphs to predicting your usage over time – essentially storing, analysing and enriching your raw data.
In this post I review two cloud services that I’ve tried so far and show you what you get from them.

This is part 2 of a series of posts about the Internet of Things applied to Home Energy Monitoring.

See my post from Day 1’s – getting started.

Cloud Service Options

The rainforest automation Eagle unit I have comes pre-configured with support for a few different cloud services: Wattvision, BuildingOS, Bidgely, BlueDot.  (Ignore the Powereye entry for now, that’s just me mucking around).

Form showing Eagle energy monitor - configuring cloud services optionsEagle monitor cloud service configuration options

To enable a service:

  1. Create an account on the website for one of the services (links above)
  2. Provide them with your Cloud ID as shown in the config screens on the Eagle
  3. Select the cloud provider in the Eagle settings and choose Set Cloud
  4. Wait – Give it a minute to get started.  It will restart some stuff on the Eagle device
  5. Visit the service provider website to see what’s going on
Screenshot showing a setup form for the BlueDot systemBlueDot service config screen
Bidgely energy monitoring config settings formBidgely energy monitoring config settings

Behind the Scenes

What exactly happens when you choose a cloud provider in the Eagle settings?

When one is chosen, the Eagle device will transmit the energy data to the provider.  Along with your device identifier, with every reading it generates a data packet for the provider to read and use in its system.  The system detects that it’s your data and makes it available to your account.

The bonus is that this results in you essentially having a copy of the data available elsewhere, on another server.  What that server does with it after is its own business.  Each provider has their own website tools and/or mobile apps that access your data and the results of their services.

So don’t expect anything on the Eagle screens to change when these are configured, it’s only sending a copy of the data and you have to go to those other sites/apps to get the added value.

As an aside, you could even set up your own server and stream your data to it!  More on that in the future.


BlueDot was the first service I tried.  It provides a very basic set of features, pretty much all being shown in this chart summary screen.

BlueDot Energy Monitor DashboardBlueDot Energy Monitor Dashboard

The charts seem to show the data in 15 minute increments.

I haven’t run it long enough to really get full value from the service.  At the bottom of the screen there is a My Recommendations section.  As it analyses more data it will add value in this spot.  They explain on their website:

Our cloud-based servers constantly analyze your home energy usage and use this information to create personalized money-saving recommendations and special offers. These recommendations are sent to the “Just For You” page in your BlueDot web browser or smartphone application.

Sounds good to me!  I hope to give it a more thorough test in the future.

If all you want is a simple chart right out of the box, this was very easy to get started with.


Most of the past day I’ve been running with the Bidgely service, so I’ve now got 24 hours worth of data to show.

Bidgely Energy Monitor DashboardBidgely Energy Monitor Dashboard

The dashboard is rich with information.  There are also two chart views, one showing usage (“demand”) and one showing an energy billing estimate.

During the configuration stage you tell it what utility you are getting your energy from and it will config the rate.

The most interesting part of Bidgely is the application of data analytics.  To start with, they have a tool that helps you train their system and to give you an estimate of how much it costs you to use an appliance.  You can use the mobile app or the website to tell it, for example, that you are going to turn on your air conditioner.  The service will watch for a few seconds or a minute and identify what kind of spike that device creates.

Bidgely Appliance Detective ModeBidgely Appliance Detective Mode

It is actually fun to go through their list and activate all the appliances you own!

The list is not comprehensive yet but I expect it will become longer in the future.  Based on comments in their community forums, users also want to be able to track other appliances or even their custom ones.

Bidgely’s main focus is not just on showing the data, but on identifying which appliances use what amounts of energy.  You can see this in the dashboard screen in the centre-left.  I called my system Whole House.  They automagically created a sub-group called Always On – that’s the baseline usage that never seems to go away. I’ll be attacking those energy vampires in the future too!

As they collect more data, my appliances will start to show up here as well.   That will be really cool.  I’m not sure how long it will take, but sounds like it might take a month or two before that happens.

At the end of the day I’ll be able to see which appliances use the most power and then adjust accordingly.


So far I haven’t made any huge discoveries or conclusive decisions for how to save on energy.  However, I did realise that the water heater is a secret sucker – it is so far out of sight and mind that I didn’t even think about it until I saw a strange plateau spike up at 11pm – when I had already shut down all my appliances and after I had showered before bed.

My space heaters are the largest culprits so far and next is the oven.  I figured this out just by looking at the times and watching what was going on around me.

At this pace I’ll soon be able to have my own, personal, conclusive evidence of which is better for heating tea – microwave or stove top.  Then everything will be right in the world!

Stay tuned for a review of more cloud services in my next post of this series.  Find me on Twitter to share more ideas or leave a comment to tell me what other areas you would like to learn about on this topic.

The post IoT Day 2: Cloud Services for Energy Monitoring appeared first on spatialguru.com.

by Tyler Mitchell at March 28, 2015 01:00 PM

March 27, 2015

GeoTools Team

CodeHaus Migration Schedule

As per earlier blog post CodeHaus is shutting down and the GeoTools project is taking steps to migrate our issue tracker and wiki to a new home.

First up I need to thank the Open Source Geospatial Foundation for responding quickly in a productive fashion. The board and Alex Mandel were in position to respond quickly and hire a contractor to work with the system admin committee to capture this content while it is still available.

I should also thank Boundless for providing me time coordinate CodeHaus migration and Andrea for arranging cloud hosting.

Confluence Migration

Is scheduled for ... now! I have taken a copy of the CodeHaus wiki and will be migrating proposals and project history. A html dump of the wiki is published at old.geotools.org so we have a record.

The new wiki is available here: https://github.com/geotools/geotools/wiki
GitHub Wiki

Jira Migration

Jira migration is scheduled for 00:00 UTC Saturday March 28th.

On Saturday all issues will be migrated to their new home (and CodeHaus issue creation will be disabled). If you wish to lend a hand testing please drop by the #osgeo IRC channel on Saturday. Harrison Grundy will be coordinating the proceedings.

We have set up a new JIRA available at osgeo-org.atlassian.net for the migration. If you need access to the new issue tracker please contact Andrea or Jody.

OSGeo Jira
As shown above a few friendly CodeHaus refugees have also been sheltered for the storm (uDig and GeoAPI).

by Jody Garnett (noreply@blogger.com) at March 27, 2015 11:19 PM

GeoTools Team

FOSS4GNA Code Sprint Replacing Vecmath

As Torben indicated on the geoserver blog we got together for a one day sprint after the foss4gna conference. Torben stole my picture for that post so I will have to stick to content ...

Our topic of choice ... replacing vecmath. The use of vecmath library has been a long standing "technical debt" for GeoTools. How long standing? The issue tracker number is GEOT-22.

So what is the problem with vecmath

The vecmath.jar is used by gt-referencing for coordinate system transformations. We only uses a couple classes form the library: primarily to implement GeneralMatrix for use in MathTransforms.
GeneralMatrix extending GMatrix
There is one small problem with this idea - vecmath is not open source! Technically vecmath is distributed as part of Java 3D (an extensions to the Java Virtual Machine). As an extension to Java it was distributed under the same licenses (Sun Binary License and a Research License) as Java.

With the GeoGig project going through LocationTech incubation we have a couple ways to use jars:
  • prerequisite: open source jars required to run
  • works with: optional jars that extend the functionality if present. These may be proprietary like an oracle database driver.
Although the vecmath license is fine for distribution it does not meet the strictly open source policies required for the GeoGig project. In this case we want GeoTools to include matrix math and needed to shop around for a replacement.

The use of vecmath (as a non open source dependency) also causes trouble for Rich Fecher's proposal to publish GeoTools on Maven Central. 

Enter EJML

With a technical dept page capturing research, some email discussion and a great lunchtime conversation at foss4gna with Eric Engle (the overlap with EclipseCon was good for something) we settled on the recommended Efficient Java Matrix Library (EJML).

GeneralMatrix delegating to DenceMatrix64F
The strategy here is delegate to the DenseMatrix64F implementation provided by EJML, and implement the methods we expect from XMatrix in terms of this new implementation.

The EJML library has the similar arrangement with SimpleMatrix wrapping a DenseMatrix64F an API friendly to casual developers. We were able to use SimpleMatrix as a guide, saving a lot of time.

How to help

While the code sprint was a success in proving the approach, there is a bit of work to go:
  • Tyler (from GeoGig) is working on removing the dependency on vecmath (there are a few other Exceptions and data structures in our API that need to be removed).
  • Jim (from GeoMesa) wants to write up more test cases to check for regressions between vecmath and EJML
  • Although Rich Fecher (from GeoWave) was unavailable to take part in the code sprint - his inspiration to work on this now means we will be hitting him up for a review when the work is complete. Thanks Rich!
  • And there is always the question of performance ... will it be faster!
To help take part, or review our work see this branch in github:
Thanks to Boundless victoria staff, Jim and Andrea for really getting behind this work. This kind of up keep keeps the community ticking along and helps the library be used in more places.

I would also like to thank the new crop of projects using GeoTools for taking part and contributing upstream. It is important to keep our community diverse and your participation is both welcomed and appreciated.

by Jody Garnett (noreply@blogger.com) at March 27, 2015 10:06 PM

Slashgeo (FOSS Articles)

GeoServer 2.7 Released

The popular open source geospatial data server GeoServer version 2.7 has been released earlier this week.

The new features highlighted in the announcement:

  • Color composition and color blending
    • These are two new extensions to the rendering engine that allows for greater control over how overlapping layers in a map are merged together
  • Relative time support in WMS/WCS
  • WPS:
    • WPS clustering
    • WPS security
    • WPS limits
    • WPS dismiss
  • CSS extension refresh
  • Cascade WFS Stored Queries

The post GeoServer 2.7 Released appeared first on Slashgeo.org.

by Alex at March 27, 2015 07:44 PM

Slashgeo (FOSS Articles)

OpenLayers 3.4.0 Released

The popular open source web mapping library OpenLayers version 3.4.0 has been released today.

Some new features according to the announcement: “Dateline wrapping has been added for tile sources, you can most clearly see this in the new wms-tiled-wrap-180 example. The draw interaction can now draw circles as well, check out the updated draw features example and select Geometry type Circle. […] Another interesting change is the ability to allow GeoJSON to be serialized according the right-hand rule, this refers to the orientation of the rings. This is very important for interoperability of GeoJSON, with systems such as Elasticsearch. […] In version 3.3.0 support for ArcGIS REST services was released […]”

Here’s the full changelog for all version from 3.1.0 to 3.4.0, versions that were not mentioned over the official OpenLayers blog.

The post OpenLayers 3.4.0 Released appeared first on Slashgeo.org.

by Alex at March 27, 2015 07:05 PM

Tyler Mitchell

IoT Day 1: Home Energy Monitoring

Eagle energy home monitoring IoT deviceEagle energy home monitoring IoT device


In my next series of blog posts we explore an Internet of Things topic – Home energy monitoring – from a first person perspective.  Join me as I install, use and hack a monitor (and related cloud services) in my new home.

Bringing IoT Home

I recently moved into a new home that uses electric heat exclusively.  Having come from a natural gas forced air furnace, I wasn’t sure what to expect in terms of cost.  I used to just keep the thermostat at 20C and forget about it.  After hearing some horror stories about $800 electricity bills for heating this place, monitoring it became a little more important to me.  So, I ordered an energy monitor.

BC Hydro helps provide a discount to get an Eagle system from Rainforest Automation, pre-programmed for your meter.  It arrived in a box today.  Here’s what I was able to do with it out-of-the-box and how I’ll be using it in the future.

Eagle energy monitor IoT device from Rainforest AutomationEagle energy monitor IoT device from Rainforest Automation

In my next post I’ll talk about how I connect it to some cloud services and see what they provide.  In the end, it is ultimately an open platform and boasts a REST API and developer documents that I’ll be digging into.

Smart Meter 101

They communicate wirelessly using a Zigbee compatible protocol.  So the Eagle monitoring unit is capturing data from that broadcast (the current meter reading counter) and storing it on the device with a timestamp.  This leads to some other questions as well, especially for the DIY device hacker.

Are there other ways of capturing that data without Zigbee – i.e. optical?  I think there are but haven’t investigated it much as this option was pre-configured and not horrendously expensive (~$60) for getting an embedded system with web server, Zigbee and more (can’t wait to hack this sucker!).

Does the meter actually store any historical data?  I’m hoping to find out, but I believe that all it spits out is the current meter counter, so it doesn’t matter if it misses a bunch of transmissions, it will correct itself later on.

Can I read my neighbours?  Looks like you need some special ID numbers generated from a hydro service account to access the meter, so I doubt that I’ll be trying it on others.

In-Device Application

After plugging it in and hooking it to an ethernet cable to my router, all the green lights came on except the Cloud light (as I hadn’t configured it yet).  The unit had a label on the underside with the name of the device as it would appear on my network eagle-xxxxxx.local – I just checked my router and used the IP directly.

First they tell you to go to their website and enter in a few codes from the label.  I didn’t get too far with that until I actually powered off the device once and restarted it.  Then I could hit the host/IP and start using it right away.  I did send a support request to them mentioning some of these issues and they actually updated their docs, so be sure to communicate early if you are working through a similar scenario it yourself.

The Eagle unit runs a few services, e.g. the RESTful interface and a web server that you can hit directly when plugged into your local network.  The main part of the UI I’m interested in is a usage History graph, showing hourly, daily or weekly usage.  It also has a download feature so you can get a CSV text file showing the same data.

I’m not sure yet how much data it will retain, guess we’ll find out tomorrow.

Graph showing electricity usage from an Eagle home enery monitorGraph showing electricity usage from an Eagle home enery monitor


Feeding External Services

Eagle energy monitor UI for configuring cloud connected servicesEagle energy monitor UI for configuring cloud connected services

An important part of the Settings in the unit are for configuring cloud services.  When configured, the Eagle unit can push data to an external service.  There are a few pre-defined ones, BlueDot, Bidgely, etc. but also the option to add your own custom service URL.  (Which tells that I think I know what Part III of this post will be already :) )

The only bummer I noticed with the cloud services so far is that, at least with the one I’ve tested, it only works when a web connection is online.  Although the monitor will continue to store data and you can view it in the devices history chart, it does not re-send records to an external service after a network failure.

So, if you are like me and want to even shut down your Internet at night to save energy, just don’t expect to gain all the benefits.  More on that later.  Until then, I’ll be leaving it all on so we can see how things go.

Next Steps

My next planned steps are to get a baseline for my energy usage during the day.  It’s actually pretty easy for me to see how it’s going throughout the day as I work from home and several of us are in the house all day long.  It’s pretty cool so far to see a spike go up when heaters turn on or the hot water is getting low.

First World Problems

Although I’m going to try to get my energy bill down, I’m still amazed at how cheap our power is, especially relative to those in the world who don’t have access to any.  So if I look like I’m out of touch trying to save a few cents a day keep in mind, I’d pay 10x my current bill before I gave up my clean, hydro-generated, renewable power in place of a dark hut with a dung fire!  I’m very thankful I can even run these experiments.

More to come in Day 2 – as I enable a cloud service to help me make sense of my data.

The post IoT Day 1: Home Energy Monitoring appeared first on spatialguru.com.

by Tyler Mitchell at March 27, 2015 03:58 PM

OpenLayers Team

OpenLayers 3.4.0 released

The OpenLayers development team is proud to announce the release of OpenLayers 3.4.0. We have been on a monthly release schedule since the beginning of this year, so you can expect a fresh OpenLayers 3 release every month.

The 3.4.0 release brings some interesting new features as well as several bug fixes. Dateline wrapping has been added for tile sources, you can most clearly see this in the new wms-tiled-wrap-180 example.

The draw interaction can now draw circles as well, check out the updated draw features example and select Geometry type Circle. You will need to click once for the center and then drag to give the circle its radius, and finally click again to finalise the circle geometry.

Draw circle geometry

Another interesting change is the ability to allow GeoJSON to be serialized according the right-hand rule, this refers to the orientation of the rings. This is very important for interoperability of GeoJSON, with systems such as Elasticsearch. For all the details please refer to the pull request.

A new example was added to show how to use existing functionality to draw arrows in line strings.

In version 3.3.0 support for ArcGIS REST services was released, which was the result of a great contribution outside of the normal developer pool, of which we have had many lately, which is very exciting. Right after the 3.3.0 release we found out through user testing (thanks neogeo) that there was an issue with this source type on Retina screens. This was fixed a few hours after the report and version 3.4.0 has the fix for this incorporated.

Also we would like to draw some more attention to the fact that we are now exclusively using StackOverflow with the openlayers-3 tag for user questions. The ol3-dev Google group is now only to be used for developer communication.

We’re excited to be meeting up next week in Schladming, Austria for a developer sprint.


by Bart van den Eijnden at March 27, 2015 02:28 PM

March 26, 2015

Peter Batty

Review of FOSS4G NA 2015

Warning: this is a LONG post! Note: updated 3/25/2015 with links to presentation videos where available. TL;DR FOSS4G NA was awesome There were way more female presenters than at previous events Mapbox was everywhere Boundless wasn’t Lots of new developments on vector tiles, including several efforts from companies other than Mapbox – even some promised from Esri! Progress towards Leaflet 1.0

by Peter Batty (noreply@blogger.com) at March 26, 2015 03:06 AM

March 25, 2015

Free and Open Source GIS Ramblings

Time Manager 1.6 – now with feature interpolation

Over the last couple of weeks, Karolina has been very busy improving and expanding Time Manager. This post is to announce the 1.6 release of Time Manager which brings you many fixes and exciting new features.

Screenshot 2015-03-25 17.58.38

What’s this feature interpolation you’re talking about?

Interpolation is really helpful if you have multiple observations of the same (moving) real-world object at different points in time and you want to visualize the movement between the observations. This can be used to visualize animal paths, vehicle tracks, or any other movement in space.

The following example shows a simple layer which contains 12 point features (3 for each id value).

Screenshot 2015-03-25 17.50.55

Using Time Manager interpolation, it is easy to create animations with interpolated positions between observations:


How is it done?

When you open the Time Manager 1.6 Settings | Add layer dialog, you will find a new option for interpolation settings. This first version supports linear interpolation of point features but more options might be added in the future. Note how the id attribute is specified to let Time Manager know which features belong to the same real-world object.

Screenshot 2015-03-25 17.43.08

For the interpolation, Time Manager creates a new layer which contains the interpolated features. You can see this layer in the layer list.

Screenshot 2015-03-25 17.46.13

I’m really looking forward to seeing all the great animations this feature will enable. Thanks Karolina for making this possible!

by underdark at March 25, 2015 05:12 PM

Boundless Blog

My First FOSS4G

A Whole New World

Being a new developer at Boundless, freshly out of university with my degree in Computer Science, I haven’t had much practical experience yet. While I have a decent understanding of programming and awareness of popular languages and frameworks, I am still building my knowledge of geospatial software while working at Boundless  on GeoServer. Also, open source communities were something I knew about, but I had never been a part of one.

So, with FOSS4G NA 2015, I was really diving into a whole new world. I didn’t know what to expect out of an open source conference, or if it would prove friendly to newbies.

Here is what I found.


Something that impressed me at the conference was the diversity. I found a wide range of people from varying backgrounds and levels of experience. Some were highly academic and studying sciences, while others were masterful cartographers. Then there were quite a few developers like myself who really weren’t veterans of geospatial. Notably, there were a sizable portion of women were at the conference, many of whom were presenters.

The topics were diverse too. The first day had some beginner-friendly sessions, including introductions to GeoServer and QGIS. Then there were “theme days”,  a concept I thought was awesome: Tuesday was PostgreSQL day, Wednesday was Big Data day, and Thursday was Web Mapping day. Other talks were going on, but I found the theme talks were especially popular. Plenty of beginner material was presented along with more advanced topics, so there was something for everyone.

Further, EclipseCon was hosted jointly with FOSS4G NA this year, providing even more diversity of people and backgrounds. Everyone who registered for one conference could attend sessions at the other. This allowed an interesting overlap of developers and scientists to mix and talk to each other. Seeing how people from different fields work differently can provide interesting insights and expand our knowledge.

Learning GIS and Software Development

Above all, my goal going into FOSS4G was to learn. A few sessions stood out in particular as great resources to me.

On the geospatial side, the list is too long to put everything here, but I’ll highlight the ones I found the most helpful and useful. For those interested in scripting with Python, the Intro to Spatial Data Analysis in Python by Jenny Palomino provided a lot of background on the many available libraries and frameworks for working with spatial data. Paul Ramsey’s PostGIS Feature Frenzy was great for introducing the power of PostGIS. There was also a whole educational “training academy” which was presented by Philip Davis in Building a Sustainable Open Source Training Academy Using QGIS. Finally, the Birds of a Feather session for GeoServer had the Boundless team as well as Andrea Aime from GeoSolutions there to answer questions and help people with GeoServer.

From the Eclipse side of things, I thought Shay Shmeltzer’s presentation on iOS Development with Eclipse and Java was particularly interesting, especially because a lot of people assumed that wasn’t possible! Another fascinating presentation was Think Before You Code by Lizzi Slivinski which promoted a good discussion about user experience (UX) and design in general. Finally, Katy DeCorah provided guidelines and considerations for writing in her presentation, Writing For Everyone.

Code Sprints and Hackathons

For the developers, there were plenty of opportunities to write some code and receive help from experienced members of the community. Tuesday night had a room dedicated for a hackathon, where I was able to meet with Andrea Aime to do some much needed bug fixing for GeoServer. Also, Boundless hosted an additional code sprint on Friday. Torben Barsballe, another new developer at Boundless working on GeoServer, and I got some help from Andrea to get started with CITE tests. The time went by really fast, but we got quite a bit done for only having a few hours. Thanks to the organizers for providing us a hackathon space, and to Boundless for the space for an additional code sprint.


FOSS4G really broadened my perspective. There are a lot of exciting things going on, especially with advances in web mapping and a greater desire to move to the cloud and process big data. It aligns well with what we’ve been working on for GeoServer and making sure it’s ready to scale up and out to meet client needs.

I think the best part about FOSS4G was that it felt welcoming. People were very open and friendly. Experts were happy to talk and share what they know, even with newbies. All the knowledge felt available for anyone who wished to pursue it. Regardless of who you are or where you stand, FOSS4G is a great experience.

Thank you to everyone who organized this year’s FOSS4G NA, and a big thanks to Boundless for sponsoring the event and giving me the opportunity to attend. Looking forward to next year.


The post My First FOSS4G appeared first on Boundless.

by Travis Brundage at March 25, 2015 01:16 PM

March 24, 2015

Boundless Blog

Embarking on a Journey with FOSS4G-NA 2015

I’m relatively new to Boundless as I build my career as a Java Developer – so it was timely during the week of March 9 I was able to attend FOSS4G-NA in San Francisco. As someone new to software conferences like this, I’d like to offer some reflections to hopefully share how these events can be as positive an experience for the New Guy as they are for the veteran.

Here’s what I found – the conference schedule was well paced, with a good variety of presenters and presentations covering numerous topics in the FOSS4G space. There were also a number of events outside of the presentation schedule where I got to interact with other people who were here for the conference – in many ways, this was the most eye-opening part of the experience for me. Real work gets done at conferences outside of the sessions, not just sitting in the auditoriums.

As noted, I am a Java Developer, so I also felt lucky the event was co-located with EclipseCon. It gave me the opportunity to address multiple interests within one event, and I can only hope other conferences offer me this breadth of information.

Some of the highlights of my experiences at FOSS4G were:


PlanetLabs presented a number of talks about their project to image the entire planet at 3-5 m resolution using a fleet of satellites. These satellites are built in-house using an “agile aerospace” approach, which entails lots of small, cheap satellites with fast turnover. This is a novel change from the conventional monolithic aerospace development strategy, and allows PlanetLabs to deploy and test improvements and changes quickly and cheaply. Since each satellite is a relatively low investment it also means that individual failures are not catastrophic. PlanetLabs also hosted a social event at their office/lab, and showed us where they build and program their satellites and mapping software.

Mapping Drones

One of the themes of Wednesday was drones. I saw presentations on how to build your own drone, and on OpenDroneMap, a piece of software for rendering drone-collected imagery in three dimensions. I also attended a presentation about kites as an alternative to drones: they are stronger, cheaper, and can stay up for longer, and people don’t feel threatened by kites like they do drones. This is especially relevant with all the discussion about drones and privacy these days, and provides an interesting look into human psychology, and how we are more accepting of what is familiar than the what is new.

Java 9 and Beyond

As part of EclipseCon, I attended a keynote on upcoming Java features. This included a discussion of a major feature of Java 9, the modular JVM. Even more interesting were some of the plans for future Java releases. These include a “value” class, which is essentially a class that behaves like a primitive, as well as primitive support for Java Generics. These future additions have been a long time coming. Primitive support for generics will be especially nice as it will eliminate the need to null-check every simple list of numbers, and greatly enhance memory efficiency as well.


As most GIS people probably already know, Cesium is a JavaScript globe and mapping library. The Cesium team made a strong showing at FOSS4G with a demonstration of 3D temporal data visualization of GPS traces using Cesium. They also presented a number of cool demo applications (that you should totally check out), which are available online:

Overall, I found FOSS4G-NA to be a valuable experience, and I would be interested in attending future conferences if I were given the opportunity. For this FOSS4G, I tried to go to talks on a wide variety of topics to explore what sort of stuff was out there. While this was definitely valuable for me as a beginner, there were definitely some things that went over my head.  If I were to go to similar events in the future, I feel like I could focus more strongly on topics that would broaden and develop my skill-set as GIS Java Developer.

The post Embarking on a Journey with FOSS4G-NA 2015 appeared first on Boundless.

by Torben Barsballe at March 24, 2015 07:19 PM

GeoSpatial Camptocamp

ngeo : une bibliothèque combinant AngularJS et OpenLayers 3

Avec ngeo, l’objectif de Camptocamp est de faciliter et rendre efficace le développement d’applications fondées sur AngularJS et OpenLayers 3. Nous développons une bibliothèque riche et flexible, adaptée à un maximum de cas d’utilisation.

ngeo fournit un ensemble de « directives » et de « services » AngularJS. Notre objectif, dans un premier temps, n’est pas de fournir des composants haut niveau, « clé en main », mais de fournir une collection de composants élémentaires, pouvant être combinés ensemble de différentes manières, selon les besoins spécifiques de l’application.

Le code source de ngeo est disponible sur GitHub. Des exemples d’utilisation sont également disponibles sur l’espace github.io de Camptocamp.

D’autres bibliothèques Open Source combinant AngularJS et OpenLayers 3 sont disponibles, la plus actuelle et conséquente étant certainement « angular-openlayers-directive ».

L’approche prise par ngeo est très différente d’angular-openlayers-directive. En effet, cette dernière offre une API déclarative pour créer des cartes OpenLayers 3 et interagir avec ces cartes ; notamment, avec angular-openlayers-directive, la carte OpenLayers 3 n’est pas créée par le code de l’application mais par la directive « openlayers » (fournie par angular-openlayers-directive), selon les valeurs d’attribut spécifiées dans le code HTML de l’application. angular-openlayers-directive fournit donc une sur-couche déclarative à OpenLayers 3.

Contrairement à angular-openlayers-directive, ngeo ne vise pas à fournir une sur-couche à OpenLayers 3. Avec ngeo, l’application est responsable de créer la carte OpenLayers 3 ; celle-ci est créée de manière impérative, dans un « contrôleur » de l’application. Le but de ngeo n’est pas de définir une autre façon d’utiliser OpenLayers 3 ; il est de fournir les briques nécessaires au développement d’applications cartographiques riches, ainsi qu’une manière d’utiliser conjointement OpenLayers 3 et AngularJS dans une application.

Comme OpenLayers 3, le code de ngeo utilise Closure Library et est compilé avec Closure Compiler. ngeo est développé pour que les applications elles-mêmes puissent bénéficier de Closure Compiler et Closure Library. Closure Compiler permet de vérifier « statiquement » le code JavaScript et de compresser le code avec des taux de compression très importants par rapport aux outils concurrents. Notre expérience nous montre que c’est un outil précieux pour le développement d’applications d’envergure et ayant des contraintes fortes en terme de performance.

Avec ngeo, Camptocamp cible donc une bibliothèque très souple, très riche fonctionnellement, et de haute qualité. Nous espérons que cet article vous a donné envie d’en savoir plus sur cette bibliothèque. N’hésitez pas à poster vos questions ou commentaires sur la mailing list ngeo-discuss.

Cet article ngeo : une bibliothèque combinant AngularJS et OpenLayers 3 est apparu en premier sur Camptocamp.

by Eric Lemoine at March 24, 2015 02:49 PM

gvSIG Team

New extension: Create legends by scale

We carry on moving towards gvSIG 2.2 with a new add-on that allows working with legends by scale (also with labels, but we’ll see that in another post).

The plugin is called Complex Legend extension and, as usual, we can install it via the Add-ons Manager, indicating the URL installation from the drop-down testing repository (Testing gvSIG repository – http://downloads.gvsig.org/download/gvsig-desktop-testing/).

Note: To have this extension in English on gvSIG 2.1 you will need to install a translations update too (at the next version it will be included). From the same Testing gvSIG repository at the add-ons manager you have to install the Translations package (Version 1.0.0-25).

Let´s see how it works…

We make the usual procedure for changing the symbology of one layer: open the window “Properties” of the layer to which the legend will apply (with right button over the layer name in the TOC of the view) and select the tab ‘Symbols’.


Between the different options for legends available, we will see one new legend: “Complex symbology”. We select it.


Then, a form will be presented in which we will start to define the ranges and types of the scales to be used in each section.

We can add new ranges with the button ‘add‘ (green cross icon) and delete the selected one with the button ‘delete‘ (red cross).


In function of the input data the combo will be completed and the bottom panel will be filled with the form needed to fill the specifics parameters of the legend.


After this, it is only need to repeat this operation, as many times as ranges needed to be defined for the layer.


We click accept and we will see the result. If everything has gone OK, the layer will be changing the legend in a dynamic way, depending on the view scale.

Another interesting utility is that in the ranges not covered by the legend, no information will be show, so, we can make the layer “invisible” for non defined scales.

In another post, we will see how to create labels by scale.

Filed under: english, gvSIG Desktop, testing Tagged: gvSIG 2.2, legends, scale, symbology

by mjlobato76 at March 24, 2015 12:55 PM

March 23, 2015

GeoServer Team


Thanks to some last minute planning, and Boundless renting space, we were able to set aside some time for a code sprint after the FOSS4G-NA conference. One of the goals of this sprint was to onboard new developers to build and run CITE tests. We also had Jim Hughes from GeoMesa joining us as a new GeoServer community member.

Clockwise from the bottom left corner: Torben Barsballe (Boundless), Travis Brundage (Boundless), Kevin Smith (Boundless), Andrea Aime (GeoSolutions)

Offscreen: Jim Hughes (GeoMesa), Jody Garnett (Boundless), Tyler Battle (Boundless)


Andrea introduced us to the OGC Compliance Testing Program (CITE), which provides resources for ensuring conformance with the OGC Standards.

One of the core tools within CITE is the Test, Evaluation, And Measurement (TEAM) Engine, which executes test suites written using the OGC Compliance Test Language (CTL). This is the official test framework of the CITE program, and all CITE tests published by the OGC are written for the TEAM engine, using CTL.

Currently, we execute a number of nightly geoserver builds that run CITE tests:

  • cite-wcs-1.0
  • cite-wcs-1.1
  • cite-wfs-1.0
  • cite-wfs-1.1
  • cite-wms-1.1
  • cite-wms-1.3

These do not encompass the full set of CITE tests published by the OGC. In the interests of adding better test coverage, it is important to familiarize new developers with how the CITE tests are build and run.

During the code sprint, we attempted to set up and execute tests on a local instance of the TEAM engine. This entailed:

To run the WFS 1.0 CITE tests, we:

  • Ran geoserver using citewfs-1.0 data directory
  • Ran the TEAM Engine on a local tomcat server
  • Configured the WFS test suite in the TEAM engine, and ran it.

There were a few hiccups during this process, but closely following the existing documentation was sufficient to get the CITE tests running properly. While I did not run into this issue, Andrea mentioned that bugs in the CITE tests themselves are about as common as bugs in the OGC-compliant GeoServer services. This makes running CITE tests a bit of a treasure hunt between which test failures are coming from from the tests, and which are coming from the service being tested.

Successful WFS 1.0 CITE Tests!


After I succeed in getting the WFS 1.0 test suite running, I tried building the WCS 1.0 test suite (which was a bit more complicated). I was able to get the tests running, but encountered a number of test failures. Among other things, the WCS CITE tests fail if the endpoint they are querying also publishes WCS 1.1 resources. This means that in order to properly run the tests, you have to build GeoServer without WCS 1.1. There also other test failures that I was unable to debug during the sprint.


We would like to thank the GeoServer community for welcoming newcomers, Boundless for renting the space and congratulate Jim Hughes on his new community modules.

by tbarsballe at March 23, 2015 09:16 PM

Markus Neteler

Inofficial QGIS 2.8 RPMs for EPEL 7: Fedora 20, Fedora 21, Centos 7, Scientific Linux 7

qgis-icon_smallThanks to the work of Devrim Gündüz, Volker Fröhlich, Dave Johansen, Rex Dieter and other Fedora/EPEL packagers I had an easy going to prepare RPM packages of QGIS 2.8 Wien for Fedora 20 and 21, Centos 7, and Scientific Linux 7.

The base SRPM package I copied from Fedora’s koji server, modified the SPEC file in order to remove the now outdated PyQwt bindings (see bugzilla) and compiled QGIS 2.8 via the great COPR platform.

Repo: https://copr.fedoraproject.org/coprs/neteler/QGIS-2.8-Wien/

The following packages can now be installed and tested on epel-7-x86_64 (Centos 7, Scientific Linux 7, etc.), Fedora-20-x86_64, and Fedora-21-x86_64:

  • qgis 2.8.1
  • qgis-debuginfo 2.8.1
  • qgis-devel 2.8.1
  • qgis-grass 2.8.1
  • qgis-python 2.8.1
  • qgis-server 2.8.1

Installation instructions (run as “root” user or use “sudo”):

# EPEL7:
wget -O /etc/yum.repos.d/qgis-epel-7.repo https://copr.fedoraproject.org/coprs/neteler/QGIS-2.8-Wien/repo/epel-7/neteler-QGIS-2.8-Wien-epel-7.repo
yum update
yum install qgis qgis-grass qgis-python qgis-server

# Fedora 20:
wget -O /etc/yum.repos.d/qgis-epel-7.repo https://copr.fedoraproject.org/coprs/neteler/QGIS-2.8-Wien/repo/fedora-20/neteler-QGIS-2.8-Wien-fedora-20.repo
yum update
yum install qgis qgis-grass qgis-python qgis-server

# Fedora 21:
wget -O /etc/yum.repos.d/qgis-epel-7.repo https://copr.fedoraproject.org/coprs/neteler/QGIS-2.8-Wien/repo/fedora-21/neteler-QGIS-2.8-Wien-fedora-21.repo
yum update
yum install qgis qgis-grass qgis-python qgis-server

The other packages are optional (well, also qgis-grass, qgis-python, and qgis-server…).


PS: Of course I hope that QGIS 2.8 officially hits EPEL7 anytime soon! My COPR repo is just a temporary bridge towards that goal.

The post Inofficial QGIS 2.8 RPMs for EPEL 7: Fedora 20, Fedora 21, Centos 7, Scientific Linux 7 appeared first on GFOSS Blog | GRASS GIS Courses.

by neteler at March 23, 2015 08:57 PM

OSGeo News

GeoTools 13.0 Released

by jsanz at March 23, 2015 06:16 PM

GeoTools Team

GeoTools 13.0 Released

GeoTools 13.0 Released

The GeoTools community is pleased to announce the availability of GeoTools 13.0the first stable release of the GeoTools 13 series.

Download GeoTools 13:
This release is also available from our maven repository, and is made in conjunction with GeoWebCache 1.7.0 and GeoServer 2.7.0.

New Features and Improvements

The following new features and improvements are available in this release:

Data Stores
  • GeoPackage support is now compatible with QGIS and OGR.
  • A new gt-solr data store is available for working with with Apache Solr. Thanks to Andrea Aime and Justin Deoliveira for this work.
  • CSVDataStore has been improved with write access and the ability to work with different kinds of CSV files (including lat/lon and WKT). Thanks to Travis Brundage for this work.
  • AbstractDataStore is now deprecated. Please make plans to migrate to ContentDataStore. There is an extensive ContentDataStore tutorial to help with your migration.
  • PropertyDataStore has now been ported to use ContentDataStore, and a wide range of issues (readers, events, transactions) have been resolved for all DataStore implementations. Thanks to Torben Barsballe for this extensive QA work.
CSS Module

  • The gt-css module is a brand-new implementation of CSS-style support, now written in Java. Thanks to Andrea Aime for bringing this exciting approach to styling within easy reach of Java developers. Here is a quick code example:
Stylesheet ss = CssParser.parse(css);
CssTranslator translator = new CssTranslator();
Style style - translator.translate(ss);

Rendering Engine
  • Color-blending has been introduced as both a FeatureTypeStyle and Symbolizer vendorExtension, allowing for a range of special effects. Thanks to Cleveland Metro Parks for this improvement.
  • FeatureTypeStyle vendorExtension named "firstMatch". This vendorExtension will grab the first rule you specify. This can in many situations help you avoid more complicated logic for style rules.
    <sld:VendorOption name="ruleEvaluation">first</sld:VendorOption>

  • Anchor points are now supported. This is done in SLD with the AnchorPoint tag:
WFS Client

The gt-wfs client has improved:
  • Better compatibility with MapServer.
  • Extensive work has been done to support WFS 2.0 transactions. Thanks to Niels Charlier for this work.
  • The gt-wfs client now supports WFS 2.0 stored queries. Thanks to Sampo Savolainen for this work.
And more
Documentation continues to improve, now with a complete function listFor a full list of improvements and changes in GeoTools 13, please see the full release notes.

Thanks to Jody Garnett and Travis Brundage (from Boundlessfor making this release happen.

by Travis Brundage (noreply@blogger.com) at March 23, 2015 04:53 PM


A data-range-independent color ramp in my #Geoserver #heatmap

Geoserver proporciona un modo sencillo de generar mapas de calor, pero la especificación de los parámetros y los resultados obtenidos parecen indicar que se generan colores en función del valor máximo que se encuentra en la zona concreta del mapa que estamos pidiendo, por tano en cualquier caso aparecerá el color asociado al máximo: Geoserver provides a simple way to generate heatmaps, but the meaning of the available parameters and the results seem to indicate that the colors used depend on the highest value found within the limits of the requested map, so in all cases the color for the maximum value will be visible:
Esto es un inconveniente si queremos generar varios mapas de calor (de zonas diferentes o con datos diferentes) conservando la coherencia entre el significado de los colores. Un modo de conseguir esto es añadir a nuestra tabla un elemento con un peso muy alto, de modo que el color máximo siempre corresponderá a ese punto del mapa: This is not good if we intend to generate various heatmaps (from different areas or from different data) and still preserve some kind of consistency in the meaning of the colors. A way to prevent this issue is by adding a record to our source table, in which the relevant field will have a very high value, so it will always get the highest-value color:
Habrá que elegir un lugar del mapa en el que no haya nada (por ejemplo en el mar) para poder cortarlo después: We'll put that element in an empty place of our map, so we can cut it off afterwards:
Tras recopilar información con la ayuda de POIProxy, aquí puede verse una animación que muestra la actividad en las redes sociales en la ciudad de Valencia. Puede apreciarse que el color del valor máximo sólo se alcanza en momentos concretos, y cada madrugada la actividad baja muchísimo, sobre todo en torno a las 6: By using POIProxy, we can gather some timestamped, georeferenced data about social networks activity in Valencia (Spain). We can see that the color for maximum value (red) only appears in some cases, and the heatmap fades out almost completely every morning at about 6am:
[12 MB]

by jldominguez at March 23, 2015 02:39 PM

Paulo van Breugel

Importing GLCF MODIS woody plant cover

The data set The Global Land Cover Facility offers, amongst many other data sets, the MODIS Vegetation Continuous Fields data set for download. These are layers that contain proportional estimates for vegetative cover types (woody vegetation, herbaceous vegetation, and bare ground). As such they are very suitable depict areas of heterogeneous land cover. Their MODIS … Continue reading Importing GLCF MODIS woody plant cover

by pvanb at March 23, 2015 02:00 PM

GeoServer Team

GeoServer 2.7 released

The GeoServer team is pleased to announce the latest major release of GeoServer: version 2.7.

Quick links:

This release includes a variety of improvements and fixes provided by and for the community over the past six months since GeoServer 2.6 was released. (See our release schedule.) While many of these high-level features have been highlighted in previous posts, we’d like to list them in brief here, with links to documentation so you can learn more.

Color composition and color blending

These are two new extensions to the rendering engine that allows for greater control over how overlapping layers in a map are merged together. Instead of just placing layers on top of others (with or without transparency), there is now a range of filters and effects, such as “multiply”, “darken”, and “hard light”.

Please see the documentation for an example on how to create inner line effects such as the image below:

Thanks to Cleveland Metroparks for sponsoring this improvement.

Relative time support in WMS/WCS

GeoServer has long had the ability to specify dates/times in requests to subset data. Up until now these dates/times needed to be absolute. Support has now been added for specifying relative time, for example:

  • Last 36 hours from the present (PT36H/PRESENT)
  • Day after December 25 2012: (2010-12-25T00:00:00.0Z/P1D)

Thanks to Jonathan Meyer for this improvement.

WPS clustering

There are quite a few improvements to the Web Processing Service module, courtesy of Andrea Aime and GeoSolutions. (Please note that WPS is still an extension.)

GeoServer has a new WPS extension point allowing GeoServer nodes in the same cluster to share the status of current WPS requests. This is particularly important for asynchronous requests, as the client polling for the progress/results might not be hitting the same node that’s currently running the request.

This initial implementation leverages the Hazelcast library to share the information about the current process status using a replicated map.

WPS security

GeoServer now has the ability to connect WPS processes to the standard role-based security system. This means that administrators can now determine what users and groups can access or execute, making WPS usage safer and more secure.

WPS limits

In addition to limiting the users and groups that can access WPS processes, GeoServer now also has the ability to set WPS input execution limits (such as timeout values), ensuring that a runaway process can’t cause a system to fail due to utilizing too many resources. Limits can be set globally and on a per-process basis.

WPS dismiss

A client that connects to the WPS now not only has the ability to execute processes, but also the ability to dismiss/kill processes. Also new is the ability for the administrator to see the current processes that are being executed on the system.

CSS extension refresh

The popular CSS extension, originally written by David Winslow of Boundless, allows users to style layers using a CSS-like syntax instead of SLD. This extension has now been entirely rewritten in native Java. The functionality remains the same, though with improvements in speed and stability.

Thanks to Andrea Aime from GeoSolutions for this improvement.

New CSS workshop

There is also now a full workshop-sized tutorial devoted to using CSS in GeoServer. This expands upon the basic tutorial, and goes into greater detail, providing a powerful learning resource for anyone who wants to learn how to style maps with CSS.

Thanks to Jody Garnett from Boundless for donating the workshop to the community.

Cascade WFS Stored Queries

Thanks to Sampo for adding support for cascaded WFS stored queries.

Try out the new version

See the full list of changes linked from the release page, and please read these previous posts for more information on these new features. While no software is ever bug-free, we fully stand behind this release, and believe it will provide you with a better, more stable, and featured-filled GeoServer. Thanks!

Download GeoServer 2.7

About GeoServer 2.7

Articles and resources for GeoServer 2.7 series:

by Mike Pumphrey at March 23, 2015 12:00 PM

Boundless Blog

Using WMS time to explore your data

A feature of GeoServer that’s not very well known is that it can publish layers that contain a time component. Clients can request data for a specific date/time, a list of dates/times, or even over a range. This is built-in to the WMS protocol; no third-party requirements are necessary here.

This can of course be used to create animations (see the MapStory project for a high-profile example of this), but even simpler, it can be used to just highlight another dimension of a dataset in a useful way.

With the release of GeoServer 2.7, the ability to request relative intervals has been added as well. Previously each request had to include absolute intervals, but now you can request data with parameters that include, for example, “<some date> and 30 days before it” or, more interestingly “10 years before <today>” where <today> is the server time.

So I thought it would be interesting to highlight this feature in GeoServer to give you some ideas and inspiration on how you can adapt this to your data.

The states and the union

First, let’s take a look at a dataset. One that I happen to have handy is the order in which the states of the USA joined the union. (Taken from MapStory.)

I’ve stripped down the style here to make comprehension easier. For each feature, there is an attribute called Date_of_St which is a timestamp accurate to the day.

For example, the value of the Date_of_St attribute for the feature representing Hawaii is listed as 1959-01-03T08:00:000Z This jives with the official date as listed on Wikipedia, though I suspect the hour may be one level of precision too ambitious.

We can set this attribute as a “special” time dimension in GeoServer. Once the layer is loaded, in the Edit Layer configuration area, the Dimensions tab contains the ability to link an attribute to the time dimension.


For our purposes, we really only care about the year. Luckily—and this is a big advantage of using WMS dimensions for this—requests need only be as precise as you want. So if you want to request a date range of 1900-1950, you don’t need to specify it as:


Instead, you can just write:


(Imagine trying to make this flexibility of input happen without this feature. Think of generating a text string search to match up the dates exactly. No fun at all.)

We’re going to make requests to GeoServer to find out which states joined at which times. The full WMS request, utilizing OpenLayers, is a mouthful:


But for tutorial purposes, I always prefer using the WMS Reflector, which makes reasonable assumptions about the request for the sake of brevity. That same request above can be shrunk to this:


Much better right? Except one little pitfall about enabling WMS time is that when not specifying any times, the map will only render features with the latest time, which leaves us sad little Hawaii (being the most recent state added):


But with this setup, it’s quite easy to make maps of states at certain times.

The states of the states

(The following code snippets need to be appended to the above request.)

The thirteen original colonies (1790):


Thirteen original colonies

The states as of the onset of the Civil War (1861):


Lots of independent territory still out there

The states that joined up during the Civil War. The US Civil War was fought from 1861-1865, but if we were to just use the string 1861/1865, we’d include Kansas, which just predates the Civil War (as seen above).

So we’ll need to get more precise, and add in the month: April 1861 to April 1865.


I did not know about Nevada joining during the Civil War, so I learned something here.

(Again, notice how easy this is with WMS dimensions; all of the work of interpreting the time is done for us.)

Finally, the states that were created in the last 120 years:


I believe the US stays at 50 states because it's such a nice round number...

This takes advantage of the new relative time support. This also means that the output of this request could itself change over time.

(The above image was reprojected to make Alaska look less huge. This is way easier using the WMS Reflector, as all you need to add is the srs parameter.)

More interactivity?

Now, it’s easy to envision a little app that takes as input Start and End dates and refreshes the map accordingly. And if people want to see that done (or anything else along that line), please leave a comment below.

And if you want to see this dataset animated, check it out over on MapStory.

No matter how long you’ve been working with GeoServer, there’s always more to be learned. Check out our Education Center to learn more!

Have you used the WMS time feature? Let us know how in the comments below!

The post Using WMS time to explore your data appeared first on Boundless.

by Mike Pumphrey at March 23, 2015 10:00 AM

March 22, 2015

From GIS to Remote Sensing

Major Update: Semi-Automatic Classification Plugin v. 4.2.0

This post is about a major update for the Semi-Automatic Classification Plugin for QGIS, version 4.2.0.

Following the changelog:

-added tab for download of Landsat images

by Luca Congedo (noreply@blogger.com) at March 22, 2015 10:24 PM

Juergen Weichand

Komplexe Feature-Modelle mit dem QGIS WFS 2.0 Client Plugin

Um die Unterstützung von komplexen Feature-Modellen (z. B. INSPIRE) zu verbessern wurde das QGIS WFS 2.0 Client Plugin in Version 0.9.2 an die vorhandenen Fähigkeiten des OGR GML-Treiber angepasst. Die Unterstützung kann im Config-Menü aktiviert werden. Für die neue Plugin-Version werden QGIS >=2.4 und OGR/GDAL >=1.11.0 benötigt.


Konfiguration GML-Reader

Mit der Option Resolve elements (xlink:href) werden Referenzen auf verknüpfte Objekte aufgelöst und deren Attribute als zusätzliche Felder in QGIS aufgeführt.

Bei GML-Applikationsschemata werden ergänzende Inhalte zusätzlich in Form von XML-Attributen codiert. Über die Option Convert attributes to fields werden diese Inhalte ebenfalls als Felder in QGIS zur Verfügung gestellt.

Konfiguration WFS 2.0

Über den Parameter Resolvedepth wird gesteuert, bis zur welchen Tiefe die referenzierten Objekte in der GetFeature-Response zurückgeliefert werden sollen. Beim Einsatz der Resolve elements-Option sollte Resolvedepth >=1 verwendet werden. Hierdurch werden keine zusätzlichen WFS-Requests zum Auflösen der Referenzen benötigt.


Beispiel INSPIRE-Addresses

Im folgenden Beispiel wird eine Adresse über den deegree INSPIRE-Demoserver abgerufen.


Der FeatureType Address besitzt die drei Komponenten ThoroughfareName (Name des Verkehrsweges), AddressAreaName (Name des Adressbereichs) und PostalDescriptor (Postalischer Deskriptor), die als Referenz (xlink:href) realisiert sind. Über die Option Resolvedepth=* werden die referenzierten Komponenten im additionalObjects-Block der WFS-FeatureCollection zurückgeliefert.

Anschließend wird die GML-Response über den GML-Treiber von OGR mit der Option Resolve elements geladen. Die Attribute der referenzierten Komponenten werden in das Address-Objekt integriert. Beispielsweise steht nun auch das Attribut PostCode aus dem referenzierten PostalDescriptor-Objekt zur Verfügung.

Adresse ohne aufgelöste Referenzen

Adresse ohne aufgelöste Referenzen

Adresse mit aufgelösten Referenzen

Adresse mit aufgelösten Referenzen



Mit der neuen Version 0.9.2 des WFS-Plugins können komplexe Feature-Modelle umfangreicher als bisher genutzt werden. Die verschachtelte Objekt-Struktur wird jedoch weiterhin in eine Simple-Feature-Struktur überführt („verebnet“).

Mit Ausbau der GML-Registry im OGR GML-Treiber werden spezifische GML-Applikationsschemata in Zukunft noch besser unterstützt werden.


by Jürgen Weichand at March 22, 2015 08:11 PM

Bjorn Sandvik

Nordryggen on skis for 25 days - creating a route map

I’m currently doing my last preparations for a 25 days skiing trip across Nordryggen in Norway. It will of course depend on weather, snow conditions and blisters, but hopefully the conditions will be bearable. Norway has a great network of 500 cabins maintained by the Norwegian Trekking Association. The longest connected cross country skiing track I’ve found is around 500 km, - how to map it?

The map we're going to create with QGIS and CartoDB. Data from the Norwegian Mapping Authority.

Nordryggen (“the north ridge”) is a fairly new name of the 1,700 km mountain range that runs through the Scandinavian Peninsula. My plan is to ski around 500 km in the southern part of Norway, most of it above the tree line.

Cross-country skiing in Jotunheimen. Photo: Bjørn Sandvik

You can study the waymarked ski routes on UT.no, or download the data from the Norwegian Mapping Authority if you want to map it yourself.  The dataset is available as a PostGIS dump or in SOSI. SOSI is a common vector data format used to exchange geographical information in Norway, and not very well supported by various mapping applications. Luckily we have Sosicon, a great open source converter by Espen Andersen. I ran this command on the SOSI file:

./sosicon -2shp Tur\ og\ friluftsruter.sos 

In return I got 4 shapefiles for ski routes, hiking trails, bike trails and POIs.

If you open the ski routes shapefile in QGIS it won’t tell you much without a basemap. Let's use the map tile service from the Norwegian Mapping Authority. In QGIS, click on the “Add WMS/WMTS Layer” button in the left toolbar. Click the “New” button to create a connection. Add a name and copy this URL:


Click “OK” and “Connect”. You will now get a long list of layers in different map projections. I recommend using “norges_grunnkart” or “topo2” (detailed) in ESPG:32633 (UTM 33).

Click on “Add” and you should see a topographic map of Norway. Now you can add the shapefile, and the ski tracks will show on top.

Waymarked ski routes shown in QGIS with my route selected between Sota Sæter and Ljosland.

I selected the track segments I plan to follow, and saved the selection in a new route shapefile. I want to have the route as a continuous line with the coordinates in order, as this will allow me to use it for navigation with my GPS unit (see my next blog post). I'm using the "Join multiple lines" plugin to get the desired result.

Installing plugins is very easy with the QGIS Plugin Manager. After installation, you'll find the "Join multiple lines" plugin in the Vector menu. 

You can use the Field calculator to calculate the length of the continuous track:

This outputs a length of 505 km.

Let's move on to CartoDB to create the route map. First, I'm uploading the zipped route shapefile to CartoDB. I've also created a table with the cabin name and positions along the route.

Create a new visualzation and add the route and cabin tables. You can add the same basemap as we used in QGIS with this URL:


The cabins are styled differently depending on the zoom level. This is the final map:

All data is available on GitHub. Map data from the Norwegian Mapping Authority.

by Bjørn Sandvik (noreply@blogger.com) at March 22, 2015 04:23 PM

Bjorn Sandvik

Transferring a route from QGIS to your GPS

In my last blog post, we created a created a 500 km continuous line representing a ski route across Nordryggen in Norway. I need to transfer this route to my Garmin GPS so I can use it for navigation while skiing. How can it be done?

Cross-country skiing in Skarvheimen, Norway. Photo: Bjørn Sandvik

Open the route in QGIS, right click the layer and select "Save As...". Select "GPS eXchange Format [GPX]" as the format, and "WGS 84" as the coordinate reference system (CRS). I'm also skipping attribute creation as my line only contains coordinates.

QGIS saves the line as a GPX route. You can import this route in Garmin BaseCamp (File -> Import). Give the track a meaningful name.

My 500 km route consists of 3867 points, but most Garmin GPS units are only capable of showing 250 points per route. You can get around this limitation by converting the route into a track. Right-click the route in Basecamp and select "Create Track from Route".

Transfer the track to your GPS unit:

You should now find the track in the "Track Manager" on your GPS. Select "View Map" to see it and then "Go" if you want to navigate along it. Depending on the maps you have on your device, you can also display an elevation plot. Since the original track don't contain elevation data, the GPS will try to fetch it from your map.

My GPS is now loaded and I'm ready to go!

All data are available on GitHub.

by Bjørn Sandvik (noreply@blogger.com) at March 22, 2015 04:23 PM

March 21, 2015

Paul Ramsey

Magical PostGIS

I did a new PostGIS talk for FOSS4G North America 2015, an exploration of some of the tidbits I've learned over the past six months about using PostgreSQL and PostGIS together to make "magic" (any sufficiently advanced technology...)


by Paul Ramsey (noreply@blogger.com) at March 21, 2015 04:16 PM

March 20, 2015

Paul Ramsey

Making Lines from Points

Somehow I've gotten through 10 years of SQL without ever learning this construction, which I found while proof-reading a colleague's blog post and looked so unlikely that I had to test it before I believed it actually worked. Just goes to show, there's always something new to learn.

Suppose you have a GPS location table:

  • gps_id: integer
  • geom: geometry
  • gps_time: timestamp
  • gps_track_id: integer

You can get a correct set of lines from this collection of points with just this SQL:

ST_MakeLine(geom ORDER BY gps_time ASC) AS geom
FROM gps_poinst
GROUP BY gps_track_id

Those of you who already knew about placing ORDER BY within an aggregate function are going "duh", and the rest of you are, like me, going "whaaaaaa?"

Prior to this, I would solve this problem by ordering all the groups in a CTE or sub-query first, and only then pass them to the aggregate make-line function. This, is, so, much, nicer.

by Paul Ramsey (noreply@blogger.com) at March 20, 2015 08:07 PM

Blog italiano di gvSIG

gvSIG 2.1: da Excel a gvSIG

Durante l’ultimo Summer Google Code è stato sviluppato un nuovo plugin per gvSIG 2.1. Questo plugin permette di caricare i dati precedentemente salvati in formato Microsoft Excel.

Questo plug-in sarà incluso di default nella prossima generazione di gvSIG, ma è possibile testarlo fin da subito.

E’ possibile installare il plug-in attraverso il Gestore delle Estensioni selezionando l’opzione “Installazione standard”, che accede ai plug-in di base già inclusi nella distribuzione gvSIG; sia attraverso il “Installazione da URL”, accedendo anche ai plug-in disponibili sul repository remoto di gvSIG.
Possiamo anche installare alcuni plug-in dalla opzione “Installazione da file”; questa opzione può essere molto utile per testare le estensioni che non sono né nella distribuzione standard né sul repository remoto.
Il plug-in di Excel si installa normalmente dal gestore delle estensioni come mostrato nelle due figure seguenti.



Una volta installato, è necessario riavviare gvSIG e verificare che l’aggiunta di una nuova tabella in formato Excel sia supportata.

Attraverso questo plug-in è possibile:

  • Caricare fogli di calcolo Excel come tabelle
  • Caricare fogli di calcolo Excel come layer

In gvSIG possiamo definire le seguenti proprietà del file Excel da aggiungere.

Le proprietà principali sono:

  • File: percorso del file
  • Locale: elenco a discesa per scegliere la configurazione che definisce il set di caratteri usati come separatori per migliaia e decimali.
  • Foglio da aggiungere: elenco a discesa per selezionare il file di Excel da caricare come tabella.
  • Usa prima riga come intestazione: Se questa opzione è attivata, la prima riga verrà usata per definire i nomi dei campi.
  • CRS: se il foglio di lavoro di Excel contiene dei campi coordinate, questo parametro consente di specificarne il sistema di riferimento.
  • Punti (X, Y, Z): campi che contenengono le coordinate. Nel caso in cui foglio Excel contenga coordinate, almeno i campi X e Y devono essere indicati.

Possiamo anche definire altre proprietà (nella scheda “Avanzate”) come, ad esempio, forzare il tipo di campo quando si carica la tabella. Nel manuale del plug-in si possono trovare delle informazioni più dettagliate.
Come detto, in gvSIG 2.1, è possibile aggiungere un foglio Excel e, in presenza di coordinate, lo si può aggiungere come layer.

Vediamo un esempio in cui si aggiunge un foglio di calcolo Excel come tabella contenente la popolazione delle Regioni italiane negli ultimi anni. In questo esempio abbiamo specificato che la prima riga contiene i nomi dei campi.

All’interno del documento Tabella di gvSIG aggiungiamo un file come facciamo normalmente e selezioniamo il nostro file Excel, dopodiché clicchiamo su Proprietà e definiamo quale foglio del file Excel importare in gvSIG, la lingua usata nel file e se il foglio ha una prima riga da usare come intestazione dei campi.


La tabella viene quindi correttamente visualizzata in gvSIG


Se il foglio di Excel contiene campi coordinate, è possibile convertirlo in un layer.


Una volta aggiunto il foglio come tabella, a partire dal menù della Vista selezionare Vista/Aggiungi layer di eventi.

Selezionare quindi, fra le varie tabelle presenti nel progetto, quella che si vuole trasformare in layer di punti.


Definire quindi quali sono i campi che contengono le coordinate geografiche ed in quale Sistema di Riferimento sono espresse.


Selezionare infine la Vista in cui si vuole caricare il layer di punti.


Una volta caricato come layer temporaneo nella Vista, lo si può esportare come shapefile e rinominare attraverso Layer/Esporta e seguendo la normale procedura di esportazione che consente anche l’esportazione in formato kml per una visualizzazione su Google Earth.


Da ricordare che la lunghezza del nome dei campi è limitata a 10 caratteri e quindi nomi con una estensione maggiore verranno “troncati” al decimo carattere.

Articolo basato sul post “gvSIG 2.1: de Excel a gvSIG” pubblicato il 11/12/2014 da Alvaro Anguix

by Giuliano Ramat at March 20, 2015 02:10 PM

Nathan Woodrow

In memory of Tim York

The other weekend the world lost a great guy, someone so nice that you struggle to find anything bad to say about him at all. Tim was only 27 but died tragically on the weekend, to the complete shock of everyone that knew him. I don’t think anyone really knew what to feel at that point, and even now it is still really hard to take it.

Tim was very well known in the GIS space here in Australia, and extremely well respected by everyone that knew him. I attended Locate15, where I was going catch up for a beer and chat with him, and almost everyone I talked to knew who he was and how great he really was. I would not doubt at all he could have become CEO or leader at a large company, no doubt at all.

It was mentioned at the funeral that we should try to think of how Tim made our lives better. For that I can fully thank Tim for my job at DMS. It was at a SSSI event after talking with my now manager about if they would like me at DMS that I had caught up with Tim and he encouraged me to move into the private sector. For that I thank him a lot.

I have started to read “How To Win Friends & Influence People” by Dale Carnegie, because one can always do with more skills in life, and some important points from the book:

  • Become genuinely interested in other people.
  • Smile.
  • Remember a person’s name.
  • Be a good listener. Encourage others to talk about themselves.
  • Talk in terms of the other person’s interests

This. All of this is what Tim did. This is the reason it always felt great to talk to him and be around him, even when I only saw him a few times.

I don’t know if Tim’s life was better for knowing me, given his large group of friends and great partner I suspect not, but I know sure as shit that mine was better for having known him.


Filed under: personal

by Nathan at March 20, 2015 10:59 AM

PostGIS Development

PostGIS 2.1.6 Released

The 2.1.6 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.6 release. As befits a patch release, the focus is on bugs, breakages, and performance issues. Users with large tables of points will want to priorize this patch, for substantial (~50%) disk space savings.


Continue Reading by clicking title hyperlink ..

by Paul Ramsey at March 20, 2015 12:00 AM

March 19, 2015

Paul Ramsey

Deloitte's Second Act

Hot off their success transforming the BC social services sector with "integrated case management", Deloitte is now heavily staffing the upcoming transformation of the IT systems that underpin our natural resource management ministries.

Interlude: I should briefly note here that Deloitte's work in social services involved building a $180,000,000 case management system that the people who use it generally do not like, using software that nobody else uses for social services, that went offline for several consecutive days last year, and based on software that basically entered end-of-life almost five years ago. I'm sure that's not Deloitte's fault, they are only the international experts hired to advise on the best ways to build the system and then actually build it.

So many shiny arrows!
Smells like management consultants...


The brain trust has now decided that the thing we need on the land base is "integrated decision making", presumably because everything tastes better "integrated". A UVic MPA student has done a complete write-up of the scheme—and I challenge you to find the hard centre inside this chewey mess of an idea—but here's a representative sample:

The IDM initiative is an example of horizontal management because it is an initiative among non‐hierarchical ministries focused on gaining efficiencies by harmonizing regulations, IT systems and business processes for the betterment of the NRS as a whole. Horizontal management is premised on joint or consensual decision making rather than a more traditional vertical hierarchy.  Horizontal collaborations create links and share information, goodwill, resources, and power or capabilities by organizations in two or more sectors to achieve jointly what they cannot achieve individually.  

Sounds great, right!?! Just the sort of thing I'd choose to manage billions of dollars in natural resources! (I jest.)

Of course, the brain trust really isn't all that interested in "horizontal management", what has them hot and bothered about "integrated decision making" is that it's an opportunity to spend money on "IT systems and business processes". Yay!

To that end, they carefully prepared a business case for Treasury Board, asking for well north of $100M to rewrite every land management system in government. Forests, lands, oil and gas, heritage, the whole kit and caboodle. The business case says:

IDM will improve the ability of the six ministries and many agencies in the NRS to work together to provide seamless, high‐quality service to proponents and the public, to provide effective resource stewardship across the province, to effectively consult with First Nations in natural resource decisions, and to contribute to cross‐government priorities.

Sounds ambitious! I wonder how they're going to accomplish this feat of re-engineering? Well, I'm going to keep on wondering, because they redacted everything in the business case except the glowing hyperbole.

However, even though we don't know how, or really why, they are embarking on this grand adventure, we can rest assured that they are now spending money at a rate of about $10M / year making it happen, much of it on our good friends Deloitte.

  • There are currently 80 consultants billing on what has been christened the "Natural Resource Sector Transformation Secretariat".

  • Not that Secretariat...

  • Of those consultants 34 are (so far) from Deloitte.
  • Coincidentally, 34 is also the number of government staff working at the Secretariat.
  • So, 114 staff, of which 34 are government employees and the rest are contractors. How many government employees does it take to change a lightbulb? Let me take that to procurement and I'll get back to you.

The FOI system charged me $120 (and only after I bargained down my request to a much less informative one) to find the above out, because they felt that the information did not meet the test of being "of public interest". If you feel it actually is in the public interest to learn where our $100M on IT services for natural resources are being spent, and you live in BC, please leave me a comment on this post.

Interlude: The test for whether fees should be waived is double barrelled, but is (hilariously) decided by the public body itself (soooo unbiased). Here are the tests I think I pass (but they don't):

  1. Do the records show how the public body is allocating financial or other resources?
  2. Is your primary purpose to disseminate information in a way that could reasonably be expected to benefit the public, or to serve a private interest?

I'm still digging for more information (like, how is it that Deloitte can bill out 34 staff on this project when there hasn't been a major RFP for it yet?) so stay tuned and send me any hints if you have them.

by Paul Ramsey (noreply@blogger.com) at March 19, 2015 06:43 PM

Boundless Blog

Support story: Getting more out of SQL Views

Most GeoServer users know how to publish tables from a database such as PostgreSQL or Oracle as vector layers. More seasoned GeoServer users, however, also take advantage of SQL Views to publish those same tables with even greater control.

So what is an SQL View? A SQL View is a type of layer that is based on a database query that you write inside GeoServer itself. To the end user, it looks like a regular layer, but behind the curtain, you have all the power of spatial SQL at your fingertips to enrich the data that your users receive.

Uses of SQL Views

There are a number of different reasons to incorporate an SQL View into your application. For example, it’s possible to:

… only expose certain attributes to users:

SELECT geom, id, name FROM banks

… run spatial queries to do spatial computation or analysis:

SELECT *, ST_Area(geom) AS area FROM countries

… join two tables together:

SELECT airport.geom, airport.name, city.population
FROM airports AS airport, cities AS city
WHERE airport.city = city.id

… convert data types that GeoServer doesn’t support:

SELECT id, name, geom, iwxxm::text FROM weather_stations

IWXXM data is stored in XML, which can be stored and validated natively by PostgreSQL (as can JSON, arrays and other types), but is not available in GeoServer. But by adding ::text, we convert it to text and expose it as a regular attribute in our layers.

Using view parameters

We can take these static SQL Views one step further by adding parameters to our SQL queries to make dynamic OGC requests based on user input. Boundless has some examples of using parameterized SQL Views in our various tutorials, including GeoNames Heat Map, Building a Census Map, and Building a Routing Application.

The trick is to add parameters to the SQL View that can be specified during a WMS or a WFS request:

SELECT * FROM buildings WHERE type = '%type%'

When we make a WMS GetMap request, for example, we can add the following to the URL:


The result will be the execution of the following query on the database:

SELECT * FROM buildings WHERE type = 'hospital'

If you go to our Geonames Word Map demo and type “canyon” as the search term, you can see the following WMS request being sent to our server:


Buried in there is VIEWPARAMS=word:canyon, and if you open this URL in your browser, you’ll see the results of the query are taken by GeoServer to generate a heatmap.

Our routing application uses multiple parameters (source, target and cost) to generate the path between two points.

Once you’ve added them to your GeoServer toolkit, you’ll wonder how you ever did without SQL Views!


Allowing a GeoServer client to influence the SQL that will be executed on your database opens the door to an SQL injection attack. To prevent the client from running arbitrary SQL code, GeoServer comes with parameter validation using regular expressions. We know that regular expressions can have a steep learning curve, but let’s go over some easy examples.

We must consider what values we want to allow for each parameter as the first step to crafting a validation expression.

Take the following SQL query:

WHERE city = '%city%' AND population > %rank% AND type = '%type%'

In this example, we will accept any city name with alphabetic characters (A-z) and spaces (\s) giving a regular expression of ^[A-z\s]+$. Population is always a number (\d) so we can use ^\d+$. Finally, for the road type, we only want to accept one of the following: highway, primary, secondary and residential. This gives the following expression: ^(highway|primary|secondary|residential)$.

By default, GeoServer uses a slightly more permissive (but usually safe!) ^[\w\d\s]+$, which allows letters, numbers and spaces for all parameters.

With these controls, we have prevented a malicious client from crafting a SQL injection that could potentially destroy our data. If a client does attempt to use a parameter that fails the regular expression check (for example: VIEWPARAMS=city:London,rank:1,type:tertiary), GeoServer will return an error:

Invalid value for parameter name

More importantly the SQL will not be executed!


So what’s the catch? There are a few things we have to take into consideration when using SQL Views. The main points of attention are:

First, be aware of potential performance implications. If you write complex SQL queries, remember that each time you make a request, the query will be executed. If it’s a slow query, your users will be waiting that much longer for a response.

Second, layers built from SQL Views are read-only. This means that you can’t use WFS-T to write back to the layer, unlike regular layers that have been published from regular database tables.

More reading

The Boundless Workshops page is a great place to read about the practical use of SQL Views in applications.

For a thorough discussion, see the OpenGeo Suite User manual’s section on SQL Views.

The post Support story: Getting more out of SQL Views appeared first on Boundless.

by Benjamin Trigona-Harany at March 19, 2015 01:12 PM

Nathan Woodrow

New addition

Stace and I would like to welcome Matilda Anne Woodrow, born 5/1/15 at 10:36am.


She is a nice way to start 2015.

After the pretty crappy year that was 2014 it was finally nice to see Matilda in the flesh. The relief of seeing her alive and well is hard to put in words after losing Ellie in 2013.

People always ask about the lack of sleep, but it feels like we have more energy now then we did before. The emotional and physical toll that the pregancy took on our family had pretty much run us into the ground and it felt like it was never ending.

The toll of the pregancy had lead me to massive lack of motivation towards QGIS and pretty much everything else in life, which isn’t healthy when you have a family to look after. Pro Tip: Get help if you find yourself in this spot, it be hard to recover if you go over the edge.

Anyway. Here is to a new year. A better year.

Filed under: Open Source

by Nathan at March 19, 2015 12:51 PM

Paul Ramsey

Breaking a Linestring into Segments

Like doing a sudoku, solving a "simple yet tricky" problem in spatial SQL can grab ones mind and hold it for a period. Someone on the PostGIS IRC channel was trying to "convert a linestring into a set of two-point segments", using an external C++ program, and I thought: "hm, I'm sure that's doable in SQL".

And sure enough, it is, though the syntax for referencing out the parts of the dump objects makes it look a little ugly.

by Paul Ramsey (noreply@blogger.com) at March 19, 2015 12:38 PM

March 18, 2015

Tyler Mitchell

Web console for Kafka messaging system

Running Kafka for a streaming collection service can feel somewhat opaque at times, this is why I was thrilled to find the Kafka Web Console project on Github yesterday.  This Scala application can be easily downloaded and installed with a couple steps.  An included web server can then be launched to serve it up quickly.  Here’s how to do all that.

For a quick intro to what the web console does, see my video first.  Instructions for getting started follow below.

Kafka Web Console Project – Download

The main repository for this project is available on Github (claudemamo/kafka-web-console).  However, I wanted the ability to add and remove Kafka topics so I use a forked repository that has a specific branch with those capabilities.  (These are planned to be added to the main project but have not been yet.)

Download the ZIP archive file and unzip.

Before doing anything further, we need another application to build and launch the console.

Download Play Framework

A program called Play with Activator is used to build and launch the web console.  It’s a Java app for launching Scala apps.

Download it here, unzip it and add it to the system path so you can execute the activator command that it provides.


Now back to the Kafka web console code.  Enter the top level directory and execute the Play Activator start command (with one special option):

cd kafka-web-console-topic-add-remove
activator start -DapplyEvolutions.default=true

 [info] Loading project definition from /home/demo/src/tmp/kafka-web-console-topic-add-remove/project
 [info] Set current project to kafka-web-console (in build file:/home/demo/src/tmp/kafka-web-console-topic-add-remove/)

 (Starting server. Type Ctrl+D to exit logs, the server will remain in background)

 Play server process ID is 8528
 [info] play - database [default] connected at jdbc:h2:file:play
 [info] play - Starting application default Akka system.
 [info] play - Application started (Prod)
 [info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000

The first time it runs will take some time to build the source and then launch the web server.

In the last line, above, you see it launches by default on port 9000.

Configuring the Console

Kafka Web Console - Zookeeper Register FormStep 1 – Register your Zookeeper instance

Now you can launch the web console and start using the application.  Step 1 figure shows the basic form for your Zookeeper configuration details.  This is the only setup step required to get access to your Kafka brokers and topics.

The remained of the steps/figures are just showing the different screens.  See the video for watching it in action.

Kafka Web Console - Broker List TableStep 2 – Brokers tab shows list of Kafka Brokers
Kafka Web Console - Topic List TableStep 3 – Topics tab lists the Kafka topics. Note that this forked version provides the red “delete” button for each topic, as well as the additional “create topic” tab above.
Kafka Web Console - Topic Feed TableStep 4 – Selecting a topic allows you to see an active topic stream.


The post Web console for Kafka messaging system appeared first on spatialguru.com.

by Tyler Mitchell at March 18, 2015 06:28 PM

NaturalGIS Blog

3rd Portugal QGIS user meeting (Covilhã) and QGIS‬ Conference 2015 (University of Copenhagen)

In the next weeks/months are due a couple of QGIS events where NaturalGIS will be present with presentations and workshops.

May 18th and 19th we will be at the QGIS Conference 2015 organized by the University of Copenhagen, Forestry College, that will take place in Nødebo, Denmark. This event is directly linked with the 13th QGIS Developer Meeting, that will take place from May 20 to 22 in the same location.

At the QGIS Conference we will make a presentation about how to get help/support for QGIS and a workshop about QGIS Server and how to publish OGC services (WMS/WFS/WCS) and webgis applications.

QGIS Conference 2015

June 5th we will be in Covilhã, Serra da Estrela, Portugal, to join the 3rd Portugal QGIS user meeting where we also will make a presentation (the same as in Nødebo) and a workshop about using the QGIS Processing toolbox.

3rd Portugal QGIS user meeting

March 18, 2015 03:07 PM

Paulo van Breugel

Importing data in GRASS GIS – an example

Introduction ISRIC, Earth Institute, Columbia University, World Agroforestry Centre (ICRAF) and the International Center for Tropical Agriculture (CIAT) have recently released a new data set of raster layers with various predicted soil properties. This data set is referred to as the “AfSoilGrids250m” data set. It supersedes the SoilGrids1km data set and comes at a resolution … Continue reading Importing data in GRASS GIS – an example

by pvanb at March 18, 2015 11:26 AM


Meet GeoSolutions at the INSPIRE Conference 2015 in Lisbon!


Dear All, we are proud to announce that GeoSolutions is exhibiting at the INSPIRE Conference 2015 which will be held in Lisbon from 25h to 29th of May 2015 (you can get more information at this link). GeoSolutions will be present with its own booth therefore we'll be happy to talk to you about our open source products, like GeoServer and Mapstore, as well as about our Enterprise Support Services. Moreover, looking at the program, we would like to remind you that we are also giving a workshop about GeoServer: Web mapping with OGC services and GeoServer: an introduction, Thursday, 28 May 2015, 1100 – 1230 The workshop will provide a hands on introduction to the basic GeoServer concepts, as well as usage and configuration, with particular attention to the setup of INSPIRE compliant view services with a demonstration set of data in various formats, both raster and vector. and also a presentation where solutions developed by GeoSolutions and its partners will be discussed: Opendata and INSPIRE With GeoServer , Geonetwork and Mapstore: Lessons Learned From Real - World Use Cases, Thursday, the 27th at 02:40 P.M. The goal of this presentation is to introduce and briefly describe GeoSolutions Open Source products to demonstrate how they can be used to create real-world Spatial Data Infrastructures built (mostly or entirely) on Open Source components and leveraging, where applicable, on public and widely accepted standards both de-facto (e.g. GeoJSON) as well as internationally recognized (e.g. OGC standards like WMS and WFS). If you are interested in learning about how we can help you achieving your goals with our Open Source products and professional services, make sure to visit us at our booth The GeoSolutions team,

by simone giannecchini at March 18, 2015 08:48 AM

March 17, 2015

OSGeo News

Orfeo ToolBox now has an official Project Steering Committee

by jsanz at March 17, 2015 10:54 AM

March 16, 2015

Geomatic Blog

Mapping Party en Poio, Pontevedra

El sábado 11 de abril en el Centro Xaime Illa de Raxó (Poio) llega la fiesta de las fiestas, la Mapping Party Poio’15 con el objetivo de pasarlo bien mientras aumentamos los datos geográficos de la costa de Poio con una licencia libre.


Este taller está organizado por la asociación de software libre Xeopesca y cuenta con la colaboración del Concello de Poio y las asociaciones SCD Raxó y ACDeM Armadiña.


  • 10:00-11:00 Presentación de OSM, organización de equipos para la zona a cartografiar.
  • 11:00-14:00 Trabajo de campo con explicaciones de como emplear osmAndroid.
  • 14:00 -16:00 Comida.
  • 16:00-20:00 Trabajar con las computadoras para el volcado de los datos OSM
  • 20:00-20:30 Clausura del curso.


El número de asistentes será de 25. La selección de los candidatos se realizará por orden de inscripción. Se recomienda la disposición de cualquiera de los siguientes dispositivos:  GPS, teléfono con GPS y cámara digital.

Formulario de Inscripción

Para inscribirse a Mapping Party Poio’15 cubre el formulario. (ver aquí) (poio.xeopesca.gal) .

Material fungible

Se hará entrega a cada uno de los asistentes de un bloc de notas, un bolígrafo y un lápiz.

Redes SociaLes

Establecemos el hashtag #mappingpartypoio para seguir el evento a través de las redes sociales. Además también puedes seguir la  Mapping Party Poio’15 a través de twitter mediante el hashtag #mappingpartypoio o en la  página de facebook de XeoPesca.





Archivado en: GIS

by Micho Garcia at March 16, 2015 09:11 PM