Welcome to Planet OSGeo

August 01, 2015

From GIS to Remote Sensing

Major Update: Semi-Automatic Classification Plugin v. 4.6.0


This post is about a major update for the Semi-Automatic Classification Plugin for QGIS, version 4.6.0.



Following the changelog:
-accuracy assessment can use a reference raster or a shapefile (selecting a
field as class code)
-Landsat pansharpening fix for 32bit systems
-improved code for several post processing functions
-bug fixing

by Luca Congedo (noreply@blogger.com) at August 01, 2015 12:54 PM

July 31, 2015

GeoTools Team

GeoTools 13.2 Released

We are pleased to announce the availability of the GeoTools 13.2 release for immediate download:
This release is made in conjunction with GeoServer 2.7.2 and is available from our maven repository

Thanks to Andrea (from GeoSolutions) for packaging this release. This release was delayed due to an unfortunate Source Forge outage.

GeoTools 13.2 is the current stable release recommended for production:
  • Update to NetCDF version 4.5.5
  • Improvements to image mosaic harvest operation and transparent color support
  • stability improvement for Jenks Natural Breaks classifier
  • internationalization fix for SOLR datastore
  • WorldVanDerGrintenI projection handler
For more information please check the release notes.

About GeoTools 13

GeoTools 13 features and improvements:

by Jody Garnett (noreply@blogger.com) at July 31, 2015 10:39 PM

GeoTools Team

GeoTools 14-M1 Milestone Release

GeoTools is pleased to announce the GeoTools 14-M1 milestone release:
This release is also available from our maven repository.

This release is made in conjunction with the GeoMesa project and has passed an IP review required for locationtech projects. Please consider this release as a preview prior our scheduled September release.

Highlights from our issue tracker release-notes:
  • Clarify use of RobustDeterminant and GridCoverage2DRIA
  • Use of Oracle metadata table for initial bounding box calculation
  • Transaction is auto closable for use with java try-with-resource syntax
  • A lot of work on stable date/time literal conversion (when running GeoTools in different timezones)
  • Update NetCDF to 4.5.5
  • Support for WFS 2.0.0

About GeoTools 14

  • Change from vecmath to EJML for matrix calculations
  • Adopt JAI-EXT operations for improved no-data and footprint support

by Jody Garnett (noreply@blogger.com) at July 31, 2015 10:06 PM

Peter Batty

My mapwheel story

Yesterday I backed a Kickstarter project called Mapwheel, I think it’s a really cool idea. They let you design a custom “toposcope” or map wheel showing the direction and distance of places of interest from the location where you live (or any other location you choose). You can choose various materials (wood or metal) and customize the design in various ways. Working on the design has been a

by Peter Batty (noreply@blogger.com) at July 31, 2015 08:51 PM

Markus Neteler

GRASS GIS 7.0.1 released – 32 years of GRASS GIS

What’s new in a nutshellgrass7_logo_500px

This release addresses some minor issues found in the first GRASS GIS 7.0.0 release published earlier this year. The new release provides a series of stability fixes in the core system and the graphical user interface, PyGRASS improvements, some manual enhancements, and a few language translations.

This release is the 32nd birthday release of GRASS GIS.

New in GRASS GIS 7: Its new graphical user interface supports the user in making complex GIS operations as simple as possible. A new Python interface to the C library permits users to create new GRASS GIS-Python modules in a simple way while yet obtaining powerful and fast modules. Furthermore, the libraries were significantly improved for speed and efficiency, along with support for huge files. A lot of effort has been invested to standardize parameter and flag names. Finally, GRASS GIS 7 comes with a series of new modules to analyse raster and vector data, along with a full temporal framework. For a detailed overview, see the list of new features. As a stable release 7.0 enjoys long-term support.

Source code download:

Binaries download:

More details:

See also our detailed announcement:

  http://trac.osgeo.org/grass/wiki/Grass7/NewFeatures (overview of new stable release series)First time users may explore the first steps tutorial after installation.

About GRASS GIS

The Geographic Resources Analysis Support System (http://grass.osgeo.org/), commonly referred to as GRASS GIS, is an Open Source Geographic Information System providing powerful raster, vector and geospatial processing capabilities in a single integrated software suite. GRASS GIS includes tools for spatial modeling, visualization of raster and vector data, management and analysis of geospatial data, and the processing of satellite and aerial imagery. It also provides the capability to produce sophisticated presentation graphics and hardcopy maps. GRASS GIS has been translated into about twenty languages and supports a huge array of data formats. It can be used either as a stand-alone application or as backend for other software packages such as QGIS and R geostatistics. It is distributed freely under the terms of the GNU General Public License (GPL). GRASS GIS is a founding member of the Open Source Geospatial Foundation (OSGeo).

The GRASS Development Team, July 2015

The post GRASS GIS 7.0.1 released – 32 years of GRASS GIS appeared first on GFOSS Blog | GRASS GIS Courses.

by neteler at July 31, 2015 06:20 PM

Jackie Ng

How can we take advantage of this?


Here's the facts:
  • MapGuide for the longest time has an FDO provider that can read SQLite databases
  • SQLite has an in-memory mode
  • :memory: is a completely valid file name to pass to the connection parameters of a SQLite FDO connection
  • In terms of I/O, memory is the fastest backing store you can have for direct data access.
Now how can we make MapGuide blazing fast (where it is currently not), given the above facts?

That question has been brewing in my mind for some time now.


by Jackie Ng (noreply@blogger.com) at July 31, 2015 03:12 PM

Darren Cope

Create World Files For a Directory of KMZ/KML Raster Files

A while back, I worked on a project that required the conversion of a number of KML/KMZ (Google Earth) raster files into vector format (don’t ask!) Because there were a lot of files, it was painstaking to manually geo-reference the files after unzipping the KMZ to extract the raster files. I dug around on the web, and was able to find two tools that did the job. The first, WorldFileTool, works great, but must be run individually for each file (ie. you can’t run it in a batch over multiple files in a directory).  I use this tool if I’m only converting a single file, or less than a handful at at time.

 

The other option I found was a shell script created by Nicolas Moyroud, who had made it available at this link. However, the link now appears to be broken, and I can’t find another reference to the file. As it’s tagged as a “GNU/GPL v3 – Free use, distribution and modification” license, I’m posting a copy here for others who may find it of use. Note that all credit for this file goes to Nicolas Moyroud, and I have no claim to this work!

Click here to download the file!

 

Edit:  I also found a working download link and Nicholas’ descriptive page here (en francais).  He must have changed his URL structure since my original StackExchange post!

by darrencope at July 31, 2015 12:42 PM

July 29, 2015

GeoMonday

GeoMonday 2015.3 – A new dimension in Geo

Intro

One of the most discussed topic of the recent years is the rise of UAVs, MAVs, RPVs or simply drones. These flying devices are equipped in their latest evolutions not only with full HD cameras, but also GPS devices and even small computers. This equipment is the base for next generation aerial geo-data and services. In the 3rd edition of GeoMonday we will cover the whole lifecycle from the creation, processing up to the integration for location intelligence or services. It’s a special pleasure for us to have our session for the first time in the beautiful city center of Potsdam, thanks to our Partner Zukunftsagentur Brandenburg.

Agenda:

When: Monday September 14th, 2015, starting 6pm sharp
Where: arcona Hotel am Havelufer, Zeppelinstraße 136, 14471 Potsdam

You are welcome to join our event to become part of the GeoMonday community. Get your free tickets here:
https://geomonday2015-3.eventbrite.de

We will announce our speakers in the next days and weeks, so stay tuned…


by j.n. mobile at July 29, 2015 09:16 PM

Markus Neteler

QGIS 2.10 RPMs for Fedora 21, Centos 7, Scientific Linux 7

qgis-icon_smallThanks to the work of Volker Fröhlich and other Fedora/EPEL packagers I was able to create RPM packages of QGIS 2.10 Pisa for Fedora 21, Centos 7, and Scientific Linux 7 using the great COPR platform.

Repo: https://copr.fedoraproject.org/coprs/neteler/QGIS-2.10-Pisa/

The following packages can now be installed and tested on epel-7-x86_64 (Centos 7, Scientific Linux 7, etc.), and Fedora-21-x86_64:

  • qgis 2.10.1
  • qgis-debuginfo 2.10.1
  • qgis-devel 2.10.1
  • qgis-grass 2.10.1
  • qgis-python 2.10.1
  • qgis-server 2.10.1

Installation instructions (run as “root” user or use “sudo”):

su

# EPEL7:
yum install epel-release
yum update
wget -O /etc/yum.repos.d/qgis-2-10-epel-7.repo https://copr.fedoraproject.org/coprs/neteler/QGIS-2.10-Pisa/repo/epel-7/neteler-QGIS-2.10-Pisa-epel-7.repo
yum update
yum install qgis qgis-grass qgis-python

# Fedora 21:
dnf copr enable neteler/QGIS-2.10-Pisa
dnf update
dnf install qgis qgis-grass qgis-python

Enjoy!

The post QGIS 2.10 RPMs for Fedora 21, Centos 7, Scientific Linux 7 appeared first on GFOSS Blog | GRASS GIS Courses.

by neteler at July 29, 2015 06:55 PM

Jackie Ng

A (Better) map viewer template

Based on blog and github stats, the bootstrap map viewer templates that I talked about in a previous post seemed to be the my most popular repo and 2nd most popular post of all time respectively. I'm glad you like it :)

But to tell the truth, I have since found something significantly better and in a humble case of acknowledging that someone made something better than what I have, suggest that you give it a try.

I am talking about Tobias Bienek's sidebar-v2 responsive map viewer template.

Just look at this screenshot!


This template is basically my dream responsive map viewer layout fully realized, which I've been trying to replicate for the longest time since I first heard of bootstrap.

A cursory glance at the repo shows that it's written in the most un-obtrusive of HTML, CSS and JavaScript which means actual integration of bootstrap-styled content should be a simple affair, and their examples already cover all the possible map viewer solutions you would ever use.

So if what I've made is less than desirable, give this template a go.

by Jackie Ng (noreply@blogger.com) at July 29, 2015 01:36 PM

gvSIG Team

11th International gvSIG Conference: “It’s possible. It’s real”

A year on, the reference conference of the gvSIG Community and one of the more relevant events about free geomatics at an international level will take place. The 11th International gvSIG Conference will be held from December 2nd to 4th 2015 under the slogan “It’s possible. It’s real“.

11gvsig

 

 

Call for papers

The conference program has been excellent at the last years, and we’re sure it will be very good this year. We expect your proposals about big ans small projects, case studies, researches and university works, developed on gvSIG, gvSIG, standard uses and Spatial Data Infrastructure on free geomatics…

If you are interested, call for papers is now open. As of today communication proposals can be sent to the email address: conference-contact@gvsig.com; they will be evaluated by the scientific committee as to their inclusion in the conference program.

There are two types of communication: paper or poster. Information regarding to regulations on communication presentations can be found in the Communications section of the website. Abstracts will be accepted until September 25th.

Sponsors

Organizations interested in collaborating in the event can find information in the section: How to collaborate of the conference website. We call the institutions and companies that use the gvSIG technology, to collaborate at this event.

 


Filed under: community, english, events, gvSIG Association Tagged: 11th International gvSIG Conference

by Mario at July 29, 2015 11:40 AM

gvSIG Team

11as Jornadas Internacionales gvSIG: “Es posible. Es real”

Un año más celebraremos las jornadas de referencia de la comunidad gvSIG y uno de los eventos más relevantes de geomática libre a nivel internacional. Del 2 al 4 de diciembre de 2015 tendrán lugar las 11as Jornadas Internacionales gvSIG que este año se presentan con el lema “Es posible. Es real”.

11gvsig

 

 

 

 

 

Envío de comunicaciones

Llevamos varios años en que el programa de las jornadas no puede calificarse de otra forma que excelente. Y estamos seguros que este año no será menos. Esperamos vuestras propuestas, de grandes y pequeños proyectos, casos de uso, investigaciones y trabajos universitarios, desarrollos con gvSIG, gvNIX, uso de estándares e Infraestructuras de Datos Espaciales con geomática libre…

Si estás interesado en participar…ya está abierto el periodo para el envío de propuestas de comunicaciones. Las propuestas pueden enviarse a la dirección de correo electrónico conference-contact@gvsig.com, y serán valoradas por el comité científico de cara a su inclusión en el programa de las Jornadas. Existen dos modalidades de comunicación: ponencia y póster. Toda la información sobre las normas para la presentación de comunicaciones puede consultarse en el apartado “Comunicaciones” de la web. El periodo de recepción de resúmenes finalizará el próximo 25 de septiembre.

Patrocinios

Las organizaciones interesadas en colaborar en el evento pueden encontrar información en el apartado “¿Cómo colaborar?” de la web de las jornadas. Hacemos llamamiento especialmente a las empresas e instituciones usuarias de la tecnología gvSIG, a que colaboren en la realización de este evento.


Filed under: events, spanish Tagged: 11as Jornadas Internacionales

by Alvaro at July 29, 2015 06:53 AM

July 28, 2015

OSGeo News

11th International gvSIG Conference: "It's possible. It's real"

by aghisla at July 28, 2015 01:51 PM

gvSIG Team

Talleres 7as Jornadas gvSIG LAC: Scripting con gvSIG 2.2

linux-python-logoHola a todos,
Ya casi están aquí las “7as Jornadas gvSIG de Latinoamérica y Caribe”, que este año se celebran en  la Facultad de Geografía de la Universidad Autónoma del Estado de México. En estas jornadas entre varios talleres y exposiciones habrá un “Taller de scripting con gvSIG 2.2″.

¿ A quien va dirigido el “Taller de scripting con gvSIG 2.2″ ?

A usuarios con conocimientos de gvSIG y python/jython, y desarrolladores que deseen iniciarse en la realización de scripts sobre gvSIG.

¿Qué deben conocer y llevar los asistentes?

Debes conocer:

  • La aplicación gvSIG desktop.
  • Nociones de programación.
  • Lenguaje de programación python; esto sería muy recomendable.

En caso de no tener conocimientos de programación o python, se puede asistir al taller y seguirlo a modo de charla, aunque no se puedan realizar todos los ejercicios que vamos a ir viendo.

Qué necesitarás llevar en el caso de seguir el taller con vuestro propio portátil/laptop:

Un gvSIG 2.2 instalado y funcionando correctamente, que tenga instalado el complemento de scripting.

Respecto al Sistema Operativo, debería poder seguirse tanto desde Linux como desde Windows. En mi caso el taller lo impartiré sobre Linux, Kubuntu de 64 bits.

¿ Qué veremos en el taller ?

Se trata de un taller de entre tres y cuatro horas, así que puede dar para ver algunas cosas interesantes.
La idea seria dividir el taller en tres bloques:

  • Una introducción, de nivel básico. Pensado para usuarios con pocos conocimientos sobre programación y python. Veremos como manipular datos espaciales y generar nuevas capas a partir de algunas ya existentes.
  • Un ejemplo de manipulación de mapas (documento mapa), de nivel intermedio. Veremos algunos trucos para personalizar nuestros mapas con un poco de scripting…
    • Ajustando la extensión del mapa a visualizar
    • Personalizando rótulos de este.
    • Cargando imágenes personalizadas.

    Además veremos algunos trucos para poder descubrir de que operaciones disponemos sobre los distintos componentes u objetos a los que tenemos acceso, así como donde podemos consultar información sobre algunos de ellos.

  • Un ejemplo de creación de interfaces de usuario desde scripting, de un nivel avanzado. Veremos como crear formularios con una herramienta externa, propia de Java, que nos permitirá crear interfaces de usuario fácilmente con una alta integración con el resto de gvSIG  y como usarlos desde un script…
    • Crear formularios de forma sencilla
    • Cargarlos desde un script y acceder a sus componentes
    • Asociar operaciones a los botones de nuestros formularios

    Continuaremos con lo que habíamos creado en el punto anterior para dotarlo de interfaz gráfica y acabar disponiendo de una herramienta que nos permita personalizar nuestros mapas.

En función de lo que a los asistentes les interese iremos dejando caer más peso en unas partes u otras, adaptando el taller a estos.

Espero poder publicar antes de las jornadas otro pequeño artículo indicando dónde podéis descargaros algo de documentación sobre lo que iremos viendo durante el taller.

Recordar que los talleres son gratuitos, al igual que todas las actividades de las jornadas, y que para asistir necesitáis realizar vuestra inscripción a las jornadas mediante el siguiente enlace:

http://www.gvsig.com/es/eventos/jornadas-lac/2015/inscripcion

Post anteriores sobre talleres en las 7as Jornadas gvSIG LAC:

Un saludo a todos!


Filed under: development, events, gvSIG Desktop, spanish Tagged: 7as LAC, jython, python, scripting, scripts, taller

by Joaquin del Cerro at July 28, 2015 10:15 AM

July 27, 2015

GeoTools Team

FOSS4G 2015 Europe talk from Ian Turton

I would like to pass on an excellent talk by Ian Turton.

This presentation is a great introduction to using an enjoying open source software for everyone, entertaining illustrated with examples from the GeoTools project.

A big thanks to Ian for this presentation and his long standing outreach efforts. Champions like this make GeoTools a friendly place to work for everyone involved.

We hope everyone enjoyed FOSS4G 2015 Europe and look forward to seeing you in Seoul!

Earning Your Support Instead of Buying It

by Jody Garnett (noreply@blogger.com) at July 27, 2015 10:38 PM

From GIS to Remote Sensing

Minor Update: Semi-Automatic Classification Plugin v. 4.5.1


This post is about a minor update for the Semi-Automatic Classification Plugin for QGIS, version 4.5.1.



Following the changelog:
-changed function for ROI display
-bug fixing

by Luca Congedo (noreply@blogger.com) at July 27, 2015 09:23 PM

From GIS to Remote Sensing

Major Update: Semi-Automatic Classification Plugin v. 4.5.0


This post is about a major update for the Semi-Automatic Classification Plugin for QGIS, version 4.5.0.



Following the changelog:
-added reflectance conversion for Landsat 1, 2, and 3 MSS
-improved the calculation of Band calc which now allows for calculation between
rasters with different size and resolution
-experimental version
-bug fixing

by Luca Congedo (noreply@blogger.com) at July 27, 2015 09:23 PM

GeoSolutions

Developer’s Corner: Improved NetCDF/GRIB support on GeoServer

GeoServer

Dear Readers,

today we want to share some improvements we made in GeoServer in supporting NetCDF and GRIB datasets.

As you know, NetCDF and GRIB are commonly used format in the Meteorological and Oceanographic (MetOc) context for observational data and numerical modeling, being a platform independent format used to represent multidimensional array-oriented scientific data. As an instance, data for air temperature, water current, wind speed computed by mathematical models across multiple dimensions, such as time, depth/elevation or physical entities measured by sensors may be served as NetCDF datasets.

1

In the last years, we have improved the GeoTools library in such a context, by providing a NetCDF plugin based on Unidata NetCDF java library. It is worth to point out that the same library also allows to access GRIB datasets. As a result, you may configure a NetCDF/GRIB coverage store in GeoServer for a NetCDF/GRIB file and setup different coverages/layers, one for each NetCDF variable/GRIB parameter available in the input file, together with its underlying dimension (time, elevation, ...)

 

Whilst the standalone NetCDF/GRIB provides access to a single NetCDF/GRIB file exposing it as a self contained coverage store, multiple datasets can be served as a single imageMosaic coverage store. This is especially useful when you have to deal with a collection of files representing different runs and forecasts of a meteo model. You may think about a Meteorological agency running a model each day, producing N forecasts by day with 1 hour step. In that case, an ImageMosaic can be configured on top of the folder containing the related NetCDF/GRIB datasets. Moreover, scripts running periodically can automatically add new files to that, in order to update the available data with latest forecasts.

With this approach you can configure a coverageStore based on an ImageMosaic, so that you can send WMS getMap requests in order to depicts, as an instance, wind currents at specific heights above ground or send WCS getCoverage requests to get raw data about precipitations at different times.

The GeoSolutions SpatioTemporal data Training contains some WMS/WCS requests examples involving the DLR’s Polyphemus Air Quality modeling System datasets.

Creating meaningful maps for the user requires proper styling to be applied to raw data. SLD allows to customize the rendering of your NetCDF/GRIB datasets.

As an instance:
  • you can show height of waves as dynamic color map (More details about this capability and SLD to be used can be found here)
2
  • you can combine color maps style and contouring to show air temperature. (here you can find the SLD for countouring a DEM. Adapting it for the air temperature case only requires to change the levels parameter values from lines 22)
3
  • You can combine wind components and represent them through windbarbs symbology
4

More details on NetCDF/GRIB styles and other rendering transformations can be found in the Rendering Transformations section of the SpatioTemporal training. In such a context, you can also take a look to this blog post related to wind barbs depicted in the previous example. Full SLD for WindBarbs example is available here.

Whilst WMS allows to create maps/portraits with custom styling to customize the rendering of a specific slice of a NetCDF/GRIB variable, WCS allows to get raw data for an “hypercube” involving multiple values across different dimensions.

In such a context WCS 2.0 defines:

  • Trimming subsetting to specify a range across a dimension
  • Slicing subsetting to specify a single value for a dimension

Standard output formats such as GeoTIFF, ArcGrid don’t allow to encode multiple “2D slices” of the same coverage related to different time,elevation ranges involved in the request.

Therefore, a NetCDF output format has been developed to store all the requested portions of a coverage into a single multidimensional file.

As an instance, a request like this: http://localhost:8080/geoserver/wcs?request=GetCoverage&service=WCS&version=2.0.1&coverageId=geosolutions__NO2&Format=application/x-netcdf&subset=http://www.opengis.net/def/axis/OGC/0/Long(5,20)&subset=http://www.opengis.net/def/axis/OGC/0/Lat(40,50)&subset=http://www.opengis.net/def/axis/OGC/0/elevation(300,1250)&subset=http://www.opengis.net/def/axis/OGC/0/time("2013-03-01T10:00:00.000Z","2013-03-01T22:00:00.000Z") will create a NetCDF file containing all available data for the NO2 (Nitrogen Dioxide) coverage by
  • Trimming on latitude [40 -> 50]
  • Trimming on longitude [5 -> 20]
  • Trimming on elevation [300 -> 1250]
  • Trimming on time [2013-03-01T10:00:00.000Z -> 2013-03-01T22:00:00.000Z]

That request is getting all NO2 data within the elevation range [300-1250] for the time period from 2013/03/01 at 10 AM to 2013/03/01 at 10 PM, in the BoundingBox with corners 5°E 40°N - 20°E 50°N.

On Panoply, the output will look like this. You can notice multiple values available across dimensions: 13 time values and 6 elevation values which can be combined together to get 6*13 = 78 different 2D slices of the requested coverage. 5

An important improvement we recently made in handling NetCDF and GRIB file is related to support for different projections. In the beginning, GT/GS NetCDF plugins were only supporting WGS84 based datasets due to missing logic to parse projection related information. Lately (GeoServer 2.8.x), the NetCDF input format has been improved in order to support different coordinate reference systems which are expressed through GridMapping as per NetCDF-CF conventions so we can support Lambert conformal, Stereographic, Transverse Mercator, Albers Equal Area, Azimuthal Equal Area, Orthographic projections. The NetCDF-CF GridMapping way, requires to associate a NetCDF variable containing the projection information to a multidimensional Variable containing data defined in that projection. As an instance, your dataset may contain an icing_probability variable declaring a grid_mapping = "LambertConformal_Projection" attribute as well as that “LambertConformal_Projection” variable containing this definition: int LambertConformal_Projection; :grid_mapping_name = "lambert_conformal_conic"; :latitude_of_projection_origin = 25.0; // double :longitude_of_central_meridian = -90.0; // double :standard_parallel = 25.0; // double :earth_radius = 6371229.0; // double This information will be internally parsed to setup a Coordinate Reference System. Moreover, a custom EPSG should be added to the GeoServer’s user_projection definitions matching that CRS in order to have a valid code identifying that custom projection. More information on this topic are available as part of the GeoServer NetCDF community module documentation.

Finally, the NetCDF output format has been improved too, in order to:

  • support NetCDF4-Classic format. Long story short, the NetCDF4-Classic format add support for the NetCDF4’s HDF5 compression capability to the NetCDF3 model. This may be helpful to reduce the size of the requested data when performing a GetCoverage requeste involving a wide domain (in terms of bbox, time, elevation, …).
    • NetCDF4 output will be automatically created when the getCoverage request specifies application/x-netcdf4 instead of application/x-netcdf.
    • NetCDF4 support requires NetCDF native libs are available. More details are available at the related GeoServer page.
  • support CF (Climate Forecast) names mapping. Input coverages can be renamed on write, in order to use standard names from the CF convention. Moreover, in case a unit of measure change is involved in this remapping, the data values will be converted to use the canonical unit of measure. Therefore, you may think about configuring the NetCDF output for a “Celsius_Degrees_temp” custom layer to produce an “air_temperature” output expressed in K (Kelvin) to respect the CF Naming convention.
  • support DataPacking on write. This feature allows to store, as an instance, a 64 bit Floating point coverage to a 16 bit Integer data type through a scale & offset mechanism.
  • support for user defined global attributes to be added to the output coverage.
All these new options can be customized through an additional panel available when configuring a layer. 6 More information on the NetCDF output features are available as part of the GeoServer NetCDF Output community module documentation. Hope you have enjoy our NetCDF/GRIB support on GeoServer. The GeoSolutions Team, 320x100_eng

by Andrea Filosa at July 27, 2015 03:48 PM

gvSIG Team

7as Jornadas Latinoamérica y Caribe: Programa

00_7LAC_gvSIGYa está disponible el programa de las 7as Jornadas de Latinoamérica y Caribe de gvSIG.

Como ya sabréis este año las Jornadas LAC se celebrarán en Toluca, México, del 26 al 28 de agosto.

En la página web de las Jornadas podéis consultar el programa de actividades con las ponencias que se van a presentar. Un programa muy completo que muestra la variedad de usos de la tecnología gvSIG, y ejemplo de la implantación creciente de gvSIG en los más diversos ámbitos y geografías. Estas jornadas servirán también para conocer los nuevos productos de la Asociación gvSIG, como gvNIX y gvCity.

De forma complementaria al programa, para todos aquellos que quieran más información sobre las Jornadas LAC, hemos ido publicando distintos posts con información de algunos de los talleres que se realizarán en dichas jornadas (y que seguiremos publicando en las próximas semanas).

Os recordamos que también continúa abierta la inscripción, que como en todo evento gvSIG es gratuita, siendo el aforo limitado (y ya hay más de dos centenares de inscritos), por lo que os recomendamos que no esperéis a los últimos días para realizarla.

Aprovechamos para comentaros que varios miembros del equipo gvSIG estaremos en las Jornadas LAC, participando activamente con talleres y ponencias. Y, por supuesto, esperamos que también sea una buena ocasión para conversar, establecer colaboraciones, sumar activos a la comunidad…

¡Os esperamos!


Filed under: community, events, spanish Tagged: 7as LAC

by Alvaro at July 27, 2015 11:22 AM

July 26, 2015

Antonio Santiago

Generate and host your own raster tiles customized with Mapbox Studio

If you have never saw maps hosted in Mapbox platform you would probably agree on the quality of its designs. The business of Mapbox is to host and server geospatial data. For this reason, all the great tools Mapbox facilitates are oriented to help their users to prepare and work with their data.

One of the provided tools is Mapbox Studio. Mapbox Studio (MbS) is a desktop application that allows to create CartoCSS themes that are later used to generate raster tiles. Briefly explained, what MbS does is to download OpenStreetMap data in vector format and render it on the fly applying the specified CartoCSS style.

The result of working with MbS is not a set of tiles but a style, that is, a set of rules that express which colour must be used to render roads, at which levels must labels appears and with which size, which colour must be used for ground, etc. This style can be later uploaded to Mapbox platform so that raster tiles were generated on the cloud and we can consume the tiles paying for the service. (Hope one day I can contract their services, they deserve by their great job).

The question we can make us is: how we can generate the raster tiles locally from a given MbS style?

Well, this article is about that. Continue reading.

Working with Mapbox Studio and create your custom style

Let’s start from the beginning so download Mapbox Studio application and install on your system. Once installed execute it and you will be asked to be connected to the Mapbox platform.

There are two main reasons why Mapbox requires you to register as a user. First, the power of the platform is on the cloud and the goal is you upload all your data to the servers. That includes the styles you create.

Second, MbS retrieves data in vector format from Mapbox servers. When you register as a user you get an API token that identifies your requests. Each time MbS makes a request to extract data it has your token that identifies you as user. This way Mapbox can control if any user is making a bad usage of their platform.
Screen Shot 2015-07-25 at 23.43.26

Once logged in you will be allowed to create new map styles. The easiest way is to start using one of the starter styles created by the great Mapbox designers:

Screen Shot 2015-07-25 at 23.47.38

Here we have chose the Mapbox Outdoors style. In the image you can see the style code (CartoCSS which is inspered by CSS) and the resultant tiles obtaining from painting the vector information with the given style rules:

CartoCSS is a Mapnik stylesheet pre-processor developed by MapBox and inspired by Cascadenik. It is like a CSS language specially developed to style maps.

Screen Shot 2015-07-25 at 23.54.20

Store the style with a new name somewhere on your computer, for example, customstyle. If you look at your disk you will see a customstyle.tm2 folder has been created containing a bunch of files that defines the style rules (take a look they are not dangerous).

Finally, modify some properties, for example @land or @crop colors and save to see the result:

Screen Shot 2015-07-25 at 23.54.47

Great !!! You just have created your first custom style.

Generating raster tiles from MbS style

Looking for a solution I discovered the tessera and tl tools. Tessera is a node based command line application. It is based in some modules from mapbox (concretely tilelive) plus others implemented by the author (Seth Fitzsimmons). The result is we can execute tessera passing a MbS defined style, open a browser pointing to a local address and see a map with the raster tiles generated with our MbS style.

Similarly, tl is a node based command line tool we can execute, passing a set of options, to generate a MBTiles file or a pyramid of tiles following the well known z/x/y.png format.

I know about both tools at the article Converting Mapbox Studio Vector Tiles to Rasters from Azavea Labs.

How to install the tools?

NOTE: You need to have NodeJS installed in your system, along with the npm package manager command line tools.

I don’t like to install global node packages (or at least more than the necessary) so I’m going to install the previous tools in a custom folder:

> mkdir tiletools
> cd tiletools

Inside the directory execute next sentence, which install the tessera and tl packages among others:

> npm install tessera tl mbtiles mapnik tilelive tilelive-file tilelive-http tilelive-mapbox tilelive-mapnik tilelive-s3 tilelive-tmsource tilelive-tmstyle tilelive-utfgrid tilelive-vector tilejson

You will see a hidden directory named .npm_modules has been created which contains some subdirectories with the same name as the previous packages.

Running tessera

Let’s try to run tessera for the first time. Because it is installed as a local node module execute:

> ./node_modules/tessera/bin/tessera.js 

Usage: node tessera.js [uri] [options]

uri     tilelive URI to serve

Options:
   -C SIZE, --cache-size SIZE          Set the cache size (in MB)  [10]
   -c CONFIG, --config CONFIG          Provide a configuration file
   -p PORT, --port PORT                Set the HTTP Port  [8080]
   -r MODULE, --require MODULE         Require a specific tilelive module
   -S SIZE, --source-cache-size SIZE   Set the source cache size (in # of sources)  [10]
   -v, --version                       Show version info

A tilelive URI or configuration file is required.

Tessera requires you pass an URI so it can server its content. It accepts URIs from Mapbox hosted file, Mapnik, Tilemill, Mapbox Studio, …

Repeat again indicating the path to our previously created style indicating the protocol tmstyle://.

> ./node_modules/tessera/bin/tessera.js tmstyle://./customstyle.tm2
Listening at http://0.0.0.0:8080/

/Users/antonio/Downloads/tiletools/node_modules/tessera/server.js:43
        throw err;
              ^
Error: A Mapbox access accessToken is required. `export MAPBOX_ACCESS_TOKEN=...` to set.
...

First seems tessera is working at port 8080 but later we get an error about MAPBOX_ACCESS_TOKEN. If you remember from the first section, Mapbox requires all the requests be signed with the user token. So, you need to get the access token from your account and set it as environment variable before execute tessera:

> export MAPBOX_ACCESS_TOKEN=your_token_here
> > ./node_modules/tessera/bin/tessera.js tmstyle://./customstyle.tm2
Listening at http://0.0.0.0:8080/

/Users/antonio/Downloads/tiletools/node_modules/tessera/server.js:43
        throw err;
              ^
Error: Failed to find font face 'Open Sans Bold' in FontSet 'fontset-0' in FontSet

We are close to make it work. The problem now is our MbS style is using a font we have not installed in our system. One easy, but brute force, solution is to install all Google Web Fonts on your system. For this purpose you can use the Web Font Load installation script. In my case I have installed them in the user’s fonts folder ~/Library/Fonts.

Once fonts were installed try executing tessera again:

> ./node_modules/tessera/bin/tessera.js tmstyle://./customstyle.tm2
Listening at http://0.0.0.0:8080/

/Users/antonio/Downloads/tiletools/node_modules/tessera/server.js:43
        throw err;
              ^
Error: Failed to find font face 'Open Sans Bold' in FontSet 'fontset-0' in FontSet

That’ s a bit strange, we have just installed the fonts but they are not found. What is happening? Well, tessera uses mapnik to create the raster tiles and it looks for fonts in the folders specified by the environment variable MAPNIK_FONT_PATH, so let define the variable:

> export MAPNIK_FONT_PATH=~/Library/Fonts/

and execute the script again:

> ./node_modules/tessera/bin/tessera.js tmstyle://./customstyle.tm2
Listening at http://0.0.0.0:8080/

/Users/antonio/Downloads/tiletools/node_modules/tessera/server.js:43
        throw err;
              ^
Error: Failed to find font face 'Arial Unicode MS Regular' in FontSet 'fontset-0' in FontSet

OMG !!! This seems a never ending story. Now we need to install the Arial Unicode font. Look for it, install in your system and execute tessera again:

> ./node_modules/tessera/bin/tessera.js tmstyle://./customstyle.tm2
Listening at http://0.0.0.0:8080/

Great !!! It seems tessera is working fine. Let’s go to open our browser pointing to http://localhost:8080 and see the result:

A map implemented using Leaflet web mapping library is shown, rendering raster tiles that are created in the fly. Look at the console to see the tessera output information:

We can see how tiles at current zoom, the zoom level 8, has been generated.

At this point we have tessera working but what about generate a local pyramid of tiles for a given zoom levels and a given bounding box?

Generating a custom pyramid of tiles with tl command line tool

Before continue we need to know which bounding box we want to generate, the whole World? or only a piece. In my case I want three zoom levels (7, 8 and 9) wrapping Catalonia.

There are some online tools you can use to get the bbox of a region, but one I like it the Bounding Box Tool from Klokan Technologies.

The tl tool can run three main commands but are only interested in the copy one, which copies data between two providers. In our case the MbS style is one provider and the file system is the other. Run the tl command to see the available options:

> ./node_modules/tl/bin/tl.js copy -help
'-p' expects a value


Usage: node tl.js copy <source> <sink> [options]

source     source URI
sink       sink URI

Options:
   -v, --version                 Show version info
   -b BBOX, --bounds BBOX        WGS84 bounding box  [-180,-85.0511,180,85.0511]
   -z ZOOM, --min-zoom ZOOM      Min zoom (inclusive)  [0]
   -Z ZOOM, --max-zoom ZOOM      Max zoom (inclusive)  [22]
   -r MODULE, --require MODULE   Require a specific tilelive module
   -s SCHEME, --scheme SCHEME    Copy scheme  [scanline]
   -i FILE, --info FILE          TileJSON

copy data between tilelive providers

So let’s go to execute the command to copy data from our MbS style to the local tiles folder. We want to generate tiles from zoom level 7 to 9 and indicating a bounding box wrapping Catalonia.

Remember the -b options must be indicated as [minLon minLat maxLon maxLat].

> ./node_modules/tl/bin/tl.js copy -z 7 -Z 9 -b "0.023293972 40.4104003077 3.6146087646 42.9542303723" tmstyle://./customstyle.tm2/ file://./tiles
Segmentation fault: 11

Ough !!! That hurts, a segmentation fault. After looking for a while I realised it seems a bug. To solve it go to tl/node_modules/abaculus/node_modules and remove the mapnik folder dependency. It is redundant because there is one installed in parent folder.

Execute the command again and see the output:

The tl tool has created a local tiles directory and generated all the raster tiles for the given zoom levels and bounding box. The output shows in addition the time required to generate each tile.

That’s all. Now we only need to host the tiles at our own servers !!!

 

Related Posts:

by asantiago at July 26, 2015 02:47 PM

July 25, 2015

OSGeo News

GRASS GIS 7.0.1 RC2 released

by jsanz at July 25, 2015 11:29 AM

Bjorn Sandvik

Master maps

I’m going freelance over the summer, after 5 great years at the Norwegian Broadcasting Corporation (NRK). It was not an easy decision, but I have to try. I’ll tell more about my plans later. Please sign up to get notified about my services.



Some of the projects I've been working on at NRK:

The flexible mapping stack of NRK.no, allowing journalists and digital storytellers to create advanced maps in minutes. 

"Kartoteket" - our in-house mapping tool built on top of our mapping stack.

Digital storytelling using NRKs mapping stack and Mapbox. 

Digital storytelling using NRKs mapping stack and Mapbox. 

Flood maps using NRKs mapping stack and CartoDB.

Radon affected areas in Norway using NRKs mapping stack.

Our popular photo maps

Video map of the long running TV show Norge Rundt.
Tracking of "Sommerbåten" along the coast of Norway.

Other work.

by Bjørn Sandvik (noreply@blogger.com) at July 25, 2015 07:53 AM

July 24, 2015

GeoServer Team

GeoServer 2.7.2 released

The GeoServer team is happy to announce the release of GeoServer 2.7.2. Download bundles are provided (zipwardmg and exe)  along with documentation and extensions.

GeoServer 2.7.2 is a stable release of GeoServer recommended for production deployment. Thanks to everyone taking part, submitting fixes and new functionality including:

  • Importer raster improvements, added support for GDAL based file optimization when importing rasters, also, it is now possible to add add granules to a mosaic (and optimize them with GDAL in the process)
  • Importer vector improvements, now one can import data into non JDBC data stores too
  • Some improvements in the documentation on using GDAL based data sources in Windows
  • More tweaks on the XXE vulnerability fixes (we left it open just enough not to break OGC compliance)
  • Properly rendering GeoTiff files with flipped Y axis
  • Making sure WPS really stops answering requests when not enabled
  • Improvements in NetCDF handling of reprojected requests
  • For a full list, see the release notes.

Also, as a heads up for Oracle users, the Oracle store does not ship anymore with the JDBC driver (due to redistribution limitations imposed by Oracle). For details see the updated the oracle installation instructions here.

Thanks to Andrea (GeoSolutions) and Kevin (Boundless) for this release.

by Andrea Aime at July 24, 2015 07:32 AM

July 23, 2015

Markus Neteler

Sol Katz Award – Call for Nominations

The Open Source Geospatial Foundation would like to open nominations for the 2015 Sol Katz Award for Geospatial Free and Open Source Software.

The Sol Katz Award for Geospatial Free and Open Source Software (GFOSS) will be given to individuals who have demonstrated leadership in the GFOSS community. Recipients of the award will have contributed significantly through their activities to advance open source ideals in the geospatial realm.

Sol Katz was an early pioneer of GFOSS and left behind a large body of work in the form of applications, format specifications, and utilities while at the U.S. Bureau of Land Management. This early GFOSS archive provided both source code and applications freely available to the community. Sol was also a frequent contributor to many geospatial list servers, providing much guidance to the geospatial community at large.

Sol unfortunately passed away in 1999 from Non-Hodgkin’s Lymphoma, but his legacy lives on in the open source world. Those interested in making a donation to the American Cancer Society, as per Sol’s family’s request, can do so at https://donate.cancer.org/index.

Nominations for the Sol Katz Award should be sent to SolKatzAward@osgeo.org with a description of the reasons for this nomination. Nominations will be accepted until 23:59 UTC on August 21st (http://www.timeanddate.com/worldclock/fixedtime.html?month=8&day=21&year=2015&hour=23&min=59&sec=59).
A recipient will be decided from the nomination list by the OSGeo selection committee.

The winner of the Sol Katz Award for Geospatial Free and Open Source Software will be announced at the FOSS4G-Seoul event in September. The hope is that the award will both acknowledge the work of community members, and pay tribute to one of its founders, for years to come.

It should be noted that past awardees and selection committee members are not eligible.

More info at the Sol Katz Award wiki page
http://wiki.osgeo.org/wiki/Sol_Katz_Award

Past Awardees:

2014: Gary Sherman
2013: Arnulf Christl
2012: Venkatesh Raghavan
2011: Martin Davis
2010: Helena Mitasova
2009: Daniel Morissette
2008: Paul Ramsey
2007: Steve Lime
2006: Markus Neteler
2005: Frank Warmerdam

Selection Committee 2015:

Jeff McKenna (chair)
Frank Warmerdam
Markus Neteler
Steve Lime
Paul Ramsey
Sophia Parafina
Daniel Morissette
Helena Mitasova
Martin Davis
Venkatesh Raghavan
Arnulf Christl
Gary Sherman

The post Sol Katz Award – Call for Nominations appeared first on GFOSS Blog | GRASS GIS Courses.

by neteler at July 23, 2015 08:54 PM

Free and Open Source GIS Ramblings

Open source IDF parser for QGIS

IDF is the data format used by Austrian authorities to publish the official open government street graph. It’s basically a text file describing network nodes, links, and permissions for different modes of transport.

Since, to my knowledge, there hasn’t been any open source IDF parser available so far, I’ve started to write my own using PyQGIS. You can find the script which is meant to be run in the QGIS Python console in my Github QGIS-resources repo.

I haven’t implemented all details yet but it successfully parses nodes and links from the two example IDF files that have been published so far as can be seen in the following screenshot which shows the Klagenfurt example data:

Screenshot 2015-07-23 16.23.25

If you are interested in advancing this project, just get in touch here or on Github.


by underdark at July 23, 2015 01:54 PM

OSGeo News

Sol Katz Award - Call for Nominations

by jsanz at July 23, 2015 01:18 PM

July 22, 2015

Stefano Costa

William Gibson, archaeologist

Earlier this year, in cold January morning commutes, I finally read William Gibson’s masterpiece trilogy. If you know me personally, this may sound ironic, because I dig geek culture quite a bit. Still, I’m a slow reader and I never had a chance to read the three books before. Which was good, actually, because I could enjoy them deeply, without the kind of teenage infatuation that is quickly gone ‒ and most importantly because I could read the original books, instead of a translation: I don’t think 15-year old myself could read English prose, not Gibson’s prose at least, that easily.

I couldn’t help several moments of excitement for the frequent glimpses of archaeology along the chapters. This could be a very naive observation, and maybe there are countless critical studies that I don’t know of, dealing with the role of archaeology in the Sprawl trilogy and Gibson’s work in general. Perhaps it’s touching for me because I deal with Late Antiquity, that is the closest thing to a dystopian future that ever happened in the ancient world, at least as we see it with abundance of useless objects and places from the past centuries of grandeur. Living among ruins of once beautiful buildings, living at the edge of society in abandoned places, reusing what was discarded in piles, black markets, spirituality: it’s all so late antique. Of course the plot of the Sprawl trilogy is a contemporary canon, and the characters are post-contemporary projections of a (very correctly) imagined future, but the setting is, to me, evoking of a world narrative that I could embrace easily if I had to write fiction about the periods I study.

Count Zero is filled with archaeology, of course especially the Marly chapters. Towards the end it gets more explicit, but it’s there in almost all chapters and it has something to do with the abundance of adjectives, the care for details in little objects. Mona Lisa overdrive is totally transparent about it, since the first pages of Angie Mitchell on the beach:

The house crouched, like its neighbors, on fragments of ruined foundations, and her walks along the beach sometimes involved attempts at archaeological fantasy. She tried to imagine a past for the place, other houses, other voices.

– William Gibson. Mona Lisa Overdrive, p. 35.

But really, you just have to follow Molly along the maze of the Straylight Villa in Neuromancer to realize it’s a powerful theme of all the Sprawl trilogy.

The Japanese concept of gomi, that pervades Kumiko’s view of Britain and the art of Rubin in the Winter Market, is another powerful tool for material culture studies, at least if we have to find a pop dimension where our studies survive beyond the inevitable end of academia.

by Stefano Costa at July 22, 2015 09:17 PM

gvSIG Team

2as Jornadas de gvSIG Perú. “Ciencia, tecnología y desarrollo”

gvSIG_Peru

Los días 25 y 26 de septiembre de 2015 tendrán lugar las 2as Jornadas gvSIG Perú, que se celebrarán en el Auditorio de la Municipalidad de Huancayo, bajo el lema “Ciencia, tecnología y desarrollo”.

Por segundo año consecutivo se celebran las jornadas que reunirán a la comunidad gvSIG Perú y a todos aquellos interesados por la geomática libre. Este año además habrá representación de la Asociación gvSIG, participando mediante charlas y talleres tanto Joaquín del Cerro – Responsable de Arquitectura y Desarrollo de gvSIG – como Alvaro Anguix – Director General-.

Está abierto el plazo para enviar las propuestas de comunicación para las jornadas a la dirección de correo electrónico jornadas.peru@gvsig.org, que serán valoradas por el Comité Científico de cara a su inclusión en el programa de las Jornadas. Toda la información sobre las normas para la presentación de comunicaciones puede consultarse en el apartado Comunicaciones de la web. El periodo de recepción de resúmenes finalizará el próximo 14 de agosto.

Ya está abierto el periodo de inscripción de las Jornadas. La inscripción es gratuita (aforo limitado) y se ha de realizar a través del formulario existente en la página web.

 


Filed under: community, events, press office, spanish Tagged: jornadas, Perú

by Alvaro at July 22, 2015 03:01 PM

Paulo van Breugel

QGIS 2.10 Pisa is out!

The new QGIS 2.10 (Pisa) has been released, with many great new features, tweaks and enhancements. Check out the changelog for the highlights (you’ll need some time, it is again an impressive list of improvements and new features). The source code and binaries for Windows, Debian and Ubuntu are already available via the large download […]

by pvanb at July 22, 2015 12:44 PM

GIS for Thought

Multi Ring Buffer – Buffer the Buffer or Incrementally Increasing Distance?

Does it matter, and who cares?

Multi-ring buffers can be useful for simple distance calculations as seen in:
X Percent of the Population of Scotland Lives Within Y Miles of Glasgow
And:
X Percent of the Population of Scotland Lives Within Y Miles of Edinburgh

For these I simply created multiple buffers using the QGIS buffer tool. This works for small samples, but was quite frustrating. I had initially hoped to do the whole analysis in SQLite, which worked pretty well initally, but struggled on the larger buffers. It took too long to run the queries, and did not allow for visualisation. I think using PostGIS would however be pretty feasible.

But creating a multi-ring buffer plugin for QGIS also seemed like a good learning experience. Which got me thinking, does it matter if you create increasingly large buffers around the original feature, or if you buffered the resulting buffer sequentially. My hypothesis was that there would be pretty significant differences due to the rounding of corners.

I asked on StackExchange but the conversation did not really take off:
http://gis.stackexchange.com/questions/140413/multi-ring-buffer-methodology

My question is not about the overlapping-ness of the buffers, since I think multi-ring buffers should be “doughnuts” anyway. But rather if smoothing will occur. The only answer was to try it myself.

Buffer styles:
Buffer the resulting buffer sequentially: Sequential
Buffer the original feature with increasing buffer distance: Central

Speed – In seconds
Features Rings Central Sequential
1 5 0.59 0.56
55 5 8.06 6.38
1 200 60.83 31.76
3 200 62.89 40.89
55 200 628.38 586.67
1 2000 203.84 67.00

No matter how you do it the sequential style is quicker, but that may be down to my code.

Rendering

Interestingly, although understandably, the sequential style results in a lot more vertices in the outer rings. For comparison, for a 500 ring buffer the outermost ring had the following vertice counts:

Style Vertices
Central 488
Sequential 30918

We can see this with editing turned on.
Central:
Central_editing
Sequential:
Sequential_editing

We can also see a smoother profile in the sequential buffer. However the difference is not major, and hard to discern with the naked eye.

So we have at most about around a 10m discrepancy, with 500 50m rings, so around 25000m of distance from the original feature.
Screenshot[34]
This impacts rendering time dramatically, an example with our 500 rings:

Central:

Sequential:

So quicker to create but slower to draw. So which one is better, quicker calculation, or quicker rendering? Or should we not do 200+ ring buffers?

Hard to say. In version 0.2 of the Multi Ring Buffer Plugin. There is an option for either in the advanced tab.

Plugin: https://plugins.qgis.org/plugins/Multi_Ring_Buffer/
Please report any issues through GitHub: https://github.com/HeikkiVesanto/QGIS_Multi_Ring_Buffer/issues

by Heikki Vesanto at July 22, 2015 11:00 AM

gvSIG Team

Taller sobre gvSIG Batoví para profesores en el liceo Nº54 del Prado (Montevideo)

Originally posted on gvSIG Batovi:

El día martes 14 de julio se llevó a cabo un taller sobre gvSIG Batoví en el liceo 54 (Agraciada 3636, Montevideo). El mismo estuvo orientado a profesores de Secundaria (en su mayoría de Geografía pero también de otras disciplinas) y contó con la asistencia de 17 profesoras y profesores de 7 liceos de la zona (Nºs 6, 16, 18, 56, 71, 75, además del propio liceo Nº 54). El resultado del mismo ha sido muy positivo.

phoca_thumb_l_2014-09-16-liceo 54 nuevo-rrpp 01 Liceo Nº54 en el Prado

IMG_5196 Sala de Informática del Liceo Nº54

El taller fue diseñado para trabajar con asistentes que se enfrentaban por primera vez con un SIG de escritorio, pero que podía aportar también a aquéllas y aquéllos que ya conocieran la tecnología. La actividad consistió en una pequeña introducción al proyecto gvSIG Batoví, una breve descripción de los objetivos del taller, mostrar cómo descargar e instalar el programa, cómo descargar y…

View original 173 more words


Filed under: opinion

by Mario at July 22, 2015 07:13 AM

July 21, 2015

Stefano Costa

Being a journal editor is hard

I’ve been serving as co-editor of the Journal of Open Archaeology Data (JOAD) for more than one year now, when I joined Victoria Yorke-Edwards in the role. It has been my first time in an editorial role for a journal. I am learning a lot, and the first thing I learned is that being a journal editor is hard and takes time, effort, self-esteem. I’ve been thinking about writing down a few thoughts for months now, and today’s post by Melissa Terras about “un-scholarly peer review practices […] and predatory open access publishing mechanisms” was an unavoidable inspiration (go and read her post).

Some things are peculiar of JOAD, such as the need to ensure data quality at a technical level: often, though, improvements on the technical side will reflect substantially on the general quality of the data paper. Things that may seem easily understood, like using CSV for tabular data instead of PDF, or describing the physical units of each column / variable. Often, archaeology datasets related to PhD research are not forged in highly standardised database systems, so there may be small inconsistencies in how the same record is referenced in various tables. In my experience so far, reviewers will look at data quality even more than at the paper itself, which is a good sign of assessing the “fitness for reuse” of a dataset.

The data paper: you have to try authoring one before you get a good understanding of how a good data paper is written and structured. Authors seem to prefer terse and minimal descriptions of the methods used to create their dataset, giving many passages for granted. The JOAD data paper template is a good guide to structuring a data paper and to the minimum metadata that is required, but we have seen authors relying almost exclusively on the default sub-headings. I often point reviewers and authors to some published JOAD papers that I find particularly good, but the advice isn’t always heeded. It’s true, the data paper is a rather new and still unstable concept of the digital publishing era: Internet Archaeology has been publishing some beautiful data papers,and I like to think there is mutual inspiration in this regard. Data papers should be a temporary step towards open archaeology data as default, and continuous open peer review as the norm for improving the global quality of our knowledge, wiki-like. However, data papers without open data are pointless: choose a good license for your data and stick with that.

Peer review is the most crucial and exhausting activity: as editors, we have to give a first evaluation of the paper based on the journal scope and then proceed to find at least two reviewers. This requires having a broad knowledge of ongoing research in archaeology and related disciplines, including very specific sub-fields of study ‒ our list of available reviewers is quite long now but there’s always some unknown territory to explore, for this asking other colleagues for help and suggestions is vital. Still, there is a sense of inadequacy, a variation on the theme of impostor syndrome, when you have a hard time finding a good reviewer, someone who will provide the authors with positive and constructive criticism, becoming truly part of the editorial process. I am sorry for the fact that our current publication system doesn’t allow for the inclusion of both the reviewers’ names and their commentary  ‒ that’s the best way to provide readers with an immediate overview of the potential of what they are about to read, and a very effective rewarding system for reviewers themselves (I keep a list of all peer reviews I’m doing but that doesn’t seem as satisfying). Peer review at JOAD is not double blind, and I think often it would be ineffective and useless to anonymise a dataset and a paper, in a discipline so territorial that everyone knows who is working where. It is incredibly difficult to get reviews in a timely manner, and while some of our reviewers are perfect machines, others keep us (editors and authors) waiting for weeks after the agreed deadline is over. I understand this, of course, being too often on the other side of the fence. I’m always a little hesitant to send e-mail reminders in such cases, partly because I don’t like receiving them, but being an annoyance is kind of necessary in this case. The reviews are generally remarkable in their quality (at least compared to previous editorial experience I had), quite long and honest: if something isn’t quite right, it has to be pointed out very clearly. As an editor, I have to read the paper, look at the dataset, find reviewers, wait for reviews, solicit reviews, read reviews and sometimes have a conversation with reviewers, if something is their comments are clear and their phrasing/language is acceptable (an adversarial, harsh review must never be accepted, even when formally correct). All this is very time consuming, and since the journal (co)editor is an unpaid role at JOAD and other overlay journals at Ubiquity Press (perhaps obvious, perhaps not!) , usually this means procrastinating: summing the impostor syndrome dose from criticising the review provided by a more experienced colleague with the impostor syndrome dose from being always late on editorial deadlines yields frustration. Lots. Of. Frustration. When you see me tweet about a new data paper published at JOAD, it’s not an act of deluded self-promotion, but rather a liberatory moment of achievement. All this may sound naive to experienced practitioners of peer review, especially to those into academic careers. I know, and I still would like to see a more transparent discussion of how peer review should work (not on StackExchange, preferably).

JOAD is Open Access. It’s the true Open Access, not to differentiate between gold and green (a dead debate, it seems) but between two radically different outputs. JOAD is openly licensed under the Creative Commons Attribution license and we require that all datasets are released under open licenses so readers know that they can download, reuse, incorporate published data in their new research. There is no “freely available only in PDF”, each article is primarily presented as native HTML and can be obtained in other formats (including PDF, EPUB). We could do better, sure ‒ for example, provide the ability to interact directly with the dataset instead of just providing a link to the repository ‒ but I think we will be giving more freedom to authors in the future. Publication costs are covered by Article Processing Charges, 100 £, that will be paid by the authors’ institutions: in case this is not possible, the fee will be waived. Ubiquity Press is involved in some of the most important current Open Access initiatives, such as the Open Library of Humanities and most importantly does a wide range of good things to ensure research integrity from article submission to … many years in the future.

You may have received an e-mail from me with an invite to contribute to JOAD, either by submitting an article or giving your availability as a reviewer ‒ or you may receive it in the next few weeks. Here, you had a chance to learn what goes on behind the scenes at JOAD.

by Stefano Costa at July 21, 2015 09:18 PM

OSGeo News

GeoForAll - Global Educator of the Year Award 2015

by jsanz at July 21, 2015 08:55 PM

OSGeo News

EarthServer Project goes into the second round

by jsanz at July 21, 2015 08:51 PM

OSGeo News

GeoMoose 2.8.0 Released

by jsanz at July 21, 2015 08:43 PM

Edmar Moretti

Como usar KML no i3Geo e a estratégia de plugins



No i3Geo, arquivos KML podem ser utilizados de duas formas: como fonte de dados tal qual arquivos shapefile ou como dados processados diretamente pelas APIs javascript que controlam a composição do mapa interativo.

Em ambos os casos são utilizados arquivos mapfile para configurar a camada que será incluída no mapa, porém de formas diferentes.

No primeiro caso, como uma fonte de dados comum, o mapfile vale-se do OGR como o conector que faz a leitura dos dados, o restante da configuração segue a estrutura normal do Mapserver, como é o caso da definição da simbologia (veja exemplo de mapfile abaixo). No editor de "mapfiles" do sistema de administração do i3Geo, o formulário de definição do tipo de conexão apresenta a opção MS_OGR que deve ser escolhida quando se usa KML.

As possibilidades de uso do KML nesse esquema depende das características do OGR, veja:

No segundo caso, utiliza-se uma implementação própria do i3Geo que funciona como um plugin que amplia o leque de fontes de dados, indo além das opções oferecidas pelo Mapserver. Ressalte-se que o Mapserver sempre faz o processamento dos dados no servidor web, renderizando imagens ou obtendo dados que são transferidas ao navegador do usuário.

Os plugins, por outro lado, objetivam aproveitar as funcionalidades das APIs javascript utilizadas pelo i3Geo, principalmente o OpenLayers e o GoogleMaps. Os arquivos mapfile são utilizados também, mas apenas para guardar os parâmetros que são necessários.

No exemplo abaixo, temos um mapfile configurado pelo plugin KML. Note que há um item nos METADATA chamado PLUGINI3GEO. Esse metadata tem como valor uma string no padrão JSON que guarda os parâmetros que o i3Geo utilizará quando a camada for adicionada ao mapa. O plugin então passará o controle da renderização da camada para o OpenLayers ou para o GoogleMaps, conforme o mapa que o usuário estiver usando.

Utilizado dessa forma o mapfile pode ser adicionado ao catálogo de camadas e também no inicializador parametrizado. Os mapas podem ser salvos também, pois o LAYER que contém o plugin sempre será interpretado pelo i3Geo quando o mapa for iniciado.


Mapfile com arquivo KML como um plugin do i3Geo

MAP
  FONTSET "../symbols/fontes.txt"
  SYMBOLSET "../symbols/simbolosv6.sym"
  LAYER
    CONNECTION ""
    DATA ""
    METADATA
      "CLASSE" "SIM"
      "PLUGINI3GEO" '{"plugin":"layerkml","parametros":{"url":"http://www.openlayers.org/en/v3.7.0/examples/data/kml/2012_Earthquakes_Mag5.kml"}}'
      "TEMA" "Terremotos kml"
    END # METADATA
    NAME "_terremotokml"
    STATUS DEFAULT
    TEMPLATE "none.htm"
    CLASS
      NAME ""
      STYLE
        COLOR 0 0 0
        SIZE 12
      END # STYLE
    END # CLASS
  END # LAYER
END # MAP


Mapfile com arquivo KML como fonte de dados
MAP
  FONTSET "../symbols/fontes.txt"
  SYMBOLSET "../symbols/simbolos.sym"
  LAYER
    CONNECTION "/var/www/i3geo/aplicmap/dados/teste.kml"
    CONNECTIONTYPE OGR
    DATA "Trovit"
    METADATA
      "CLASSE" "SIM"
      "TEMA" "teste"
    END # METADATA
    NAME "_lkml"
    STATUS DEFAULT
    TEMPLATE "none.htm"
    TYPE POINT
    CLASS
      NAME ""
      STYLE
        COLOR 200 50 0
        SIZE 6
        SYMBOL "ponto"
      END # STYLE
    END # CLASS
  END # LAYER
END # MAP
O plugin KML, assim como os outros já disponíveis, são configurados no editor de mapfiles. A figura abaixo mostra essa situação. Observe que na árvore de opções existe um nó denominado "plugin". Quando um dos itens é acionado abre-se um formulário para o preenchimento dos parâmetros.


O plugin também é utilizado na ferramenta de conexão com dados que fica disponível para qualquer usuário, mesmo os que não estão logados, diretamente na interface dos mapas.


Vantagens e desvantagens

O uso de KML como fonte de dados via OGR permite que as funcionalidades do i3Geo que acessam a tabela de atributos funcionem normalmente. É o caso do editor de legenda, visualizador de dados, gerador de gráficos etc. A definição da simbologia fica menos dependente da fonte de dados, pois pode ser definida nas classes do layer presente no mapfile.

O uso via plugin tem a vantagem de ser mais simples de configurar ao aproveitar as definições de estilos já pré-definidas no KML. Outra vantagem é a menor dependência do processamento dos dados no servidor, o que pode ser vantajoso quando se tem uma internet de melhor qualidade.

Concluindo, o esquema de plugins dá ao i3Geo uma boa flexibilidade para que o software acompanhe as inovações que ocorrem com frequência no ambiente Web. As bibliotecas javascript que oferecem representações cartográficas inovadoras podem assim ser rapidamente incorporadas, a exemplo dos demais plugins de agrupamento de pontos (heatmap e markercluster).


by Moretti Edmar (noreply@blogger.com) at July 21, 2015 01:47 AM

July 20, 2015

Edmar Moretti

i3Geo e INDE - como cadastrar metadados e serviços no visualizador

As camadas de dados geográficos configuradas no i3Geo podem ser disponibilizadas como serviços nos padrões OGC. Com isso outros softwares podem se conectar a esses serviços e utilizar as camadas de forma independente das interfaces web do i3Geo.

O i3Geo não possui um sistema para cadastro de metadados, mas permite que em cada camada seja atribuído um link para a fonte. Essa fonte pode ser um registro no software Geonetwork ou qualquer outra página web. Por outro lado, no Geonetwork pode-se cadastrar os serviços fornecidos pelo i3Geo.

Para detalhar melhor isso incluímos um livro no curso de administração do i3Geo:
http://moodle.gvsig-training.com/mod/book/view.php?id=5846

O texto foi escrito por Murilo Caixêta e é utilizado nos processos de disseminação de dados no Ministério da Saúde.



by Moretti Edmar (noreply@blogger.com) at July 20, 2015 06:36 PM

Boundless Blog

Building an OpenLayers 3 Web App Without Writing Code – Part II

In my previous blog posted on June 23rd, I walked through the steps necessary to go from Project & Data to completed web app using Boundless’ new Web App Builder. I encourage you to take a minute to review the initial blog, as there’s some important context. The sample flood data application used a sampling of the most commonly anticipated controls and options that are available, but there’s a lot of functionality I didn’t explore. This post we will explore more of these, including the following that someone desiring greater control might seek to leverage:

-> using different themes
-> augmenting the HTML of an information popup,
-> using bookmarks map control.

I think it’s worth noting that the extensibility of QGIS – for more on this, my colleague Anthony’s recent post is a great place to start – means these are only the start, there is lots more room for additional functionality to be added.

THEMES

To refresh your memory… When we start the Web App Builder, the first tab presented to us is the Description tab. Here we enter an application name, add a logo image, and choose a theme. The Web App Builder currently includes three themes: Basic, Fullscreen, and Tabbed.
Miller2_1Below is the same application created using the three different themes.

Basic
Miller2_2

Fullscreen
Miller2_3

Tabbed
MIller2_4
As you can see from the images the primary difference between the themes is the location of the controls. In the basic theme the tools are placed in the upper right and clicking the tool brings up a new panel. In the tabbed version the tools are the title of the tab and the information (about, chart, etc) is contained inside its respective tab.  To maximize real estate for the map the full screen version places the tools are in a pull-down on the top.  The three themes provide flexibility in the look and feel of your application, deciding on the best fit is up to you.

AUGMENTING THE HTML

Just as important as finding the right theme for your application is formatting the attribute data for features. The configuration dialogue for info popups is accessible from the QGIS Layers tab and is specific to each layer.   The Info popup for the layers uses HTML, which we can use to format the attributes and other popup content.

Let’s start by looking at how we add feature attributes to the popup. While the editor dialogue is initially empty, clicking ‘Add all attributes’ in the Info Popup Editor dialogue will add every attribute to the popup along with the field name, colon, and line break.

Miller2_5Info Popup after clicking Add all Attributes
Miller2_6

Resulting Info Popup

However the HTML tags can be used for more than just formating. They can also be used to link to other documents or reports based on information in the attributes. In our sample Flood Data Viewer Application I’ve added a reports sub-folder that has PDFs for each of the parcels using the Parcel ID as the file name.  Using the HTML Link tag we can create a link to the PDF report using the parcel ID.

<b>Owner Name</b>:[OWNER_NAME]<br>
<b>Parcel ID</b>: [PARCELID]<br>
<a href=”./reports/[PARCELID].pdf”> Parcel Report </a>

MIller2_7From the popup a user is presented with a clickable link that takes them to the corresponding PDF report for the parcel.

BOOKMARKS

Located under the controls tab, the Bookmarks control allows us to import bookmarks or a bookmarks layer from QGIS, and with a little configuration can turn our map into a story panel.
MIller2_8
Right-clicking on the control opens the configuration dialogue.
MIller2_9There are two tabs for configuring the bookmarks, the first is used to specify which bookmarks to use and the second to define how the bookmarks are presented.

The first step is adding bookmarks, they can be added from QGIS or from a separate layer. In our case the bookmarks are saved with the QGIS project. Once added, we can change the order of the bookmarks by dragging and dropping them in the list, and can add a description to each bookmark.

If we stop here and preview the application we see the bookmarks are presented in a drop down.
Miller2_10
Choosing a bookmark will pan and zoom the map to that location.

Selecting “Show as story panel” on the second tab of the bookmarks configuration dialogue changes the bookmark display from a drop down to an inset where the intro title and description are used in the overview inset. Clicking the arrow in the inset advances the map to the next bookmark, while checking “Move automatically with each X seconds” will cycle through the bookmarks automatically.

Miller2_11

MIller2_12

Miller2_13Overview map

Miller2_141st bookmark

The screenshots above illustrate the story panel concept, this is useful for making presentations or guiding users through key points of interest. However, for our limited text the story panel is much too large. Let’s take a quick look at how we can adjust it, using the description tab click configure theme.  Here we see the setting that can easily be changed for the components of the web app:

MIller2_15We want to change the pixel values for the height and width inside the “.story-panel”:
Miller2_16
Here is the story panel after the change.
Miller2_17As we have seen in this post the Web App Builders’ configuration options provide a lot of flexibility for your application. If you need to make changes beyond the configurations many of the theme components and controls can be customized using CSS or HTML, which again means web developers can support the application without being GIS experts.

Currently the Web App builder is available by contacting Boundless at sales@boundlessgeo.com.  In the near future the Web App Builder will be available to install from the Boundless plugin server, simplifying the install and update process.

 

 

The post Building an OpenLayers 3 Web App Without Writing Code – Part II appeared first on Boundless.

by Aaron Miller at July 20, 2015 04:34 PM

Stefano Costa

Political leaders are not human beings

The problem with satire and much political commentary is that politicians, and leaders particularly, are treated like human beings, with their own spoken language and experience and ideas, whereas a less naive view could acknowledge they are more a condensation of economic and lobby agendas, brought forward on a mid-term scale with fixed objectives. At least that seems to explain the trajectory of successful leaders and successful parties, like Berlusconi and lately Renzi. One may simplistically call them puppets for large groups of less visible people who are not directly involved with politics, from rich entrepreneurs to CFO of the financial sector and high-ranking civil servants. It’s certainly more nuanced and way deeper than that, though.

by Stefano Costa at July 20, 2015 03:52 PM

gvSIG Team

gvSIG Association has received the NASA World Wind Europa Challenge award

ww2

The gvSIG Association received one of the Europa Challenge” awards, given by the NASA and the European Commission last Friday, during the “FOSS4G Europe” in Como, Italy.

The “Europa Challenge” award has an international character where besides the Spanish representatives from the gvSIG Association, projects from United States, Italy, United Kingdom, Hungary and India.

These awards are given to development projects of computer applications that use the NASA World Wind software, a virtual globe similar to Google Earth in open source, and are framed at the standards use to share and access to geographical information, defined by the INSPIRE European Directive.

The award was given by Patrick Hogan finally, the NASA World Wind project manager. Besides, the meeting served to stablish the bases of a collaboration agreement between NASA and the gvSIG Association.

The project presented by the gvSIG Association allows the integration of the virtual globe developed by the NASA in gvSIG, an open source geographical information system that has become a referent as technology for information analysis from the territorial point of view, in an international level, after it’s 10th anniversary. Trough gvSIG, thousands of users around the world manage their spatial information without any license use limitation.

This award is a recognition to the probably most successful open source project born in the European Union. It’s a recognition to the gvSIG Association and the community work, that has started up a development model based on the collaboration, solidarity and shared knowledge successfully.

Thank you very much to everybody who make gvSIG bigger day by day!

Here you have the presentation and videos:

 


Filed under: english, events, gvSIG Desktop, press office Tagged: 3D, Europa Challenge, foss4g, NASA, World Wind

by Mario at July 20, 2015 01:07 PM