Welcome to Planet OSGeo

September 21, 2022

One of the cool things about blogging is that there aren't a lot of rules. I had some thoughts about racing at Superior that I wasn't able to weave into my previous post, so I'm just making a new post about technical aspects of the race. What I wore, what I carried, and what I ate.

I brought rain gear and extra clothing to Minnesota, and used it on Friday. But when the forecasts converged on dry and mild for Saturday, I left that gear out of my pack and drop bags. I sent a pair of shoes (Evo Speedgoats) and socks to the Cramer Road aid station at 27 miles and a spare headlamp to the Sawbill aid station at 40 miles. No other non-consumable items. I didn't use the spare Speedgoats. My feet were still feeling okay at 27 miles. In hindsight, it would have been better to have sent completely different shoes to Sawbill. A pair with a different fit, like my Nikes, would have been lovely to change into at 40 miles.

I wore the same clothing all day. Smartwool socks. A very light tech top from Patagonia. Venerable capilene boxer briefs from REI and slightly ratty, but lucky, Vuori Banks shorts. I trust these to be non-chaffing at home and they came through for me again. My Gnar Runners Boco tech trucker was on my head until sundown, after which I intermittently wore a capilene beanie. I wore my nylon windbreaker (Patagonia Houdini) for the last seven miles. With clear skies, the air cooled quickly, but never got very cold. I left my wool gloves in my pack. I wore almost everything I packed and didn't need anything more. I'm going to have to retire some of these items soon. Plastic clothing lasts a long time, but not forever.

I carried everything in my trusty Ultimate Directions Mountain Vest (4.0). It weighs almost nothing, is breathable, and fits well whether lightly or heavily loaded. I brought a small first aid kit, but didn't use it. Nor did I use my sunglasses or small stick of sunscreen. For hydration, I carried three 16-ounce soft bottles. One was mostly a spare. It was never very warm, and I never carried more than two full bottles at a time between aid stations.

I brought light hiking poles to Minnesota, the ones I used at the Never Summer 100k last year. They're very helpful on muddy trails and on technical descents in the dark. I left them in my room at the lodge on race day and didn't regret it. I would have used them in the last hour of the race if I'd been carrying them, but they wouldn't have improved my time or saved my feet.

Shoes... I do think that my Speedgoats weren't adequately broken in. The heel and toe areas were a bit stiff. I did appreciate the famous cushion and traction. They're a good choice for the Superior Hiking Trail. It's a super rugged trail, so you need some combination of cushion, armor, or dancer's feet. I ran for miles with a guy who was wearing the Speedgoat 5, which I have worn for up to 20 miles, but no further, and he loved them. I might see about switching over.

Fuel is the last technical detail to cover. I drank a lot of Tailwind solution. 400 calories worth at six of the 7 aid stations. I stashed pre-measured amounts in my drop bags to pick up along the way. I sucked down 4 packets of GU and ate three Stinger waffles and one bag of Stinger chews. I drank 8-10 ounces of Coke, one of my go-to calorie sources, at every aid station. I was able to eat plenty of solid food all day long. From pancakes and sausage in the morning, to mashed potatoes and chicken noodle soup in the evening. Mid-day I got tired of aid station food and that's when I went for the chews and gels. I did a much better job fueling than I did at last year's Never Summer, where I bonked badly in the last 10 miles.

What about crewing? I didn't have a planned crew. My friend David Bitner, a devoted Superior runner and pacer, met me at mile 27 to pump me up and help me get set for the second half. Outside of that, I found my own groups as I went. 50 miles, in my experience, is not so long that I need help. I would love to have some crew help on a 100 mile race and a pacer to help me flow quickly through aid stations.

That's all the technical notes for the race. I went into it with a B fitness level, extra pounds, a little uncertainty about my heart, but with experience at this race distance to lean on. I flunked my shoe choice, but did everything else right, enjoyed tons of support, and had a great time.

by Sean Gillies at September 21, 2022 01:59 AM

September 20, 2022

In the past year, the build system behind QField has been ported to vcpkg, a modern C++ dependency management system. It has been a great success for QField and considerably helped to streamline efforts, improve the development experience and to guarantee an outstanding stability of the application. In this blog post we will look at the history of building QGIS based applications for mobile systems and how it has become what it is today.

When Marco Bernasocchi (CEO of OPENGIS.ch and chair of QGIS.org) started working on QGIS for Android in Google Summer of Code a decade ago, the main job was to also build all QGIS dependencies for Android. This includes well-known libraries like proj and gdal and less-known ones like libxml2 or iconv. Each of them has its particularities and specific build flags. Working on this appears to be an endless iterative trial-and-error journey where you hope each day that eventually you will see the QGIS splash screen on your Android phone while all you see are endless lines of code and compiler errors.

As we know nowadays QGIS for Android has eventually seen the sunlight and its achievements are still the base for QGIS-based mobile apps like QField.

Sometime later we decided to modernize the build infrastructure into OSGeo4A a set of scripts where each dependency was built with a “recipe”. Modularized this way, it was easier to maintain, and general build code common for all libraries could be isolated. It was good enough to help drive QField for a couple of years, and a copy of it is still in use as the base for nowadays QGIS builds for macOS.

When we decided to make QField also available on other platforms like iOS, Windows and macOS we quickly realized that duplicating build chains scales really bad and maintaining this is an immense effort we wanted to avoid. There are a couple of existing C++ dependency management systems, none of which convinced us ultimately. Lucky for us a mail on the QGIS mailing list mentioned a new one called vcpkg which looked very promising.

A couple of days later we had a build for Windows and later in the same year for macOS. With many dependencies already available in modern versions. Cheers.

What’s left to do than just enable it for Android, and all our problems are suddenly solved? Alas, it’s not so easy. Cross-compiling is always a bit trickier. And so we started another journey to improve the situation. After a while, we had a working build chain based on vcpkg for Android in our R&D labs. Interestingly, this added a couple of features just because the community around vcpkg had already added them. For example using COG-based raster data via HTTP was suddenly working (for the record: thanks to the availability of curl which we never took care of adding ourselves in OSGeo4A).

Soon after we also wanted to try building for iOS with vcpkg, which after a few attempts also was successful, and even managed to resolve some weird crashes and other issues we had experienced with the old buildchain.

The main benefit was that we could upgrade the QGIS base libraries in one single place for every platform, in an isolated branch without playing the Jenga game on each upgrade.

The only unfinished business we still had was that support for iOS and Android was still available only in our own vcpkg fork.

So the last few weeks and months we have been working closely with upstream to bring building for Android and iOS up to the same level as desktop platforms. The relevant parts are now in a clean state.

Advantages of this approach:

  • • Mutualized efforts on all the base libraries, also with programmers outside the geoverse
  • • A vibrant community that ensures a noticeably fast upgrade of libraries
  • • A clean dependency management system
  • • A consistent set of dependency versions (gdal, geos, libpq, …) across all platforms
  • • A clean caching system that will only recompile reverse dependencies on updates
  • • We can upgrade a dependency in an isolated branch and only release it when it works on all platforms
  • • We can optimize the code for a given set of dependency versions and if a bug is fixed in a certain dependency version, we are sure we can ship this fix on all platforms promptly
  • • We maintain the QField source code as well as dependency versions in a single repository, what makes development more streamlined

Big thanks go to Alexander Neumann and Kai Pastor who both stand out for doing things the right and future-proof way.

As always, things come at a price, there was a steep learning curve involved, and some edge cases require attention. However, we are thrilled by the simplification this has brought us.

If you are maintaining a customized fork of QField, it is now a good time to start upgrading to vcpkg, since OSGeo4A has been archived and will no longer be maintained. The developer documentation of QField has been updated with relevant instructions.

If you have time to test the new build system, we will be happy to read about your experiences with it.

by Matthias Kuhn at September 20, 2022 05:48 AM

September 19, 2022

September 16, 2022

Hace unos días falleció Javier Marías, considerado uno de los mejores escritores que ha dado la literatura de finales del siglo XX y principios del siglo XXI y, sin lugar a dudas, un monarca ejemplar. Precisamente este verano, leyendo sobre reinos ficticios, se me ocurrió que en algún rato libre (de esos que el proyecto gvSIG deja pocos), debería crear el geoportal del Reino de Redonda.

Un reino de apenas tres kilómetros cuadrados, una pequeña isla que al contrario de lo que nos diría la imaginación no tiene ni playas ni palmeras. Sin embargo, esta aparentemente aburrida roca sita en el mar Caribe alberga una maravillosa historia para los amantes de la literatura.

Fue Matthew Dowdy Shiell el que allá por 1865, cuando nació su primer hijo, compró la isla y quiso convertirla en un reino ficticio que no fuera ni más, ni menos, que el citado islote. Para ello, parece ser, solicitó a la reina Victoria el título de reino, que le fue concedido con una única condición: que nunca supusiera un peligro para los intereses de los británicos.

Matthew Dowdy Shiell traspasó el reinado a su hijo, el escritor Matthew Phipps Shiel (sí, con una “l” menos que el padre), conocido en el mundo de la literatura como M.P. Shiel y en el Reino de Redonda como Felipe I. Además de ser un escritor bastante prolifico cultivando el género fantástico y de ciencia ficción, reinó desde 1880 a 1947, designando de paso a los primeros duques del reino, honor que recayó en personajes como H.G. Wells, Dylan Thomas o Henry Miller.

Felipe I abdicó en el poeta Terence Ian Fytton Armstrong, conocido en el mundo de la literatura como John Gawsworth y tomando el nombre de Juan I para su papel de monarca. Con la típica vida bohemia de un escritor maldito, regada por el alcohol y los problemas económicos, llegó a vender títulos nobiliarios por unas pocas libras. Con 58 años murió a consecuencia de una úlcera. Su reinado tuvo lugar desde 1947 a 1970.

Su sucesor fue el editor y también escritor John Wynne-Tyson, que para eso de ser rey se decidió por el nombre de Juan II. Su reinado duró hasta 1997, año en que decidió que la persona más adecuada para ocupar el trono era un escritor español, Javier Marías.

Javier Marías, republicano confeso, reinó con el nombre de Rey Xavier hasta su reciente fallecimiento, el pasado 11 de septiembre de 2022. Su último rey ha sido, sin duda, el que ha llevado a Redonda a sus cotas más altas de popularidad, designó a no pocos artistas e intelectuales como miembros de la corte, creó un sello propio de literatura fantástica que lleva el nombre del reino, difundió su existencia en algunos de sus escritos y creó el premio literario del Reino de Redonda.

Como todo reino que se precie ha habido disputas por el trono y ahí tenemos a Arthur John Roberts, William Leonard Gates, Bob Williamson y Michael Howorth. Y, por si os lo preguntáis, al contrario que en el caso de la monarquía inglesa… todavía no se ha proclamado el sucesor del Rey Xavier.

Hecha esta breve cronología del Reino de Redonda, os comparto los pasos que he realizado para crear el geoportal con gvSIG Online, ya sabéis, la solución en software libre para implantar Infraestructuras de Datos Espaciales. Todo el proceso me ha llevado poco más de una hora y para ello he usado un gvSIG Online que tenemos en la Asociación gvSIG para “jugar a nuestras cosas”; ni tan siquiera es la penúltima versión del software, pero ya sabéis en casa de herrero cuchillo de palo.

  • Lo primero que hice anoche fue descargar cartografía de la zona del proyecto OpenStreetMap y, a partir de ahí, seleccionar las geometrías correspondientes a la Isla de Redonda para generar una nueva capa.
  • Esa capa la repliqué para generar capas que mostraran los dominios de cada uno de los reinados y que, básicamente, han sido siempre los mismos. Un reino que ha sabido defender sus fronteras y, al mismo tiempo, tampoco ha tenido la necesidad de ampliarlas.
  • Usando la herramienta de georreferenciación y posteriormente la de asignar proyección de gvSIG Desktop, ubiqué uno de los pocos mapas manuscritos que hay del Reino de Redonda, disponible en la web de The Redondan Foundation. Al ser un mapa dibujado no es sencillo hacerlo coincidir con sus coordenadas reales, pero creo que ha quedado medianamente bien.
  • También con gvSIG Desktop generé una nueva capa de puntos para dibujar los principales puntos de interés de la isla.
  • Ya en gvSIG Online publiqué todas esas capas y generé un geoportal que las contuviera.

Claro, se podría adornar y completar mucho más, pero como pequeño homenaje creo que es suficiente. No sé si esto me convierte en cartógrafo oficial del Reino de Redonda, pero al menos espero que os haya despertado la curiosidad.

El enlace a la Infraestructura de Datos Espaciales de Redonda lo tenéis aquí:

https://online.gvsig.com/gvsigonline/core/load_public_project/redonda/

by Alvaro at September 16, 2022 09:14 AM

September 15, 2022

Dear OTB community, We are happy to announce that OTB version 8.1.0 has been released! Ready to use binary packages are available on the package page of the website: OTB-8.1.0-Darwin64.run (Mac OS) OTB-8.1.0-Linux64.run (Linux) OTB-8.1.0-Win64.zip (Windows 64 bits) You can also use the official docker image It is also possible to checkout the branch with git: git clone https://gitlab.orfeo-toolbox.org/orfeotoolbox/otb.git […]

by Thibaut Romain at September 15, 2022 09:16 AM

We are happy to announce GeoServer 2.20.6 release is available with downloads (bin, war, windows), along with docs and extensions.

This is the last planned maintenance release for the 2.20.x series, please consider upgrading to the 2.21.x series. This release was made in conjunction with GeoTools 26.6 and GeoWebCache 1.20.4.

Improvements and Fixes

  • It’s now possible to use the REST API to reset caches for a single store or a single layer.
  • GeoFence role filtering was improved.
  • OSHI was updated to allow gathering OS statistics on Apple M2 as well.
  • Assorted small improvements to the status page.
  • An infinite recursion in GWC transparent integration with WMS was spotted and fixed (affected vector tiles and layers with a meta-tiling factor of 1, when the “TILED” parameter was turned off).
  • The GWC module was split so that its REST API can be excluded (for completely headless installs).

For the full list of fixes and improvements, see 2.20.6 release notes.

About GeoServer 2.20

Additional information on GeoServer 2.20 series:

Release notes: (2.20.5 | 2.20.4 | 2.20.3 | 2.20.2 | 2.20.1 | 2.20.0 | 2.20-RC )

by Andrea Aime at September 15, 2022 12:00 AM

September 14, 2022

The GeoTools team is pleased to share the availability  GeoTools 26.6 : geotools-26.6-bin.zip geotools-26.6-doc.zip geotools-26.6-userguide.zip geotools-26.6-project.zip This release is also available from the OSGeo Maven Repository and is made in conjunction with GeoServer 2.20.6Fixes and improvementsBugGEOT-7186 SRID of geometries in columns w/o SRID constraint not detected properlyGEOT-7182

by Andrea Aime (noreply@blogger.com) at September 14, 2022 01:48 PM

September 13, 2022

Last weekend I traveled to Minnesota's North Shore to run the Superior 50: 52 miles through the North Shore Highlands and Sawtooth Mountains on the Superior Hiking Trail. On the Thursday before the race I flew into Duluth and drove a rental car 90 miles up Highway 61 to a lodge at the Lutsen Mountains Resort, the finish line for the event. Friday I met David Bitner and his partner Marin at Tettegouche State Park for lunch and some easy hiking on the cliffs next to Lake Superior. Marin would be running the Moose Marathon (the last 26.2 miles of the 50 and 100 mile course) on Saturday and Bitner, a 100 mile finisher in 2019, would be pacing a 100 mile runner overnight. After a few hours with them, I drove north again to Grand Marais to wander around and find a pre-race dinner, and then back to the lodge to sleep before Saturday's early start.

https://live.staticflickr.com/65535/52356764075_0cdeeed3a5_b.jpg

Finland, MN, 5:00 a.m.

The 50 mile race began at 5:15 a.m. in Finland, Minnesota. A school bus took me from the finish to the start, leaving the resort at 4:15. Friday's clouds and rain were gone and the 50 mile race kicked off in calm and cool conditions under a full moon.

https://live.staticflickr.com/65535/52356656369_222fc64224_c.jpg

Somewhere on the Superior Hiking Trail.

The Superior Hiking Trail, or SHT, is wild, rugged, and challenging. I estimate that a quarter of the 50 mile course was fairly runnable. The rest was steep, or overgrown, or root-bound, or all of the above. You have to watch your feet, closely. On the other hand, I didn't need sunscreen, because we were traveling under the canopy of the boreal forest, with only small breaks at river and road crossings and rocky summits.

My plan was to take it easy in the first third of the course and, if I felt good, pick up the pace in the second third and try to sustain it through the final 17 miles. I settled into the tail end of the pack and stayed there until just before the Crosby-Manitou aid station (mile 12). During this part of the race I was briefly in a small train with Courtney Dauwalter (two time UTMB winner and winner of this year's Hardrock 100) and her mom, which was fun. They say never meet your heroes, but the Dauwalters were friendly and gracious. At the aid station I got a small cup of coffee, a big pancake and a sausage patty, and really began to enjoy the race.

https://live.staticflickr.com/65535/52356836485_13de1d6a14_c.jpg

Feeling good at the Cramer Road aid station, 27 miles in. Photo by David Bitner.

I felt great at 17 miles. My heart was behaving properly. My legs felt good. The trail conditions, considering Friday's rain, and weather were much better than I'd expected. I found some good company and longer stretches of runnable trail at this point did my fastest running of the day all the way into the Sugarloaf aid station at mile 21. I kept going steadily, passing two packs of runners on the way to the Cramer Road aid station.

My favorite place on the trail was the Temperance River Gorge. The river's name is, according to Bitner, a pun: there is no sand bar where it reaches Lake Superior. I took a bunch of photos of swimming holes just past the race's Temperance aid station. Not much further downstream the gorge narrows and deepens dramatically. I didn't take any photos there, but you can easily find them online. It's a real wonder of nature.

https://live.staticflickr.com/65535/52356764135_089cea41f8_c.jpg

Temperance River Gorge, looking downstream. 34 miles.

https://live.staticflickr.com/65535/52356656414_deb9eb24b3_c.jpg

Temperance River Gorge, looking upstream.

In hindsight, I might have neglected to eat enough after Cramer Road. I was feeling sluggish on the 1,200 foot climb up from the Temperance River to Carlton Peak. My ambitions of finishing before the sun went down were starting to look unrealistic. The trail near the summit was steep and rocky, not unlike the approaches to Arthur's Rock or Horsetooth. I saw Columbines and raspberries (flowers and fruit long gone), which also reminded me of home.

https://live.staticflickr.com/65535/52356764165_2f775c9f9c_c.jpg

Carlton Peak, 38 miles.

Near the Sawbill-Britton aid station I began to catch up to slower 100 mile runners. Some of them were suffering. Some were perfectly executing their plans to beat all the aid station cut-off times and finish in just under 38 hours.

Around mile 40, my feet began to literally fall apart. I'm not sure whether my shoes (Speedgoat 4s with ~25 miles of wear) were inadequately broken in or I hadn't toughened up my feet enough in training. I only ran 20 miles once this year before Superior, whereas I ran 20, 25, and 30 miles before Quad Rock last year. My heels and toes blistered, and in trying to spare them on downhills, I absorbed more force in my quads than I otherwise would. The last 7 miles were a slow, painful slog, much of it well after dark. All the people I passed between miles 21 and 40 caught up to me and left me behind. Going uphill felt better than going downhill. But I finished! Good company helped. I ran the last 3 miles with a woman from Littleton, Colorado. We'd run some of the first few pre-dawn miles together and had some laughs about the rest of the day.

My official time: 15:36:21. This was an extremely well-run event and I can see why people keep coming back for more. Thank you, Superior volunteers and friendly runners. Thank you, Bitner and Marin, for lunch and crewing at the 27 mile mark. My biggest thanks are to my family, for putting up with my obsession and letting me off the hook for chores so I can sleep in during blocks of high training volume. I couldn't have finished without your support.

I just now found a link to the race director's recap in my inbox. It is here.

by Sean Gillies at September 13, 2022 03:38 PM

September 10, 2022

September 09, 2022

In the previous installment, we had gotten a test MapGuide installation with bundled PHP 8.1 up and running, and we were able to successfully produce a PHP error when hitting the Site Administrator.

This PHP error referenced previously was a milestone because it meant that our PHP 8.1 setup (FastCGI on Apache via mod_fcgid) was working, our PHP code was actually running and so the actual errors is the result of the vast swaths of our current PHP web tier applications needing to be migrated across to work against this new PHP 8.1 binding for the MapGuide API.

And so for the next few months I did just that, not just for PHP, but also for Java and .net. You can consider this post to be a preview of what you'll need to do yourself if you want to migrate your PHP/Java/.net MapGuide applications to MGOS 4.0.

PHP Migration Overview

The PHP binding was the one I expect to be the most work because besides finally supporting PHP 8.1, the major feature of this new PHP binding is that constants.php is no longer required! This is because with vanilla SWIG we now can bake all the constants of the MapGuide API into the PHP binding extension itself! So I expect a lot of  "include 'constants.php'" references needing to be removed.

Once all the constants.php references are removed, we found the prime issue with this new PHP binding. PHP didn't like some of our C++ classes had overloaded methods whose signatures did not exist in the parent class. This manifested in the form of fatal PHP errors like the following:

PHP Fatal error:  Declaration of MgCoordinateSystemMeasure::GetDistance(MgCoordinate|float|null $arg1, MgCoordinate|float|null $arg2, float $arg3, float $arg4): float must be compatible with MgMeasure::GetDistance(?MgCoordinate $arg1, ?MgCoordinate $arg2): float in Unknown on line 0

In this case, our MgMeasure class has a GetDistance method of the following signature:

double GetDistance(MgCoordinate* coord1, MgCoordinate* coord2)

In the derived MgCoordinateSystemMeasure, it has a new overload of GetDistance that has this signature:

double GetDistance(double x1, double y1, double x2, double y2)

However, when this converted to PHP proxy classes by SWIG, PHP doesn't like this class setup because under its inheritance model, it is expecting the GetDistance overload with 4 double parameters to also exist in the base MgMeasure class. This is not the case, and thus PHP throws the above fatal error.

To workaround this problem, we had to use the SWIG %rename directive to rename the conflicting overload signature in MgCoordinateSystemMeasure in PHP to the following:

double GetDistanceSimple(double x1, double y1, double x2, double y2)

With this rename, there is no longer a signature conflict in the generated proxy class and PHP no longer throws a fatal error. Fortunately, only 2 classes in the MapGuide API have this problem, so the amount of method renaming is minimal.

Once this issue was addressed, I tried firing up the PHP implementation of the AJAX viewer and I got an interactive map! Everything seemed to be working until I tried to generate a map plot, and found my second problem. I was getting PHP fatal errors like this:

PHP Fatal error:  Uncaught TypeError: No matching function for overloaded 'MgRenderingService_RenderMap' in quickplotgeneratepicture.php:115

Fortunately, this one was easier to explain and fix. In PHP 8.1 (maybe even earlier in the PHP 7.x series), the type checking became more stricter which meant int parameters must take integers, double parameters must take doubles, etc, etc, you couldn't pass ints as doubles or vice versa, and for a method like RenderMap of MgRenderingService, there are lots of overloads that take many different combinations of int and double parameters.

Our map plotting code was passing in strings where int/double parameters were expected. In PHP 5.6 this was allowed because the type checking was evidently more lax. Now such cases cause the above PHP fatal error. This was easy enough to fix, we just use the intval and doubleval functions to make sure ints are being passed into int parameters and doubles are being passed into double parameters.

And with that, the rest of the changes involving fixing up our exception handling code due to a major change with how MapGuide applications should be handling exceptions from the MapGuide API. As part of this SWIG binding work, we've flattened the MapGuide exception hierarchy into a single MgException class, and introduced a new exception code property to allow handling MapGuide exceptions on a case-by-case basis.

So if you were handling specific MapGuide exceptions like this:

1
2
3
4
5
6
7
8
try
{
//Some code that could throw
}
catch (MgUnauthorizedAccessException $e)
{
...
}

You would now rewrite them like this:

 1
2
3
4
5
6
7
8
9
10
try
{
//Some code that could throw
}
catch (MgException $e)
{
if ($e->GetExceptionCode() == MgExceptionCodes::MgUnauthorizedAccessException) {
...
}
}

The reason for flattening the exception hierarchy was:

  • To make wrapping exceptions simpler (we now only need to wrap the sole MgException class in SWIG) and not have to handle exception class inheritance chains in a consistent manner across all 3 language bindings.
  • Most of the example MapGuide API code pretty much only caught MgException anyways and rarely catches any of its derived exception classes (and I imagine that this is the case in your MapGuide applications as well). Any code that cared to handle specific exception cases, we can just include the relevant sub-classification as a property of MgException itself as the above code example shows.
Once this final change was made, the AJAX viewer was fully functional. Porting Fusion to PHP 8 was a similar process.

Java Migration Overview

This migration I expect to be a cakewalk because although this binding is now also being generated by vanilla SWIG, it is 99% identical to the existing MapGuideJavaApiEx.jar that we have been generating and shipping for many releases of MapGuide.

So all I expect is just to:
  • Fix up references to MapGuideJavaApiEx and rename them to MapGuideJavaApi
  • Update exception handling blocks to take action based on the captured exception code in the caught MgException instead of catching for specific subclasses of MgException
  • Replace MapGuideApiEx.jar with MapGuideApi.jar (we're using the old jar name for the new binding) in our MG installation (and effectively, going back full circle to the way things were for Java in MapGuide)
And ... it went just exactly what I said above! This was by far the easiest migration effort of the lot.

.net Migration Overview

This migration was the one I had been dreading the most. Not because I feared this binding was going to be fragile, because we already had an exhaustive test suite which this binding passed with flying colors.

But rather, I had been dreading this one because all of our .net code that is going to use this binding (AJAX viewer, code samples, etc) are all legacy pre-historic aspx webforms and I wasn't sure if such code would accept the brand new .net development story I had planned for it.

Consider the current .net development story.

  1. You would reference the 5 OSGeo.MapGuide.* assemblies from mapviewernet/bin in your MapGuide application.
  2. You would then have to manually copy the remaining dlls from mapviewernet/bin to your MapGuide application's output directory so that the .net MapGuide API binding doesn't fail due to missing native dll dependencies
The alternative to this is to use the NuGet package, which make this story more seamless, but the process to build this NuGet package was a bespoke affair, with hand-crafted powershell scripts that trigger on nuget package installation to set up the necessary project build events to copy the native dlls needed by the OSGeo.MapGuide.* assemblies to the right location. Such functionality is tightly-coupled to Visual Studio, so if you were installing this NuGet package and building your MapGuide application outside of Visual Studio, none of the required post-build events would fire and the result is a broken MapGuide .net application because the native dll dependencies were not being copied to your application's output directory.

For this new .net binding, we build each OSGeo.MapGuide.* assembly as a separate SDK-style csproj project files. This new csproj file format has several benefits:
  • You don't have to reference individual C# source files that need to be compiled. Any C# source fil in the same directory as the csproj file is implied to be compiled as part of the project. This is great because it means we can run SWIG to generate the .cs files straight into the project and build that project straight away afterwards.
  • This csproj file format supports NuGet packages as first-class project output
  • The NuGet packages produced have first-class support for native dependencies. This is the real killer feature because it means in terms of packaging, we just have to include these native dlls in a well known location and they will be bundled up automatically as part of NuGet package creation. Such a package when consumed will have its native dependencies automatically copied to the right place and loaded from the right spot without any custom post-build events to set this stuff up!
  • And finally, it means instead of targeting .net Framework, we can target netstandard2.0

What does netstandard2.0 support imply? It implies your MapGuide application using these packages can work on all of these platforms. Now practically speaking, despite now being netstandard2.0 packages, these packages will only work on platforms where the underlying OS is Windows and (maybe) Linux as those are the platforms where we can actually compile the underlying supporting native libraries needed by these nuget packages. So no Mac OSX, no Xamarin, etc. 

In practical terms, it means you are no longer shackled to legacy .net framework for building MapGuide .net applications. You can now use .net core from its earliest netstandard2.0-supported iterations all the way to the latest .net 6.0 (as of this post). 

That's great and all, but going back to the original issue: Can the current suite of aspx webforms code accept this new way of consuming the .net MapGuide API and come along for the ride? I hope so! Because the alternative is to rewrite all of this code with more modern .net web technologies (razor pages maybe?), and while such a rewrite has merit and probably warranted, it is not warranted right now because that would add many more months of dev work to my already time-poor schedule. We have bigger fish to fry! We just hope this current codebase will cooperate with our new .net packaging paradigm with minimal effort.

So let's start with the basic facts.

  • MapGuide's .net integration requires IIS and .net framework to already be installed
  • We can assume that for the purpose of being able to use this netstandard2.0 library, that the installed .net framework version must .net framework 4.8. Building your own MapGuide applications for .net core and .net 5.0+ is something you can opt-in to, but it is not something to be demanded by our existing .net web tier code.
With these facts established we have our first hurdle, and it is one of setup/deployment.

The AJAX viewer and sample code have no Visual Studio solution/project files! How do these things ever get built?

The AJAX viewer for .net is a series of raw .aspx files, will be "compiled" to .net assemblies on first request to IIS. As part of this compilation, it will check its respective "bin" directory for any references. That's why the mapviewernet/bin has the OSGeo.MapGuide.* assemblies in there because that is what is being referenced when the .aspx files get compiled. The .net sample code also follows the same pattern.

So we have a bunch of folders of .aspx files and we need to get the correct set of dlls from inside our brand new shiny nuget packages into these folders. How would we go about this without needing to disruptively set up solution/project files for them?

Here's my approach. We setup a stub SDK-style project that targets net4.8 and references the 5 OSGeo.MapGuide.* nuget packages produced from our new .net binding project setup.

As part of the main build, we perform a framework-dependent publish of this project. Because of the first-class native dependency support, the publish output of this project would be the project's assembly, the 5 OSGeo.MapGuide.* assemblies and (importantly) all of their native dll dependencies in one single output directory. Once we done the framework-dependent publish, we can then copy all the dll files in this publish output folder (except for the stub project) into the bin directory of our AJAX viewer and sample directories.

It turns out this approach does result in a functional .net AJAX viewer and code samples. What was needed in addition to using a stub project to setup the required dll file list, is that the .net AJAX viewer and code samples need a web.config file that references the netstandard assembly due to our OSGeo.MapGuide.* assemblies now target netstandard2.0

Is this a complete and utter hack? Totally!

But this approach gives us a functional .net AJAX viewer and code samples. Considering the alternative solutions and my current timelines, this is a workable approach and sometimes ...


So that was the ugly setup/deployment aspect, but what about the code itself? Well that was relatively simple. Like PHP and Java before it, we only needed to fix up the exception handling code to match to our new pattern of checking for specific exception codes in the caught MgException to handle for certain error cases.

Getting this to work on Linux

With the bindings now all being generated by vanilla SWIG and all working on Windows, it was a case of getting the build system fully working again on Linux with these new bindings and updated web tier components.

Fortunately, on the binding front, most of the CMake configurations added in the initial phases of this work only needed minor adjustments, so the bulk of the work was actually building PHP 8.1 and  integrating this into the Apache httpd server, which we are also building from source and updating our various httpd/php config file templates to work with this new version of PHP.

Where we stand now

We now finally have MapGuide API bindings generated with vanilla, unmodified SWIG that work on both Windows and Linux. This has been a long and arduous journey and I can finally see the light at the end of this tunnel!

A new RFC has been posted for PSC discussion/voting. I hope there isn't strong opposition/dissent on these changes, because I truly believe that this work is absolutely necessary for MapGuide going forward. Newer versions of .net/Java/PHP will always get released and inevitably we will need our MapGuide API bindings to work on these newer versions. Our current infrastructure to keep up with newer .net/Java/PHP versions is just not maintainable or tenable.

If/when this RFC is adopted, the long overdue Preview 4 release should drop not too long after!

by Jackie Ng (noreply@blogger.com) at September 09, 2022 05:56 PM

Il corso dura 2 giorni (9:00 – 17:00) e costa 990 CHF a persona (compreso il pranzo e il certificato). Un insegnante per un massimo di 6 partecipanti e 2 insegnanti da 7 a 12 partecipanti.

Descrizione

Alla fine di questo corso, i partecipanti conosceranno le principali funzioni di QGIS Desktop (software GIS open source) e saranno in grado di importare e analizzare dati, creare una mappa con layout professionale ed inserire oggetti con attributi e geometrie vettoriali.

Programma giorno 1

  • introduzione
  • Informazioni sul progetto QGIS
  • Panoramica dell’interfaccia utente
  • Fonti di dati, formati di dati e servizi web
  • Selezione e filtro
  • Simbologia
  • Tabella degli attributi, calcolatore di campo, campi temporanei

Programma giorno 2

  • Annotazioni, suggerimenti per le mappe e azioni
  • Modifica della geometria e degli attributi
  • Configurazione dei formulari, unioni e relazioni
  • Espressioni
  • Analisi vettoriale e raster basiche
  • Layout con il compositore di stampe
  • Prospettiva: gestione banca dati

Preconoscenze

Conoscenza di base di GIS (es. Il termine “layer”) e database (es. Il termine “data type” con intero / numero / data / stringa / booleano).

Certificazione

Questo corso è organizzato da un’organizzazione riconosciuta da QGIS . La certificazione dei partecipanti è inclusa nel prezzo del corso.

Software

Installazione di QGIS per Windows, macOS o Linux https://download.qgis.org

  • Utiliziamo l’ultima versione LTR disponibile.
  • Nessun plugin deve essere precedentemente installato.

by Anna Randegger at September 09, 2022 10:11 AM

Le cours est sur 2 jours (9:00 – 17:00) et coûte 990 CHF par personne (déjeuner et certificat inclus). Un enseignant pour un maximum de 6 personnes et 2 enseignants pour 7 à 12 personnes.

Description

À l’issue de ce cours, les participants connaîtront les principales fonctions de QGIS Desktop, logiciel open source SIG et seront capables d’importer et d’analyser des données, de créer une carte avec une mise en page professionnelle et de saisir des objets avec des attributs et des géométries vectorielles.

Programme jour 1

  • Introduction
  • À propos du projet QGIS
  • Présentation de l’interface utilisateur
  • Sources de données, formats de données et services web
  • Sélection et filtre
  • Symbologie
  • Table d’attributs, calculateur de champs, champs temporaires

Programme jour 2

  • Annotations, astuces sur les cartes et actions
  • Édition de la géométrie des données vectorielles et des attributs
  • Configuration du formulaire, unions et relations
  • Expressions
  • Analyse simple des vecteurs et des rasters
  • Mise en page avec le composeur d’impression
  • Perspective : gestion de BD

Connaissances préalables

Connaissance de base des SIG (par exemple, le terme “couche”) et des bases de données (par exemple, le terme “type de données” avec nombre entier/nombre/date/chaîne/booléen).

Certification

Ce cours est organisé par un organisme reconnu par QGIS. La certification des participants est incluse dans le prix du cours.

Logiciel

Installation de QGIS pour Windows, macOS ou Linux https://download.qgis.org

  • Nous utilisons la dernière version LTR disponible.
  • Aucun plugin ne doit être préalablement installé.

by Anna Randegger at September 09, 2022 10:10 AM

We published our new dates for this autumns QGIS online courses.

The courses last two days (9:00 a.m. – 5:00 p.m.) and cost 990 CHF per person. As in our in-person courses, we limit our instructor to participant ratio to a maximum of 6 participants for one instructor and two instructors for 7 to 12 participants.

by Anna Randegger at September 09, 2022 10:08 AM

September 08, 2022

The international community of QGIS contributors got together in person from 18 to 22 August in parallel to OpenStreetMap State of The Map event and right before the FOSS4G. So there was a lot of open source geo power concentrated in the beautiful city of Florence in those days. It was my first participation and all I knew was that it’s supposed to be an unconference. This means, there is no strict schedule but space and opportunity for everyone to present their work or team up to discuss and hack on specific tasks to bring the QGIS project to the next level.

Introduction and first discussions

We were a group of six OPENGIS.ch members arriving mostly on Thursday, spending the day shopping and moving into our city apartment. In the evening we went to a Bisteccheria to eat the famous Fiorentina steak. It was big and delicious as was the food in general. Though, I am eating vegetarian since to compensate. On Friday we went to the Campus to meet the other contributors. After a warm welcome by the organizer, Rossella and our CEO and QGIS chair Marco Bernasocchi we did an introduction round where everyone mentioned their first QGIS version ever used. At this point, I became aware of the knowledge and experience I was sharing the room with. Besides this, I noticed that there was another company attending with several members, namely Tim Sutton’s Kartoza, which is also contributing a lot to QGIS. The first discussion was about QGIS funding model, vision, communication and on the new website in planning. This discussion then moved into some smaller groups including most of the long term contributors. I was looking around, physically and virtually, and tried to process all the new inputs and to better understand the whole QGIS world. Besides, I noticed my colleague Ivan having problems with compiling QGIS after upgrading to Ubuntu 22.04 which then motivated my other colleague Clemens to implement a docker container to do the compilation. Nevertheless, I postponed my Ubuntu upgrade. That evening we went out all together to have a beer or two and play some pool sessions and table football. Finally, the OPENGIS.ch crew navigated back home pairing a high-precision GNSS sensor with a mobile device running OpenStreetMap in QField. We arrived back home safely and super precise.

First tasks and coffee breaks

There was catering in the main hall covering breakfast, lunch and coffee breaks. It never took long after grabbing a cup of coffee to find yourself in a conversation with either fellow contributors or OpenStreetMap folks. I chatted with a mapper from Japan about mobile apps, an engineer from Colombia about travelling and a freelancer from the Netherlands about GDAL, to name 3 coffees out of many.

QGIS plugins website

After some coffee, Matthias Kuhn, our CTO and high-ranking QGIS contributor, asked me whether I could improve some ugly parts of QGIS plugins website. So I had my first task which I started working on immediately. The task was to make the site more useful on mobile devices which would be achieved by collapsing some unimportant information and even removing other parts. I noticed some quirks in the development workflow, so I also added some pre-commit hooks to the dev setup. Dimas Ciputra from Kartoza helped me finalize the improvements and merge them into master branch on github.

QGIS website downloads section

Regis Haubourg asked to help simplify the QGIS Downloads for Windows section on the main QGIS website. We played around in the browser dev tools until we thought the section looked about right. I then checked out the github repo and started implementing the changes. I need to say the tech stack is not quite easy to develop with currently, but there is a complete rework in planning. Anyway, following the pull request on github a lively discussion started which is ongoing by the time of writing. And this is a good thing and shows how much thought goes into this project.

Presentations

There were many interesting and sometimes spontaneous presentations which always involved lively discussions. Amy Burness from Kartoza presented new styling capabilities for QGIS, Tobias Schmetzer from the Bavarian Center for Applied Energy Research presented the geo data processing and pointed out issues he encountered using QGIS on this and Etienne Trimaille from 3liz talked about qgis-plugins-ci, just to name a few.

Amazing community

On Saturday evening a bus showed up at the campus and took us on a trip up to the hills. After quite a long ride we arrived at a restaurant high up with mind-blowing view of the city. I forgot how many rounds of Tuscan food were served, but it was delicious throughout. An amazing evening with fruitful conversations and many laughs.

The weather was nice and hot, the beers cold, the Tuscan food delicious and the contributors were not only popular Github avatars but really nice people. Thank you QGIS.

by Fabian Binder at September 08, 2022 05:00 AM

September 05, 2022

I did less running last week, but it was high quality running. Plenty of technical, hilly single track and strides. Sunday I went for an hour long bike ride with Ruth instead of running.

  • 4 hours, 9 minutes (running)

  • 19.0 miles

  • 3,097 ft D+

Friday I stopped by the top of Arthur's Rock for a view over Horsetooth Reservoir and the plains. This Saturday I'll be getting views of Lake Superior.

https://live.staticflickr.com/65535/52336755125_2e345d4188_b.jpg

Arthur's Rock has a rock

by Sean Gillies at September 05, 2022 04:41 PM

A little update on Geomatys’ expansion into geospatial health applications

European Space Agency’s ASPIRE with ESA program

Geomatys has been named lauréat of an ASPIRE with ESA business development grant to help put together all the elements we have been working towards over the past two years to make the wealth of tele-epidemiology and infectious disease ecology knowledge more widely available for managing health risks. With the summer we’ve all just experienced, it’s no secret that our changing climate means we can no longer rely on the patterns of the past or gradual changes that can be mitigated “petit-à-petit”. Preparedness requires a more nimble approach combined with deeper knowledge of the underlying processes that drive risk.

Our project is gaining momentum!

We have hired experts in infectious disease epidemiology and modeling and will soon welcome a marketing and business development specialist. We are on our way towards producing a platform to make information curated from academic results and geospatial data technologies accessible to decision makers in need of timely evidence-based assessments of the current and future risks posed by a growing list of infectious diseases. 

Our team has grown!

In June, we had the pleasure of welcoming Dr. Sarah Kada, an infectious disease modeler, who just completed 2.5 years of postdoctoral work in the Dengue Unit of the US CDC, where she also helped with the COVID-19 response. A master already of * many * things, Sarah is learning the ins-and-outs of industry and the world of geospatial data infrastructures.

For more information, please do not hesitate to get in touch!

by geoadmin at September 05, 2022 01:26 PM

August 31, 2022

The QGIS plugin repository currently lists 1694 plugins and the list keeps on growing, even during the holiday season. It can be challenging to stay up to date.

Our new monthly plugin update is meant to provide you a quick overview of the newest plugins. If any of the names or short descriptions piques your interest, you can find the direct link to the plugin page in the table below the screenshot.

STL Generator
This plugin lets you generate an STL from a DEM and allows the exclusion of nodata regions.
Maxent Model
Maxent mapping adapter for QGIS. Adaptador de cartografía Maxent para QGIS
SRApp
Synchronizacja z bazą danych aplikacji Metryka
QGIS Redistricting
Tool for drawing districting plans from geographic units
XPlan-Reader
Import XPlan-GML
GEO_search
Layer Geo Search
Check, Define & Convert CRS
Check, define and convert CRS
Dynamic Provider Filter Plugin
QGIS plugin to dynamically set provider filters using QGIS variable replacement.
TopoTijdreis
This plugin loads all historic maps from 1815-2020 from topotijdreis.nl into QGIS
Tanaka Contours
Generates Tanaka-contours from a DEM

by underdark at August 31, 2022 06:46 PM

August 30, 2022

August 29, 2022

I got some running-is-a-part-time-job kind of numbers this week.

  • 14 hours, 32 minutes

  • 60.4 miles

  • 10,656 ft D+

I ran every day except Monday and hills every day but Thursday, when I did an easy run by the river.

https://live.staticflickr.com/65535/52317552787_7b61d5eccf_c.jpg

Osprey at the Cache La Poudre River

Two weeks ago I wrote:

I want to get ~110 miles and 15,000 ft of climbing in over the next two weeks.

I pretty much did it. 103 miles and almost 18,000 ft of climbing. My training is finally coming together and none too soon, with my one race of the year less than two weeks away.

Today I went back to RMNP to beat the heat and had a great long run on the loop I did in June, but in the other direction and without snow. I weathered some afternoon rain and hail, for the first time on a long run, and at the end of the day dealt with a herd of elk at the trailhead where I had parked. I'm comfortable being close to mule deer, but adult elk are big and can mess you up! The smart thing to do is wait for them to move along, or go around, giving them a wide berth. In about two weeks, male elk will begin competing for mates and it will become more dangerous to be out in the meadows with them.

https://live.staticflickr.com/65535/52317552827_5f262fa13b_b.jpg

Elk at Rocky Mountain National Park

by Sean Gillies at August 29, 2022 03:38 AM

August 28, 2022

August 27, 2022

August 26, 2022

August 25, 2022

The JTS Topology Suite recently gained the ability to compute concave hulls.  The Concave Hull algorithm computes a polygon enclosing a set of points using a parameter to determine the "tightness".  However, for polygonal inputs the computed concave hull is built only using the polygon vertices, and so does not always respect the polygon boundaries.  This means the concave hull may not contain the input polygon.

It would be useful to be able to compute the "outer hull" of a polygon.  This is a valid polygon formed by a subset of the vertices of the input polygon which fully contains the input polygon.  Vertices can be eliminated as long as the resulting boundary does not self-intersect, and does not cross into the original polygon.

An outer hull of a polygon representing Switzerland

As with point-set concave hulls, the vertex reduction is controlled by a numeric parameter. This creates a sequence of hulls of increasingly larger area with smaller vertex counts.  At an extreme value of the parameter, the outer hull is the same as the convex hull of the input.
A sequence of outer hulls of New Zealand's North Island

The outer hull concept extends to handle holes and MultiPolygons.  In all cases the hull boundaries are constructed so that they do not cross each other, thus ensuring the validity of the result.

An outer hull of a MultiPolygon for the coast of Denmark.  The hull polygons do not cross.

It's also possible to construct inner hulls of polygons, where the constructed hull is fully within the original polygon.

An inner hull of Switzerland

Inner hulls also support holes and MultiPolygons.  At an extreme value of the control parameter, holes become convex hulls, and a polygon shell reduces to a triangle (unless blocked by the presence of holes).

An inner hull of a lake with islands.  The island holes become convex hulls, and prevent the outer shell from reducing fully to a triangle

A hull can provide a significant reduction in the vertex size of a polygon for a minimal change in area. This could allow faster evaluation of spatial predicates, by pre-filtering with smaller hulls of polygons.

An outer hull of Brazil provides a 10x reduction in vertex size, with only ~1% change in area.

This has been on the JTS To-Do list for a while (I first proposed it back in 2009).  At that time it was presented as a way of simplifying polygonal geometry. Of course JTS has had the TopologyPreservingSimplifier for many years.  But it doesn't compute a strictly outer hull.  Also, it's based on Douglas-Peucker simplification, which isn't ideal for polygons.  

It seems there's quite a need for this functionality, as shown in these GIS-StackExchange posts (1, 2, 3, 4).  There's even existing implementations on Github: rdp-expansion-only and simplipy (both in Python) - but both of these sound like they have some significant issues. 

Recent JTS R&D (on concave hulls and polygon triangulation) has provided the basis for an effective, performant polygonal concave hull algorithm.  This is now released as the PolygonHullSimplifier class in JTS.

The PolygonHullSimplifier API

Polygon hulls have the following characteristics:

  • Hulls can be constructed for Polygons and MultiPolygons, including holes.
  • Hull geometries have the same structure as the input.  There is a one-to-one correspondence for  elements, shells and holes.
  • Hulls are valid polygonal geometries.
  • The hull vertices are a subset of the input vertices.

The PolygonHullSimplifier algorithm supports computing both outer and inner hulls. 

  • Outer hulls contain the input geometry.  Vertices forming concave corners (convex for holes) are removed.  The maximum outer hull is the convex hull(s) of the input polygon(s), with holes reduced to triangles.
  • Inner hulls are contained within the input geometry.  Vertices forming convex corners (concave for holes) are removed.   The minimum inner hull is a triangle contained in (each) polygon, with holes expanded to their convex hulls.  

The number of vertices removed is controlled by a numeric parameter.  Two different parameters are provided:

  • the Vertex Number Fraction specifies the desired result vertex count as a fraction of the number of input vertices.  The value 1 produces the original geometry.  Smaller values produce simpler hulls.  The value 0 produces the maximum outer or minimum inner hull.
  • the Area Delta Ratio specifies the desired maximum change in the ratio of the result area to the input area.  The value 0 produces the original geometry.  Larger values produce simpler hulls. 
Defining the parameters as ratios means they are independent of the size of the input geometry, and thus easier to specify for a range of inputs.  Both parameters are targets rather than absolutes; the validity constraint means the result hull may not attain the specified value in some cases. 

Algorithm Description

The algorithm removes vertices via "corner clipping".  Corners are triangles formed by three consecutive vertices in a (current) boundary ring of a polygon.  Corners are removed when they meet certain criteria.  For an outer hull, a corner can be removed if it is concave (for shell rings) or convex (for hole rings).  For an inner hull the removable corner orientations are reversed.  

In both variants, corners are removed only if the triangle they form does not contain other vertices of the (current) boundary rings.  This condition prevents self-intersections from occurring within or between rings. This ensures the resulting hull geometry is topologically valid.  Detecting triangle-vertex intersections is made performant by maintaining a spatial index on the vertices in the rings.  This is supported by an index structure called a VertexSequencePackedRtree.  This is a semi-static R-tree built on the list of vertices of each polygon boundary ring.  Vertex lists typically have a high degree of spatial coherency, so the constructed R-tree generally provides good space utilization.  It provides fast bounding-box search, and supports item removal (allowing the index to stay consistent as ring vertices are removed).

Corners that are candidates for removal are kept in a priority queue ordered by area.  Corners are removed in order of smallest area first.  This minimizes the amount of change for a given vertex count, and produces a better quality result.  Removing a corner may create new corners, which are inserted in the priority queue for processing.  Corners in the queue may be invalidated if one of the corner side vertices has previously been removed; invalid corners are discarded. 

This algorithm uses techniques originated for the Ear-Clipping approach used in the JTS PolgyonTriangulator implementation. It also has a similarity to the Visvalingham-Whyatt simplification algorithm.  But as far as I know using this approach for computing outer and inner hulls is novel. (After the fact I found a recent paper about a similar construction called a Shortcut Hull [Bonerath et al 2020], but it uses a different approach).

Further Work

It should be straightforward to use this same approach to implement a variant of Topology-Preserving Simplifier using the corner-area-removal approach (as in Visvalingham-Whyatt simplification).  The result would be a simplified, topologically-valid polygonal geometry.  The simplification parameter  limits the number of result vertices, or the net change in area.  The resulting shape would be a good approximation of the input, but is not necessarily be either wholly inside or outside.







by Dr JTS (noreply@blogger.com) at August 25, 2022 06:03 PM

As of PostGIS 3.1, the PostGIS sfcgal support library is no longer part of the postgis core library, but instead spun off into a new library postgis_sfcgal-3.

This change is not an issue for people doing regular, soft-upgrades from a PostGIS < 3.1 compiled with SFCGAL to a PostGIS >= 3.1 with SFCGAL using ALTER EXTENSION postgis_sfcgal UPDATE; or SELECT postgis_extensions_upgrade();. However if you are using pg_upgrade, you might get errors like postgis-3 does not contain function postgis_sfcgal_version() (which is part of the postgis_sfcgal extension).

The three main reasons for this break were:

  • We wanted postgis-3 library to have the same exposed functions regardless if you are compiling with SFCGAL or not. This change was planned in PostGIS 3.0, but only the backend switching plumbing was removed and not the complete detachment.

  • It makes it possible for packagers to offer postgis_sfcgal (perhaps as a separate package), without requiring other users who just want postgis to have to have boost and CGAL.

  • In the past postgis_sfcgal and postgis extensions were hooked together at the hip in the same underlying library, because their were a few functions overlapping in name such as ST_3DIntersects and ST_Intersects. Trying to explain to people how this whole thing worked, to switch the backend to sfcgal if they wanted extended 3D functionality, not to mention the added annoyance GUC backend of notices during upgrade was more of a pain than it was worth. So moving forward, we will not be reusing function names between the two extensions, and will have only non-overlapping function names.

by Regina Obe at August 25, 2022 12:00 AM

August 24, 2022

The previous post discussed polygonal coverages and outlined the plan to support them in the JTS Topology Suite.  This post presents the first step of the plan: algorithms to validate polygonal coverages.  This capability is essential, since coverage algorithms rely on valid input to provide correct results.  And as will be seen below, coverage validity is usually not obvious, and cannot be taken for granted.  

As described previously, a polygonal coverage is a set of polygons which fulfils a specific set of geometric conditions.  Specifically, a set of polygons is coverage-valid if and only if it satisfies the following conditions:

  • Non-Overlapping - polygon interiors do not intersect
  • Edge-Matched (also called Vector-Clean and Fully-Noded) - the shared boundaries of adjacent polygons has the same set of vertices in both polygons
The Non-Overlapping condition ensures that no point is covered by more than one polygon.  The Edge-Matched condition ensures that coverage topology is stable under transformations such as reprojection, simplification and precision reduction (since even if vertices are coincident with a line segment in the original dataset, this is very unlikely to be the case when the data is transformed).  
An invalid coverage which violates both (L) Non-Overlapping and (R) Edge-Matched conditions (note the different vertices in the shared boundary of the right-hand pair)

Note that these rules allow a polygonal coverage to cover disjoint areas.  They also allow internal gaps to occur between polygons.  Gaps may be intentional holes, or unwanted narrow gaps caused by mismatched boundaries of otherwise adjacent polygons.  The difference is purely one of size.  In the same way, unwanted narrow "gores" may occur in valid coverages.  Detecting undesirable gaps and gores will be discussed further in a subsequent post. 

Computing Coverage Validation

Coverage validity is a global property of a set of polygons, but it can be evaluated in a local and piecewise fashion.  To confirm a coverage is valid, it is sufficient to check every polygon against each adjacent (intersecting) polygon to determine if any of the following invalid situations occur:
  • Interiors Overlap:
    • the polygon linework crosses the boundary of the adjacent polygon
    • a polygon vertex lies within the adjacent polygon
    • the polygon is a duplicate of the adjacent polygon
  • Edges do not Match:
    • two segments in the boundaries of the polygons intersect and are collinear, but are not equal 
If neither of these situations are present, then the target polygon is coverage-valid with respect to the adjacent polygon.  If all polygons are coverage-valid against every adjacent polygon then the coverage as a whole is valid.

For a given polygon it is more efficient to check all adjacent polygons together, since this allows faster checking of valid polygon boundary segments.  When validation is used on datasets which are already clean, or mostly so, this improves the overall performance of the algorithm.  

Evaluating coverage validity in a piecewise way allows the validation process to be parallelized easily, and executed incrementally if required.

JTS Coverage Validation

Validation of a single coverage polygon is provided by the JTS CoveragePolygonValidator class.  If a polygon is coverage-invalid due to one or more of the above situations, the class computes the portion(s) of the polygon boundary which cause the failure(s).  This allows the locations and number of invalidities to be determined and visualized.

The class CoverageValidator computes coverage-validity for an entire set of polygons.  It reports the invalid locations for all polygons which are not coverage-valid (if any). 

Using spatial indexing makes checking coverage validity quite performant.  For example, a coverage containing 91,384 polygons with 10,474,336 vertices took only 6.4 seconds to validate.  In this case the coverage is nearly valid, since only one invalid polygon was found. The invalid boundary linework returned by CoverageValidator allows easily visualizing the location of the issue.

A polygonal dataset of 91,384 polygons, containing a single coverage-invalid polygon

The invalid polygon is a tiny sliver, with a single vertex lying a very small distance inside an adjacent polygon.  The discrepancy is only visible using the JTS TestBuilder Reveal Topology mode.

The size of the discrepancy is very small.  The vertex causing the overlap is only 0.0000000001 units away from being valid:

[921]  POLYGON(632)
[922:4]  POLYGON(5)
Ring-CW  Vert[921:0 514]  POINT ( 960703.3910000008 884733.1892000008 )
Ring-CW  Vert[922:4:0 3]  POINT ( 960703.3910000008 884733.1893000007 )

This illustrates the importance of having fast, robust automated validity checking for polygonal coverages, and providing information about the exact location of errors.

Real-world testing

With coverage validation now available in JTS, it's interesting to apply it to publicly available datasets which (should) have coverage topology.  It is surprising how many contain validity errors.  Here are a few examples:

SourceCity of Vancouver
DatasetZoning Districts


This dataset contains 1,498 polygons with 57,632 vertices. There are 379 errors identified, which mainly consist of very small discrepancies between vertices of adjacent polygons.
Example of a discrepant vertex in a polygon


SourceBritish Ordnance Survey OpenData
DatasetBoundary-Line
Fileunitary_electoral_division_region.shp



This dataset contains 1,178 polygons with 2,174,787 vertices. There are 51 errors identified, which mainly consist of slight discrepancies between vertices of adjacent polygons. (Note that this does not include gaps, which are not detected by CoverageValidator.  There are about 100 gaps in the dataset as well.)
An example of overlapping polygons in the Electoral Division dataset


SourceHamburg Open Data Platform
DatasetVerwaltungsEinheit (Administrative Units)


The dataset (slightly reduced) contains 7 polygons with 18,254 vertices.  Coverage validation produces 64 error locations.  The errors are generally small vertex discrepancies producing overlaps.  Gaps exist as well, but are not detected by the default CoverageValidator usage.
An example of overlapping polygons (and a gap) in the VerwaltungsEinheit dataset


As always, this code will be ported to GEOS.  A further goal is to provide this capability in PostGIS, since there are likely many datasets which could benefit from this checking.  The piecewise implementation of the algorithm should mesh well with the nature of SQL query execution.

And of course the next logical step is to provide the ability to fix errors detected by coverage validation.   This is a research project for the near term.

UPDATE: my colleague Paul Ramsey pointed out that he has already ported this code to GEOS.  Now for some performance testing!

by Dr JTS (noreply@blogger.com) at August 24, 2022 11:42 PM

August 23, 2022

A security fix is now available for MapGuide Open Source.

This fix mitigates several XSS vulnerabilities reported in the MapGuide Site Administrator tool.

Download the fix here

To apply, simply extract the zip contents to the www/mapadmin folder of your MapGuide installation and overwrite all existing files.

This fix can be applied to the following versions of MapGuide Open Source:

  • 2.6.1
  • 3.0.0
  • 3.1.0
  • 3.1.1
  • 3.1.2
  • Any preview release of 4.0.0

Special thanks to Eitan Shav of mend.io who found and reported this vulnerability

by Jackie Ng (noreply@blogger.com) at August 23, 2022 02:06 PM

August 22, 2022

GeoServer 2.22-M0 release is now available with downloads (bin, war, windows), along with docs, extensions, and data directory.

This is a milestone release previewing the GeoServer 2.22.x series for FOSS4G attendees.

Thanks to Jody Garnett and Ian Turton for making this release.

Docker image

This release is also available as an official Docker image:

docker pull docker.osgeo.org/geoserver:2.22-M0
docker run -it -p 80:8080 docker.osgeo.org/geoserver:2.22-M0

Welcome Page Improvements

The welcome page has been improved with the ability to:

  • Select workspace to browse workspace web services
  • Select layer and layergroup for layer specific web services

Welcome workspace

GeoPackage Sample data

The sample data directory now includes a small geopackage generated from Natural Earth data.

World map

About GeoServer 2.22

Release notes: ( 2.22-M0 )

by Jody Garnett at August 22, 2022 12:00 AM

August 18, 2022

Decía Oscar Wilde que un mapamundi que no incluya los territorios de la utopía no vale la pena mirarlo, pues cuando la Humanidad mira a lo lejos tierras mejores, siempre zarpa hacia ellas. El progreso, decía el escritor, es la realización de las utopías.

Este año son las 18as Jornadas Internacionales de gvSIG, un proyecto que en sus inicios fue tachado de utópico, de irrealizable, y que dieciocho años después está más activo que nunca. Tras dos años en los que el evento tuvo que realizarse en modalidad virtual, regresa a la presencialidad, en València, la ciudad que se ha convertido por derecho propio en uno de los centros de referencia de la geomática, la ciencia y tecnología aplicada a la gestión territorial.

Volvemos de la mejor manera posible, uniendo esfuerzos con la Red GeoLIBERO, impulsada por CYTED Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo. Una red que aúna a algunas de las principales entidades y personas del ámbito de la investigación en el área de la geomática libre. Coordinadores de los distintos grupos de investigación de GeoLIBERO, de toda Iberoamérica, presentarán sus trabajos en el marco de las Jornadas.

Impulsando la economía del conocimiento, lema del evento, destaca uno de los ejes principales del proyecto gvSIG. Los últimos tiempos están haciéndonos más conscientes de la necesidad de ser independientes en sectores críticos como la energía, la sanidad, la defensa y, por supuesto, la tecnología. Soberanía tecnológica, uno de los lemas de gvSIG, y que debe ir intrínsecamente relacionado con la soberanía económica.

Es necesario, hoy más que nunca, impulsar los proyectos que apuestan por nuevos modelos de negocio, basados en la colaboración, la solidaridad y el conocimiento compartido. Es el momento de romper definitivamente con la dependencia tecnológica, con modelos que no generan economía y suponen gasto. El momento de mantener y reforzar tecnologías que se asientan sobre los conceptos de cooperación y sostenibilidad. Es el momento de gvSIG y la geomática libre.

by Alvaro at August 18, 2022 08:31 AM

August 16, 2022

August 14, 2022