GEO Visual GPU Analytics notes

Update June 2019:  idea on sequencing and replay  mentioned here: https://blog.sumbera.com/2019/06/28/motion-sequence-and-replay-in-dynamic-maps/

****

With some delay, but before the year ends, I have to wrap up my presentation from GIS Hackathon March/2017 in Brno called Geo Visual GPU Analytics . It is available here in CZ : https://www.slideshare.net/sumbera/geo-vizualni-gpu-analytika  .  There are more pictures than text, so here I will try to add some comments to the slides.

slide 3,4: credits to my source of inspiration -Victor Bret, Oblivion GFX, Nick Qi Zhu.

slide 5: this is a snippet from my “journey log” (working diary), I keep every working day a short memo what I did, or anything significant that happen. It serves to several purposes, for example in this case I have gave up on trying WebGL , spent one /two days on other subject and then returned to the problem – and viola, I could resolve the problem.  Everyday counts, it helps to keep discipline and learn from past entries. Getting to know WebGL opened really ‘New Horizons” of GPU computing universe.

slide 7: “better bird in the hand than a pigeon on the roof  ” (English equivalent is : A bird in the hand is worth two in the bush’ ). This proverb is put into the context of edge vs cloud computing on slide 9.  In the hands – this is the edge , in the roof – this is the cloud.  So I believe that what users can hold in their hand, or wear or experience ‘nearby’ ‘is better’ (or more exciting)  than what exist somewhere far away (despite its better parameters).

slide 8 : We have same term for tool and instrument in the Czech – ‘nastroj’  so the question is musical instrument or just instrument (aka tool)? This goes to the whole topic of latency in user interaction, described for instance here. I tend to compare the right approach with musical instrument where tight feedback loop happens between the player and the musical instrument. The instrument must respond in less then 10 ms to tighten the feedback loop so the player can feel this instrument as his own ‘body’ and forget on ‘mechanics’ rather flow on the expressiveness of the feelings for what he is interpreting or improvising.  (right picture credit here) Why not to have such tools in visual analytics ? Why we need to wait for response from the server if the same task can be done quite  well on the edge ? mGL library for GPU powered visualization on web  or ImpactIN for iOS using Apple Pencil  reflects this principle. We have real-time rendering, we need human-sense-time interaction and bloated abstraction of current software stack do not help here despite of the advance in the hardware –  nice write up about latency problem here   …and as a side note there are computers types with very low latency – check any synthesizer or digital instrument where latency from user interaction must be very low, hence the left picture  on that slide represents them (combination of MIDI pad + Guitar).

Here is a short video form the Korg Monologue synth  on something used from 70’s , I consider this type of low-latency feedback-loop applied to new domains fascinating subject to explore. Notice real-time filter modification.

slide 9,10: nice chart from 2012 from britesnow.com    on cyclic nature of server vs client processing.  I stated there that Innovation happens on client (on edge) as servers(clouds, frames)  can do always anything and everything. Exaggerated and related to the slide 7 described above.  Workstations, PC, Smartphones (1st iPhone), AR/VR devices, wearables in general etc… it is always about efficiency in used space. Interestingly NVIDIA GPU Gems states similar on chip level.

slide 11: GPU chart over-performing CPU in conjunction with video resolution.

slide 12: Most tricky slide called ironically “Find 10 differences”. On left side is the program I did in 1993, in DOS, on right  the one I did using WebGL in 2016. Both examples are great achievements, the right side does GPU-based filtering (or marketingly in-memory)  with low user latency so it redraws immediately as user filters by his mouse pointing on brush selector.  The left was created in DOS era where each graphics card has its own way of mode switching  and that app could utilize maximum of the graphic card using 640×480 resolution with 256 colors ! that was something that time. However something is wrong in trying to find 10 differences as they are basically so similar, both using monitor, keyboard/mouse, and layout….

slide 13:  last slide titled “Find 1 difference”is the answer on the dilemma from slide 12  – the AR experience, new way of interaction, new type of the device for new workflows, visual analytic, exploration etc.  For one example of many possibilities of AR, here is a nice video from HxGN live 2017:

 

HxGN13: SG&I Perspectives LIVE

Here is a project I have been working on iKatastr2 (SpatialReader) with myVR technology showing terrain model with overlays of OGC WMS services. All from freely available data. Presented at HxGN live 2013

at 39:10 watch myVR multiplatform rendering technology integrated in the mobile app for 3D map visualisation.

Here are also few pictures form HxGN live booth/keynote

ESA App challenge winner

  I have participated in the first ESA (European Space Agency)  app dev challenge where 5 teams competed on best  concept/prototype that will bring GMES data sources to the public on mobile devices. Our team (Czech Republic, Germany, Macedonia) won and each member got iPad 3 . We won not because we were best in terms of  the best prototype,  concept or presentation, but because we fit best to the criteria imposed by this challenge and each piece of delivery (5 page long document describing concept, presentation, prototype demo)  was pretty good and simple enough to be feasible for final realization. Moreover a unique value of mobile devices plus unique value of  GMES satellites have been addressed.  Full article can be read here :   http://www.esa.int/esaEO/SEMIQOBXH3H_index_0.html

Update 08/08/2012 : there is also press release from my company  Intergraph : http://www.intergraph.com/assets/pressreleases/2012/08-01-2012.aspx

 

Intergraph 2010 session

ingr10_logo_localization2 Here is a video  from the prototype I did for the Intergraph 2010 keynote on “Geospatial Workflow Designer”  as part of the SOA/Composite services.

 

Another video showed prototype of hosting  GeoMedia Web Map on Amazon EC2 and Azure:

It was part of the large presentation  there #3119

3119: The Private Cloud: Integrating and Hosting Intergraph Services Using Microsoft Technologies
Abstract:
Service oriented architecture (SOA) has established a solid position within enterprise systems as the predominant way of integration and means of distributed computing. As part of the ongoing Intergraph SOA vision and roadmap, we are moving our research focus from foundation services to composite services and orchestration. The speaker of this session will discuss various aspects of orchestration using Workflow Foundation as well as options for hosting higher order services using Microsoft AppFabric and virtualization technologies – all elements of emerging private cloud architectures.
Date & Time: Wednesday, 9/1/2010 8:00 AM
Length: 45 minutes
Division: Security, Government & Infrastructure (SG&I)
Audience Type: User – General
Track / Sub-Track: Utilities & Communications;Defense & Intelligence;Public Safety & Security;Government & Transportation / SG&I General
Speaker(s): Stanislav Sumbera

 

Appliantization at NUC 2008

 I have heard first time the term “Appliantization” from Justin Lindsey a CTO of Netezza at Netezza User Conference 2008, September, Orlando. I must admit I love this term, especially since I was involved with virtual appliance concept for geospatial. At Intergraph I was evaluating Netezza Performance Server with gaining fascinating results –  truly the peroformance runs in ranges you read in Netezza marketing materials – that is 10-100 times faster than equivalent general purpose database.  Gartner put Netezza into leaders sections in their magic quadrant for 2008, Netezza has quite good support for spatial types and spatial operations in their database and with UDXes you can turn the machine into domain focused Data Warehouse Appliance. 

More about Netezza Spatial  : http://www.netezza.com/data-warehouse-appliance-products/spatial-analytics.aspx

But let’s start from the beginning…

“One size fits all” approach doesn’t fit for high performance.

  Computing Appliances are equipment with a specialized laser focus on solving targetted IT problems. In contrast to general purpose hardware and software solutions, computing appliances leverage a high level of coherence or fidelity between wired hardware and software pieces. Appliances hide the technical complexity of a system and expose the simplicity of the system. According to the Gartner definition an appliance is “a prepackaged or preconfigured balanced set of hardware, software, service and support, sold as a unit with built-in redundancy for high availability.”

Recently in the data warehouse market, new appliances have emerged with support for geospatial data, processing and present revolution (and disruptive) technology. These new appliances provide a performance boost by tackling the way large amounts of geospatial data can be effectively processed. These performance boosts are reaching orders of magnitude in comparison to general purpose database counterparts like Oracle.

 Geospatially empowered Data Warehouse Appliances (DWA) with Massively Parallel Processing (MPP) architecture can scale out into the hundreds of terabytes, have capabilities to perform spatial queries in seconds instead of minutes or hours, and provide to the user new levels of experience with the affordable instant geospatial analytics.

 With a huge volume of geospatially related data, there are many technical reasons to tune and assemble hardware with software and encapsulate all the complexity together into a self-contained ‘simple’ appliance with standard endpoints for interfacing. These self-contained appliances are easier to maintain and manage keeping the total cost of ownership lower than their general purpose counterparts.
2008 will be known as the year of “Appliantization.” In the data warehousing domain, appliances such as Netezza NPS, Oracle Exadata or Microsoft’s code-named project “Madison” (confluence of DataAllegro and SQL Server) are enabling technologies for high performance spatial analysis.

 Simplicity is managed complexity and computing appliances just do this