Category Archives: WebGL

Motion sequence and replay in smart maps

moitonseq.jpgWhen dynamic maps were in early stages of development or even concepts, I was thinking on ‘replay’ function – sort of step-sequencing based on certain dimension. Some of these ideas have been presented  in the “Incident Prototype” in 2014 here at  2:24 where you can see sequencing over the day or month period.

Another demonstration took place in Green Space Analyzer in 2015 when introducing Smart M.Apps at 2:04 when years are sequenced.

Important part of the idea was the editing functionality – or how the sequence and capture ‘user motion’ in general. In multidimensional data there are nearly unlimited number of combinations and ways how data are filtered over the time. However instead of continuous time sequencing, we can simplify to ‘step-sequencing’ that is  well known in music industry – many digital  instruments offer this, specially rhythm, drum or groove machines. Geospatial industry could take a lot from looking on how this function, core to music production, is done there and get inspiration for smart  mapping.

For example Korg Minilogue xd captures  ‘user motion’ of the filters that can be defined for each of the 16 steps

Why it is important – imagine story telling where someone will show how to filter out data to get certain interesting results – he will then save his choices of filters over the time (in form of motion sequence) so other people can load it and ‘replay’ it and further tweak it.

I believe this might be super-useful for show-case of  evolving of some phenomena, or how someone (aka expert) got into certain results.

Or it might be useful as ‘Smart Video’ of maps – so instead of publishing dump video – you publish a dynamic map plus ‘motion sequence’ of when and what is filtered (including map manipulations like zooms)

Yeah this is like what everyone knows from Whether forecasts – replay or forecast of cloud/pressure/wind  motion, however instead of video, you get real smart map – the one that is replaying its state  based on the sequencer data.

State of state 

Keeping current state of the filters modified by user  is the first step.

While lot of user interaction already  happens on the client side, most of the front -leading solutions do not keep a state of the selected filters so refreshing URL doesn’t  keep what user has previously selected – in short all user UI state is lost. May be this is ‘feature’ to reset state not inconsistency, then you need to make another button that explicitly gives you link to the current state…

In iKatastr.cz I keep the state in URL – this makes it easy than to go forward/backward of what user has previously selected (here parcel).  Moreover it enables direct link sharing and one button less in UI (for sharing link) – so what user sets in UI on the page is reflected in the URL as fragment (starting by #) so any time user can grab the URL and pass it by SMS or email – this plays well on iOS/Android  with default ‘share url’ function of the browsers. So try this link:  https://ikatastr.cz/#kde=49.31166,17.75353,18&mapa=zakladni&vrstvy=parcelybudovy&info=49.31142,17.75423

and try to refresh it – you will see the same state as before the refresh. You can also try to select other parcels and then press back button in the browser. You can experience back/forward ‘sequencing’ manual sequencing fo your previous user interactions. This is most likely the way to go and further evolve into full step-sequencing.

GEO Visual GPU Analytics notes

Update June 2019:  idea on sequencing and replay  mentioned here: https://blog.sumbera.com/2019/06/28/motion-sequence-and-replay-in-dynamic-maps/

****

With some delay, but before the year ends, I have to wrap up my presentation from GIS Hackathon March/2017 in Brno called Geo Visual GPU Analytics . It is available here in CZ : https://www.slideshare.net/sumbera/geo-vizualni-gpu-analytika  .  There are more pictures than text, so here I will try to add some comments to the slides.

slide 3,4: credits to my source of inspiration -Victor Bret, Oblivion GFX, Nick Qi Zhu.

slide 5: this is a snippet from my “journey log” (working diary), I keep every working day a short memo what I did, or anything significant that happen. It serves to several purposes, for example in this case I have gave up on trying WebGL , spent one /two days on other subject and then returned to the problem – and viola, I could resolve the problem.  Everyday counts, it helps to keep discipline and learn from past entries. Getting to know WebGL opened really ‘New Horizons” of GPU computing universe.

slide 7: “better bird in the hand than a pigeon on the roof  ” (English equivalent is : A bird in the hand is worth two in the bush’ ). This proverb is put into the context of edge vs cloud computing on slide 9.  In the hands – this is the edge , in the roof – this is the cloud.  So I believe that what users can hold in their hand, or wear or experience ‘nearby’ ‘is better’ (or more exciting)  than what exist somewhere far away (despite its better parameters).

slide 8 : We have same term for tool and instrument in the Czech – ‘nastroj’  so the question is musical instrument or just instrument (aka tool)? This goes to the whole topic of latency in user interaction, described for instance here. I tend to compare the right approach with musical instrument where tight feedback loop happens between the player and the musical instrument. The instrument must respond in less then 10 ms to tighten the feedback loop so the player can feel this instrument as his own ‘body’ and forget on ‘mechanics’ rather flow on the expressiveness of the feelings for what he is interpreting or improvising.  (right picture credit here) Why not to have such tools in visual analytics ? Why we need to wait for response from the server if the same task can be done quite  well on the edge ? mGL library for GPU powered visualization on web  or ImpactIN for iOS using Apple Pencil  reflects this principle. We have real-time rendering, we need human-sense-time interaction and bloated abstraction of current software stack do not help here despite of the advance in the hardware –  nice write up about latency problem here   …and as a side note there are computers types with very low latency – check any synthesizer or digital instrument where latency from user interaction must be very low, hence the left picture  on that slide represents them (combination of MIDI pad + Guitar).

Here is a short video form the Korg Monologue synth  on something used from 70’s , I consider this type of low-latency feedback-loop applied to new domains fascinating subject to explore. Notice real-time filter modification.

slide 9,10: nice chart from 2012 from britesnow.com    on cyclic nature of server vs client processing.  I stated there that Innovation happens on client (on edge) as servers(clouds, frames)  can do always anything and everything. Exaggerated and related to the slide 7 described above.  Workstations, PC, Smartphones (1st iPhone), AR/VR devices, wearables in general etc… it is always about efficiency in used space. Interestingly NVIDIA GPU Gems states similar on chip level.

slide 11: GPU chart over-performing CPU in conjunction with video resolution.

slide 12: Most tricky slide called ironically “Find 10 differences”. On left side is the program I did in 1993, in DOS, on right  the one I did using WebGL in 2016. Both examples are great achievements, the right side does GPU-based filtering (or marketingly in-memory)  with low user latency so it redraws immediately as user filters by his mouse pointing on brush selector.  The left was created in DOS era where each graphics card has its own way of mode switching  and that app could utilize maximum of the graphic card using 640×480 resolution with 256 colors ! that was something that time. However something is wrong in trying to find 10 differences as they are basically so similar, both using monitor, keyboard/mouse, and layout….

slide 13:  last slide titled “Find 1 difference”is the answer on the dilemma from slide 12  – the AR experience, new way of interaction, new type of the device for new workflows, visual analytic, exploration etc.  For one example of many possibilities of AR, here is a nice video from HxGN live 2017:

 

GPU accelerated visual analytics

Recent months I had lot of fun working   on WebGL component called “mGL” for visualizing and filtering large amount of data in the browser. It has been used  for Incident Analyzer and Area Analyzer Smart M.Apps. Here are  2 videos of the testing app of the mGL that shows its potential.  Most interesting is the filtering part that takes place in the fragment shader. mGL itself has API that can connect to crossfilter to control filtering  or has adapter to be used with dc.js.

First video shows 400k parcels in Cincinaty  and second   400k road network in North Caroline. Both  with fast cross-filtering on several attributes. You can switch between dimensions represented by charts by clicking on their label. Map (road network) will reflects chart’s color and immediately response to changing filters on either chart or on map.second video shows 400k parcels in Cincinaty with the same behavior.

 

 

WMS overlay with MapBox-gl-js 0.5.2

alt textQuick and dirty test of the WMS capabilities of the new MapBox-gl-js 0.5.2 API. First of all, yes ! it is possible to overlay (legacy) WMS over the vector WebGL rendered base map … however the way is not straightforward:

 

  • Needs some ‘hacks’ as current version of the API doesn’t have enough events to supply custom URL before it is loaded. But check latest version of mapbox, it might have better support for this.
  • Another issue is that WMS server has to provide HTTP header with Access-Control-Allow-Origin:* to avoid WebGL CORS failure when loading image (gl.texImage2D). Usually WMS servers don’t care about this, as for normal img tags CORS doesn’t apply. Here WebGL has access to raw image data so WMS provider has to explicitly agree with this.
  • Build process of mapbox-gl-js tend to be as many other large js projects complicated, slow, complex. And specifically on Windows platform it is more difficult to get mapbox-gl-js install and build running then on Mac.

Code is documented to guide you through the process, few highlights:


 // -- rutine originaly found in GlobalMercator.js, simplified
 // -- calculates spherical mercator coordinates from tile coordinates
 function tileBounds(tx, ty, zoom, tileSize) {
    function pixelsToMeters(px, py, zoom) {
     var res = (2 * Math.PI * 6378137 / 256) / Math.pow(2, zoom),
         originShift = 2 * Math.PI * 6378137 / 2,
         x = px * res - originShift,
         y = py * res - originShift;
     return [Math.abs(x), Math.abs(y)];
     };
   var min = pixelsToMeters(tx * tileSize, ty * tileSize, zoom),
         max = pixelsToMeters((tx + 1) * tileSize, (ty + 1) * tileSize, zoom);
return min.concat(max);
}

 
]

// -- save orig _loadTile function so we can call it later
 // -- there was no good pre-load event at mapbox API to get hooked and patch url
// -- we need to use undocumented _loadTile
 var origFunc = sourceObj._loadTile;
    // -- replace _loadTile with own implementation
 sourceObj._loadTile = function (id) {
    // -- we have to patch sourceObj.url, dirty !
    // -- we basically change url on the fly with correct BBOX coordinates
    // -- and leave rest on original _loadTile processing
     var origUrl =sourceObj.tiles[0]
                      .substring(0,sourceObj.tiles[0].indexOf('&BBOX'));
     var origUrl = origUrl +"&BBOX={mleft},{mbottom},{mright},{mtop}";
     sourceObj.tiles[0] = patchUrl(id, [origUrl]);
     // -- call original method
     return  origFunc.call(sourceObj, id);
 }

 

 

gist available here

WebGL polyline tessellation with MapBox-GL-JS

update 09/20015 : test of tesspathy.js library here . Other sources to look:

  1.  http://mattdesl.svbtle.com/drawing-lines-is-hard
  2.  https://github.com/mattdesl/extrude-polyline
  3. https://forum.libcinder.org/topic/smooth-thick-lines-using-geometry-shader

*** original post ***

This post attempted to use pixi.js tessellation of the polyline, this time let’s look on how mapbox-gl-js can do this. In short much more better than pixi.js.

it took slightly more time to get the right routines from mapbox-gl-js and find-out where the tessellation is calculated and drawn. It is actually on two places – in LinBucket.js  and in line shader. FireFox shader editor helped a lot to simplify and extract needed calculations and bring it into the JavaScript (for simplification, note however that shader based approach is the right one as you can influence dynamically thickness of lines, while having precaluclated mesh means each time you need to change thickness of line you have to recalculate whol e mesh and update buffers )

 

// — module require mockups so we can use orig files unmodified
 module = {};
 reqMap = {
‘./elementgroups.js’: ‘ElementGroups’,
‘./buffer.js’ : ‘Buffer’
};
require = function (jsFile) { return eval(reqMap[jsFile]); };

 

   <!-- all mapbox dependency for tesselation of the polyline -->
 <script src="http://www.sumbera.com/gist/js/mapbox/pointGeometry.js"></script>
    <script src="http://www.sumbera.com/gist/js/mapbox/buffer.js"></script>
    <script src="http://www.sumbera.com/gist/js/mapbox/linevertexbuffer.js"></script>
    <script src="http://www.sumbera.com/gist/js/mapbox/lineelementbuffer.js"></script>
    <script src="http://www.sumbera.com/gist/js/mapbox/elementgroups.js"></script>
    <script src="http://www.sumbera.com/gist/js/mapbox/linebucket.js"></script>
    <script src="http://www.sumbera.com/gist/data/route.js" charset="utf-8"></script>
 // -- we don't use these buffers, override them later, just set them for addLine func
 var bucket = new LineBucket({}, {
 lineVertex: (LineVertexBuffer.prototype.defaultLength = 16, new LineVertexBuffer()),
 lineElement: (LineElementBuffer.prototype.defaultLength = 16, new LineElementBuffer())
 });

var u_linewidth = { x: 0.00015 };
// override .add to get calculated points
LineVertexBuffer.prototype.add = function (point, extrude, tx, ty, linesofar) {
    point.x = point.x + (u_linewidth.x * LineVertexBuffer.extrudeScale * extrude.x * 0.015873);
    point.y = point.y + (u_linewidth.x * LineVertexBuffer.extrudeScale * extrude.y * 0.015873);
    verts.push( point.x, point.y);
    return this.index;
};

// — pass vertexes into the addLine func that will calculate points
bucket.addLine(rawVerts,“miter”,“butt”,2,1);

prototype  code posted here

 

WebGL polyline tessellation with pixi.js

update 09/2015  : another triangulation methods (mapbox, tesspathy) mentioned here

pixi.js is a 2D open source library for gaming that includes WebGL support for primitives rendering. Why not to utilize it for polyline renderings on map ? It turned out, however, that the  tesselation of the polylines is not handled well.

most important code snippets:

<script src="Pixi.js"></script>
<script src="Point.js"></script>
<script src="WebGLGraphics.js"></script>

//--data
<script src="route.js" charset="utf-8"></script>

var graphicsData = {
 points: verts,
 lineWidth: 0.00015,
 lineColor: 0x33FF00,
 lineAlpha: 0.8
};
var webGLData = {
   points: [],
   indices: []
 };
 // -- from pixi/utils
 PIXI.hex2rgb = function (hex) {
   return [(hex >> 16 & 0xFF) / 255,
           (hex >> 8 & 0xFF) / 255,
           (hex & 0xFF) / 255];
  };

PIXI.WebGLGraphics.buildLine(graphicsData, webGLData);

I have put sample here:

Another implementaiton of polyline tessellation (seems like more functional) is in mapbox-gl-js  in LineBucket  .Mapbox-gl-js code took quite more time to get it running and debug on Windows platform,I  had to run npm install  from VS command shell and read carefully what all the npm errors are saying (e.g. Python version should be < 3). Then FireFox for some reason haven’t triggered breakpoint on LineBucket.addLine, this took another time to find out that I should debug thiOstravaRailwayss rather in Chrome.   See the blog here.Anyway good  experience with all the messy npm modules, their install requirements and unnecessary complexity. Also all the npm modules takes more than 200 MB, but some of them are optional in the install.

After all basic LINE draw in WebGL (without the thicknes and styling) is useful too, as on picture above you can see railways in CZ city Ostrava.

 

WebGL polygons fill with libtess.js

kraje

Update 1.6.2015: geojson-vt seems to do great job in tiling and simplifying polygons. Check this post.

Update 18.1.2015: Vladimir Agafonkin from MapBox released earcut.js – very fast and reliable triangulation library. Worth to check. Video available here:

 

 

Original post:

Brendan Kenny from Google showed  here how he made polygons using libtess.js on Google Maps, so I have tried that too with single large enough polygon on Leaflet with CZ districts.  libtess.js is port from C code . Neither plntri.js (update: see also comments for plntri v2.0 details)  nor PolyK.js were able to triangulate large set  of points as libtess.js.

Update:  I looked on poly2tri.js  too with following results:

I could run 2256 polygons (all together > 3M vertexes)  with poly2tri  16 701 ms  vs 127 834 ms (libtess), however I  had to dirty fix  other various errors from poly2tri (null triangles or “FLIP failed due to missing triangle…so some polygons were wrong..), while  libtess was fine for the  same data.

Here is  a test :  3 M vertexes with 1 M triangles have been by generated by libtess in 127s . poly2tri took 16s.  Drawing is still fine but it is ‘just enough’ for WebGL too.

 

 

key part is listed below:


tessy.gluTessNormal(0, 0, 1);
tessy.gluTessBeginPolygon(verts);

tessy.gluTessBeginContour();

//--see blog comment below on using Array.map&lt;/span&gt;&lt;/strong&gt;
data.features[0].geometry.coordinates[0].map(function (d, i) {
pixel = LatLongToPixelXY(d[1], d[0],0);
var coords = [pixel.x, pixel.y, 0];
tessy.gluTessVertex(coords, coords);
});

tessy.gluTessEndContour();
// finish polygon (and time triangulation process)
tessy.gluTessEndPolygon();

code available here: http://bl.ocks.org/sumbera/01cbd44a77b4283e6dcd

 

There is also EMSCRIPTEN version of the tesslib.c available on github, and I was curious whether this version would increase speed of  computation. I could run it but for large polygons (cca 120 verts of CZ boundary) I had to increase module memory to 64 MB for FireFox.  Tessellata 120T verts in  FF-30 took 21s, IE-11, Ch-36: failed  reporting out of stack memory :(

Getting back to version from Brendan  (no emscripten) I quickly measured same data on browsers: IE-11 21s, Ch-36: 31s,  FF-30: 27s .

Update Oct/2014: Polyline tessellation blog here

Modern data visualization on map

hxgn14 For this year HxGN14  conference  I have prepared a web app  of modern data vizualisation, I have got  inspired by great ideas from Victor Bret and his research and talks for general concept (high interactivity, visualization ) of this app.

It is exciting to see what is possible to do today inside browser and interactivity provided by various open source projects (e.g. leaflet,d3  and its plugins)  and WebGL technology .

Leaflet WebGL many points rendering

WebGL is funny – programming in very low level style in JavaScript. This sample plots 86T points using this technology.  .

 

 

 

The code is very straightforward, the only thing is to how points are initially loaded and scaled (instead of reloading each time when map moves).

All points are initially transformed to tile size of 256 x 256 pixels at zoom level 0  and then re-scaled/re-shifted based on the current position of the map. drawingOnCanvas is called from L.CanvasOverlay each time map needs to be drawn (move, zoom)

 


function drawingOnCanvas(canvasOverlay, params) {
  gl.clear(gl.COLOR_BUFFER_BIT);
  // -- set base matrix to translate canvas pixel coordinates -> webgl coordinates
 mapMatrix.set(pixelsToWebGLMatrix);
  var bounds = leafletMap.getBounds();
  var topLeft = new L.LatLng(bounds.getNorth(), bounds.getWest());
  var offset = LatLongToPixelXY(topLeft.lat, topLeft.lng);
  // -- Scale to current zoom
  var scale = Math.pow(2, leafletMap.getZoom());
 scaleMatrix(mapMatrix, scale, scale);
 translateMatrix(mapMatrix, -offset.x, -offset.y);
  // -- attach matrix value to 'mapMatrix' uniform in shader
  gl.uniformMatrix4fv(u_matLoc, false, mapMatrix);
 gl.drawArrays(gl.POINTS, 0, numPoints);
}

More information and insipiration I took from this site

demo here: http://bl.ocks.org/sumbera/c6fed35c377a46ff74c3

For polygons rendering check here and for polyline rendering here

Some good intros to WebGL that might help you to understand the code: http://aerotwist.com/presentations/custom-filters/#/6

There is a nice intro book to WebGL  WebGL Programming Guide by Kouchi Matsuda and Rodger Lea

To illustrate how variables are passed from JavaScript to shaders used in above example, here are two figures from the book-  figure 5.7 on p. 149, and figure 5.3 on p.144.

strideoffset

Stride and Offset

This figure shows single buffer (interleaved)that is used fro both coordinates and size. In similar way single buffer is constructed in the example here:

 

 

 

var vertBuffer = gl.createBuffer();
var vertArray = new Float32Array(verts);
var fsize = vertArray.BYTES_PER_ELEMENT;

gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, vertArray, gl.STATIC_DRAW);
gl.vertexAttribPointer(vertLoc, 2, gl.FLOAT, false,fsize*5,0);
gl.enableVertexAttribArray(vertLoc);
// -- offset for color buffer
gl.vertexAttribPointer(colorLoc, 3, gl.FLOAT, false, fsize*5, fsize*2);
gl.enableVertexAttribArray(colorLoc);

 

shadervariables2

behavior of a varying variable