Towards spatial interactive

at myVR I have been working last months  to bring into the life ARKit and ARCore prototypes/apps using mMap SDK.  At HxGN Live 2018 they have been presented as part of the Hexagon’s Xalt.  You can see the demos  on the page here: http://www.myvr-software.com/xrpocs/

Looking back, one of the best decision for me was to go native in 2011 for iOS app “Spatial Reader”  to make mobile apps. It was not because iOS was more difficult and challenging, rather potential of the device as whole  that can be exploited only by going deep into the platform. This tight integration Apple is pursuing  is about interactivity, user experience and simplicity, and  in this ARKit case – literally ‘Spatial Interactivity’.

For HxGN live we have also  used anamorphic image to represent real excavation – the type of image and projection that make perfect 3D illusion from certain point of view.

Advertisements

iKatastr – tip: měření

Měření  v desktop verzi iKatastr.cz – stiskněte jednu z ikon vlevo dole, a můžete začít rýsovat plochu nebo linii. Rozměry se automaticky zobrazí. ukončení měření provedete kliknutím na poslední bod (nebo i dvoj-klikem). Po narýsování můžete také body manipulovat a polygon/linii změnit

iKatastr.cz pro mobilní web

Pár poznámek k nové mobilní verzi iKatastr.cz.

To co je na této verzi zajimavé je, že jsem se nakonec rozhodl ignorovat všechna možná pokušení o využití JavaScriptovych knihoven, tedy kromě základní mapové komponenty – ta je založena na Leafletu v 1.0.3, kde jsem ještě musel udělat pár úprav, aby vše fungovalo správně. Leaflet má v komprimované formě 48KB. Využívám i komponentu Mapy API ale jen pro vyhledávání a navíc je tato část nahrána až když uživatel opravdu klikne na hledání.

Vlastní kód iKatastru má v komprimované formě pouze 13.3 KB. Celý výkonný kód, včetně HTML, CSS a fontu (který je také na míru vyroben) zabere při studeném startu (bez cache) pod 70 KB.  Zbytek je už vlastní obsah  – data  – hlavně dlaždice ze zdrojů Mapy.cz a ČÚZK.

Při každém dotazu do mapy aplikace vyšle požadavky na webové služby ČÚZK, ty zaberou v jedné odpovědi pod 10 KB.

 Uživatelské rozhraní 

Vertikální menu ve stylu rozbalovacích panelů nakonec zvítězilo a myslim si, že je hbité, pěkné a funkční. Opět nechtěl jsem dělat tendenční UI ani animace, které nakonec  berou čas uživateli. Na mobilu se dovoluje kvůli místu mít pouze jeden panel rozbalený, navíc funguje libovolné tapnutí do mapy k tomu aby se panel zabalil – to je navýkové a pohodlné. Ovšem panely se dají zavírat více způsoby – samotnou ikonou co je otevíra a samozřejmě křížkem vpravo.

Informace z katastru nemovitostí:

Tohle byl asi největší problém – tedy ne samotné informace – ty jsou úplně na jiné úrovni než před 8 lety, kdy iKatastr začínal, spíš problém jak do mapy tapnout – dlouze, nebo krátce ? Dlouhý stisk je zavedený v mobilnich aplikacích, krátký klik je zase pohodlný a rychlý na webu.  Nakonec má uživatel k dispozici oboje (a k mému údivu to šlo vyřešit) a navíc může si zvolit, jestli se má na jedno tapnutí přímo zobrazit nahlížení (tedy tak jak byl zvyklý z minulých verzí) . To otevírá poměrně hladkou cestu na přenesení této mobilní verze na desktop. Ona tam funguje, dokonce i na Internet Exploreru (jen si stěžuje že  potřebuje pomoci s povolením otevřít popup okno …opět) a na EDGE prohlížeči nefungují některé dotazy – je to chybou v EDGE prohlížeči, naštěstí se rozjedou vždy záložní dotazy, které fungují, takže z pohledu uživatele se pouze nezvýrazní parcely/budovy.

Budu rád pokud mi sem napíšete vaše postřehy, připomínky k této mobilní verzi – co funguje, co se lébi/nelébi a jak používate vlastně iKatastr a při jaké práci.

 

MapKit with ARKit and overlays

Flyover mode in Apple Maps allows  AR/VR style interaction.  This is not by default available for iOS developers using underlaying MapKit/ARKit technology.  However it is possible to test it and the following short video is about this proof of concept – viewing cadastral maps (iKatastr)  in VR like experience on iPad .  Btw. Flyover mode on iOS 11  has some strange handling of overlays – described here so loading of tiles is little bit tricky. The iOS 10 version was much more better (check the video here)

 

GEO Visual GPU Analytics notes

With some delay, but before the year ends, I have to wrap up my presentation from GIS Hackathon March/2017 in Brno called Geo Visual GPU Analytics . It is available here in CZ : https://www.slideshare.net/sumbera/geo-vizualni-gpu-analytika  .  There are more pictures than text, so here I will try to add some comments to the slides.

slide 3,4: credits to my source of inspiration -Victor Bret, Oblivion GFX, Nick Qi Zhu.

slide 5: this is a snippet from my “journey log” (working diary), I keep every working day a short memo what I did, or anything significant that happen. It serves to several purposes, for example in this case I have gave up on trying WebGL , spent one /two days on other subject and then returned to the problem – and viola, I could resolve the problem.  Everyday counts, it helps to keep discipline and learn from past entries. Getting to know WebGL opened really ‘New Horizons” of GPU computing universe.

slide 7: “better bird in the hand than a pigeon on the roof  ” (English equivalent is : A bird in the hand is worth two in the bush’ ). This proverb is put into the context of edge vs cloud computing on slide 9.  In the hands – this is the edge , in the roof – this is the cloud.  So I believe that what users can hold in their hand, or wear or experience ‘nearby’ ‘is better’ (or more exciting)  than what exist somewhere far away (despite its better parameters).

slide 8 : We have same term for tool and instrument in the Czech – ‘nastroj’  so the question is musical instrument or just instrument (aka tool)? This goes to the whole topic of latency in user interaction, described for instance here. I tend to compare the right approach with musical instrument where tight feedback loop happens between the player and the musical instrument. The instrument must respond in less then 10 ms to tighten the feedback loop so the player can feel this instrument as his own ‘body’ and forget on ‘mechanics’ rather flow on the expressiveness of the feelings for what he is interpreting or improvising.  (right picture credit here) Why not to have such tools in visual analytics ? Why we need to wait for response from the server if the same task can be done quite  well on the edge ? mGL library for GPU powered visualization on web  or ImpactIN for iOS using Apple Pencil  reflects this principle. We have real-time rendering, we need human-sense-time interaction and bloated abstraction of current software stack do not help here despite of the advance in the hardware –  nice write up about latency problem here   …and as a side note there are computers types with very low latency – check any synthesizer or digital instrument where latency from user interaction must be very low, hence the left picture  on that slide represents them (combination of MIDI pad + Guitar).

Here is a short video form the Korg Monologue synth  on something used from 70’s , I consider this type of low-latency feedback-loop applied to new domains fascinating subject to explore. Notice real-time filter modification.

slide 9,10: nice chart from 2012 from britesnow.com    on cyclic nature of server vs client processing.  I stated there that Innovation happens on client (on edge) as servers(clouds, frames)  can do always anything and everything. Exaggerated and related to the slide 7 described above.  Workstations, PC, Smartphones (1st iPhone), AR/VR devices, wearables in general etc… it is always about efficiency in used space. Interestingly NVIDIA GPU Gems states similar on chip level.

slide 11: GPU chart over-performing CPU in conjunction with video resolution.

slide 12: Most tricky slide called ironically “Find 10 differences”. On left side is the program I did in 1993, in DOS, on right  the one I did using WebGL in 2016. Both examples are great achievements, the right side does GPU-based filtering (or marketingly in-memory)  with low user latency so it redraws immediately as user filters by his mouse pointing on brush selector.  The left was created in DOS era where each graphics card has its own way of mode switching  and that app could utilize maximum of the graphic card using 640×480 resolution with 256 colors ! that was something that time. However something is wrong in trying to find 10 differences as they are basically so similar, both using monitor, keyboard/mouse, and layout….

slide 13:  last slide titled “Find 1 difference”is the answer on the dilemma from slide 12  – the AR experience, new way of interaction, new type of the device for new workflows, visual analytic, exploration etc.  For one example of many possibilities of AR, here is a nice video from HxGN live 2017:

 

3D visual interactive analysis with myVR SDK

Recent weeks I have been working on the concept code named ‘Impact IN’  where I could apply few interesting ideas of the modern, interactive 3D  geospatial analysis. The core 3D functionality and viewshed analysis is provided by myVR SDK . While this concept is demonstrated  on iPad Pro with iOS , myVR SDK  is truly multi-platform SDK so this can be run on any platform (Android, Web, Desktop, it runs even on Daqri helmet!). I have used Apple Pencil to navigate directional viewshed and to drive fly-through on 3D map  – it works like joystick. Another concept  shown here is the real-time transition of the analysis from 3D to 2D – so while user interacts with the viewshed on 3D city model  of London , it is at the same time reflected on 2D map, creating thematic map of the impacted areas – a classic GIS result that can undergo further analysis in a GIS system of choice, or better go directly as input into the Smart M.App    (e.g. into Studio or Grid Analysis)….  following video is what attendees could see at HxGN Live 2017 . [will continue next time]