9. album od Maxmiliána Šumbery s názvem Letní imprese je nově k dispozici na Audiomack. Skladby Duo pro hoboj a klavír No.1 “Ametystové červánky” a Rozkvetlé louky se umístily ve zlátém pásmu soutěže XIII. Múzy I.Hurníka 2025. Navíc první z nich dostala také zvláštní cenu v kategorii instrumentální hudby.
Disclosure:AI-assisted content — consult with an organic-developer.
When building low-level libraries for the JVM — especially those that interact with JNI, rendering engines, or MethodHandles — the exact bytecode emitted matters. Recently I hit a limitation in Kotlin that reminded me why the JVM world still needs Java for certain things.
Where Kotlin Emits Different Bytecode than Java
Here are the main areas where Kotlin’s generated bytecode diverges from Java’s, and why that matters.
Area
Java
Kotlin
Takeaway
Signature-polymorphic calls
Emits correct signature for MethodHandle.invokeExact
Falls back to (Object[])Object, causing mismatches
Keep these calls in Java
Default parameters
No defaults → use overloads
Generates synthetic $default methods with bitmask
Avoid defaults in public APIs for Java clients
Companion objects / @JvmStatic
True static methods
Methods live in $Companion unless annotated
Use @JvmStatic or plain Java for static APIs
Internal visibility
Package-private supported
internal compiles to public + metadata
Don’t rely on internal for cross-language encapsulation
SAM interfaces
Any functional interface = lambda
Only fun interface supports SAM; lambdas may create synthetic classes
This wasn’t an isolated issue. Kotlin differs from Java in other ways that make it risky for core interop code:
Area
Java
Kotlin limitation
JNI declarations
static native boolean render(int, int)
Needs @JvmStatic in a companion object; generates synthetic names
JNI header generation
javac -h works directly
No header generation for Kotlin sources
Checked exceptions
Enforced at compile-time
Kotlin ignores them (all unchecked)
Raw types
Allowed (List)
Always requires generics (List<*>)
Wildcards
? super, ? extends supported
Only in / out; cannot express everything
Default params
Not supported (overloads instead)
Compiles to synthetic $default methods
Static members
static keyword
Requires @JvmStatic in object/companion
Suspend functions
N/A
Compiled to Continuation-based state machines, awkward for Java callers
Why This Matters for Library Code
A low-level library often deals with:
JNI ↔ JVM bridges
OpenGL or native rendering loops
Performance-critical calls that must inline
Reflection and MethodHandles
All of these require predictable bytecode and signatures. Kotlin often inserts synthetic classes ($Companion, $DefaultImpls, $WhenMappings) or adapts signatures in ways Java clients (and JNI) do not expect.
Why Keeping the Library Core in Java Makes Sense
Benefit
Why It Matters
One language to maintain
Single codebase, easier contributor onboarding, faster builds
Interop for everyone
Java APIs work in all JVM languages; Kotlin clients lose nothing; Java clients stay safe from Kotlin-only features
JNI friendliness
Direct mapping of Java types to JNI (int → jint, boolean → jboolean); javac -h header generation works; avoids $Companion/$DefaultImpls surprises
Bytecode predictability
No synthetic baggage ($Companion, $default, $WhenMappings); avoids mismatched signatures; JIT optimizes exactly as written
Strategy: Java Core + Optional Kotlin API
The pattern I adopted (and which many frameworks use):
Core in Java
Predictable bytecode
JNI header generation
Works with MethodHandle, VarHandle, Unsafe
Safe for both Java and Kotlin clients
Optional Kotlin extensions (-ktx)
Extension functions for ergonomics
Coroutines (suspend wrappers)
Null-safety
DSLs for configuration
This is the same model Android Jetpack follows: androidx.core in Java, androidx.core-ktx in Kotlin.
Takeaways
Point
Why
MethodHandle support
Java compiler emits exact signatures ((int,int)boolean), Kotlin falls back to (Object[])Object, causing runtime issues
Po skoro 30 letech od ‘obyčejné’ klavírní verze je tu verze orchestrální. Překvapivě bohatá a rozšiřující původní námět. Zvláště střední a závěrečná, epicky znějící část, se mi velmi líbí. Orchestraci z původní skladby vytvořil a vhodně rozšířil Maxmilián Šumbera.
Night runner, my favourite one released ! I love the uncommon accents forming a distinct rhythm, interleaving with a faster pulse — like a heartbeat echoing the rhythm of feet hitting the ground in that first minute.
This is another nice half-frame camera — the Olympus PEN-D. It has an uncoupled meter and a smart way of setting aperture and shutter speed together. Works great and produces nice photos. Compared to the Canon Demi EE28 , it’s a bit bulkier, mostly due to the faster Zuiko 1.9 lens.
It features a Copal leaf shutter, fully mechanical, so no batteries are needed. Focusing is manual via a distance scale. Zone focusing works well once you get a feel for it.
I gave a presentation where GIS might evolve in 20 years from now as part of the GIS Ostrava 2025 conference on 5.3. 2025. it was great to see again colleagues ! get eye-contact with audience and not just virtual applause. Also nobody was showing physically thumbs-up or red heart (in that case I would call emergency), rather real spoken (I mean real sound wave based ) comments, real talk and smiles. That was the main topic of the talk – Spatial Interactive – with people, tech, discoveries. Step out of the ‘glass-illusion’ trap.
Spatial Interactive
Stanislav Sumbera, GIS Vision 2024, 5.3. 2025,GIS Ostrava 2025
What happens here is more important than what happens now.
Space is naturally interactive, enabling collaboration and sharing.
The computer is not behind a 2D glass screen but understands 3D space and interactions within it.
People learn through observation, collaboration, and play.
NPCs have become “thinking machines” (are we, on other side, turning into NPCs ourselves? aka Jumanji 2 )
Image from Jumanji 2,driver – Mason Pike ?
The Chinese Room paradox – an English speaker perfectly assembles answers in Chinese following instructions without understanding the Chinese language and symbols meaning.
AI cannot create true originality but excels at combining and compiling existing inputs – a “super plagiarist” or “super puzzle resolver” ?
Might replace a significant amount of human (intellectual + routine) labor – in GIS (georeferencing, recognition/classification), programming/syntax, and more
“Hard work for machines, thinking for people” (Tomáš Baťa) is evolving into “(Pre)thinking* for machines, creativity/ideas for people” (in Czech Language : pre-mýšlení)
AI model marketplace – grow (cultivate) your unique “thought twin” that integrates into an open AI network.
Taylor, Fayol, Ford, and Baťa put an end to the old type of entrepreneur for good. They argued that profit does not depend on the numerically lowest wages, but on the highest and most efficient work performance, during which the worker works as if he were working for himself.
“Baťa’s factories produced and supplied most of the construction materials themselves. They had their own brickyards, carpentry, joinery, and locksmith workshops, which supplied standardized interior fittings to all homes, produced neon tubes, flooring, in short, almost everything they needed. And if there was something not included in their production program, they purchased it directly from manufacturers (excluding intermediaries).”
20.6.’25: Nová 3D verze, obsahuje reliéf terénu, výškové obrysy budov a využívá akceleraci grafické karty. Vyzkoušejte v testovacím provozu https://ikatastr.cz/3d
Beskydy
23.2.’25: zlepšeno rozlišení základních map pro ‘retina’ obrazovky. Oprava drobných chyb.
17.2.’25 aktualizována mapová komponenta Leaflet na poslední verzi 1.9.4.
27.1.’25: Přidána vrstva označující nemovitosti, které obsahují cenový údaj v katastru nemovitostí. Vyzkoušejte např zde. Ovšem sdělení konkrétní ceny je pouze na vyžádání na katastru. Viz ceník zde, položka 4021 a 4022.
We have various terms for AR – like Mixed Reality, Metaverse, Spatial-ware, XR. Among these, Apple’s term “Spatial Computing” stands out for its emphasis on integrating physical space and digital interactivity. This resonates with me, as the concept of “Spatial” reflects how we model and interact with space in meaningful ways. Years ago, I made up a new term “SpatialIn”—an open-ended label where “In” simply means Spatial is “in.” Later, with advancements like ARKit, I extended this idea into “Spatial Interactive,” emphasizing the interactive potential of space around us behaving like a dynamic canvas. Vision Pro aligns perfectly with this vision. After testing Vision Pro , here are my key observations:
Hands-Free Interaction
Vision Pro’s hands-free interaction feels intuitive. Manipulating virtual objects with gestures or gaze eliminates barriers and enhances usability. Fluent hand movements remind me of my Tai-Chi classes from years ago.
From VR to True Mixed Reality Immersion
Unlike VR, Vision Pro allows safer navigation in real spaces while engaging with virtual elements. It maintains spatial awareness and visual contact with reality, making it both practical and immersive.
Unmatched Persistence
Vision Pro’s capability to retain virtual object placement across sessions is impressive. This feature is critical for practical applications such as architectural design, where models need to stay precisely where placed for accurate spatial referencing, or in education, where persistent virtual setups can create consistent and engaging learning environments. I found a model I placed earlier in the day still standing on the lower floor of the building—exactly where I left it. This happens without explicit relocalization notification for the user, as the virtual model sticks to the physical space even across different floors. A must for advanced spatial computing design as virtual space must keep integrity similar to physical one.
Feeling Rendering on My Hands – sort of
The wide field of view and detailed lighting ensure a natural integration of digital and physical environments. Soft shadows and consistency make virtual elements feel tangible. Fidelity is so high that it creates the illusion of tactile sensation when interacting with virtual models. While purely subjective, this visual illusion convincingly engages my sense of touch, making the experience feel remarkably real to me. This visual-feel integration adds a layer of immersion that goes beyond sight and sound, engaging the sense of presence in a way that feels almost instinctive.
As there is no 3D map from Apple to test on the device by default, I had to convert a gltf to USDZ and send it to device for QuickLook to get experience of how 3D city would look like there.Here it goes: