Putting it together: hotspots, gestures and large objects

In this experiment we will combine our distance-based hotspot detection with gestures in order to create an interactive experience.

Given our uneven results (so far) in creating a large NFT marker based on an outdoor location we will conduct this experiment with a Hiro marker at 5m. When we have successfully created an NFT marker based on a billboard, for example, we will be able to add a second AR demo using that.

Hiro marker with hotspots and gesture-based interactions

Swipe one finger to move the model up and down, pinch and zoom to scale the model, and swipe with three fingers to rotate the model around its vertical axis. The demo includes a button in the bottom right corner where we can initiate interactions on hotspots identified by proximity to the camera. Currently this interaction is limited to registering the distance to the caravan model - the parent of the yellow and orange hotspots embedded in its side. The aim is to have the yellow and orange hotspots react to world proximity to the camera, based on local-to-world conversion of their local positions relative to the parent.