At Novel, we created a series of mixed reality experiments that explore the relationship between virtual objects and their connection to our physical world.


Augmented reality lets us to blend new worlds with our own. But aside from surface planes and shadows, our physical surroundings are mostly foreign to the experience. Shot and rendered in real-time, the following experiments explore how light, sound and motion from outside of our devices can create multi-dimensional AR experiences that challenge our perspective of "here" and "there."

Module 01

Light from non-objects
How can light inform the existence of the non-material? 
Light as an indicator
By definition light is a natural agent that stimulates sight and makes things visible. Synchronizing DMX controllable lights to AR objects, we can observe the presence, speed or proximity of virtual objects on our physical surroundings.
Non-object velocity triggered lighting
Impossible projection
With AR, impossible objects can exist among us. In this example, the projected shadow of an AR Tesseract illuminates the ground as a 3D cube.

Module 02

The dichotomy of sound
How can sound inform or enhance an AR experience?
Sound from non-objects
It's easy to recognize an environmental sound from a speaker. Here, sound created from a drum beater and synchronized to digital objects creates spatial audio cues of virtual objects hitting a physical surface.
Non-object collisions trigger spatial sound.
Non-objects from sound
In this experiment, we used MIDI to visualize the pressed keys of a piano in real-time. Enlarging those objects at room, or venue scale allows a user view the keys up-close and from their own perspective.

Module 03

Coordinated motion
How does motion translate from the physical to the digital?
Non-object inertia
‍Game engines have replicated our scientific laws to create realistic simulations of our physical world. In a modern twist on the classic "Newton's Cradle" we tested the boundaries of Newton's first law to see how closely we could translate movement wo worlds.
Precision calibration
Robots have the unique ability to "touch" virtual objects based on their 3D spatial awareness. To test the boundaries of this interaction, we built a Unity based real-time controller for our robot arm creating a precise connection between virtual and physical objects in motion.
The physical touching the virtual

Science AF

ECD - Jeff Linnell
Creative Director - Jeremy Stewart
Design Technologist - Jonathan Proto
Design Technologist - Brandon Kruysman
Executive Producer - Julia Gottlieb
Robotics - Eli Reekmans
Video Production -  Stebs, Natalie Rhold
Special Thanks - Mimic team at Autodesk: Evan Atherton, Nick Cote, Jason Cuenco
Produced at Novel
A brand that speaks robot.