Perceptoscope is reviving this old, nostalgic concept of viewfinders by adding a layer of augmented reality.
Started as a hacker project, got a grant from The National Science Foundation, now Ben is working with the Los Angeles Natural History Museum to bring a few Perceptoscope at the La Brea tarpits. After this first implementation, the goal is to bring the Perceptoscope to more museums and national parks.
We are working with Ben on a few key software components for the Perceptoscope.
First, we are building a Software Development Kit (SDK) for developers to create an experience on the Perceptoscope easily. We are creating a Unity SDK and an open-source webXR SDK using A-Frame & BabylonJS. There will be specific components and a series of sample scenes with animations, timeline, shaders, 3D videos, and spatial mesh. Each sample scenes will be documented. We are also looking into how we can provide a web-based user interface to compose a scene in 3D, a bit like the ThreeJS Editor.
Second, we are working on a solution for the Perceptoscope to have static and dynamic occlusion with the real world. For this, we are currently experimenting with a wide range of depth cameras like the Zed mini, Intel Realsense 425, the new Occipital Core, and the latest Kinect.
The open-source nature and maker origin of the project makes it unique to us. The Perceptoscope comes from the fabric of the Los Angeles fabric of innovation with values that matters to our team. It's exciting to support Ben and Adam in bringing the Persceptoscope to the world.