Merging relevant digital information in context to one’s physical environment is one of the most impactful and valuable functions of AR. By communicating the complexity of ideas and concepts visually in immediate context to the physical world, one can consume far more information and knowledge regarding the topic at hand. An example of this can be seen with a scaled down architectural site model displayed at Mobile World Congress 2019.
Sprint and PeachTree Corners Innovation Lab in Atlanta have partnered up to create a real-word smart city infrastructure powered by 5G connectivity enabling companies to test with autonomous vehicles and drone delivery amongst other emerging technologies such as cloud AI. The testing ground is a 1.5 mile strip in the heart of Atlanta, Georgia.
In collaboration with architectural model makers, The Model Shop, AfterNow created an interactive augmented reality experience superimposed on a scaled down physical model of the 1.5 mile strip (sized 8’ X 3’) to highlight various components of the smart city infrastructure. With augmented reality, users could see exactly how the technologies interact with one another in context to the physical model of the site.
Some examples of what was shown in AR include; drones taking off from home base and delivering packages to buildings on the site and digital autonomous vehicles driving on the physical test tracks with signals radiating from them. AR also provided the opportunity to visualize the 5G signals getting distributed across the entire site.
Overlaying virtual objects over a real object with high degrees of accuracy is not an easy task. The complexity increases, the larger the object - such as this architectural model. An existing method includes using a 1:1 3D model that mirrors the real object. Another is to approximate where the real object is in space, then place virtual objects relative to it. Neither method were ideal for our project since we did not have an accurate 3D representation of the physical model. Also, approximating the real object's pose produced inaccurate tracking results.
We had to get creative with our solution - we implemented a tracking method that's been recently introduced as part of ARKit (Apple’s augmented reality development platform). In short, users can use a compatible iOS device to scan any object, then use the scan to identify the real object during their app experience. This meant that we did not need to generate a 1:1 3D model, and can still achieve the tracking accuracy of true object tracking.
Although this method worked well for small to medium sized objects, we did come across a few challenges producing accurate tracking for our large model. To combat this, we came up with a method to "average out" the tracking accuracy over time. Essentially, we scanned the model multiple times, in whole and in smaller parts, then used these scans to enable constant detection and positional adjustments over time. As a result, the tracking accuracy improved the longer the app was used.
In the end, we were able to implement a tracking method that requires minimal setup and produced high accuracy for a large scale physical object.
This project attracted the attention of multiple attendees, prospects and executives at Mobile World Congress 2019 that were vital to the prosperity of PeachTree Corners and Sprint’s vision for 5G integration. Using AR to visualize the complex functionalities of a full scale smart-city in its physical context allowed for them to engage and immerse their key audience in a simple yet powerful way.