Speaking about the update, Artsiom Ablavatski and Ivan Grishchenko, Research Engineers, Google Artificial Intelligence (AI) wrote in a blog post, “To make all this possible, we employ machine learning (ML) to infer approximate 3D surface geometry to enable visual effects and ML Pipeline for Selfie AR.” Google also stated that it is working on improving the accuracy and robustness of the new AR support by refining the processes. It should be noted that Facebook and Instagram already come with AR filter support for their Stories, so it makes YouTube the last to receive the support for AR filters.
Google’s blog post further states, “That way we can grow our dataset to increasingly challenging cases, such as grimaces, oblique angle and occlusions. Dataset augmentation techniques also expanded the available ground truth data, developing model resilience to artefacts like camera imperfections or extreme lighting conditions.” Google is not only bringing the AR only to YouTube, as it is also adding an AR-backed feature to Google Maps, which will allow the user to see the real world through the camera. Last month, Google stated that it is testing the use of AR with Google Maps. The Maps using Google StreetView would use the cameras to see and add AR signs like, arrows, pointers and street names to the live image. In short, Google Maps will make it easier for the user to find their ways by identifying the right street and place and also by guiding the user in the right direction. The AR for Google Maps is still in its beta phase but it will roll out for the users in the coming months. While the AR filter on YouTube stories will be live in the coming days. “We are excited to share this new technology with creators, users and developers alike. In the future we plan to broaden this technology to more Google products,” Ablavatski and Grishchenko noted. For the latest gadget and tech news, and gadget reviews, follow us on Twitter, Facebook and Instagram. For newest tech & gadget videos subscribe to our YouTube Channel.