4/1/2023 0 Comments Snap lens studioLens Studio also provides example scripts with content from which developers can learn to extend the given scripts to even more use cases. For 3D objects, you use the same script, attach a Scene object that contains a Text3D component, and select the source and target language again. After that, you select the desired source and target language. To translate simple Text objects, you attach the script to a Scene object that includes a Text component. ![]() In both use cases, developers use a provided script of the “Remote Service Module” that translates Text or Text3D objects. ![]() To be more precise, this module provides a scripting API to call the endpoints of iTranslate’s translation API to do two things basically: Lens Studio allows the usage of data obtained from external APIs, such as weather, stock market information, or as in our case, a translation service via the “ Remote Service Module.” Developers can then use the data from these APIs for new lenses. How iTranslate’s translation API is used at Lens Studio With the help of external APIs provided by companies like iTranslate, AR developers can create even more immersive, powerful, and helpful lenses that otherwise couldn’t be built or would take too much effort and time. The lens makers are able to create “Face Lenses” that use a device’s front camera and “World Lenses” for rear camera experiences. They’re an unparalleled way to use augmented reality and provide a playground for creators, designers, 3D professionals, developers, and end-users alike. Lenses are AR experiences that change and transform the way you look and how you experience the world around you. All Lens creation starts with Lens Studio, which is the company’s AR creation tool designed for artists and developers to build Lenses for Snapchatters worldwide. to give AR developers the tools to build more engaging, immersive, and accessible experiences. “The technology is here, the potential is huge and the journey is really only just getting started.As officially announced at “ Lens Fest,” the annual celebration of Snap’s global AR community, iTranslate teamed up with Snap Inc. “Merging the playfulness and the utility will be something to look for in the upcoming months and years,” he says. Going forward, the focus is “elevating the shopping experience,” to help people discover and try products prior to purchase and “allowing brands to be more entertaining and immersive with their offering.” ![]() While prior to Covid AR projects were more focused on creativity than product, Perez maintains that "the pandemic forced brands to take products outside the physical store and now what we’re really seeing is brands exploring how they can merge the two - combining the art and creativity of the brand, and the science of sales.” ![]() Tiffany premiered the technology last year at the LVMH brand’s exhibition at London’s Saatchi Gallery where visitors were able to virtually try on its famous 128.54-carat Tiffany Diamond. The technical capability enhances the realism of AR by reflecting light on digital objects - especially important when it comes to jewelry and high price point items. Latest product Snap is rolling out to developers via its Lens Studio is Ray Tracing.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |