Try-on is an evolution of shopping experiences, seamlessly blending the physical and digital realms in our everyday life. It’s the technological advancement that enables users to digitally explore and “try on” various products.
Through this technology, users can thoroughly examine the product, assess its dimensions and size, comprehend its functionalities, and engage with various other features. Augmented Reality plays a pivotal role in this process by overlaying digital objects onto the real world via the device’s camera.
I honestly believe that Augmented Reality (AR) will continue to be prevalent because of its innate ability to seamlessly integrate into our daily lives and surroundings. AR doesn’t demand costly and bulky equipment to function effectively, nor does it drastically alter our environment. Instead, it effortlessly merges with our existence, introducing a delightful and playful component to our interactions with the world around us.
Utilizing digital products as wearable items fosters informal engagement with our clients, encouraging them to spend extended periods on our brand’s website or social media platforms. Furthermore, this practice enhances emotional connections by evoking a sense of sentimentality as individuals catch a glimpse of their own self-portraits through their rear camera.
Creating a digital twin poses a real challenge, particularly when trying to adapt the intricacies of the 3D world to the limitations of our cell phones’ specifications. These devices do not possess the required technology for real-time rendering of 3D objects, making the task even more demanding.
Designing a digital fabric presents an even greater challenge compared to hard surface modeling. Fabrics entail numerous aspects that must be carefully considered while creating a digital twin, including irregularities in yarn shape and color, anisotropic characteristics, and transparency, among others. Some of these challenges are inherent to the nature of fabrics, while others stem from the difficulty of translating the desired fabric into the 2D digital realm.
The process of tiling itself presents a challenge, as the goal is typically to create the smallest repeating pattern to minimize computational load while reproducing the tile across a large 3D surface. However, when working with fabrics, we are compelled to use larger tiles during the tiling process. This is necessary to avoid creating repetitive artificial artifacts that will change the appearance of our digital fabric twin from our original desired fabric.
To develop the fabrics within this AR lens, I utilized Seddi Textura, a 3D digital textile software that incorporates an AI engine. This software leverages genuine textile data and employs advanced machine-learning techniques to construct dependable 3D representations.
The initial creation of the 3D hat model was accomplished using Clo 3D. Afterward, I exported the 3D hat model to Rhino 3D, where I merged its individual parts into a single mesh.
The next step was creating the UV maps in 3D Max Autodesk software.
Creating a 3D object, specifically tailored for the gaming industry, particularly when it needs to conform to the limitations of the mobile realm, presents a significant challenge. This is because it necessitates utilizing multiple software programs, each specializing in different stages of the development process.
By constructing the UV maps, I could successfully import the Hat model into Substance Painter. This allowed me to apply the desired fabric to the garment pattern accurately. Seddi Textura facilitated the process by enabling the download of the desired tiled fabric in an SBSR format, making it convenient to transfer the fabric model into Substance. While it is also possible to download the texture maps directly from the Textura profile library and use them in Lens Studio or Spark AR, I personally prefer to have control over the placement of textures along the 3D surface.
After uploading all the texture maps, I encountered a new technical issue, as the combined weight of the four different fabric texture maps was too large to fit into a single AR lens.
I perceive the design process as a constant exchange, a balancing act between reducing model quality while ensuring no impact on the final appearance. This allows me to achieve faster and improved results in the user experience for the end product.
This post is also available in: עברית (Hebrew)