Meta has filed a patent application for a new technology that could allow the Facebook parent company to allow equip its wearable mixed reality headsets with functionality inspired by a specific Apple Vision Pro feature. Apple’s first spatial computer is equipped with an external display that can display an indicator of what the wearer is doing in immersive mode, or show a version of their expressions, using sensors inside the headset. Meta could deliver a feature that uses its avatar feature, according to the patent document.
An application titled ‘Embedded Sensors in Immersive Reality Headsets to Enable Social Presence’ was recently filed with the European Patent Office (EPO) by Meta Platforms (via Patently Apple). The patent application was also published by the US Patent and Trademark Office (USPTO) two months ago.
In its patent application filed with both regulators, Meta describes the use of sensors (including ones used for electrocardiograms and electroencephalograms) placed inside the mixed reality headset, that can observe a user’s facial expressions in real time and control a virtual version of a user (the company calls these avatars) based on their expressions, which is displayed elsewhere.
The document also contains a flowchart (Figure 4) that describes the entire process. When a headset is worn by a person, a sensor located on the inside will track movements in facial muscles. It will also determine the expressions of the user, by mapping it to the movement of the muscle — there are 20 of these displayed in another diagram (Figure 3), along with two anatomical drawings.
After the device has determined the user’s current facial expression, it will automatically “adjust” the avatar for the user, according to the patent application. Finally, it will provide the modified avatar to an “immersive reality application” that is hosted on a remote server.
While the Apple Vision Pro shows a user’s facial expressions on the outer display, it appears that Meta’s collection of expressions will enable detailed appearances of these avatars on other services. The patent application also describes the use of machine learning technology to associate the movement of the facial muscle to the user’s expression.
It’s worth noting that while the patent application is not an indicator of when the feature could make its way to a Meta product, its potential inclusion in the future could lead to the development of more responsive avatars on other Meta products and services.