WO2025207460A1 - Virtual eyewear fitting system - Google Patents
Virtual eyewear fitting systemInfo
- Publication number
- WO2025207460A1 WO2025207460A1 PCT/US2025/021054 US2025021054W WO2025207460A1 WO 2025207460 A1 WO2025207460 A1 WO 2025207460A1 US 2025021054 W US2025021054 W US 2025021054W WO 2025207460 A1 WO2025207460 A1 WO 2025207460A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eyewear
- user
- virtual
- facial
- fitting parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
- G02C13/003—Measuring during assembly or fitting of spectacles
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
- G02C13/003—Measuring during assembly or fitting of spectacles
- G02C13/005—Measuring geometric parameters required to locate ophtalmic lenses in spectacles frames
Definitions
- the present inventions relate to information systems that facilitate eyewear fitting. More specifically, methods, systems, devices, and non-statutory computer-readable storage media are applied to implement an interactive eyewear fitting process.
- Eyewear plays a critical role in providing vision correction, protection from environmental elements, and fashion enhancement for individuals. Eyewear that is properly fitted will not only be comfortable for the wearer, but also provide desired functionality and aesthetic appeal. Conversely, ill-fitting eyewear can lead to discomfort, decreased visual acuity, and even long-term damage to the wearer's eyes or nonuse altogether.
- an optician’s eyewear fitting processes can increase the likelihood of human error, lead to discrepancies in lens alignment, resulting in imprecise prescriptions, poorly fitted frames, and discomfort and suboptimal vision for a person wearing.
- some embodiments also include the realization that an ideal or perfect fit of eyewear may be almost impossible using traditional methods given that the optician will not usually be able to obtain perfect feedback or communication from the wearer to confirm that the optician’s intent is fully realized, and also that wearer is often unaware of the goals, consequences, science, and metrics necessary for a proper fit.
- conventional fitting techniques are only ever able to provide a “best” fit in view of the limitations and drawbacks associated with such techniques.
- corrective eyewear is fitted to a customer by the optician, based on prescriptions provided by optometrists or ophthalmologists, available frame styles, lens materials, and coatings, some of which are based on the customer’s preferences and lifestyle requirements.
- the goal is to ensure that the eyewear fits comfortably and securely on a wearer’s face while also aligning with their prescription needs.
- opticians’ eyewear fitting processes rely heavily on subjective assessments and manual adjustments and are often plagued by inefficiencies and inaccuracies. Accordingly, the present disclosure addresses this and other challenges that have always plagued the eyewear fitting process.
- the present disclosure provides a process of digital imaging and computerized measurement techniques to improve efficiency, accuracy, and patient satisfaction with an eyewear fitting process.
- a method is implemented at a computer system for virtual eyewear fitting.
- the method can include obtaining image data of a user captured by a remote imaging device, generating or generating or extracting a set of facial measurements of a face of the user based on the image data, obtaining visual information of an eyewear of the user, and adjusting a virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
- the method can further include providing a visualization of the eyewear.
- the method can include visualizing, on a first user interface, the eyewear of the user based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on a first user interface.
- the method can also include generating a prompt, an option, or an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter and causing display of the instruction including the virtual fitting parameter on a second user interface.
- a physical eyewear frame can be adapted, modified, or created based on the virtual representation displayed on the display device.
- a computer system can be used to achieve some of the advantageous features disclosed herein.
- the computer system can be configured to include one or more processors and memory for storing one or more programs for execution by the one or more processors, the one or more programs including instructions for performing a virtual eyewear fitting method.
- the computer system can obtain image data of a user captured by a remote imaging device, extract a set of facial measurements of a face of the user based on the image data, obtain visual information of an eyewear of the user, and adjust a virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
- the computer system can provide a visualization of the eyewear for the user.
- the computer system can cause a visualization of the eyewear to be displayed, on a first user interface, based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on a first user interface.
- the computer system generates an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter and causes display of the instruction including the virtual fitting parameter on a second user interface.
- a non- transitory computer readable storage medium stores one or more programs for execution by one or more processors of a computer system, and the one or more programs including instructions for performing the above virtual eyewear fitting method.
- a method for custom-fitting eyewear for an online eyewear user, including capturing a set of facial measurements of the eyewear user via a remote imaging device. Some embodiments may also include transmitting the captured measurements to a processing unit. Some embodiments may also include generating a virtual representation of the eyewear adjusted to the captured measurements. Some embodiments may also include displaying the virtual representation on a display device within a production facility. Some embodiments may also include adjusting a physical eyewear frame to match the virtual representation displayed on the display device. Some embodiments may also include shipping the adjusted eyewear frame to the eyewear user.
- the remote imaging device may be a camera of an eyewear user’s mobile computer device.
- the processing unit may be located remotely from the production facility and the display device may be a tablet computer device.
- the method may include capturing a video sequence of the eyewear user’s head movements and expressions to refine the eyewear fit based on dynamic facial metrics.
- the method may include simulating environmental conditions in the virtual representation to adjust the eyewear for specific eyewear user use cases such as sports or high-glare environments.
- the processing unit further customizes the eyewear by selecting frame styles and materials based on the eyewear user’s biometric data and aesthetic preferences.
- the method may include generating a predictive model of the eyewear user’s future facial changes and adjusting the eyewear to accommodate predicted changes within a predefined time frame.
- an eyewear fitting application is implemented by a head-mounted display device (HDD) configured to create an extended reality (XR) environment for a user (e.g., an optician, a wearer of the eyewear).
- XR extended reality
- a pair of glasses may be rendered for the user in a three-dimension format in the XR environment, thereby facilitating eyewear selection and fitting.
- the XR is an umbrella term encapsulating Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), and everything in between.
- AR Augmented Reality
- VR Virtual Reality
- MR Mixed Reality
- any embodiments that apply a VR system can be implemented using an AR or MR system as well.
- FIG. 1 is a data processing environment 100 having one or more servers 102 communicatively coupled to a plurality of client devices 140A-140E, in accordance with some embodiments.
- Each client device 140 can collect data or user inputs (e.g., image data of a user), executes user applications, and present outputs (e.g., an instruction for eyewear fitting, a visual representation 220 of eyewear 160) on its user interface.
- the collected data or user inputs can be processed locally at the client device 140 and/or remotely by the server(s) 102.
- the one or more servers 102 provides system data (e.g., boot files, operating system images, and user applications) to the client devices 140, and in some embodiments, processes the data and user inputs received from the client device(s) 140 when the user applications are executed on the client devices 140.
- the data processing environment 100 further includes a storage 106 for storing data related to the servers 102, client devices 140, and applications executed on the client devices 140.
- storage 106 may store one or more of: video content, static visual content, audio data, facial measures of users, virtual fitting parameters, and eyewear fitting instructions.
- the one or more servers 102 are configured to enable an eyewear fitting platform having a plurality of user accounts 328 ( Figure 3) for technician users 120T and eyewear users 120E.
- Each of the plurality of client devices 140A- 140E is associated with a technician or an eyewear user 120E, and configured to execute a dedicated or browser-based eyewear fitting application 326 ( Figure 3).
- the plurality of client devices 140 may be, for example, desktop computers, laptop computers 140 A, tablet computers MOB, mobile phones 140C, or intelligent, multi -sensing, network-connected home devices (e.g., a depth camera, a visible light camera 140E).
- the plurality of client devices 140 include an XR device 140D (also called a head-mounted display device (HDD) 140D) configured to render extended reality content, e.g., facilitating virtual eyewear fitting.
- the one or more servers 102 can enable real-time data communication with the client devices 140 that are remote from each other or from the one or more servers 102. Further, in some embodiments, the one or more servers 102 can implement data processing tasks that cannot be or are preferably not completed locally by the client devices 140.
- the client devices 140 include executes an interactive eyewear fitting application 326 ( Figure 3). The client devices 140 captures the image data of an eyewear user 120E, and sends the image data to the server 102.
- the server 102 generates a set of facial measurements 122 of the eyewear user 120E, a virtual fitting parameter, a visual representation 220 of the eyewear 160, and an instruction 202 ( Figure 2A) for adjusting the eyewear 160 of the user physically.
- the instruction may be sent to a client device 140A-140D associated with a technician user 120T to present the instruction, thereby guiding the technician user 120T to adjust the eyewear 160.
- the client device 140 includes an eyewear fitting machine.
- the instruction 202 may be sent to for adjusting the eyewear 160, and the instruction 202 may be displayed on a display associated with the eyewear fitting machine. Further, in response to the instruction 202, a robotic arm of the eyewear fitting machine may be controlled to adjust the eyewear 160 based on the virtual fitting parameter 162, which can thereafter be adjusted.
- the one or more communication networks 108 can represent the Internet of a worldwide collection of networks and gateways that use the Transmission Control Protocol/Internet Protocol (TCP/IP) suite of protocols to communicate with one another.
- TCP/IP Transmission Control Protocol/Internet Protocol
- At the heart of the Internet is a backbone of high-speed data communication lines between major nodes or host computers, consisting of thousands of commercial, governmental, educational and other electronic systems that route data and messages.
- a client device 140 e.g., a remote image device
- a server 102 or a client device 140 associated with a technician user 120T obtains the image data of the eyewear user 120E, and extracts a set of facial measurements 122 of a face of the eyewear user 120E based on the image data.
- the server 102 or the client device 140 associated with the technician user 120T obtains visual information of the eyewear 160, e.g., extracts from an eyewear database.
- the server 102 or the client device 140 associated with the technician user 120T generates an instruction to adjust the eyewear 160 of the user physically based on the virtual fitting parameter, and causes display of the instruction including the virtual fitting parameter on a second user interface 210B ( Figure 2A).
- the second user interface 210B optionally includes or replaces the first user interface 210A.
- the image data of the eyewear user 120E include a video sequence of head movements and expressions of the user 120E
- the set of facial measurements 122 includes dynamic facial metrics extracted from the video sequence.
- the virtual fitting parameter 162 of the eyewear 160 is adjusted based on the dynamic facial metrics to refine an eyewear fit for the user 120E.
- the set of facial measurements 122 include a 3D facial map representing geometric features of the face of the user 120E in a plurality of meshes.
- the eyewear fitting application may be executed to reconstruct the face of the user in a visual representation or determine geometric features (e.g., a facial contour, a distance between an eyebrow and eyelashes) of the user’s face.
- Figure 2A is a visual acuity assessment environment in which an XR device 140D is applied to facilitate eyewear fitting, in accordance with some embodiments
- Figures 2B and 2C are visual representations 220 of an eyewear 160 including one or more virtual fitting parameters 162, in accordance with some embodiments.
- the XR device 140D may be communicatively coupled within the data processing environment 100.
- the XR device 140D may include one or more cameras (e.g., a visible light camera, a depth camera), a microphone, a speaker, one or more inertial sensors (e.g., gyroscope, accelerometer), and a display. In some situations, the camera captures hand gestures of a user wearing the XR device 140D.
- Frame Tilt 246 that is adjustable on the eyewear 160 to ensure that the eyewear 160 are level on the face and sit correctly;
- Frame Material that is adjustable for a better fit, such as acetate or metal
- the client device 140 of the computer system 300 uses a microphone for voice recognition or an eye tracking device 380 (e.g., a camera) fortracking eyeball movement.
- the client device 140 includes one or more optical cameras (e.g., an RGB camera), scanners, or photo sensor units for capturing images.
- the computer system 300 also includes one or more output devices 312 that enable presentation of user interfaces 210 and display content, including one or more speakers and/or one or more visual displays.
- the one or more input devices 310 includes an image module 310A ( Figure 5) for generating or extracting facial measurements 122 of an eyewear user 120E.
- the one or more output devices 312 include a display module 312A ( Figure 5) for presenting the adjustment data to a technician.
- Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other nonvolatile solid state storage devices.
- Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302.
- Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium.
- memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
- Operating system 314 including procedures for handling various basic system services and for performing hardware dependent tasks
- Network communication module 316 for connecting each server 102 or client device 140 to other devices (e.g., server 102, client device 140, or storage 106) via one or more network interfaces 304 (wired or wireless) and one or more communication networks 108, such as the Internet, other wide area networks, local area networks, metropolitan area networks, and so on;
- User interface module 318 for enabling presentation of information (e.g., a graphical user interface for application(s) 324, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc.) at each client device 140 via one or more output devices 312 (e.g., displays, speakers, etc.);
- information e.g., a graphical user interface for application(s) 324, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc.
- output devices 312 e.g., displays, speakers, etc.
- Input processing module 320 for detecting one or more user inputs or interactions from one of the one or more input devices 310 and interpreting the detected input or interaction;
- Web browser module 322 for navigating, requesting (e.g., via HTTP), and displaying websites and web pages thereof, including a web interface for logging into a user account associated with a client device 140 or another electronic device, controlling the client or electronic device if associated with the user account, and editing and reviewing settings and data that are associated with the user account;
- One or more user applications 324 for execution by the computer system 300 e.g., games, social network applications, smart home applications, extended reality application, and/or other web or non-web based applications for controlling another electronic device and reviewing data captured by such devices
- an eyewear fitting application 326 is executed to implement eyewear fitting, and has a plurality of user accounts 328 associated with a plurality of users 120 (e.g., technician users 120T and eyewear users 120E in Figure 1); and • One or more databases 350 for storing at least data including one or more of: o Device settings 352 including common device settings (e.g., service tier, device model, storage capacity, processing capabilities, communication capabilities, etc.) of the computer system 300; and o User account information 354 for the one or more user applications 324, e.g., user names, security questions, account history data, user preferences, and predefined account settings, where in some embodiments, the user account information 354 includes facial measurements 122 and one or more virtual fitting parameters 162 associated with a plurality of users 120 (
- the eyewear fitting application 326 applies an eyewear fitting model 330 to determine a virtual fitting parameter 162 of an eyewear based on facial measurements 122 of an eyewear user 120E.
- the eyewear fitting application 326 includes a user account 328 associated with an eyewear user 120E, and a fit profile is created for the user account 328 to store the facial measurements 122 and the adjusted virtual fitting parameter 162 for the user account 328.
- FIG. 4 is a flowchart that describes a method 400 for virtual eyewear fitting, in accordance with some embodiments.
- the method may include capturing a set of facial measurements 122 of the eyewear user via a remote imaging device.
- the method may include transmitting the captured measurements 122 to a processing unit.
- the method may include generating a virtual representation of the eyewear adjusted to the captured measurements 122.
- the method may include displaying the virtual representation on a display device within a production facility.
- the method may include adjusting a physical eyewear frame to match the virtual representation displayed on the display device.
- the method may include shipping the adjusted eyewear frame to the eyewear user.
- the remote imaging device may be a camera of an eyewear user’s mobile computer device (e.g., mobile devices 140A-140E in Figure 1).
- the mobile computer device may execute a user application for capturing and transmitting the image data of the user.
- the processing unit may be located remotely from the production facility and the display device may be a tablet computer device.
- the method may include capturing a video sequence of the eyewear user’s head movements and expressions to refine the eyewear fit based on dynamic facial metrics.
- the remote imaging device may include depthsensing technology (e.g., a depth sensor or a depth camera 310D) to capture a three- dimensional (3D) facial contour of the eyewear user.
- the method may include utilizing a feedback loop.
- the virtual representation may be adjusted based on real-time eyewear user feedback before the physical eyewear frame may be adjusted.
- the method may include simulating environmental conditions in the virtual representation to adjust the eyewear for specific eyewear user use cases such as sports or high- glare environments.
- the processing unit may further customize the eyewear by selecting frame styles and materials based on the eyewear user’s biometric data and aesthetic preferences.
- the method may include generating a predictive model of the eyewear user’s future facial changes and adjusting the eyewear to accommodate predicted changes within a predefined time frame.
- Clause 3 The method of clause 1 or 2, wherein the remote imaging device includes a depth sensor configured to capture a three-dimensional facial contour of the user.
- Clause 26 The method of any of clauses 21-25, further comprising: utilizing a feedback loop wherein the virtual representation is adjusted based on real-time eyewear user feedback before the physical eyewear frame is adjusted.
- Clause 32 The system of clause 30 or 31, where the processing module utilizes a machine learning algorithm to refine the adjustment data based on historical fit data and eyewear user feedback.
- Clause 33 The system of any of clauses 30-32, further comprising a feedback module configured to receive post-delivery feedback from eyewear users to improve the adjustment data algorithm.
- Clause 35 The system of any of clauses 30-34, further comprising: a calibration module to ensure that the display device presents the virtual representation at a one- to-one scale with the physical eyewear.
- Clause 37 The system of any of clauses 30-36, wherein: the adjustment station includes robotic arms controlled by the adjustment data to automate the adjustment of the eyewear frame.
- the imaging module is further configured to capture biometric data such as interpupillary distance and ear- to-ear width using a plurality of images from different angles.
- a method for verifying the accuracy of eyewear adjustment comprising: capturing an image of the adjusted eyewear frame; comparing the image to the virtual representation; determining deviations between the adjusted frame and the virtual representation; and providing feedback to the technician for further adjustment if deviations exceed a predetermined threshold.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present disclosure relates to methods and systems for implementing virtual eyewear fitting. In some implementations, a computer system can obtain image data of a user captured by a remote imaging device and extract a set of facial measurements of a face of the user based on the image data. The system can obtain visual information of an eyewear of the user and adjust a virtual fitting parameter of the eyewear of the user based on the set of facial measurements. The eyewear of the user can be visualized on a first user interface based on the visual information and the adjusted virtual fitting parameter of the eyewear. The computer system causes display of a virtual representation of the eyewear on a first user interface. An instruction to adjust the eyewear of the user physically based on the virtual fitting parameter is generated and displayed on a second user interface.
Description
VIRTUAL EYEWEAR FITTING SYSTEM
BACKGROUND
Field of the Inventions
[0001] The present inventions relate to information systems that facilitate eyewear fitting. More specifically, methods, systems, devices, and non-statutory computer-readable storage media are applied to implement an interactive eyewear fitting process.
Description of the Related Art
[0002] Eyewear plays a critical role in providing vision correction, protection from environmental elements, and fashion enhancement for individuals. Eyewear that is properly fitted will not only be comfortable for the wearer, but also provide desired functionality and aesthetic appeal. Conversely, ill-fitting eyewear can lead to discomfort, decreased visual acuity, and even long-term damage to the wearer's eyes or nonuse altogether.
SUMMARY
[0003] In accordance with at least some embodiments disclosed herein is the realization that when a customer engages in the process of obtaining new prescription eyewear, an optician’s eyewear fitting processes can increase the likelihood of human error, lead to discrepancies in lens alignment, resulting in imprecise prescriptions, poorly fitted frames, and discomfort and suboptimal vision for a person wearing. Moreover, some embodiments also include the realization that an ideal or perfect fit of eyewear may be almost impossible using traditional methods given that the optician will not usually be able to obtain perfect feedback or communication from the wearer to confirm that the optician’s intent is fully realized, and also that wearer is often unaware of the goals, consequences, science, and metrics necessary for a proper fit. Instead of an ideal or perfect fit, conventional fitting techniques are only ever able to provide a “best” fit in view of the limitations and drawbacks associated with such techniques.
[0004] Typically, corrective eyewear is fitted to a customer by the optician, based on prescriptions provided by optometrists or ophthalmologists, available frame styles, lens materials, and coatings, some of which are based on the customer’s preferences and lifestyle requirements. Of course, the goal is to ensure that the eyewear fits comfortably and securely on a wearer’s face while also aligning with their prescription needs. However, in accordance with at least some embodiments disclosed herein is the realization that opticians’ eyewear
fitting processes rely heavily on subjective assessments and manual adjustments and are often plagued by inefficiencies and inaccuracies. Accordingly, the present disclosure addresses this and other challenges that have always plagued the eyewear fitting process.
[0005] Moreover, in accordance with at least some embodiments disclosed herein is the realization that inefficient workflows further exacerbate the problem by prolonging the time required for each fitting and hindering the overall patient experience. The present disclosure addresses these and other deficiencies using novel and innovative solutions that will enable various improvements in vision and health care.
[0006] For example, in some implementations, the present disclosure provides a process of digital imaging and computerized measurement techniques to improve efficiency, accuracy, and patient satisfaction with an eyewear fitting process.
[0007] In accordance with some embodiments of the present disclosure, a method is implemented at a computer system for virtual eyewear fitting. The method can include obtaining image data of a user captured by a remote imaging device, generating or generating or extracting a set of facial measurements of a face of the user based on the image data, obtaining visual information of an eyewear of the user, and adjusting a virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
[0008] The method can further include providing a visualization of the eyewear. For example, the method can include visualizing, on a first user interface, the eyewear of the user based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on a first user interface.
[0009] The method can also include generating a prompt, an option, or an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter and causing display of the instruction including the virtual fitting parameter on a second user interface.
[0010] Moreover, in some embodiments, a physical eyewear frame can be adapted, modified, or created based on the virtual representation displayed on the display device.
[0011] In accordance with some embodiments of the present disclosure, a computer system can be used to achieve some of the advantageous features disclosed herein. The computer system can be configured to include one or more processors and memory for storing one or more programs for execution by the one or more processors, the one or more programs including instructions for performing a virtual eyewear fitting method. The computer system can obtain image data of a user captured by a remote imaging device, extract a set of facial measurements of a face of the user based on the image data, obtain visual information of an
eyewear of the user, and adjust a virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
[0012] In some embodiments, the computer system can provide a visualization of the eyewear for the user. For example, the computer system can cause a visualization of the eyewear to be displayed, on a first user interface, based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on a first user interface. The computer system generates an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter and causes display of the instruction including the virtual fitting parameter on a second user interface.
[0013] In accordance with some embodiments of the present disclosure, a non- transitory computer readable storage medium stores one or more programs for execution by one or more processors of a computer system, and the one or more programs including instructions for performing the above virtual eyewear fitting method.
[0014] In accordance with some embodiments of the present disclosure, a method is implemented for custom-fitting eyewear for an online eyewear user, including capturing a set of facial measurements of the eyewear user via a remote imaging device. Some embodiments may also include transmitting the captured measurements to a processing unit. Some embodiments may also include generating a virtual representation of the eyewear adjusted to the captured measurements. Some embodiments may also include displaying the virtual representation on a display device within a production facility. Some embodiments may also include adjusting a physical eyewear frame to match the virtual representation displayed on the display device. Some embodiments may also include shipping the adjusted eyewear frame to the eyewear user.
[0015] In some implementations, the remote imaging device may be a camera of an eyewear user’s mobile computer device. In some implementations, the processing unit may be located remotely from the production facility and the display device may be a tablet computer device. In some implementations, the method may include capturing a video sequence of the eyewear user’s head movements and expressions to refine the eyewear fit based on dynamic facial metrics.
[0016] In some implementations, the remote imaging device includes depthsensing technology (e.g., a depth sensor or camera) to capture three-dimensional facial contours. In some implementations, the method may include utilizing a feedback loop. In some
implementations, the virtual representation may be adjusted based on real-time eyewear user feedback before the physical eyewear frame may be adjusted.
[0017] In some implementations, the method may include simulating environmental conditions in the virtual representation to adjust the eyewear for specific eyewear user use cases such as sports or high-glare environments. In some implementations, the processing unit further customizes the eyewear by selecting frame styles and materials based on the eyewear user’s biometric data and aesthetic preferences. In some implementations, the method may include generating a predictive model of the eyewear user’s future facial changes and adjusting the eyewear to accommodate predicted changes within a predefined time frame.
[0018] In accordance with some embodiments of the present disclosure, a system for custom-fitting eyewear includes an imaging module configured to capture facial measurements. Some embodiments may also include a processing module configured to receive the facial measurements and generate adjustment data for the eyewear. Some embodiments may also include a display module for presenting the adjustment data to a technician. Some embodiments may also include an adjustment station where the technician adjusts the eyewear in accordance with the adjustment data.
[0019] In some implementations, the system, where the imaging module includes a mobile application running on a user’s smartphone capable of capturing and transmitting the measurements. In some implementations, the system, where the processing module utilizes a machine learning algorithm to refine the adjustment data based on historical fit data and eyewear user feedback.
[0020] In some implementations, the system may include a feedback module configured to receive post-delivery feedback from eyewear users to improve the adjustment data algorithm. In some implementations, the display module includes augmented reality capabilities to overlay the adjustment data onto a live image of the eyewear frame during adjustment.
[0021] In some implementations, the system may include a calibration module to ensure that the display device presents the virtual representation at a one-to-one scale with the physical eyewear. In some implementations, the system may include a quality assurance module that verifies the physical adjustment of the eyewear frame against the virtual representation using image recognition technology. In some implementations, the adjustment station includes robotic arms controlled by the adjustment data to automate the adjustment of the eyewear frame. In some implementations, the imaging module may be further configured
to capture biometric data such as interpupillary distance and ear-to-ear width using a plurality of images from different angles.
[0022] In accordance with some embodiments of the present disclosure, a non- transitory computer-readable storage medium having instructions stored thereon that, when executed by a processor, cause the processor to perform operations including receiving facial measurements from a mobile device. Some embodiments may also include generating virtual fitting parameters for eyewear based on the received measurements. Some embodiments may also include displaying the virtual fitting parameters on a display device to assist in manual adjustment of the eyewear. Some embodiments may also include recording the adjustments to create a fit profile associated with the eyewear user.
[0023] In accordance with some embodiments of the present disclosure, a method is implemented for verifying the accuracy of eyewear adjustment, and includes capturing an image of the adjusted eyewear frame. Some embodiments may also include comparing the image to the virtual representation. Some embodiments may also include determining deviations between the adjusted frame and the virtual representation. Some embodiments may also include providing feedback to the technician for further adjustment if deviations exceed a predetermined threshold.
[0024] In accordance with some embodiments of the present disclosure, a method is implemented for enhancing eyewear user engagement in an eyewear fitting process, and includes providing a user interface for the eyewear user to view and modify the virtual representation of the eyewear. Some embodiments may also include enabling the eyewear user to select from various adjustment suggestions. Some embodiments may also include finalizing the eyewear adjustment based on the eyewear user’s selections.
[0025] In accordance with some embodiments of the present disclosure, a method is implemented for training an artificial intelligence model for eyewear fitting. The method includes collecting a large dataset of facial measurements and corresponding eyewear user satisfaction ratings. Some embodiments may also include using the dataset to train the artificial intelligence model to predict optimal fit adjustments. Some embodiments may also include continuously updating the model with new data to improve fit predictions over time.
[0026] In some embodiments, an eyewear fitting application is implemented by a head-mounted display device (HDD) configured to create an extended reality (XR) environment for a user (e.g., an optician, a wearer of the eyewear). A pair of glasses may be rendered for the user in a three-dimension format in the XR environment, thereby facilitating eyewear selection and fitting. The XR is an umbrella term encapsulating Augmented Reality
(AR), Virtual Reality (VR), Mixed Reality (MR), and everything in between. In this application, any embodiments that apply a VR system can be implemented using an AR or MR system as well.
[0027] Additional features and advantages of the subject technology will be set forth in the description below, and in part will be apparent from the description, or may be learned by practice of the subject technology. The advantages of the subject technology will be realized and attained by the structure particularly pointed out in the written description and embodiments hereof as well as the appended drawings.
[0028] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the subject technology.
BRIEF DESCRIPTION OF THE FIGURES
[0029] For a better understanding of the various described implementations, reference should be made to the Description of Implementations below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
[0030] Figure 1 is a data processing environment having one or more servers communicatively coupled to a plurality of client devices, in accordance with some embodiments.
[0031] Figure 2A is a visual acuity assessment environment in which an XR device is applied to facilitate eyewear fitting, in accordance with some embodiments.
[0032] Figures 2B and 2C are visual representations of an eyewear including virtual fitting parameters, in accordance with some embodiments.
[0033] Figure 3 is a block diagram of a computer system configured to implement eyewear fitting, in accordance with some embodiments.
[0034] Figure 4 is a flowchart illustrating a method for virtual eyewear fitting, in accordance with some embodiments.
[0035] Figure 5 is a block diagram illustrating a system for virtual eyewear fitting, in accordance with some embodiments.
[0036] Figure 6 is a flowchart illustrating a method for verifying accuracy of eyewear adjustment, in accordance with some embodiments.
[0037] Figure 7 is a flowchart illustrating a method for enhancing eyewear user engagement, in accordance with some embodiments.
[0038] Figure 8 is a flowchart illustrating a computer-implemented method for training an artificial intelligence model for virtual eyewear fitting, in accordance with some embodiments.
[0039] For illustrating the overcoming of pixel density limitations through algorithmic enhancement, aspects of the present disclosure can be illustrated with a diagram showcasing the process from optotype selection, through algorithmic enhancement, to display on the VR headset, highlighting the steps taken to adjust for pixel density limitations.
[0040] Before and after images can be included as comparative images showing optotypes displayed on VR headsets with or without an algorithmic enhancement, which is optional in some embodiments, thus clearly demonstrating the improvement in clarity and visibility.
DETAILED DESCRIPTION
[0041] It is understood that various configurations of the subject technology will become readily apparent to those skilled in the art from the disclosure, wherein various configurations of the subject technology are shown and described by way of illustration. As will be realized, the subject technology is capable of other and different configurations and its several details are capable of modification in various other respects, all without departing from the scope of the subject technology. Accordingly, the summary, drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
[0042] The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be apparent to those skilled in the art that the subject technology may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology. Like components are labeled with identical element numbers for ease of understanding.
[0043] Some embodiments of the present disclosure may include a method for custom-fitting eyewear for an online eyewear user, including capturing a set of facial measurements of the eyewear user via a remote imaging device. Some embodiments may also include transmitting the captured measurements to a processing unit. Some embodiments may also include generating a virtual representation of the eyewear adjusted to the captured measurements. Some embodiments may also include displaying the virtual representation on a display device within a production facility. Some embodiments may also include adjusting a physical eyewear frame to match the virtual representation displayed on the display device. Some embodiments may also include shipping the adjusted eyewear frame to the eyewear user.
[0044] Figure 1 is a data processing environment 100 having one or more servers 102 communicatively coupled to a plurality of client devices 140A-140E, in accordance with some embodiments. Each client device 140 can collect data or user inputs (e.g., image data of a user), executes user applications, and present outputs (e.g., an instruction for eyewear fitting, a visual representation 220 of eyewear 160) on its user interface. The collected data or user inputs can be processed locally at the client device 140 and/or remotely by the server(s) 102. The one or more servers 102 provides system data (e.g., boot files, operating system images, and user applications) to the client devices 140, and in some embodiments, processes the data and user inputs received from the client device(s) 140 when the user applications are executed on the client devices 140. In some implementations, the data processing environment 100 further includes a storage 106 for storing data related to the servers 102, client devices 140, and applications executed on the client devices 140. For example, storage 106 may store one or more of: video content, static visual content, audio data, facial measures of users, virtual fitting parameters, and eyewear fitting instructions.
[0045] In some implementations, the one or more servers 102 are configured to enable an eyewear fitting platform having a plurality of user accounts 328 (Figure 3) for technician users 120T and eyewear users 120E. Each of the plurality of client devices 140A- 140E is associated with a technician or an eyewear user 120E, and configured to execute a dedicated or browser-based eyewear fitting application 326 (Figure 3). The plurality of client devices 140 may be, for example, desktop computers, laptop computers 140 A, tablet computers MOB, mobile phones 140C, or intelligent, multi -sensing, network-connected home devices (e.g., a depth camera, a visible light camera 140E). In some implementations, the plurality of client devices 140 include an XR device 140D (also called a head-mounted display device (HDD) 140D) configured to render extended reality content, e.g., facilitating virtual eyewear fitting.
[0046] The one or more servers 102 can enable real-time data communication with the client devices 140 that are remote from each other or from the one or more servers 102. Further, in some embodiments, the one or more servers 102 can implement data processing tasks that cannot be or are preferably not completed locally by the client devices 140. For example, the client devices 140 include executes an interactive eyewear fitting application 326 (Figure 3). The client devices 140 captures the image data of an eyewear user 120E, and sends the image data to the server 102. The server 102 generates a set of facial measurements 122 of the eyewear user 120E, a virtual fitting parameter, a visual representation 220 of the eyewear 160, and an instruction 202 (Figure 2A) for adjusting the eyewear 160 of the user physically. The instruction may be sent to a client device 140A-140D associated with a technician user 120T to present the instruction, thereby guiding the technician user 120T to adjust the eyewear 160.
[0047] In some embodiments not shown, the client device 140 includes an eyewear fitting machine. The instruction 202 may be sent to for adjusting the eyewear 160, and the instruction 202 may be displayed on a display associated with the eyewear fitting machine. Further, in response to the instruction 202, a robotic arm of the eyewear fitting machine may be controlled to adjust the eyewear 160 based on the virtual fitting parameter 162, which can thereafter be adjusted.
[0048] The one or more servers 102, the plurality of client devices 140, and storage 106 are communicatively coupled to each other via one or more communication networks 108, which are the medium used to provide communications links between these devices and computers connected together within the data processing environment 100. The one or more communication networks 108 may include connections, such as wire, wireless communication links, or fiber optic cables. Examples of the one or more communication networks 108 include local area networks (LAN), wide area networks (WAN) such as the Internet, or a combination thereof. The one or more communication networks 108 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB), FIREWIRE, Long Term Evolution (LTE), Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP), Wi-MAX, or any other suitable communication protocol. A connection to the one or more communication networks 108 may be established either directly (e.g., using 3G/4G/5G connectivity to a wireless carrier), or through a network interface 110 (e.g., a router, switch, gateway, hub, or an intelligent, dedicated whole-home control node), or
through any combination thereof. As such, the one or more communication networks 108 can represent the Internet of a worldwide collection of networks and gateways that use the Transmission Control Protocol/Internet Protocol (TCP/IP) suite of protocols to communicate with one another. At the heart of the Internet is a backbone of high-speed data communication lines between major nodes or host computers, consisting of thousands of commercial, governmental, educational and other electronic systems that route data and messages.
[0049] Some implementations of this application are directed to a virtual eyewear fitting process implemented at least partially via the data processing environment 100. A client device 140 (e.g., a remote image device) associated with an eyewear user 120E captures image data of the eyewear user 120E. A server 102 or a client device 140 associated with a technician user 120T obtains the image data of the eyewear user 120E, and extracts a set of facial measurements 122 of a face of the eyewear user 120E based on the image data. The server 102 or the client device 140 associated with the technician user 120T obtains visual information of the eyewear 160, e.g., extracts from an eyewear database. Based on the set of facial measurements 122, the server 102 or the client device 140 associated with the technician user 120T adjusts a virtual fitting parameter of the eyewear 160, e.g., to enhance fitting of the eyewear 160 to the face of the eyewear user 120E. The client device 140 associated with the technician user 120T visualizes, on a first user interface 210A (Figure 2A), the eyewear 160 based on the visual information and the adjusted virtual fitting parameter of the eyewear 160, including causing display of a virtual representation of the eyewear 160 on the first user interface 210A. The server 102 or the client device 140 associated with the technician user 120T generates an instruction to adjust the eyewear 160 of the user physically based on the virtual fitting parameter, and causes display of the instruction including the virtual fitting parameter on a second user interface 210B (Figure 2A). The second user interface 210B optionally includes or replaces the first user interface 210A.
[0050] In some embodiments, the image data of the eyewear user 120E include a video sequence of head movements and expressions of the user 120E, and the set of facial measurements 122 includes dynamic facial metrics extracted from the video sequence. The virtual fitting parameter 162 of the eyewear 160 is adjusted based on the dynamic facial metrics to refine an eyewear fit for the user 120E. In some embodiments, the set of facial measurements 122 include a 3D facial map representing geometric features of the face of the user 120E in a plurality of meshes. The eyewear fitting application may be executed to reconstruct the face of the user in a visual representation or determine geometric features (e.g., a facial contour, a distance between an eyebrow and eyelashes) of the user’s face.
[0051] In some embodiments, the server 102 or the client device 140 associated with the technician user 120T determines an environmental condition in which the eyewear 160 is configured to be worn by the user 120E. The virtual fitting parameter 162 of the eyewear 160 is adjusted based on the environmental condition.
[0052] Figure 2A is a visual acuity assessment environment in which an XR device 140D is applied to facilitate eyewear fitting, in accordance with some embodiments, and Figures 2B and 2C are visual representations 220 of an eyewear 160 including one or more virtual fitting parameters 162, in accordance with some embodiments. The XR device 140D may be communicatively coupled within the data processing environment 100. The XR device 140D may include one or more cameras (e.g., a visible light camera, a depth camera), a microphone, a speaker, one or more inertial sensors (e.g., gyroscope, accelerometer), and a display. In some situations, the camera captures hand gestures of a user wearing the XR device 140D. In some situations, the microphone records ambient sound, including user’s voice commands. The XR device 140D may execute a client-side eyewear fitting application 326 (Figure 3) via a user account 328 associated with a technician user 120T or an eyewear user 120E.
[0053] In some embodiments, an eyewear user 120E may review eyewear options offered by the eyewear fitting platform in a three-dimensional (3D) format in the XR device 140D. A server 102 or a client device 140 associated with an eyewear user 120E may execute an eyewear fitting application 326 to select a frame style and a frame material based on biometric data (e.g., facial measurements 122) and aesthetic preferences of the user 120E. In some implementations, the XR device 140 may obtain image data of the eyewear user 120E captured by a remote imaging device, extracts facial measurements 122 of the eyewear user 120E, create a 3D avatar of a model or the eyewear user 120E, and displays a visual representation 220 of a selected eyewear option on a face of the 3D avatar. Particularly, in some implementations, the eyewear fitting application 326 customizes a virtual fitting parameter of the selected eyewear option reflects it on the visual representation 220 on the face of the 3D avatar. As such, the eyewear fitting application 326 enables personalized eyewear fitting remotely using interactive user interfaces 210, digital imaging, and automated measurements techniques.
[0054] Alternatively or additionally, in some embodiments, a technician user 120T may review an eyewear requested by the eyewear in a three-dimensional (3D) format in the XR device 140D. The XR device 140 may obtain image data of the eyewear user 120E captured by the remote imaging device, extracts facial measurements 122 of the eyewear user 120E,
create a 3D avatar of a model or the eyewear user 120E, and displays a visual representation 220 of a selected eyewear option on a face of the 3D avatar. Further, the XR device 140D and/or an associated server 102 compares the facial measurements 122 of the face of the eyewear user 120E and measurements 122 of the eyewear (e.g., provided by an eyewear factory, extracted by the server 102), determines a different between the facial measurements 122 of the face of the eyewear user 120E and the measurements 122 of the eyewear, and adjusts a virtual fitting parameter of the eyewear. In some implementations, the XR device 140D or the server 102 generates an instruction 202 to adjust the eyewear of the user physically based on the virtual fitting parameter, and the XR device 140D displays the instruction 202 including the virtual fitting parameter on its display. In some embodiments, the instruction 202 is overlaid on a live image that is generated, rendered, or visualized and representative of the eyewear 160, e.g., in real time while the technician user 120T is adjusting the eyewear 160.
[0055] In some implementations, during an eyewear fitting process, the technician user 120T may wear the XR device 140, which captures image data of the eyewear 160 that has been adjusted in response to the instruction 202. In some embodiments, the eyewear 160 of the user 120E is visualized with a one-to-one scale on the user interface 210. The image data of the eyewear is compared to the virtual representation of the eyewear 160. Based on a comparison result, the server 102 or the XR device 140 determines whether adjustment of the eyewear 160 matches the adjusted virtual fitting parameter of the eyewear 160 as shown on the visual representation 220. Further, in some situations, the comparison result includes a difference between the image data of the eyewear 160 and the virtual representation of the eyewear 160. The XR device 140 updates the instruction 702 to adjust the eyewear 160 of the user physically based on the difference, obtains additional image data of the eyewear 160 in response to the instruction 702, and compares the additional image data and the virtual representation of the eyewear 160 to update the difference, iteratively and until the difference is below a predetermined threshold. By these means, the eyewear fitting application 326 can provide objective and real-time instructions to guide the technician user 120T to implement a closed-loop eyewear fitting process, thereby enhancing efficiency and accuracy in eyewear fitting.
[0056] It is noted that some implementations of the XR device 140D are not limited by the XR device 140D and may be implemented via another client device 140 (e.g., a mobile phone 140C) distinct from the XR device 140D. In some embodiments, the first user interface 210A is displayed on an electronic device 104 associated with the eyewear user 120E. The server 102 or client device 104 associated with a technician user 140T receives, from the first
user interface 210A, a user feedback message for the adjusted virtual fitting parameter of the eyewear, and prior to display of the instruction, updates the instruction 202 based on the user feedback message.
[0057] In some embodiments, a client device 140 (e.g., the XR device 140D) receives, on the first user interface 210A, a user input (e.g., a hand gesture, a touch on a touch screen, a voice message) requesting a modification to the virtual representation 220. The user input may select the modification from a plurality of adjustment suggestions. Based on the user input, the virtual representation 220 is modified and updated on the first user interface 210A.
[0058] Referring to Figures 2B and 2C, in some embodiments, the technician user 120T may adjust one or more virtual fitting parameters 162 on the eyewear 160 to ensure a proper fit and optimal comfort for the eyewear user 120E. The one or more virtual fitting parameters 162 include one or more of
• Bridge Width 242 allowing the eyewear 160 to sit comfortably on a nose without slipping or causing discomfort;
• Temple Length 244 that is a length of temple arms adjustable to ensure that the eyewear 160 fit securely behind ears without being too tight or too loose;
• Frame Tilt 246 that is adjustable on the eyewear 160 to ensure that the eyewear 160 are level on the face and sit correctly;
• Frame Angle 248 representing an angle of the frame adjustable to ensure that the eyewear 160 follow the contour of the face properly;
• Nose Pad Position 250 that are adjustable to ensure that they rest comfortably on the nose and distribute weight evenly;
• Temple Curvature 252 of the temple arms that is adjustable to fit a curvature of the eyewear user’s head for a secure fit;
• Lens Alignment 254 of eyewear lenses with the eyewear user’s eyes;
• Frame Width 256 and Frame Height 258 for providing adequate coverage for eyes while not obstructing vision or sitting too low on the face;
• Frame Material that is adjustable for a better fit, such as acetate or metal;
• Nose Pad Material that may be adjusted for added comfort; and
• Overall Fit of the eyewear 160 on the eyewear user 120E.
[0059] In some embodiments, the virtual fitting parameters 162 are adjusted to match the facial measurements 122 of the eyewear user 120E, which are extracted from image data (e.g., static images, video clips, depth image) of the eyewear user 120E. The facial
measurements 162 used to adjust eyewear (e.g., manually or by a machine) include, but are not limited to, one or more of:
• Pupillary Distance (PD) between centers of two pupils;
• Bridge Width of a nose bridge;
• First Distance from an outer corner of an eye to a rear contour of an ear, which defines a temple length of the eyewear 160;
• Face Width that determines a frame width of the eyewear 160;
• Eye Size that determines a frame height of the eyewear 160;
• Face Shape allowing the technician user 120T to consider an overall shape of the eyewear user’s face (e.g., round, oval, square) to recommend frames that best complement their facial features.
• Nose Bridge Depth;
• Cheekbone Width; and
• Ear Positions of two ears of the eyewear user 120E.
[0060] Figure 3 is a block diagram of a computer system 300 configured to implement eyewear fitting, in accordance with some embodiments. The computer system 300 typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset). The computer system 300 includes one or more input devices 310 that facilitate user input, such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a gesture capturing camera, or other input buttons or controls. Furthermore, in some embodiments, the client device 140 of the computer system 300 uses a microphone for voice recognition or an eye tracking device 380 (e.g., a camera) fortracking eyeball movement. In some implementations, the client device 140 includes one or more optical cameras (e.g., an RGB camera), scanners, or photo sensor units for capturing images. The computer system 300 also includes one or more output devices 312 that enable presentation of user interfaces 210 and display content, including one or more speakers and/or one or more visual displays.
[0061] In some embodiments, the one or more input devices 310 includes an image module 310A (Figure 5) for generating or extracting facial measurements 122 of an eyewear user 120E. In some embodiments, the one or more output devices 312 include a display module 312A (Figure 5) for presenting the adjustment data to a technician.
[0062] Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally,
includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other nonvolatile solid state storage devices. Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302. Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium. In some implementations, memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
• Operating system 314 including procedures for handling various basic system services and for performing hardware dependent tasks;
• Network communication module 316 for connecting each server 102 or client device 140 to other devices (e.g., server 102, client device 140, or storage 106) via one or more network interfaces 304 (wired or wireless) and one or more communication networks 108, such as the Internet, other wide area networks, local area networks, metropolitan area networks, and so on;
• User interface module 318 for enabling presentation of information (e.g., a graphical user interface for application(s) 324, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc.) at each client device 140 via one or more output devices 312 (e.g., displays, speakers, etc.);
• Input processing module 320 for detecting one or more user inputs or interactions from one of the one or more input devices 310 and interpreting the detected input or interaction;
• Web browser module 322 for navigating, requesting (e.g., via HTTP), and displaying websites and web pages thereof, including a web interface for logging into a user account associated with a client device 140 or another electronic device, controlling the client or electronic device if associated with the user account, and editing and reviewing settings and data that are associated with the user account;
• One or more user applications 324 for execution by the computer system 300 (e.g., games, social network applications, smart home applications, extended reality application, and/or other web or non-web based applications for controlling another electronic device and reviewing data captured by such devices), where in some embodiments, an eyewear fitting application 326 is executed to implement eyewear fitting, and has a plurality of user accounts 328 associated with a plurality of users 120 (e.g., technician users 120T and eyewear users 120E in Figure 1); and
• One or more databases 350 for storing at least data including one or more of: o Device settings 352 including common device settings (e.g., service tier, device model, storage capacity, processing capabilities, communication capabilities, etc.) of the computer system 300; and o User account information 354 for the one or more user applications 324, e.g., user names, security questions, account history data, user preferences, and predefined account settings, where in some embodiments, the user account information 354 includes facial measurements 122 and one or more virtual fitting parameters 162 associated with a plurality of eyewear users 120E.
[0063] In some embodiments, the eyewear fitting application 326 applies an eyewear fitting model 330 to determine a virtual fitting parameter 162 of an eyewear based on facial measurements 122 of an eyewear user 120E. In some embodiments, the eyewear fitting application 326 includes a user account 328 associated with an eyewear user 120E, and a fit profile is created for the user account 328 to store the facial measurements 122 and the adjusted virtual fitting parameter 162 for the user account 328.
[0064] Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some implementations, memory 306, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 306, optionally, stores additional modules and data structures not described above.
[0065] Figure 4 is a flowchart that describes a method 400 for virtual eyewear fitting, in accordance with some embodiments. In some implementations, at step 410, the method may include capturing a set of facial measurements 122 of the eyewear user via a remote imaging device. At step 420, the method may include transmitting the captured measurements 122 to a processing unit. At step 430, the method may include generating a virtual representation of the eyewear adjusted to the captured measurements 122. At step 440, the method may include displaying the virtual representation on a display device within a production facility. At step 450, the method may include adjusting a physical eyewear frame to match the virtual representation displayed on the display device. At step 460, the method may include shipping the adjusted eyewear frame to the eyewear user.
[0066] In some implementations, the remote imaging device may be a camera of an eyewear user’s mobile computer device (e.g., mobile devices 140A-140E in Figure 1). The mobile computer device may execute a user application for capturing and transmitting the image data of the user. In some implementations, the processing unit may be located remotely from the production facility and the display device may be a tablet computer device. In some implementations, the method may include capturing a video sequence of the eyewear user’s head movements and expressions to refine the eyewear fit based on dynamic facial metrics.
[0067] In some implementations, the remote imaging device may include depthsensing technology (e.g., a depth sensor or a depth camera 310D) to capture a three- dimensional (3D) facial contour of the eyewear user. In some implementations, the method may include utilizing a feedback loop. The virtual representation may be adjusted based on real-time eyewear user feedback before the physical eyewear frame may be adjusted. In some implementations, the method may include simulating environmental conditions in the virtual representation to adjust the eyewear for specific eyewear user use cases such as sports or high- glare environments. In some implementations, the processing unit may further customize the eyewear by selecting frame styles and materials based on the eyewear user’s biometric data and aesthetic preferences. In some implementations, the method may include generating a predictive model of the eyewear user’s future facial changes and adjusting the eyewear to accommodate predicted changes within a predefined time frame.
[0068] In accordance with some embodiments, a computer system (e.g., a server 102, a client device 140, or a combination thereof in Figure 1) obtains image data of a user 120E (Figure 1) captured by a remote imaging device and extracts a set of facial measurements 122 of a face of the user 120E based on the image data. The image data of the user 120E includes a plurality of images corresponding to a plurality of different angles, and the set of facial measurements 122 of the face of the user 120E includes at least interpupillary distance and ear-to-ear width. The computer system obtains visual information of an eyewear of the user 120E, e.g., from a storage 106 (Figure 1). The computer system adjusts a virtual fitting parameter of the eyewear of the user 120E based on the set of facial measurements 122, and visualizes, on a first user interface 210A (Figure 2A), the eyewear of the user 120E based on the visual information and the adjusted virtual fitting parameter of the eyewear. For example, the computer system causes display of a virtual representation of the eyewear on the first user interface 210A. The computer system generates an instruction to adjust the eyewear of the user 120E physically based on the virtual fitting parameter, and causes display of the instruction including the virtual fitting parameter on a second user interface 210B. In some embodiments,
the first user interface 210A is the second user interface 210B. In some embodiments, the first user interface 210A is replaced with the second user interface 21 OB.
[0069] In some implementations, the first user interface 210A and the second user interface 210B are presented to a technician user 120T (Figure 1) who adjusts a physical eyewear frame to match the virtual representation displayed on the display device associated with the technician user before the adjusted eyewear frame is provided to the eyewear user 120E. Stated another way, the first user interface 210A and the second user interface 21 OB are displayed on a display of an electronic device (e.g., client device 140A) located in a production facility.
[0070] Figure 5 is a block diagram that describes a system 500 for virtual eyewear fitting, in accordance with some embodiments. In some implementations, the system 500 may include an imaging module 310A configured to capture facial measurements 122, a processing module 302A configured to receive the facial measurements 122 and generate adjustment data for the eyewear, a display module 312A for presenting the adjustment data to a technician, and an adjustment station 240 where the technician adjusts the eyewear in accordance with the adjustment data.
[0071] In some implementations, the system 500, where the imaging module 310A. In some implementations, the processing module 302A utilizes a machine learning algorithm to refine the adjustment data based on historical fit data and eyewear user feedback. In some implementations, the system 200 may include a feedback module configured to receive postdelivery feedback from eyewear users to improve the adjustment data algorithm.
[0072] In some implementations, the display module 312A may include augmented reality capabilities to overlay the adjustment data onto a live image of the eyewear frame during adjustment. In some implementations, the system 500 may include a calibration module to ensure that the display device presents the virtual representation at a one-to-one scale with the physical eyewear. In some implementations, the system 500 may include a quality assurance module that verifies the physical adjustment of the eyewear frame against the virtual representation using image recognition technology. In some implementations, the adjustment station 540 may include robotic arms controlled by the adjustment data to automate the adjustment of the eyewear frame. In some implementations, the imaging module 310A may be further configured to capture biometric data such as interpupillary distance and ear-to-ear width using a plurality of images from different angles.
[0073] In some implementations, a non-transitory computer-readable storage medium. Instructions stored thereon that, when executed by a processor, cause the processor to
perform operations. Receiving facial measurements 122 from a mobile device. Generating virtual fitting parameters for eyewear based on the received measurements 122. Displaying the virtual fitting parameters on a display device to assist in manual adjustment of the eyewear. Recording the adjustments to create a fit profile associated with the eyewear user.
[0074] Figure 6 is a flowchart that describes a method 600 for verifying accuracy of eyewear adjustment, in accordance with some embodiments. In some implementations, at step 610, the method 600 may include capturing an image of the adjusted eyewear frame. At step 620, the method may include comparing the image to the virtual representation 220. At step 630, the method may include determining deviations between the adjusted frame and the virtual representation 220. At step 640, the method may include providing feedback (e.g., an instruction 202 in Figure 2A) to the technician user 120T for further adjustment if deviations exceed a predetermined threshold.
[0075] Figure 7 is a flowchart that describes a method 700 for enhancing eyewear user engagement, in accordance with some embodiments. In some implementations, at step 710, the method may include providing a user interface for the eyewear user 120E to view and modify the virtual representation 220 of the eyewear. At step 720, the method may include enabling the eyewear user to select from various adjustment suggestions. At step 730, the method may include finalizing the eyewear adjustment based on the eyewear user’s selections.
[0076] Figure 8 is a flowchart that describes a computer-implemented method 800 for training an artificial intelligence model, in accordance with some embodiments. In some implementations, at 810, the computer-implemented method may include collecting a large dataset of facial measurements 122 and corresponding eyewear user satisfaction ratings. At 820, the computer-implemented method may include using the dataset to train the artificial intelligence model to predict optimal fit adjustments. At 830, the computer-implemented method may include continuously updating the model with new data to improve fit predictions over time. For example, in some embodiments, an eyewear adjustment model 330 (Figure 3) is applied to adjust a virtual fitting parameter 162 (Figure 3) of an eyewear of an eyewear user 120E based on the set of facial measurements 122 extracted from image data provided by a client device 140 associated with the eyewear user. Further, in some embodiments, historical fit data (e.g., historical facial measurements 162 of former users 120E, historical virtual fitting parameter 162 of their eyewear) and associated eyewear user feedback information (e.g., eyewear user satisfaction ratings) are collected and used to train the eyewear adjustment model 330.
[0077] In some embodiments, an adjustment data algorithm is applied to adjust the virtual fitting parameter 162 of the eyewear of the user based on the set of facial measurements 122. The adjustment data algorithm may be an algorithm that does not use any neural network. The adjustment data algorithm may be based on machine learning and include a neural network. The computer system receives a post-delivery feedback message from a technician user 120T, and updates the adjustment data algorithm based on the post-delivery feedback message.
[0078] In some embodiments, a computer system (e.g., a server 102, a client device 140, or a combination thereof in Figure 1) applies a facial development model to predict future facial measurements 122 of the face of the eyewear user 120E corresponding to a predefined time frame (e.g., during the next 36 months), wherein the virtual fitting parameter 162 of the eyewear 160 is adjusted based on the future facial measures 122 of the face.
[0079] As used herein, the word “module” refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpretive language such as BASIC. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM or EEPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware.
[0080] It is contemplated that the modules may be integrated into a fewer number of modules. One module may also be separated into multiple modules. The described modules may be implemented as hardware, software, firmware or any combination thereof. Additionally, the described modules may reside at different locations connected through a wired or wireless network, or the Internet.
[0081] In general, it will be appreciated that the processors can include, by way of example, computers, program logic, or other substrate configurations representing data and instructions, which operate as described herein. In other embodiments, the processors can include controller circuitry, processor circuitry, processors, general purpose single-chip or multi-chip microprocessors, digital signal processors, embedded microprocessors, microcontrollers and the like.
[0082] Furthermore, it will be appreciated that in one embodiment, the program logic may advantageously be implemented as one or more components. The components may advantageously be configured to execute on one or more processors. The components include, but are not limited to, software or hardware components, modules such as software modules, object-oriented software components, class components and task components, processes methods, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
[0083] The foregoing description is provided to enable a person skilled in the art to practice the various configurations described herein. While the subject technology has been particularly described with reference to the various figures and configurations, it should be understood that these are for illustration purposes only and should not be taken as limiting the scope of the subject technology.
[0084] There may be many other ways to implement the subject technology. Various functions and elements described herein may be partitioned differently from those shown without departing from the scope of the subject technology. Various modifications to these configurations will be readily apparent to those skilled in the art, and generic principles defined herein may be applied to other configurations. Thus, many changes and modifications may be made to the subject technology, by one having ordinary skill in the art, without departing from the scope of the subject technology.
[0085] It is understood that the specific order or hierarchy of steps in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged. Some of the steps may be performed simultaneously. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
[0086] As used herein, the phrase “at least one of’ preceding a series of items, with the term “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list (i.e., each item). The phrase “at least one of’ does not require selection of at least one of each item listed; rather, the phrase allows a meaning that includes at least one of any one of the items, and/or at least one of any combination of the items, and/or at least one of each of the items. By way of example, the phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or at least one of each of A, B, and C.
[0087] Furthermore, to the extent that the term “include,” “have,” or the like is used in the description or the claims, such term is intended to be inclusive in a manner similar to the term “comprise” as “comprise” is interpreted when employed as a transitional word in a claim.
[0088] As used herein, the term “about” is relative to the actual value stated, as will be appreciated by those of skill in the art, and allows for approximations, inaccuracies and limits of measurement under the relevant circumstances. In one or more aspects, the terms “about,” “substantially,” and “approximately” may provide an industry-accepted tolerance for their corresponding terms and/or relativity between items, such as a tolerance of from less than one percent to 10% percent of the actual value stated, and other suitable tolerances.
[0089] As used herein, the term “comprising” indicates the presence of the specified integer(s), but allows for the possibility of other integers, unspecified. This term does not imply any particular proportion of the specified integers. Variations of the word “comprising,” such as “comprise” and “comprises,” have correspondingly similar meanings.
[0090] The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.
[0091] A reference to an element in the singular is not intended to mean “one and only one” unless specifically stated, but rather “one or more.” Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. The term “some” refers to one or more. Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the subject technology, and are not referred to in connection with the interpretation of the description of the subject technology. All structural and functional equivalents to the elements of the various configurations described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and intended to be encompassed by the subject technology. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the above description.
[0092] Although the detailed description contains many specifics, these should not be construed as limiting the scope of the subject technology but merely as illustrating different examples and aspects of the subject technology. It should be appreciated that the scope of the subject technology includes other embodiments not discussed in detail above. Various other modifications, changes and variations may be made in the arrangement, operation and details of the method and apparatus of the subject technology disclosed herein without departing from the scope of the present disclosure. In addition, it is not necessary for a device or method to
address every problem that is solvable (or possess every advantage that is achievable) by different embodiments of the disclosure in order to be encompassed within the scope of the disclosure. The use herein of “can” and derivatives thereof shall be understood in the sense of “possibly” or “optionally” as opposed to an affirmative capability.
Illustration of Subject Technology as Clauses
[0093] Various examples of aspects of the disclosure are described as numbered clauses (1, 2, 3, etc.) for convenience. These are provided as examples, and do not limit the subject technology. Identifications of the figures and reference numbers are provided below merely as examples and for illustrative purposes, and the clauses are not limited by those identifications.
[0094] In some implementations, any of the clauses herein may depend from any one of the independent clauses or any one of the dependent clauses. In one aspect, any of the clauses (e.g., dependent or independent clauses) may be combined with any other one or more clauses (e.g., dependent or independent clauses). In one aspect, a claim may include some or all of the words (e.g., steps, operations, means or components) recited in a clause, a sentence, a phrase or a paragraph. In one aspect, a claim may include some or all of the words recited in one or more clauses, sentences, phrases or paragraphs. In one aspect, some of the words in each of the clauses, sentences, phrases or paragraphs may be removed. In one aspect, additional words or elements may be added to a clause, a sentence, a phrase or a paragraph. In one aspect, the subject technology may be implemented without utilizing some of the components, elements, functions or operations described herein. In one aspect, the subject technology may be implemented utilizing additional components, elements, functions or operations.
[0095] Clause 1. A method for virtual eyewear fitting, comprising: obtaining image data of a user captured by a remote imaging device; generating or extracting a set of facial measurements of a face of the user based on the image data; obtaining visual information of an eyewear of the user; adjusting a virtual fitting parameter of the eyewear of the user based on the set of facial measurements; visualizing, on a first user interface, the eyewear of the user based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on the first user interface; generating an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter; and causing display of the instruction including the virtual fitting parameter on a second user interface.
[0096] Clause 2. The method of clause 1, wherein the remote imaging device includes a camera of a mobile device associated with the user and is configured to execute a user application for capturing and transmitting the image data of the user.
[0097] Clause 3. The method of clause 1 or 2, wherein the remote imaging device includes a depth sensor configured to capture a three-dimensional facial contour of the user.
[0098] Clause 4. The method of any of the preceding clauses, wherein the method is implemented by a computer system, and the first user interface and the second user interface are displayed on a display of an electronic device located in a production facility.
[0099] Clause 5. The method of any of the preceding clauses, wherein an eyewear adjustment model is applied to adjust the virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
[0100] Clause 6. The method of clause 5, further comprising: obtaining historical fit data and associated eyewear user feedback information, the historical fit data including historical virtual fitting parameters and historical facial measurements; and training the eyewear adjustment model based on the historical fit data and the associated eyewear user feedback information.
[0101] Clause 7. The method of any of the preceding clauses, wherein an adjustment data algorithm is applied to adjust the virtual fitting parameter of the eyewear of the user based on the set of facial measurements, the method further comprising: receiving a post-delivery feedback message from the user; and updating the adjustment data algorithm based on the post-delivery feedback message.
[0102] Clause 8. The method of any of the preceding clauses, further comprising: executing an eyewear fitting application including a user account associated with the user; and creating a fit profile including the facial measurements and the adjusted virtual fitting parameter for the user account.
[0103] Clause 9. The method of any of the preceding clauses, wherein the image data of the user include a video sequence of head movements and expressions of the user, and generating or extracting the set of facial measurements further comprises: generating or extracting dynamic facial metrics from the video sequence, wherein the virtual fitting parameter is adjusted based on the dynamic facial metrics to refine an eyewear fit for the user.
[0104] Clause 10. The method of any of the preceding clauses, wherein the first user interface is displayed on an electronic device associated with the user, further comprising: receiving, from the first user interface, a user feedback message for the adjusted virtual fitting
parameter of the eyewear; and prior to display of the instruction, updating the instruction based on the user feedback message.
[0105] Clause 11. The method of any of the preceding clauses, further comprising: determining an environmental condition in which the eyewear is configured to be worn by the user, wherein the virtual fitting parameter of the eyewear is adjusted based on the environmental condition.
[0106] Clause 12. The method of any of the preceding clauses, wherein the first user interface includes the second user interface and is displayed on an augmented reality (AR) device, and the instruction is overlaid on a live image visualized for the eyewear.
[0107] Clause 13. The method of any of the preceding clauses, wherein the eyewear of the user is visualized with a one-to-one scale on the first user interface.
[0108] Clause 14. The method of any of the preceding clauses, further comprising: obtaining image data of the eyewear that has been adjusted in response to the instruction; comparing the image data of the eyewear and the virtual representation of the eyewear; based on a comparison result, determining whether adjustment of the eyewear matches the adjusted virtual fitting parameter of the eyewear.
[0109] Clause 15. The method of clause 14, wherein the comparison result includes a difference between the image data of the eyewear and the virtual representation of the eyewear, the method further comprising, iteratively and until the difference is below a predetermined threshold: updating the instruction to adjust the eyewear of the user physically based on the difference; obtaining additional image data of the eyewear in response to the instruction; and comparing the additional image data and the virtual representation of the eyewear to update the difference.
[0110] Clause 16. The method of any of the preceding clauses, further comprising: selecting a frame style and a material based on biometric data and aesthetic preferences of the user.
[OHl] Clause 17. The method of any of the preceding clauses, further comprising: applying a facial development model to predict future facial measurements of the face of the user corresponding to a predefined time frame, wherein the virtual fitting parameter of the eyewear is adjusted based on the future facial measures of the face.
[0112] Clause 18. The method of any of the preceding clauses, further comprising: in response to the instruction, controlling a robotic arm to adjust the eyewear based on the virtual fitting parameter.
[0113] Clause 19. The method of any of the preceding clauses, further comprising: receiving, on the first user interface, a user input requesting a modification to the virtual representation, the user input selecting the modification from a plurality of adjustment suggestions; and based on the user input, modifying the virtual representation on the first user interface.
[0114] Clause 20. The method of any of the preceding clauses, wherein the image data of the user includes a plurality of images corresponding to a plurality of different angles, and the set of facial measurements of the face of the user includes at least interpupillary distance and ear-to-ear width.
[0115] Clause 21. A method for custom-fitting eyewear for an online eyewear user, comprising: capturing a set of facial measurements of the eyewear user via a remote imaging device; transmitting the captured measurements to a processing unit; generating a virtual representation of the eyewear adjusted to the captured measurements; displaying the virtual representation on a display device within a production facility; adjusting a physical eyewear frame to match the virtual representation displayed on the display device; and shipping the adjusted eyewear frame to the eyewear user.
[0116] Clause 22. The method of clause 21, wherein the remote imaging device is a camera of an eyewear user’s mobile computer device.
[0117] Clause 23. The method of clause 21 or 22, wherein the processing unit is located remotely from the production facility and the display device is a tablet computer device.
[0118] Clause 24. The method of any of clauses 21-23, further comprising: capturing a video sequence of the eyewear user’s head movements and expressions to refine the eyewear fit based on dynamic facial metrics.
[0119] Clause 25. The method of any of clauses 21-24, wherein: the remote imaging device includes depth-sensing technology to capture three-dimensional facial contours.
[0120] Clause 26. The method of any of clauses 21-25, further comprising: utilizing a feedback loop wherein the virtual representation is adjusted based on real-time eyewear user feedback before the physical eyewear frame is adjusted.
[0121] Clause 27. The method of any of clauses 21-26, further comprising: simulating environmental conditions in the virtual representation to adjust the eyewear for specific eyewear user use cases such as sports or high-glare environments.
[0122] Clause 28. The method of any of clauses 21-27, wherein: the processing unit further customizes the eyewear by selecting frame styles and materials based on the eyewear user’s biometric data and aesthetic preferences.
[0123] Clause 29. The method of any of clauses 21-28, further comprising: generating a predictive model of the eyewear user’s future facial changes and adjusting the eyewear to accommodate predicted changes within a predefined time frame.
[0124] Clause 30. A system for custom-fitting eyewear, comprising: an imaging module configured to capture facial measurements; a processing module configured to receive the facial measurements and generate adjustment data for the eyewear; a display module for presenting the adjustment data to a technician; and an adjustment station where the technician adjusts the eyewear in accordance with the adjustment data.
[0125] Clause 31. The system of clause 30, where the imaging module includes a mobile application running on a user’s smartphone capable of capturing and transmitting the measurements.
[0126] Clause 32. The system of clause 30 or 31, where the processing module utilizes a machine learning algorithm to refine the adjustment data based on historical fit data and eyewear user feedback.
[0127] Clause 33. The system of any of clauses 30-32, further comprising a feedback module configured to receive post-delivery feedback from eyewear users to improve the adjustment data algorithm.
[0128] Clause 34. The system of any of clauses 30-33, wherein: the display module includes augmented reality capabilities to overlay the adjustment data onto a live image of the eyewear frame during adjustment.
[0129] Clause 35. The system of any of clauses 30-34, further comprising: a calibration module to ensure that the display device presents the virtual representation at a one- to-one scale with the physical eyewear.
[0130] Clause 36. The system of any of clauses 30-35, further comprising: a quality assurance module that verifies the physical adjustment of the eyewear frame against the virtual representation using image recognition technology.
[0131] Clause 37. The system of any of clauses 30-36, wherein: the adjustment station includes robotic arms controlled by the adjustment data to automate the adjustment of the eyewear frame.
[0132] Clause 38. The system of any of clauses 30-37, wherein: the imaging module is further configured to capture biometric data such as interpupillary distance and ear- to-ear width using a plurality of images from different angles.
[0133] Clause 39. A non-transitory computer-readable storage medium having instructions stored thereon that, when executed by a processor, cause the processor to perform operations comprising: receiving facial measurements from a mobile device; generating virtual fitting parameters for eyewear based on the received measurements; displaying the virtual fitting parameters on a display device to assist in manual adjustment of the eyewear; and recording the adjustments to create a fit profile associated with the eyewear user.
[0134] Clause 40. A method for verifying the accuracy of eyewear adjustment, comprising: capturing an image of the adjusted eyewear frame; comparing the image to the virtual representation; determining deviations between the adjusted frame and the virtual representation; and providing feedback to the technician for further adjustment if deviations exceed a predetermined threshold.
[0135] Clause 41. A method for enhancing eyewear user engagement in the eyewear fitting process, comprising: providing a user interface for the eyewear user to view and modify the virtual representation of the eyewear; enabling the eyewear user to select from various adjustment suggestions; finalizing the eyewear adjustment based on the eyewear user’s selections.
[0136] Clause 42. A computer-implemented method for training an artificial intelligence model for eyewear fitting, comprising: collecting a large dataset of facial measurements and corresponding eyewear user satisfaction ratings; using the dataset to train the artificial intelligence model to predict optimal fit adjustments; and continuously updating the model with new data to improve fit predictions over time.
[0137] Clause 43. A non-transitory computer readable storage medium, storing one or more programs for execution by one or more processors of a computer system, the one or more programs including instructions for performing a method in any of clauses 1-29, 40, and 41.
[0138] Clause 44. A computer system, comprising: one or more processors; memory for storing one or more programs for execution by the one or more processors, the one or more programs including instructions for performing a method in any of clauses 1-29, 40, and 41.
Claims
1. A method for virtual eyewear fitting, comprising: obtaining image data of a user captured by a remote imaging device; generating a set of facial measurements of a face of the user based on the image data; obtaining visual information of an eyewear of the user; adjusting a virtual fitting parameter of the eyewear of the user based on the set of facial measurements; visualizing, on a first user interface, the eyewear of the user based on the visual information and the adjusted virtual fitting parameter of the eyewear, including causing display of a virtual representation of the eyewear on the first user interface; generating an instruction to adjust the eyewear of the user physically based on the virtual fitting parameter; and causing display of the instruction including the virtual fitting parameter on a second user interface.
2. The method of claim 1, wherein the remote imaging device includes a camera of a mobile device associated with the user and is configured to execute a user application for capturing and transmitting the image data of the user.
3. The method of claim 1, wherein the remote imaging device includes a depth sensor configured to capture a three-dimensional facial contour of the user.
4. The method of claim 1, wherein the method is implemented by a computer system, and the first user interface and the second user interface are displayed on a display of an electronic device located in a production facility.
5. The method of claim 1, wherein an eyewear adjustment model is applied to adjust the virtual fitting parameter of the eyewear of the user based on the set of facial measurements.
6. The method of claim 5, further comprising: obtaining historical fit data and associated eyewear user feedback information, the historical fit data including historical virtual fitting parameters and historical facial measurements; and training the eyewear adjustment model based on the historical fit data and the associated eyewear user feedback information.
7. The method of claim 1, wherein an adjustment data algorithm is applied to adjust the virtual fitting parameter of the eyewear of the user based on the set of facial measurements, the method further comprising: receiving a post-delivery feedback message from the user; and updating the adjustment data algorithm based on the post-delivery feedback message.
8. The method of claim 1, further comprising: executing an eyewear fitting application including a user account associated with the user; and creating a fit profile including the facial measurements and the adjusted virtual fitting parameter for the user account.
9. The method of claim 1, wherein the image data of the user include a video sequence of head movements and expressions of the user, and extracting the set of facial measurements further comprises: extracting dynamic facial metrics from the video sequence, wherein the virtual fitting parameter is adjusted based on the dynamic facial metrics to refine an eyewear fit for the user.
10. The method of claim 1, wherein the first user interface is displayed on an electronic device, further comprising: receiving, from the first user interface, a user feedback message for the adjusted virtual fitting parameter of the eyewear; and prior to display of the instruction, updating the instruction based on the user feedback message.
11. The method of claim 1, further comprising: determining an environmental condition in which the eyewear is configured to be worn by the user, wherein the virtual fitting parameter of the eyewear is adjusted based on the environmental condition.
12. The method of claim 1, wherein the first user interface includes the second user interface and is displayed on an augmented reality (AR) device, and the instruction is overlaid on a live image visualized for the eyewear.
13. The method of claim 1, further comprising: obtaining image data of the eyewear that has been adjusted in response to the instruction;
comparing the image data of the eyewear and the virtual representation of the eyewear; based on a comparison result, determining whether adjustment of the eyewear matches the adjusted virtual fitting parameter of the eyewear.
14. The method of claim 14, wherein the comparison result includes a difference between the image data of the eyewear and the virtual representation of the eyewear, the method further comprising, iteratively and until the difference is below a predetermined threshold: updating the instruction to adjust the eyewear of the user physically based on the difference; obtaining additional image data of the eyewear in response to the instruction; and comparing the additional image data and the virtual representation of the eyewear to update the difference.
15. The method of claim 1, further comprising: applying a facial development model to predict future facial measurements of the face of the user corresponding to a predefined time frame, wherein the virtual fitting parameter of the eyewear is adjusted based on the future facial measures of the face.
16. The method of claim 1, further comprising: in response to the instruction, controlling a robotic arm to adjust the eyewear based on the virtual fitting parameter.
17. The method of claim 1, further comprising: receiving, on the first user interface, a user input requesting a modification to the virtual representation, the user input selecting the modification from a plurality of adjustment suggestions; and based on the user input, modifying the virtual representation on the first user interface.
18. The method of claim 1, wherein the image data of the user includes a plurality of images corresponding to a plurality of different angles, and the set of facial measurements of the face of the user includes at least interpupillary distance and ear-to-ear width.
19. A non-transitory computer readable storage medium, storing one or more programs for execution by one or more processors of a computer system, the one or more programs including instructions for performing the method of any of claims 1-18.
20. A computer system, comprising: one or more processors; memory for storing one or more programs for execution by the one or more processors, the one or more programs including instructions for performing the method of any of Claims 1-18.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463569095P | 2024-03-23 | 2024-03-23 | |
| US63/569,095 | 2024-03-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025207460A1 true WO2025207460A1 (en) | 2025-10-02 |
Family
ID=97216748
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/021054 Pending WO2025207460A1 (en) | 2024-03-23 | 2025-03-23 | Virtual eyewear fitting system |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025207460A1 (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150055085A1 (en) * | 2013-08-22 | 2015-02-26 | Bespoke, Inc. | Method and system to create products |
-
2025
- 2025-03-23 WO PCT/US2025/021054 patent/WO2025207460A1/en active Pending
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150055085A1 (en) * | 2013-08-22 | 2015-02-26 | Bespoke, Inc. | Method and system to create products |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12130499B2 (en) | Systems and methods for previewing adjustments to stock eyewear frames using a 3D scan of facial features | |
| KR102190812B1 (en) | Method for determining at least one value of a parameter for customising a visual compensation device | |
| US12288417B2 (en) | Obtaining high-resolution oculometric parameters | |
| US20220148262A1 (en) | Method for generating geometric data for a personalized spectacles frame | |
| JP2023539159A (en) | Virtual fitting service provision method, equipment and system | |
| KR20240039018A (en) | Provision of eye exams and prescription glasses | |
| US20240135664A1 (en) | Systems and methods for calculating optical measurements and rendering results | |
| WO2025207460A1 (en) | Virtual eyewear fitting system | |
| TW201521674A (en) | Corrective lens prescription adaptation system for personalized optometry, method and computer program product thereof | |
| US20240249477A1 (en) | Fit prediction based on feature detection in image data | |
| US20250275678A1 (en) | Immersive Technology Vision Testing | |
| US12490897B1 (en) | Astigmatism-driven media content compensation | |
| US20260033711A1 (en) | Vision tests in virtual reality and augmented reality | |
| US20260033720A1 (en) | Avatar-guided vision tests in virtual environments | |
| US20260033717A1 (en) | Voice-guided vision tests in virtual environments | |
| US20260033712A1 (en) | Multifocal media content compensation | |
| US20260038650A1 (en) | Personalized eyewear manufacturing | |
| EP4571593A1 (en) | Federated learning in service environments | |
| US20250391289A1 (en) | Mobile Application and Method for Assisting Visually Impaired Users in Accurate Makeup Application | |
| US20260000286A1 (en) | Methods and systems for compensating eye deficiency in image rendering | |
| US20260000290A1 (en) | Methods and systems for applying biometric feedback signals in vision tests | |
| US20260000288A1 (en) | Methods and systems for tracking eye movement in vision tests in virtual environments | |
| US20250391122A1 (en) | Accurate Cosmetic Application through Advanced Facial Mapping | |
| US20260000289A1 (en) | Methods and systems for applying brain activity feedback in vision tests in virtual environments | |
| US20260000287A1 (en) | Biomedical data based virtual eye tests |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25777120 Country of ref document: EP Kind code of ref document: A1 |