WO2022137245A1 - System and method for virtually trying-on clothes - Google Patents
System and method for virtually trying-on clothes Download PDFInfo
- Publication number
- WO2022137245A1 WO2022137245A1 PCT/IN2020/051050 IN2020051050W WO2022137245A1 WO 2022137245 A1 WO2022137245 A1 WO 2022137245A1 IN 2020051050 W IN2020051050 W IN 2020051050W WO 2022137245 A1 WO2022137245 A1 WO 2022137245A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- clothes
- avatar
- photograph
- metrics
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0621—Electronic shopping [e-shopping] by configuring or customising goods or services
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Electronic shopping [e-shopping] by investigating goods or services
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping
- G06Q30/0643—Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping graphically representing goods, e.g. 3D product representation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Definitions
- the invention relates to a virtual clothing modeling system and a computer-implemented method, more particularly, the invention relates to a system and a method for virtually trying on clothes.
- E-commerce also known as electronic commerce, refers to buying or selling of goods or services over the internet. E-commerce has boosted buying or selling of goods or services by eliminating any theoretical geographical limitations. Several advantages of e- commerce include faster buying/selling procedure, orders can be placed 24 hours and 7 days a week. Customers can easily select products from different providers without moving around physically.
- a US Patent Application, Pub. No. US20120299912A1 discloses a method to help a user visualize how a wearable article will look on the user's body.
- the method includes creating a head and body model of the user based on depth maps and thereafter creating a three-dimensional avatar based on the model.
- the methods of the prior art suffer from one or more disadvantages, such as being too complex in processing or requiring a complex setup.
- Another major disadvantage of the methods of prior arts is they do not actually allow a user to predict if the clothes will fit them or how they will look wearing the corresponding clothes. Generally, the clothes are applied on the front side of the model and not actually worn over the model. Thus, the user does not get a fair idea about the fitting of the clothing.
- clothes herein connote all forms of wearable articles of clothing and includes apparels and garments. Moreover, the terms clothes, apparel and garment may be interchangeably used.
- avatar herein connotes to a 3D model of a user in accordance with one or more embodiments of the invention.
- An objective of this invention is, therefore, to allow users to virtually and precisely predict the fitness and appearance of clothes on wearing, the clothes they wish to buy.
- Another objective of this invention is to allow the users to rotate a 3D model of the user 360 degrees for visualizing the clothes superimposed on the 3D model.
- Yet another objective of this invention is that the invention is simpler and economic to execute.
- Certain embodiments of this invention are directed to a system and method for virtually trying-on clothes.
- This invention allows a user to virtually and precisely predict the fitness and appearance of clothes on their bodies.
- the user inputs one or more body metrics into the system.
- the body metrics include gender, height, and weight of the user.
- the user inputs a front pose and a selfie photograph, wherein the photograph includes face and body of the user.
- the system based on the input metrics and photographs, generates a 3D avatar of the user using augmented reality and computer vision. Thereafter, 3D clothes are superimposed on the 3D avatar such that the user can visualize the clothes as worn over the 3D avatar in 360 degrees by rotating the 3D avatar along the vertical axis of the 3D avatar.
- Fig. 1 is a block diagram showing components of the virtual try-on system, in accordance with an exemplary embodiment of this invention.
- Fig. 2 is a flow chart showing a method of providing clothes database in accordance with an exemplary embodiment of this invention.
- Fig. 3 is a flow chart showing a method of generating a 3D avatar in accordance with an exemplary embodiment of this invention.
- Fig. 4 is a flow chart showing a method of visualizing 3D clothes on a 3D avatar in accordance with an exemplary embodiment of this invention.
- Fig. 5 is an embodiment of clothes as displayed to the user, for example in a catalogue or website.
- Fig. 6 shows an embodiment of an interface for inputting the front and selfie photographs by the user.
- Fig. 7 shows rotation of the 3D avatar in accordance with an exemplary embodiment of this invention.
- Certain embodiments of this invention are directed to a virtual trying-on system and a computer implemented method.
- Embodiments of the invention may be implemented in one or a combination of hardware, firmware, and software. Embodiments of the invention may also be implemented as instructions stored on a machine -readable medium, which may be read and executed by a computing platform to perform the operations described herein.
- a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
- a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.
- Embodiments of this invention may also include apparatuses and systems for performing the operations described herein.
- An apparatus or system may be specially constructed for the desired purposes, or it may comprise a general-purpose device selectively activated or reconfigured by a program stored in the device.
- Fig. 1 which shows an embodiment of this invention
- one or more user access devices 140 and system 110 are connected through a communication network 150.
- the databases 120 and 130 connected to the system 110.
- the database 120 comprises 3D clothes
- database 130 comprises user profiles.
- the user profiles comprising 3D avatar of the user.
- the user access device 140 is a computing system having a display and an input means.
- the user device can be a wireless handheld device such as a mobile phone, smart phone, PDA or the like.
- the user device can also be a desktop computing device, such as a laptop or other personal computer or the like.
- the communication network 150 can be a wired connection or wireless connection.
- the communications can be achieved via one or more networks, such as, but are not limited to, one or more of WiMax, a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Bluetooth, Wi-Fi, 2G, 2.5G, 3G, 4G, IMT-Advanced, pre-4G, 3G LTE, 3GPP LTE, LTE Advanced, mobile WiMax, WiMax 2, Wireless MAN-Advanced networks.
- WiMax Wireless MAN-Advanced networks.
- communications to and from the user access devices 140 can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet.
- communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).
- SSL secure sockets layer
- TLS transport layer security
- the system 110 is a computing system comprising a memory and a processor.
- the processor may include one or more processor cores to execute the instructions of the system 110.
- the processor may include any processor of various commercially available processors, including but not limited to, speed AMD® Long ® (AtMon®), Duron ® (Dimm: ® :) or Opteron > (Optoron®) processor; ARM® application, embedded processor or safety; IBM® and / or Motorola: ® dragon Ball ® (DragonBall®) or PowerPC® processor; the IBM and / or Sony Cell processor ®; or
- Xeon ,® (XeonCS)) or XScale® processor.
- the memory includes a machine-readable medium on which is stored one or more sets of data structures and instructions (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein.
- the instructions may also reside, completely or at least partially, within the main memory, static memory, and/or within the processor during execution thereof by the computer system, with the main memory, static memory, and the processor also constituting machine-readable media.
- memory may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions.
- machine-readable medium shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions.
- the term “machine- readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.
- machine-readable media include non-volatile memory, including but not limited to, by way of example, semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)
- EPROM electrically programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)
- flash memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM
- Fig. 2 illustrates a method 200 of generating and storing 3D clothes.
- the clothes are generated in 3D using any of the commercially known algorithms.
- Various parameters such as size and fitting of the clothes are annexed to the 3D clothing.
- the 3D clothes are stored in the clothes database 120 which is connected to the system 110. It is to be understood that although the database 120 is shown to be connected to the system 110, the database 120 can be a part of the system 110 itself or externally connected to the system 110.
- Fig.3 shows a method 300 for generating 3D avatar of the user.
- the user logs into the user access device 140.
- the user may enter his credentials including username and password.
- the user can be provided with an interface on the display of the user access device 140.
- the interface may allow the user to input one or more parameters as requested by the system 110.
- the system 110 requests the user his gender.
- the form may include a drop-down list which includes possible answers, such as male, female and like. The user may select his gender from the drop-down list.
- the system 110 receives the height of the user.
- the user may select “cm” as the unit from a drop-down list and input " 167" as the height.
- the user inputs his weight along with the corresponding unit of weight.
- the user may select "pounds" from a drop- down list and input 70 as his weight.
- the user can upload his front pose photograph i.e., a photograph of the front side of the user including his body and face. Alternatively, the user can be provided with an option to capture the photograph using a camera of the user access device 140.
- the user uploads, another photograph of his selfie.
- Fig 6 illustrates an embodiment of the interface presented by the system 110 to the user for requesting the front and selfie photographs.
- the first pose in the figure is a front pose 610 and the second pose is a left selfie 620.
- the system 110 renders the details obtained above from the user to generate a 3D avatar of the user.
- the 3D avatar is a simulated 3D model of the user rendered using augmented reality and computer vision, and having a face similar to the face of the user.
- the body of the 3D avatar resembles the body metrics of the user.
- the 3D avatar is stored in the user profile database 130 and can be retrieved later by the system 110.
- the 3D avatar is only stored in the user access device 140.
- the 3D avatar can be stored in the cache memory of the user access device 140. Storing the 3D avatar in the user accesses device 140 may be required in case when the user does not want to share/save his details in an external server.
- a method 400 for visualizing clothes superimposed on the 3D avatar The users, by visualizing the clothes on their 3D avatar, can predict if how the corresponding clothes will fit them or how they will look wearing the corresponding clothes.
- the user may log into the user access device 410 as discussed for step 310. It is obvious that the user once logged into the system 110 at step 310 may not need to log in again at step 410 unless the user had logged out of the system 110.
- the user may browse through a catalog of the seller. For example, the seller may list the clothes on their website and the user can browse the clothes by accessing the website. Photographs of the clothes are listed with information such as size, color, fitting, etc.
- Fig. 5 illustrates one such listing which is a photograph of the front side of clothes. Each of the listings of clothes is linked to a corresponding 3D clothes.
- the system 410 retrieves the 3D clothes linked to the selected clothes, at step 430. Thereafter, at step 440, the system 410 retrieves the 3D avatar of the user from the user profiles database 130 or the cache memory of the user device 140, as the case may be. At step 350, the system 110 using augmenting reality, superimposes the 3D clothes over the 3D avatar, such as the 3D avatar is wearing the clothes. At step 460, the 3D avatar is displayed to the user on their user access device 140. The user is also provided with suitable controls for rotation of the avatar at step 470. Step 470 is further illustrated in Fig. 7 which shows the 3D avatar.
- the user can rotate the 3D avatar along its vertical axis in 360 degrees.
- the user is presented with the front side of the avatar as shown in Fig 7A.
- the user thereafter rotates the avatar clockwise 90 degrees, as shown in Fig. 7B.
- the user further rotates the avatar 90 degrees along its vertical axis (height of the avatar), as shown in Fig. 7C.
- the user can further rotate the avatar to full 360 degrees.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Development Economics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Architecture (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
Abstract
This invention is directed to a system (110) and method for virtually trying-on clothes. This invention allows a user to virtually predict the fitness and appearance of clothes when worn. The user inputs one or more body metrics, including gender, height, and weight of the user into the system. Furthermore, the user inputs a front pose (610) and a selfie photograph, wherein the photograph includes face and body of the user. The system, based on the input metrics and the photographs, generates a 3D avatar (700) of the user using augmented reality and computer vision. Thereafter, 3D clothes are superimposed on the 3D avatar such that the user can visualize the clothes as worn over the 3D avatar in 360 degrees by rotating the 3D avatar along the vertical axis of the 3D avatar.
Description
SYSTEM AND METHOD FOR VIRTUALLY TRYING-ON CLOTHES
FIELD OF INVENTION AND USE OF INVENTION
The invention relates to a virtual clothing modeling system and a computer-implemented method, more particularly, the invention relates to a system and a method for virtually trying on clothes.
PRIOR ART AND PROBLEM TO BE SOLVED
E-commerce, also known as electronic commerce, refers to buying or selling of goods or services over the internet. E-commerce has boosted buying or selling of goods or services by eliminating any theoretical geographical limitations. Several advantages of e- commerce include faster buying/selling procedure, orders can be placed 24 hours and 7 days a week. Customers can easily select products from different providers without moving around physically.
In the case of clothes, a consumer often travels to a store and tries on several articles of clothing in a trial room. For example, the consumer try-on the clothes to be assured that corresponding clothes will properly fit them. Moreover, the clothes are tried by the consumer to know how they look wearing the clothes they wish to buy. Trying-on of clothes becomes a major limitation in online buying or selling of clothes. The consumer has to rely on the photographs of the clothes and measurements scales provided by the seller. Overall, the buying experience of clothes by the consumers is generally
unsatisfactory. Sellers often allow the consumers to return clothes, in case the same does not fit properly or are not liked. However, returning clothes results in extra burden on the sellers and shrink the profits.
Addressing the issues with selling or buying of clothes online properly would help the efficiency of the retailer's business, increase the satisfaction and experience of the consumer and impact positively on the carbon footprint if fewer items are returned.
Various approaches are known for improving the reliability of online retailing for customers. For example, a US Patent Application, Pub. No. US20120299912A1 discloses a method to help a user visualize how a wearable article will look on the user's body. The method includes creating a head and body model of the user based on depth maps and thereafter creating a three-dimensional avatar based on the model.
The methods of the prior art suffer from one or more disadvantages, such as being too complex in processing or requiring a complex setup. Another major disadvantage of the methods of prior arts is they do not actually allow a user to predict if the clothes will fit them or how they will look wearing the corresponding clothes. Generally, the clothes are applied on the front side of the model and not actually worn over the model. Thus, the user does not get a fair idea about the fitting of the clothing.
Thus, considering the advantages of e-commerce, a method which allows users to virtually and more accurately predict the fitness and appearance of clothes when worn is
The term clothes herein connote all forms of wearable articles of clothing and includes apparels and garments. Moreover, the terms clothes, apparel and garment may be interchangeably used.
The term avatar herein connotes to a 3D model of a user in accordance with one or more embodiments of the invention.
OBJECTS OF THE INVENTION
An objective of this invention is, therefore, to allow users to virtually and precisely predict the fitness and appearance of clothes on wearing, the clothes they wish to buy.
Another objective of this invention is to allow the users to rotate a 3D model of the user 360 degrees for visualizing the clothes superimposed on the 3D model.
Yet another objective of this invention is that the invention is simpler and economic to execute.
SUMMARY OF THE INVENTION
Certain embodiments of this invention are directed to a system and method for virtually trying-on clothes. This invention allows a user to virtually and precisely predict the fitness and appearance of clothes on their bodies. The user inputs one or more body metrics into the system. The body metrics include gender, height, and weight of the user. Furthermore, the user inputs a front pose and a selfie photograph, wherein the photograph includes face and body of the user. The system, based on the input metrics and photographs, generates a 3D avatar of the user using augmented reality and computer vision. Thereafter, 3D clothes are superimposed on the 3D avatar such that the user can
visualize the clothes as worn over the 3D avatar in 360 degrees by rotating the 3D avatar along the vertical axis of the 3D avatar.
BRIEF DESCRIPTION OF DRAINGS
The accompanying figures, which are incorporated herein, form part of the specification and illustrate embodiments of this invention. Together with the description, the figures further serve to explain the principles of this invention and to enable a person skilled in the relevant arts to make and use the invention.
Fig. 1 is a block diagram showing components of the virtual try-on system, in accordance with an exemplary embodiment of this invention. Fig. 2 is a flow chart showing a method of providing clothes database in accordance with an exemplary embodiment of this invention.
Fig. 3 is a flow chart showing a method of generating a 3D avatar in accordance with an exemplary embodiment of this invention.
Fig. 4 is a flow chart showing a method of visualizing 3D clothes on a 3D avatar in accordance with an exemplary embodiment of this invention.
Fig. 5 is an embodiment of clothes as displayed to the user, for example in a catalogue or website.
Fig. 6 shows an embodiment of an interface for inputting the front and selfie photographs by the user.
Fig. 7 shows rotation of the 3D avatar in accordance with an exemplary embodiment of this invention.
DETAILED DESCRIPTION OF INVENTION
Certain embodiments of this invention are directed to a virtual trying-on system and a computer implemented method.
Subject matter will now be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments. Subject matter may, however, be embodied in a variety of different forms and, therefore, covered or claimed subject matter is intended to be construed as not being limited to any exemplary embodiments set forth herein; exemplary embodiments are provided merely to be illustrative. Likewise, a reasonably broad scope for claimed or covered subject matter is intended. Among other things, for example, the subject matter may be embodied as methods, devices, components, or systems. The following detailed description is, therefore, not intended to be taken in a limiting sense.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration." Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Likewise, the term "embodiments of the present invention" does not require that all embodiments of the invention include the discussed feature, advantage or mode of operation.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising,”, “includes” and/or “including”, when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Embodiments of the invention may be implemented in one or a combination of hardware, firmware, and software. Embodiments of the invention may also be implemented as instructions stored on a machine -readable medium, which may be read and executed by a computing platform to perform the operations described herein. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.
Unless specifically stated otherwise, and as may be apparent from the following description and claims, it should be appreciated that throughout the specification descriptions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” or the like, refer to the action and/or processes of a computer or
computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
Embodiments of this invention may also include apparatuses and systems for performing the operations described herein. An apparatus or system may be specially constructed for the desired purposes, or it may comprise a general-purpose device selectively activated or reconfigured by a program stored in the device.
The following detailed description includes the best currently contemplated mode or modes of carrying out exemplary embodiments of the invention. The description is not to be taken in a limiting sense but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention will be best defined by the allowed claims of any resulting patent.
Referring to Fig. 1 which shows an embodiment of this invention, one or more user access devices 140 and system 110 are connected through a communication network 150. Further shown in Fig. 1 are the databases 120 and 130 connected to the system 110. The database 120 comprises 3D clothes, and database 130 comprises user profiles. The user profiles comprising 3D avatar of the user.
In one case, the user access device 140 is a computing system having a display and an input means. For example, the user device can be a wireless handheld device such as a
mobile phone, smart phone, PDA or the like. The user device can also be a desktop computing device, such as a laptop or other personal computer or the like.
The communication network 150 can be a wired connection or wireless connection. The communications can be achieved via one or more networks, such as, but are not limited to, one or more of WiMax, a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Bluetooth, Wi-Fi, 2G, 2.5G, 3G, 4G, IMT-Advanced, pre-4G, 3G LTE, 3GPP LTE, LTE Advanced, mobile WiMax, WiMax 2, Wireless MAN-Advanced networks. In addition, communications to and from the user access devices 140 can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one exemplary embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).
The system 110 is a computing system comprising a memory and a processor. The processor may include one or more processor cores to execute the instructions of the system 110. In various embodiments, the processor may include any processor of various commercially available processors, including but not limited to, speed AMD® Long ® (AtMon®), Duron ® (Dimm: ® :) or Opteron > (Optoron®) processor; ARM® application, embedded processor or safety; IBM® and / or Motorola: ® dragon Ball ®
(DragonBall®) or PowerPC® processor; the IBM and / or Sony Cell processor ®; or
Intel Celeron ® ® (Celeron®), Core (2) Duo® .. „ Core (2), Core i: 3® , Core: ! i5®,
Core 7 vine, Atom ® (Atom parameter), Itanium ® (Itani thief), Pentium ® (Pentium®),
Xeon ,® (XeonCS)) or XScale® processor.
The memory includes a machine-readable medium on which is stored one or more sets of data structures and instructions (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions may also reside, completely or at least partially, within the main memory, static memory, and/or within the processor during execution thereof by the computer system, with the main memory, static memory, and the processor also constituting machine-readable media.
While the memory is illustrated in an exemplary embodiment to be a single medium, the term “memory” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine- readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including but not limited to, by way of example,
semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
Now referring to Fig. 2 which illustrates a method 200 of generating and storing 3D clothes. The clothes are generated in 3D using any of the commercially known algorithms. Various parameters such as size and fitting of the clothes are annexed to the 3D clothing. The 3D clothes are stored in the clothes database 120 which is connected to the system 110. It is to be understood that although the database 120 is shown to be connected to the system 110, the database 120 can be a part of the system 110 itself or externally connected to the system 110.
Referring to Fig.3 shows a method 300 for generating 3D avatar of the user. At step 310, the user logs into the user access device 140. For example, the user may enter his credentials including username and password. On logging into the system 110, the user can be provided with an interface on the display of the user access device 140. The interface may allow the user to input one or more parameters as requested by the system 110. At step 320, the system 110 requests the user his gender. The form may include a drop-down list which includes possible answers, such as male, female and like. The user may select his gender from the drop-down list. At step 330, the system 110 receives the height of the user. For example, the user may select “cm” as the unit from a drop-down list and input " 167" as the height. At step 340, the user inputs his weight along with the corresponding unit of weight. For example, the user may select "pounds" from a drop-
down list and input 70 as his weight. At step 350, the user can upload his front pose photograph i.e., a photograph of the front side of the user including his body and face. Alternatively, the user can be provided with an option to capture the photograph using a camera of the user access device 140. At step 360, the user uploads, another photograph of his selfie. Fig 6 illustrates an embodiment of the interface presented by the system 110 to the user for requesting the front and selfie photographs. The first pose in the figure is a front pose 610 and the second pose is a left selfie 620.
On successful uploading of the photographs by the user, the empty outlines illustrating a body are filled with the photographic details of the user. Referring back to Fig. 3, at step 370, the system 110 renders the details obtained above from the user to generate a 3D avatar of the user. The 3D avatar is a simulated 3D model of the user rendered using augmented reality and computer vision, and having a face similar to the face of the user. The body of the 3D avatar resembles the body metrics of the user. At step 380, the 3D avatar is stored in the user profile database 130 and can be retrieved later by the system 110. Alternatively, the 3D avatar is only stored in the user access device 140. For example, the 3D avatar can be stored in the cache memory of the user access device 140. Storing the 3D avatar in the user accesses device 140 may be required in case when the user does not want to share/save his details in an external server.
Now referring to Fig. 4, shown is a method 400 for visualizing clothes superimposed on the 3D avatar. The users, by visualizing the clothes on their 3D avatar, can predict if how the corresponding clothes will fit them or how they will look wearing the corresponding clothes. At step 410, the user may log into the user access device 410 as discussed for
step 310. It is obvious that the user once logged into the system 110 at step 310 may not need to log in again at step 410 unless the user had logged out of the system 110. At step 420, the user may browse through a catalog of the seller. For example, the seller may list the clothes on their website and the user can browse the clothes by accessing the website. Photographs of the clothes are listed with information such as size, color, fitting, etc. Fig. 5 illustrates one such listing which is a photograph of the front side of clothes. Each of the listings of clothes is linked to a corresponding 3D clothes.
On selecting clothes for visualization by the user at step 420, the system 410 retrieves the 3D clothes linked to the selected clothes, at step 430. Thereafter, at step 440, the system 410 retrieves the 3D avatar of the user from the user profiles database 130 or the cache memory of the user device 140, as the case may be. At step 350, the system 110 using augmenting reality, superimposes the 3D clothes over the 3D avatar, such as the 3D avatar is wearing the clothes. At step 460, the 3D avatar is displayed to the user on their user access device 140. The user is also provided with suitable controls for rotation of the avatar at step 470. Step 470 is further illustrated in Fig. 7 which shows the 3D avatar. The user, through the controls, can rotate the 3D avatar along its vertical axis in 360 degrees. First, the user is presented with the front side of the avatar as shown in Fig 7A. The user thereafter rotates the avatar clockwise 90 degrees, as shown in Fig. 7B. The user further rotates the avatar 90 degrees along its vertical axis (height of the avatar), as shown in Fig. 7C. Similarly, the user can further rotate the avatar to full 360 degrees.
While the foregoing written description of the invention enables one of ordinary skill to make and use what is considered presently to be the best mode thereof, those of ordinary
skill will understand and appreciate the existence of variations, combinations, and equivalents of the specific embodiment, method, and examples herein. The invention should therefore not be limited by the above-described embodiment, method, and examples, but by all embodiments and methods within the scope and spirit of the invention as claimed.
Claims
1. A computer implemented method for virtually trying-on clothes, the method comprising: a. generating one or more 3D clothes; b. receiving, from a user, user-metrics, the user-metrics comprising height of the user and weight of the user; c. receiving, from the user, a first photograph, the first photograph depicting a front side of the user; d. receiving, from the user, a second photograph, the second photograph depicting a selfie of the user; e. generating a 3D avatar based on the user-metrics, the first photograph and the second photograph; and f. applying the one or more 3D clothes to the 3D avatar.
2. The computer implemented method of claim 1, wherein the user metrics further comprises gender of the user.
3. The computer implemented method of claim 1 further comprises displaying the 3D avatar to the user after step f.
4. The computer implemented method of claim 1 further comprises allowing the user to rotate the 3D avatar in 360 degrees after step f.
5. The computer implemented method of claim 1, wherein the front side comprises body and face.
A virtual trying-on system for clothes, the virtual trying-on system comprising: a system (110) in electronic communication (150) with one or more user access devices (140), the system (110) comprising: one or more processors; and a non-transitory, computer-readable medium comprising a set of instruction that, when executed by one or more processors, cause the one or more processors to perform operations comprising a) retrieving one or more 3D clothes from a clothes database (120); b) receiving, from a user, user-metrics, the user-metrics comprising height of the user and weight of the user; c) receiving, from the user, a first photograph, the first photograph depicting a front side of the user; d) receiving, from the user, a second photograph, the second photograph depicting a selfie of the user; e) generating a 3D avatar based on the user-metrics, the first photograph and the second photograph; and f) applying the one or more 3D clothes to the 3D avatar. The virtual trying-on system of claim 6, wherein operations further comprises displaying the 3D avatar to the user after step f. The virtual trying-on system of claim 6, wherein operations further comprises allowing the user to rotate the 3D avatar in 360 degrees after step f.
The virtual trying-on system of claim 6, wherein the user access device is a desktop computer. The virtual trying-on system of claim 6, wherein the user access device is a smartphone.
16
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2020/051050 WO2022137245A1 (en) | 2020-12-24 | 2020-12-24 | System and method for virtually trying-on clothes |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2020/051050 WO2022137245A1 (en) | 2020-12-24 | 2020-12-24 | System and method for virtually trying-on clothes |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022137245A1 true WO2022137245A1 (en) | 2022-06-30 |
Family
ID=82159149
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IN2020/051050 Ceased WO2022137245A1 (en) | 2020-12-24 | 2020-12-24 | System and method for virtually trying-on clothes |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022137245A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024163574A1 (en) * | 2023-02-02 | 2024-08-08 | Snap Inc. | Augmented reality try-on experience for a friend or another user |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10664903B1 (en) * | 2017-04-27 | 2020-05-26 | Amazon Technologies, Inc. | Assessing clothing style and fit using 3D models of customers |
-
2020
- 2020-12-24 WO PCT/IN2020/051050 patent/WO2022137245A1/en not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10664903B1 (en) * | 2017-04-27 | 2020-05-26 | Amazon Technologies, Inc. | Assessing clothing style and fit using 3D models of customers |
Non-Patent Citations (1)
| Title |
|---|
| TEXEL: "Texel - Virtual try-on in store", YOUTUBE, XP055952770, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=QY4egXwOJyg> * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024163574A1 (en) * | 2023-02-02 | 2024-08-08 | Snap Inc. | Augmented reality try-on experience for a friend or another user |
| US12340453B2 (en) | 2023-02-02 | 2025-06-24 | Snap Inc. | Augmented reality try-on experience for friend |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11273378B2 (en) | Generating and utilizing digital avatar data for online marketplaces | |
| US20220215450A1 (en) | Methods and systems for virtual fitting rooms or hybrid stores | |
| US11915380B2 (en) | Method for animating clothes fitting | |
| US11430142B2 (en) | Photometric-based 3D object modeling | |
| CN111787242B (en) | Method and apparatus for virtual fitting | |
| US20160180447A1 (en) | Virtual shopping | |
| US10529009B2 (en) | Digital avatars in online marketplaces | |
| US20150134496A1 (en) | Method for providing for the remote fitting and/or selection of clothing | |
| US11615462B2 (en) | System for virtually sharing customized clothing | |
| US20240362874A1 (en) | Virtual wardrobe ar experience | |
| WO2023226454A1 (en) | Product information processing method and apparatus, and terminal device and storage medium | |
| US20190125022A1 (en) | Method and system for on demand production of apparels | |
| CN106022898A (en) | Clothes trading method and system having virtual fitting function | |
| US20170236333A1 (en) | System And Method For Virtually Trying-On Clothing | |
| WO2022137245A1 (en) | System and method for virtually trying-on clothes | |
| WO2015172229A1 (en) | Virtual mirror systems and methods | |
| US11282132B2 (en) | Frameworks and methodologies configured to enable generation and utilisation of three-dimensional body scan data | |
| US20250336170A1 (en) | Three-dimensional models of users wearing clothing items | |
| CN116524088A (en) | Jewelry virtual try-on method, jewelry virtual try-on device, computer equipment and storage medium | |
| WO2018052367A1 (en) | Garment digitization and try-on system and method | |
| US9286723B2 (en) | Method and system of discretizing three-dimensional space and objects for two-dimensional representation of space and objects | |
| KR102044901B1 (en) | System for generating image-based purchase information of fitting goods and terminal device for the same | |
| US12518307B1 (en) | Human body scanning for size recommendation | |
| KR20220076637A (en) | How to provide clothing matching service | |
| US20250285172A1 (en) | System and method to virtually try clothes |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20966769 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20966769 Country of ref document: EP Kind code of ref document: A1 |