CN106803243A - A kind of clothing based on Virtual Realization technology tries method on - Google Patents
A kind of clothing based on Virtual Realization technology tries method on Download PDFInfo
- Publication number
- CN106803243A CN106803243A CN201510812278.3A CN201510812278A CN106803243A CN 106803243 A CN106803243 A CN 106803243A CN 201510812278 A CN201510812278 A CN 201510812278A CN 106803243 A CN106803243 A CN 106803243A
- Authority
- CN
- China
- Prior art keywords
- rendering
- clothing
- size
- human
- human body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/30—Polynomial surface description
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Image Processing (AREA)
Abstract
Method is tried on the invention provides a kind of clothing based on Virtual Realization technology, methods described includes:Obtain the 3D rendering of human dimension;Obtain the 3D rendering of clothing size;The 3D rendering of the 3D rendering of the human dimension and the clothing size is matched, to obtain the Virtual Realization effect that human body tries clothing on.Relative to prior art, the clothing based on Virtual Realization technology that the present invention is provided tries method on, by obtaining the 3D rendering of human body and clothing size, then the 3D rendering of the 3D rendering of human dimension and clothing size is matched, to obtain the Virtual Realization effect that human body tries clothing on.In addition, in clothing customized, can avoid causing the overgrown situation of clothing to occur because measurement size is inaccurate.When human dimension 3D rendering is obtained, by removing the outer, garment that human body is worn, human body bikini or nude or dress can be obtained than the size 3D rendering of relatively thin close-fitting clothing, can ensure that size is more suitable when being matched.
Description
Technical field
The present invention relates to the technical field of Virtual Realization, it is specifically related to a kind of based on Virtual Realization skill
The clothing of art tries method on.
Background technology
Shopping at network was fast-developing in recent years, due to its have the advantages that conveniently, save money, and by people
Accept extensively.People are difficult to grasp the size of clothing to be bought when net purchase clothing, often,
Many times also to be tried on to solid shop/brick and mortar store, sometimes for trying on very when solid shop/brick and mortar store is tried on
More than one piece, examination piecemeal is pretty troublesome, time-consuming also very long, in addition when clothing customized,
Need designer to measure a lot of sizes, waste time and energy.
The content of the invention
The embodiment of the present invention provides a kind of clothing based on Virtual Realization technology and tries method on, to solve
Try complicated, the inconvenient technical problem of clothes in the prior art.
To solve the above problems, a kind of clothing based on Virtual Realization technology is the embodiment of the invention provides
Thing tries method on, and methods described includes:
Obtain the 3D rendering of human dimension;
Obtain the 3D rendering of clothing size;
The 3D rendering of the 3D rendering of the human dimension and the clothing size is matched, with
Obtain the Virtual Realization effect that human body tries clothing on.
According to one preferred embodiment of the present invention, the step of 3D rendering of the acquisition human dimension has
Body includes:
Non-contact type measuring device obtains the figure of multigroup different angles using camera to the human body
Piece;
Picture according to multigroup different angles obtains the 3D rendering of the human dimension.
According to one preferred embodiment of the present invention, described in the picture according to multigroup different angles is obtained
Also include after the 3D rendering of human dimension:
Obtain the certain thickness of the human body dress;
The certain thickness that the human body of the acquisition is worn is from the 3D rendering of the human dimension
Remove, and then obtain the second 3D rendering of the human dimension, by the second of the human dimension
3D rendering is matched with the 3D rendering of the clothing size.
According to one preferred embodiment of the present invention, the step is obtained according to the picture of multigroup different angles
Also include after the 3D rendering of the human dimension:
Obtain the certain thickness of the human body dress;
Obtain the characteristic point of the clothing outer surface;
Characteristic point according to the certain thickness and the clothing outer surface obtains table in the clothing
The characteristic point in face;
The characteristic point of the clothing inner surface is coupled together to obtain the 3rd of the human dimension the
3D rendering, by the 3D rendering of the 3rd 3D rendering of the human dimension and the clothing size
Matched.
According to one preferred embodiment of the present invention, described in the picture according to multigroup different angles is obtained
The step of 3D rendering of human dimension, specifically includes:
Determine the human body to be imaged, imaging region is set up according to the human body to be imaged;
Picture according to multigroup different angles selects characteristic point in the imaging region;
Image pyramid is generated according to the characteristic point;
Parse the image pyramid, and obtain the image space coordinate of the characteristic point, object coordinates with
And elements of exterior orientation, to form digital elevation model;
The digital elevation model for parsing generation is changed into DOM Document Object Model.
According to one preferred embodiment of the present invention, the step of 3D rendering of the acquisition clothing size has
Body includes:
Non-contact type measuring device obtains the figure of multigroup different angles using camera to the clothing
Piece;
Picture according to multigroup different angles obtains the 3D rendering of the clothing size.
According to one preferred embodiment of the present invention, described in the picture according to multigroup different angles is obtained
During the 3D rendering of clothing size, the elasticity of the clothing is set according to the clothing material
Coefficient a, the 3D rendering of the clothing size for measuring is adjusted further according to the coefficient of elasticity a in trying on.
According to one preferred embodiment of the present invention, the step of 3D rendering of the acquisition clothing size has
Body includes:Obtain the design drawing of clothing;The clothing size is obtained according to the design drawing
3D rendering.
According to one preferred embodiment of the present invention, it is described by the second of human dimension or the 3rd 3D figure
It is specific as the step of being matched with the 3D rendering of clothing size:By the of the human dimension
Two or the 3rd 3D rendering is carried out one by one with the 3D rendering of the clothing size according to each limb part
Corresponding matching.
According to one preferred embodiment of the present invention, it is described by the second of human dimension or the 3rd 3D figure
As carry out Corresponding matching one by one according to each limb part with the 3D rendering of clothing size the step of it
Afterwards, also include:The characteristics of motion according to human body simulates corresponding movement locus, so that the human body
Size second or the 3rd 3D rendering and the clothing size 3D rendering Dynamic Matching.
According to one preferred embodiment of the present invention, the clothing include clothes, cap, footwear, accessories,
Scarf.
Relative to the clothing side of trying on based on Virtual Realization technology that prior art, the present invention are provided
Method, by obtaining the 3D rendering of human body and clothing size, then by the 3D rendering of human dimension and
The 3D rendering of clothing size is matched, to obtain the Virtual Realization effect that human body tries clothing on.
In addition, in clothing customized, can avoid causing clothing overgrown because measurement size is inaccurate
Situation occurs.
Brief description of the drawings
Technical scheme in order to illustrate more clearly the embodiments of the present invention, will retouch to embodiment below
The accompanying drawing to be used needed for stating is briefly described, it should be apparent that, drawings in the following description
Only some embodiments of the present invention, for those of ordinary skill in the art, are not paying
On the premise of creative work, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is the stream that clothing of the present invention based on Virtual Realization technology tries the preferred embodiment of method one on
Cheng Tu;
Fig. 2 is the flow chart of the 3D rendering for obtaining human dimension;
Fig. 3 is the specific steps schematic flow sheet of the 3D rendering of acquisition human dimension in Fig. 2;With
And
Fig. 4 is the schematic diagram of lens imaging.
Specific embodiment
With reference to the accompanying drawings and examples, the present invention is described in further detail.Refer in particular to
Go out, following examples are merely to illustrate the present invention, but the scope of the present invention do not limited
It is fixed.Likewise, following examples are only section Example of the invention and not all embodiments,
It is all other that those of ordinary skill in the art are obtained under the premise of creative work is not made
Embodiment, belongs to the scope of protection of the invention.
Refer to Fig. 1, it is excellent that Fig. 1 is that clothing of the present invention based on Virtual Realization technology tries method one on
The flow chart of embodiment is selected, the method includes but is not limited to following steps.
Step S100, obtains the 3D rendering of human dimension.
In the step s 100, also referring to Fig. 2, Fig. 2 is the 3D rendering for obtaining human dimension
Flow chart, further include the step of the 3D rendering of the acquisition human dimension:
Step S110, non-contact type measuring device obtains multigroup different angles using camera to human body
Picture.
In step s 110, the non-contact type measuring device can be mobile phone, panel computer, photograph
The mobile terminal device with camera function such as camera, laptop computer.By taking mobile phone as an example, Ke Yi
The front of mobile phone, the back side or side set two identical cameras, two cameras preferably with hand
The axis of machine is vertical or parallel, so can preferably utilize display space, certain camera
Set location be not limited to this, be arranged on other positions can equally realize three-dimensional imaging will
Ask.Be will not enumerate herein on camera set location.
Further, the distance between camera be preferably 5mm~50mm, between camera away from
From can adjust, and the angle of camera can carry out angular adjustment by installing tumbler.When
So, apart from unadjustable, angle does not have tumbler to reach measurement purpose to camera.Close
In the technical characteristic of mobile phone camera mounting structure, in the range of the understanding of those skilled in the art,
Here is omitted.Preferably, non-contact type measuring device includes multiple cameras, and multiple is taken the photograph
The side of non-contact type measuring device is located at as head.When camera is located at non-contact type measuring device
During side, can reduce data acquisition number of times, and imaging effect is more true to nature.
In addition, the camera of the non-contact type measuring device can also only set one, relative to setting
Two structures of camera are put, this kind sets a non-contact type measuring device for camera structure and clap
Take the photograph number of times for example, can set two non-contact type measuring devices of camera in imaging with more
During only need to shoot 1-3 time, and set a non-contact type measuring device for camera be imaged
During can shoot 6-8 times, can equally realize stereo image-forming function.
Step S120, the picture according to multigroup different angles obtains the 3D rendering of human dimension.Please
Fig. 3 is further regarded to, Fig. 3 is the specific steps stream of the 3D rendering of acquisition human dimension in Fig. 2
Journey schematic diagram.Step S120 specifically includes following steps.
Step S1201, determines human body to be imaged, and imaging region is set up according to human body to be imaged.
Can be in step S1201, the step of determine human body to be imaged:Non-contact measurement sets
It is standby to pass through to recognize that reference recognizes thing, and then lock there is provided being people to be imaged with reference to the human body for recognizing thing
Body.
Determining the method for human body to be imaged can also be:Non-contact type measuring device passes through people to be imaged
Characteristic point on body locks human body to be imaged;Or set in non-contact measurement by operating personnel
The standby upper mode for choosing human body to be imaged.
Can be then the Intelligent worn devices such as measuring stick, intelligent watch, Intelligent bracelet with reference to identification thing.
Illustrated by taking Intelligent bracelet as an example in embodiments of the present invention, other Intelligent worn devices and its phase
Seemingly.
The process that Intelligent bracelet filters out measurement target is as follows:Process of the Intelligent bracelet in manufacturer production
Middle size is fixed.The sized data of Intelligent bracelet can be input to non-contact measurement by we
In the software of equipment, Intelligent bracelet is a specific object, such as one-man in group
Have and dress this this kind of model bracelet, can thus position this people, or similar to bluetooth,
Having quite a few people in crowd has this Intelligent bracelet, can be selected by specific match cognization code
Some Intelligent bracelet connected with handset binding is selected, so as to reach the purpose of screening.Wherein, intelligence
Energy bracelet is attached preferably by wireless mode with non-contact type measuring device.
Certainly, it is determined that can not also be carried out by referring to identification thing when human body to be imaged,
Can be user's self-locked human body to be imaged, then be clapped using non-contact type measuring device
Take the photograph, and then obtain picture.
Step S1202, the picture according to multigroup different angles selects characteristic point in imaging region.
Step S1203, obtain with reference to identification thing particular identification code, further according to particular identification code from
The corresponding size with reference to identification thing is obtained in list.
Non-contact type measuring device memory storage is with reference to the identification particular identification code of thing and right therewith
The list of the size answered, non-contact type measuring device obtains the particular identification code with reference to identification thing, then
The corresponding size with reference to identification thing is obtained from list according to particular identification code.Can basis
The size of Intelligent bracelet size controls measurement error, or sets up real space by Intelligent bracelet
Dimension scale, carries out low volume data so as to reach measurement effect.
Step S1204, image pyramid is generated according to characteristic point.In step S1204, according to
In the step of characteristic point generation image pyramid, have also combined the reference obtained in step S1203 and know
The size of other thing.Carry out obtain human dimension 3D rendering when, need before, left rear side, the right side after
The direction of side three carries out three-dimensional model foundation, and is built in same three-dimensional coordinate system according to characteristic point
It is vertical.The 3-D view of human body is obtained according to measurement result, then can be surveyed according to this image
Amount, such as similar to the measurement of tellurian latitude, longitude, or mountain peak distance, difference in height etc.
Measurement.
Step S1205, parses image pyramid, and obtains the image space coordinate of characteristic point, object space and sit
It is marked with and elements of exterior orientation, to form digital elevation model (Digital Elevation Model, letter
Claim DEM).
Because two fixing cameras are in same imaging of interface, by common image intersection,
Common characteristic point carries out imaging fixation.The data for being drawn are by simply can obtain measured object
The spatial relation of body.And due to two pictures, simultaneously, almost same spatial coordinate shoots, gained
Accurate data degree is high.Wherein, the phase is corrected to the dual camera measuring and calculating degree of accuracy before measuring, obtains
To the correction parameter of data, (X, aY, bZ) is that parameter is (1, a, b).Because first-born imaging
During product, camera can not possibly be just the same, and inconsistent due to focal length, two cameras exist
There is one under conditions of same imaging interfaces imaging meeting is just the same with two cameras of hypothesis admittedly
Fixed error parameter.And image space coordinate, object coordinates and each parameters relationship of elements of exterior orientation meet
Below equation,
(formula 1)
Wherein, (x, y) represents the plane coordinates of picture point in raw video;
F represents camera shooting focal length;
(XS, YS, ZS) to represent and take the photograph the object space coordinate of website in photographic process;
(XA, YA, ZA) represent picture point corresponding to object space point object space coordinate.
Step S1206, DOM Document Object Model is changed into by the digital elevation model for parsing generation
(Document Object Model, abbreviation DOM).
Wherein, DOM Document Object Model is the mark of the treatment extensible markup language of W3C Organisation recommendations
Quasi- DLL.It is a kind of application programming interfaces (API) unrelated with platform and language, and it can be with
Dynamically access program and script, update its content etc..So that digital elevation model can be in software
In be easily edited and modified.
After step S1206, also including carrying out stereoplotting and generation DLG (Digital Line
Graphic, data digitlines draw map) vector of basically identical each map elements is drawn with existing line
Data set, and preserve the attribute information of the spatial relationship between each key element and correlation.
In digital mapping, most commonly seen product is exactly digital adventure appearance, field operation mapping it is final into
It is exactly DLG that fruit is general.The product more fully describes earth's surface phenomenon, improvement of visual effect and chi in proportion
Consistent but color more horn of plenty.Various spatial analysis requirements are disclosure satisfy that, can randomly enter line number
According to choosing and showing, it is superimposed with other information, spatial analysis, decision-making can be carried out.Which part ground
Shape key element can draw topographic(al) feature as the line in digital orthoimage topographic map.
Above is a specific principle process using dual camera imaging.Itself and general non-survey
The difference for measuring single camera stereoplotting essentially consists at following 2 points:First, image need not correct place
Reason.2nd, measurement result need not carry out Image Matching.
In addition, the step of contactless measurement also includes verification, specific as follows.
Step S1207, obtains the focal length and the size with reference to identification thing of camera.
Step S1208, the range information of characteristic point is obtained according to convex lens imaging principle, and then survey
Obtain the actual size of human body to be imaged.
Fig. 4 is referred to, Fig. 4 is the schematic diagram of lens imaging.The focal length of camera is known to one
Data, according to the difference of camera, focal length has fixed and revocable.The number of fixed focal length
Value is certain, and revocable camera just has the function of focusing, also can be that focal length exists in focussing process
A certain position is fixed on during taking pictures, so being also a fixed known numeric value.Coordinating
In the wearable devices such as the Intelligent bracelet of principal dimensions, using the del of imaging process, it is known that AB
Length, it is known that the length of focal length Of, AB be imaged ab length, can just calculate naturally
The length of object distance OF.So know the length of object distance OF, it is possible to calculate together with AB
Other positions object length and position etc..This is a kind of testee data capture method,
It is the verification to imaging process.It is one of the wearable devices such as Intelligent bracelet effect.
Step S1209, will obtain human body to be imaged actual size with according to digital elevation model
In human body coordinate parameters to be imaged formed size contrasted.
Whether step S1210, judge comparing result in predetermined error range.
If it is determined that obtain human body to be imaged actual size with according in digital elevation model
Error between the size that human body coordinate parameters to be imaged are formed then is entered into down within a predetermined range
One step (step S1211).If it is determined that the actual size and basis of the human body to be imaged for obtaining
Error between the size that human body coordinate parameters to be imaged in digital elevation model are formed is not pre-
Determine in scope, then return to step S1201, determine human body to be imaged, built according to human body to be imaged
Vertical imaging region, with re-shoot and human body to be imaged stereo-picture.
Step S1211, the 3D rendering of human dimension is accurate, and then generates the 3D figures of human dimension
Picture.
Said process generation is the human dimension 3D rendering for wearing clothing, in order to more accurate
Really human body is matched with clothing, also need the clothing that will be worn on human body to remove, obtain human body bikini
Or nude or dress than relatively thin close-fitting clothing size 3D rendering.
And obtain the size 3D of human body bikini or nude or dress than relatively thin close-fitting clothing
The mode of image can have two kinds, a kind of certain thickness for being to obtain human body dress;Then will obtain
Human body dress certain thickness remove from the 3D rendering of human dimension, and then acquisition human body chi
The second very little 3D rendering, the second 3D rendering of signified human dimension is exactly us by being somebody's turn to do at this
The size of human body bikini that method to be obtained or nude or dress than relatively thin close-fitting clothing
3D rendering.Then the 3D rendering of the second 3D rendering of human dimension and clothing size is carried out
Match somebody with somebody.The certain thickness for obtaining human body dress mentioned here refers generally to outside thicker clothing, example
Such as overcoat, cap, trousers, typically can not including underpants etc. other underwears.Certainly, if
It is if women tries the underwears such as brassiere on, then to need all from the above-mentioned human dimension 3D for wearing clothing
Remove in image.If it is exactly bikini dress or naked to shoot human body when obtaining data certainly
Body or dress than relatively thin close-fitting clothing state, it is possible to without treatment.Other situations are removed
Unnecessary clothing is processed according to the certain thickness obtained in shooting process, typically can with cuff or
Neck etc. can be elsewhere reference evident from the separate position of clothing and human body.If cuff
Locate to be generally the twice of certain thickness elsewhere, the certain thickness data for having calculated herein halve
It is exactly clothing data elsewhere, then gets rid of the thickness of clothing on body.
Another obtains human body bikini or nude or dress than relatively thin close-fitting clothing
The mode of size 3D rendering is:The certain thickness of human body dress is obtained first;The step and upper one
Certain thickness is obtained in method similar, be not repeated herein.Then the spy of clothing outer surface is obtained
Levy a little;The method is equivalent to the 3D rendering for obtaining clothing exterior contour;Further according to certain thickness and
The characteristic point (i.e. the 3D rendering of clothing exterior contour) of clothing outer surface obtains clothing inner surface
Characteristic point;The implementation of the step is that clothing thickness is subtracted on the 3D rendering of clothing exterior contour
Degree, or (place of fold is close to obtain the actual thickness that this puts by the characteristic point of some folds
A part for body all clings to the body).Finally the characteristic point of clothing inner surface is coupled together
To obtain the 3rd 3D rendering of human dimension, by the 3rd 3D rendering of the human dimension and institute
The 3D rendering for stating clothing size is matched.The 3rd 3D rendering of signified human dimension is just at this
Human body bikini that to be us to be obtained by the method or nude or dress are tighter than relatively thin
The size 3D rendering of body clothing.
It is pointed out that clothing described in the embodiment of the present invention include clothes, cap, footwear,
Accessories, scarf, underwear etc..Minimizing technology on accessories, cap, shoes etc. is similar,
Do not repeating herein.
Step S200, obtains the 3D rendering of clothing size.
The specific steps of step S200 are similar with the 3D rendering for obtaining human dimension, are all by non-
The camera of contact type measurement equipment obtains the picture of multigroup different angles to clothing;Further according to multigroup
The picture of different angles obtains the 3D rendering of clothing size.
But when the 3D rendering of clothing size is obtained, typically consider three kinds of situations, including standard
The clothing of size, the clothing of off-standard size and clothing customized.
First, standard-sized clothing is exactly to generate a sample ruler in strict accordance with the size code of clothing
Very little clothing.Namely clothing actual size it is known that.But specific each position of clothing
Or a specific picture does not know its specific size such as above clothing, so needing to be connect with non-
Touch measuring apparatus are shot, and preferably dual-camera structure (directly clap, directly by multi-cam
Can obtain, can also check standard size whether standard), the planar picture taken synchronously is amplified
Or narrow down to standard size, then carry out matching each structure for being obtained with actual clothing or
The actual size of person's combinations of patterns.Converted finally by data, data are converted into DEM from DOM
Isovector data.This data can carry out the conversion of various solid figures.Technical side described above
Case carries out front shooting to clothing and obtains.Because clothing is standard size, size is exactly
It is fixed, but be exactly what unclear of details, such as collar how long, concrete shape is how,
The distribution of the pattern on clothing is size etc..If clothing has multilayer, in addition it is also necessary to launch to shoot,
Clap clothing the inside.Used when mobile phone etc. is taken pictures during this and forehanded, that is, parallel to clothing
Thing, is taken pictures above clothing.Positive and negative and inner face (multilayer has interior trim) are clapped, so
The figure for taking is placed on the profile of the clothing size of standard, carries out characteristic point (after clothing is laid flat
Contour feature point:Several summits of the clothing profile in periphery) the synchronous elongation of position amplify, while
Pattern and interior structure on clothing also synchronously amplify.Namely set up an engineer's scale.This mistake
The thickness of clothing can be calculated in journey with multi-cam equipment, if clothing can be ignored not than relatively thin
Meter, single camera can also be completed, and simply need clap several times more.
Secondly, the clothing of off-standard size;The clothing of off-standard size is accomplished by us with mobile phone etc.
Non-cpntact measurement equipment obtains the dimension plane figure of clothing, and data are then converted into dem data.
Its process is no longer described in detail herein.
Clothing customized, i.e., oneself set the clothing of generation.It is applied to self customization, design clothing
Thing link, oneself designs several planar sections of clothing, is then assembled into one.According to certainly
Oneself desired size determines the size at each position.Then it is assembled into a complete clothing.So
Data are converted into dem data afterwards.
The integral photograph that this positive and negative for being similar to our dress combines in fact,
Through the length for knowing warp and parallel, it is possible to know that this part clothes maximum can be worn by great people
.Then the three-dimensional presentation figure true of clothing is sketched the contours of by the profile of people.Namely
Say, the flexible material such as clothing finally moulds shaping to be attached on what material according to it.Its
Real, some clothes are exactly a plane, in the absence of one closed configuration of the normal clothes of closing of formation,
We can be according to actual conditions, it is assumed that it can be what situation, such as clasp or drawing that they close
Chain makes it be combined into a structure for closure.If be analogous to really, scarf is the same will not to be formed
Close, be also not suitable in assumed condition.Certain can just be given full play on one side or cornerwise
The solid of (can be by non-homogeneous diagonal) to wrapping up is wound, and forms the three-dimensional performance of this thing,
And then enable sealing or the clothing that can not seal and can obtain representing for 3D rendering.
In addition, in the step of obtaining the 3D rendering of clothing size, can also be by obtaining clothing
Design drawing;Then the 3D rendering of clothing size is obtained according to design drawing.And design drawing
Can be to draw to obtain or go out specific manifestation figure of clothing etc. by Freehandhand-drawing by computer software
Deng.In the range of the understanding of those skilled in the art, will not enumerate herein.
In view of the motive power of clothing upper body, the maximum for first calculating each position of clothing is born
Space, is then transferred on the manikin of needs collocation, and the principle according to various power obtains clothing
The effect of upper body.In addition, it is also contemplated that the superposability of clothing, for example, western-style clothes etc. is except normal
Outside the setting of clothing, also shoulder pad etc..These factors clothing size 3D rendering generate when all
Consider, and then adjust the size of clothing 3D rendering.In this process, we see clothing
Into a complete garment for being many tiny point compositions, and each point be it is ponderable, each
There is stretch index between points, holding power index (hardness or automatic bending degree), and
And also certain thickness.
It is illustrated by example of clothes below.Matching process is sought to the positive and negative of clothes
Space in face is full of.Garment body, wherein two sleeve these three cylinders, main body can be formed
Cylinder be irregular.These three cylinders are regarded as closing by we, it is possible to calculated
Go out three volume V of cylinder.Certainly this is in the case where clothes coefficient of elasticity is not considered.Such as
Fruit coefficient of elasticity is a (a is generally higher than 1), then the volume of main part is a2V/2, (this mistake
Clothes elastic force from top to bottom is not considered in journey), because generally, sleeve and main body are connected
Part is all than larger, although this junction has the lines in vest-making process to fix, it is impossible to play
Elastic force, but we are believed that this part also is adapted for identical coefficient of elasticity a.Certainly in clothing
Clothes are not in tight state, and we can ignore this spring ratio a.Only in close-fitting clothing or
Consider when person's human body certain is king-sized, but also just for also close-fitting position.In actual behaviour
In work, gravity considers, but does not consider elastic force up and down, and left and right does not consider holding power.Belt and rubber band
Consider near it by force function, can by test obtain functional equation.
The clothing of complete closed loop can not be realized for some, such as scarf, cap etc. can pass through
Mouse draws on computers to be wanted what is worn into.Then assign and do not form enclosed enclosing
Towel.Or the wearing mode of several scarfs etc. is provided, it is for selection.
Step S300, the 3D rendering of the 3D rendering of human dimension and clothing size is matched,
To obtain the Virtual Realization effect that human body tries clothing on.
During the 3D rendering of the 3D rendering of human dimension and clothing size is matched,
The human body bikini or nude or dress that will usually be obtained by above two mode compare
The size 3D rendering of thin close-fitting clothing is matched with the 3D rendering of clothing size, can during matching
To be whole matching, i.e. the 3D rendering of the 3D rendering of the human dimension of complete structure and clothing size
Matched;Can also be by the second of human dimension or the 3rd 3D rendering and clothing size 3D
Image carries out Corresponding matching one by one according to each limb part.Order can for from top to bottom, from rear
Forward or from totally-enclosed face to sealing face etc..Phase can also be simulated according to the characteristics of motion of human body simultaneously
The movement locus answered so that the second of human dimension or the 3rd the 3D of 3D rendering and clothing size scheme
As Dynamic Matching.Bulk can be broken down into similar to the armor, then module is matched one by one.
Clothing based on Virtual Realization technology provided in an embodiment of the present invention tries method on, by obtaining
The 3D rendering of human body and clothing size, then by the 3D rendering of human dimension and clothing size
3D rendering is matched, to obtain the Virtual Realization effect that human body tries clothing on.In addition, fixed
When making clothing, can avoid causing the overgrown situation of clothing to occur because measurement size is inaccurate.
When human dimension 3D rendering is obtained, by removing the outer, garment that human body is worn, can obtain
Human body bikini or nude or dress are entering than the size 3D rendering of relatively thin close-fitting clothing
Can ensure that size is more suitable during row matching.
A kind of embodiment of the invention is the foregoing is only, protection model of the invention is not thereby limited
Enclose, the equivalent device or equivalent process that every utilization description of the invention and accompanying drawing content are made become
Change, or be directly or indirectly used in other related technical fields, be similarly included in of the invention
In scope of patent protection.
Claims (11)
1. a kind of clothing based on Virtual Realization technology tries method on, it is characterised in that the side
Method includes:
Obtain the 3D rendering of human dimension;
Obtain the 3D rendering of clothing size;
The 3D rendering of the 3D rendering of the human dimension and the clothing size is matched, with
Obtain the Virtual Realization effect that human body tries clothing on.
2. method according to claim 1, it is characterised in that the acquisition human dimension
3D rendering the step of specifically include:
Non-contact type measuring device obtains the figure of multigroup different angles using camera to the human body
Piece;
Picture according to multigroup different angles obtains the 3D rendering of the human dimension.
3. method according to claim 2, it is characterised in that described according to multigroup difference
The picture of angle obtains also including after the 3D rendering of the human dimension:
Obtain the certain thickness of the human body dress;
The certain thickness that the human body of the acquisition is worn is from the 3D rendering of the human dimension
Remove, and then obtain the second 3D rendering of the human dimension, by the second of the human dimension
3D rendering is matched with the 3D rendering of the clothing size.
4. method according to claim 2, it is characterised in that the step is according to multigroup
The picture of different angles obtains also including after the 3D rendering of the human dimension:
Obtain the certain thickness of the human body dress;
Obtain the characteristic point of the clothing outer surface;
Characteristic point according to the certain thickness and the clothing outer surface obtains table in the clothing
The characteristic point in face;
The characteristic point of the clothing inner surface is coupled together to obtain the 3rd of the human dimension the
3D rendering, by the 3D rendering of the 3rd 3D rendering of the human dimension and the clothing size
Matched.
5. method according to claim 2, it is characterised in that described according to multigroup difference
The step of picture of angle obtains the 3D rendering of the human dimension specifically includes:
Determine the human body to be imaged, imaging region is set up according to the human body to be imaged;
Picture according to multigroup different angles selects characteristic point in the imaging region;
Image pyramid is generated according to the characteristic point;
Parse the image pyramid, and obtain the image space coordinate of the characteristic point, object coordinates with
And elements of exterior orientation, to form digital elevation model;
The digital elevation model for parsing generation is changed into DOM Document Object Model.
6. method according to claim 1, it is characterised in that the acquisition clothing size
3D rendering the step of specifically include:
Non-contact type measuring device obtains the figure of multigroup different angles using camera to the clothing
Piece;
Picture according to multigroup different angles obtains the 3D rendering of the clothing size.
7. method according to claim 6, it is characterised in that described according to multigroup difference
During the picture of angle obtains the 3D rendering of the clothing size, according to the clothing material
The coefficient of elasticity a of the clothing is set, further according to the clothing chi that coefficient of elasticity a adjustment is measured
Very little 3D rendering.
8. method according to claim 1, it is characterised in that the acquisition clothing size
3D rendering the step of specifically include:
Obtain the design drawing of clothing;
The 3D rendering of the clothing size is obtained according to the design drawing.
9. the method according to claim 3 or 4, it is characterised in that described by human body
Size second or the 3rd 3D rendering it is specific the step of matched with the 3D rendering of clothing size
's:By the second of the human dimension or the 3rd 3D rendering pressed with the 3D rendering of the clothing size
Corresponding matching is carried out one by one according to each limb part.
10. the method stated according to claim 9, it is characterised in that described by human dimension
Second or the 3rd 3D rendering carried out one by one according to each limb part with the 3D rendering of clothing size it is right
After the step of should matching, also include:The characteristics of motion according to human body simulates corresponding movement locus,
So that the second of the human dimension or the 3rd 3D rendering and the clothing size 3D rendering dynamic
Matching.
11. methods according to claim 1, it is characterised in that the clothing include clothes,
Cap, footwear, accessories, scarf.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510812278.3A CN106803243A (en) | 2015-11-21 | 2015-11-21 | A kind of clothing based on Virtual Realization technology tries method on |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510812278.3A CN106803243A (en) | 2015-11-21 | 2015-11-21 | A kind of clothing based on Virtual Realization technology tries method on |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN106803243A true CN106803243A (en) | 2017-06-06 |
Family
ID=58975760
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201510812278.3A Pending CN106803243A (en) | 2015-11-21 | 2015-11-21 | A kind of clothing based on Virtual Realization technology tries method on |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN106803243A (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111080800A (en) * | 2019-12-18 | 2020-04-28 | 郭艺斌 | A virtual wearing method and virtual wearing system based on human-computer interaction |
| CN113240481A (en) * | 2021-02-09 | 2021-08-10 | 飞诺门阵(北京)科技有限公司 | Model processing method and device, electronic equipment and readable storage medium |
Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020024517A1 (en) * | 2000-07-14 | 2002-02-28 | Komatsu Ltd. | Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space |
| JP2006331367A (en) * | 2005-05-27 | 2006-12-07 | Jun Shiozawa | Trial fitting system |
| CN101582143A (en) * | 2008-05-16 | 2009-11-18 | 杨政宪 | Terminal fitting simulation system and method for generating fitting image |
| CN101819663A (en) * | 2009-08-27 | 2010-09-01 | 珠海琳琅信息科技有限公司 | System for virtually trying on clothes |
| CN102298797A (en) * | 2011-08-31 | 2011-12-28 | 深圳市美丽同盟科技有限公司 | Three-dimensional virtual fitting method, device and system |
| CN102509354A (en) * | 2011-11-10 | 2012-06-20 | 武汉大学 | Manufacturing method for projection digital elevation model capable of changing together with image |
| CN102902868A (en) * | 2011-07-26 | 2013-01-30 | 赵朔琼 | Three-dimensional simulated clothes fitting system applied to network |
| CN102982581A (en) * | 2011-09-05 | 2013-03-20 | 北京三星通信技术研究有限公司 | Virtual try-on system and method based on images |
| CN102982246A (en) * | 2012-12-06 | 2013-03-20 | 江苏乐买到网络科技有限公司 | Electronic fitting system in internet shopping |
| CN102981603A (en) * | 2011-06-01 | 2013-03-20 | 索尼公司 | Image processing apparatus, image processing method, and program |
| CN103310342A (en) * | 2012-03-15 | 2013-09-18 | 凹凸电子(武汉)有限公司 | Electronic fitting method and electronic fitting device |
| CN103514350A (en) * | 2012-06-27 | 2014-01-15 | 富泰华工业(深圳)有限公司 | Electronic device with virtual fit function and virtual fit method |
| EP2686835A1 (en) * | 2011-02-17 | 2014-01-22 | Metail Limited | Computer implemented methods and systems for generating virtual body models for garment fit visualisation |
| CN103750817A (en) * | 2012-10-20 | 2014-04-30 | 因美吉智能科技(济南)有限公司 | Non-contact pediatric measuring method and measuring device |
| KR20140077820A (en) * | 2012-12-14 | 2014-06-24 | 한국전자통신연구원 | Method for fiiting virtual items using human body model and system for providing fitting service of virtual items |
| CN104182886A (en) * | 2013-05-24 | 2014-12-03 | 比亚迪股份有限公司 | Virtual fitting method and mobile terminal for virtual fitting |
-
2015
- 2015-11-21 CN CN201510812278.3A patent/CN106803243A/en active Pending
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020024517A1 (en) * | 2000-07-14 | 2002-02-28 | Komatsu Ltd. | Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space |
| JP2006331367A (en) * | 2005-05-27 | 2006-12-07 | Jun Shiozawa | Trial fitting system |
| CN101582143A (en) * | 2008-05-16 | 2009-11-18 | 杨政宪 | Terminal fitting simulation system and method for generating fitting image |
| CN101819663A (en) * | 2009-08-27 | 2010-09-01 | 珠海琳琅信息科技有限公司 | System for virtually trying on clothes |
| EP2686835A1 (en) * | 2011-02-17 | 2014-01-22 | Metail Limited | Computer implemented methods and systems for generating virtual body models for garment fit visualisation |
| CN102981603A (en) * | 2011-06-01 | 2013-03-20 | 索尼公司 | Image processing apparatus, image processing method, and program |
| CN102902868A (en) * | 2011-07-26 | 2013-01-30 | 赵朔琼 | Three-dimensional simulated clothes fitting system applied to network |
| CN102298797A (en) * | 2011-08-31 | 2011-12-28 | 深圳市美丽同盟科技有限公司 | Three-dimensional virtual fitting method, device and system |
| CN102982581A (en) * | 2011-09-05 | 2013-03-20 | 北京三星通信技术研究有限公司 | Virtual try-on system and method based on images |
| CN102509354A (en) * | 2011-11-10 | 2012-06-20 | 武汉大学 | Manufacturing method for projection digital elevation model capable of changing together with image |
| CN103310342A (en) * | 2012-03-15 | 2013-09-18 | 凹凸电子(武汉)有限公司 | Electronic fitting method and electronic fitting device |
| CN103514350A (en) * | 2012-06-27 | 2014-01-15 | 富泰华工业(深圳)有限公司 | Electronic device with virtual fit function and virtual fit method |
| CN103750817A (en) * | 2012-10-20 | 2014-04-30 | 因美吉智能科技(济南)有限公司 | Non-contact pediatric measuring method and measuring device |
| CN102982246A (en) * | 2012-12-06 | 2013-03-20 | 江苏乐买到网络科技有限公司 | Electronic fitting system in internet shopping |
| KR20140077820A (en) * | 2012-12-14 | 2014-06-24 | 한국전자통신연구원 | Method for fiiting virtual items using human body model and system for providing fitting service of virtual items |
| CN104182886A (en) * | 2013-05-24 | 2014-12-03 | 比亚迪股份有限公司 | Virtual fitting method and mobile terminal for virtual fitting |
Non-Patent Citations (4)
| Title |
|---|
| OLARU SABINA ET AL: "Virtual Fitting–Innovative Technology for Customize Clothing Design", 《PROCEDIA ENGINEERING》 * |
| 崔树芹: "三维虚拟试衣系统中参数化人体建模技术的研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 * |
| 李闯: "个性化虚拟试衣技术研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅰ辑》 * |
| 郭大海等: "《机载POS系统直接地理定位技术理论与实践》", 31 August 2009 * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111080800A (en) * | 2019-12-18 | 2020-04-28 | 郭艺斌 | A virtual wearing method and virtual wearing system based on human-computer interaction |
| CN113240481A (en) * | 2021-02-09 | 2021-08-10 | 飞诺门阵(北京)科技有限公司 | Model processing method and device, electronic equipment and readable storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10529127B2 (en) | System and method for simulating realistic clothing | |
| CN105387847B (en) | Contactless measurement, measuring apparatus and its measuring system | |
| CN112639875B (en) | Dimensioning system | |
| Markiewicz et al. | 3D anthropometric algorithms for the estimation of measurements required for specialized garment design | |
| CN102982581B (en) | System for virtually trying and method based on image | |
| US20150248583A1 (en) | Image processing apparatus, image processing system, image processing method, and computer program product | |
| KR101672299B1 (en) | Automatic customized service system and method of Clothes pattern and Customized clothes manufactured by using this method | |
| CN108615248A (en) | Method for relocating, device, equipment and the storage medium of camera posture tracing process | |
| CN107180235A (en) | Human Action Recognition Algorithm Based on Kinect | |
| EP3092584B1 (en) | Mapping wearer mobility for clothing design | |
| KR102033161B1 (en) | method for transferring garment draping between avatars | |
| EP1766551A2 (en) | 3d thermal breast cancer detector | |
| KR102534876B1 (en) | Method and apparatus for determining the breathing status of a person using a depth camera | |
| JP6792273B2 (en) | Dimension data calculation device, product manufacturing device, and silhouette image generation device | |
| CN108805135A (en) | A kind of garment dimension data identification method, device and user terminal | |
| CN105956910A (en) | Clothes customized method, image shooting device and control system | |
| US11443486B2 (en) | Mobile 3D body scanning methods and apparatus | |
| CN114627490A (en) | A multi-person pose estimation method based on inertial sensors and multi-function cameras | |
| CN105913300A (en) | Human body data measurement method and measurement clothes for remote clothes customization | |
| CA3018886C (en) | Three-dimensional measuring system and measuring method with multiple measuring modes | |
| CN106803243A (en) | A kind of clothing based on Virtual Realization technology tries method on | |
| CN106773050B (en) | A kind of intelligent AR glasses virtually integrated based on two dimensional image | |
| CN109602113A (en) | Garment making data acquisition-based garment making method | |
| JP5373128B2 (en) | Person identification device, person identification method, and person identification program | |
| WO2016115536A2 (en) | Determining three-dimensional information from projections or placement of two-dimensional patterns |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| TA01 | Transfer of patent application right |
Effective date of registration: 20200515 Address after: 610000 South Tianfu Avenue 888, Huayang Street, Tianfu New District, Chengdu City, Sichuan Province Applicant after: Chengdu ocean circle Technology Co., Ltd Address before: 421001 Hunan Province, Hengyang Zhengxiang District Road No. 28 Changsheng Applicant before: Zhou Bin |
|
| TA01 | Transfer of patent application right | ||
| AD01 | Patent right deemed abandoned |
Effective date of abandoning: 20210115 |
|
| AD01 | Patent right deemed abandoned |