[go: up one dir, main page]

WO2021086319A1 - Generation of model data for three dimensional printers - Google Patents

Generation of model data for three dimensional printers Download PDF

Info

Publication number
WO2021086319A1
WO2021086319A1 PCT/US2019/058469 US2019058469W WO2021086319A1 WO 2021086319 A1 WO2021086319 A1 WO 2021086319A1 US 2019058469 W US2019058469 W US 2019058469W WO 2021086319 A1 WO2021086319 A1 WO 2021086319A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference locations
object model
build
source
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2019/058469
Other languages
French (fr)
Inventor
Patrick Daney DE MARCILLAC
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to PCT/US2019/058469 priority Critical patent/WO2021086319A1/en
Publication of WO2021086319A1 publication Critical patent/WO2021086319A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Definitions

  • FIG. 1 shows an example process for generating an adjusted object model.
  • FIG. 2 shows another example process for generating an adjusted object model.
  • FIG. 3 is a flowchart showing an example method for generating object model data based on an adjusted object model.
  • FIG. 4 shows an example controller to generate object model data.
  • FIG. 5 shows an example of a computer readable medium comprising instructions to generate object model data.
  • Many products or objects are manufactured to fit against, or otherwise interface with, another object during their use or operation.
  • objects that are operated or used by a user, and interface with a body part of the user, for example user input devices such as a computer mouse or games console controller, or items designed to fit to or against a body part, such as a prosthesis, eye glasses or the insole of a shoe.
  • Other examples include products designed to fit other types of naturally occurring, non-uniform object or body, such as a saddle for a horse, a birdhouse fitted to a tree branch, or a rock-climbing piton for fitting to a rock formation. There may exist discrepancies between the shape of a product and an object it is designed to interface with.
  • any object to be built which is designed to fit to or against, or interface with, another object, which may be referred to as a source object or reference object, may be customised for improved interaction with the reference object.
  • the present disclosure describes how a customised object may be determined, based on the particular topography of another object.
  • a model of an object may be automatically adjusted by comparing reference locations in model data representing a user’s body part with corresponding reference locations in the object model. This adjustment may facilitate 3D printing of an object that is more accurately fitted to the user’s body part.
  • body part model data may be used to adjust an object model and generate object model data representing an adjusted object model.
  • the process of producing a customised 3D-printed object includes: (i) obtaining body part model data; (ii) adjusting an object model and generating object model data based on the adjusted object model; and (iii) 3D printing the object defined by the model, or generating the object using the generated object model data by any other form of additive manufacturing system.
  • the body part model data may reflect the surface of a human body part, and may be received by a pre-print application. Based on the body part model data, an object model may be adjusted. This adjustment may be performed in accordance with the details provided below. In some examples, adjustment of the object model may be performed automatically, manually, or a combination of both, in the pre-print application.
  • an automatically adjusted object model generated by the pre-print application may be accepted, rejected or modified by a user.
  • This automatic adjustment of the object model may utilize machine learning techniques.
  • An entire object may be printed, of which either the whole or a part has been adjusted from an initial object model, or a part of the object may be printed for attachment to a further, standard object part after 3D printing.
  • a part of a computer mouse may be comprised in the object model data, which could then be attached to a standard mouse base after printing.
  • an interface portion of a prosthesis such as a socket portion of a prosthetic limb, may be comprised in the object model data, which may then be customised to fit a user and attached to other standard components of the prosthesis, such as the pylon of a prosthetic limb.
  • the entire object may be printed.
  • a part of the object may printed so that it may be attached to another, standard object part after printing.
  • This enables objects to be manufactured comprising a combination of 3D-printed parts, which may be customised, and standardised off-the-shelf parts.
  • the standardised off-the-shelf parts may, for example, be 3D-printed, be molded, or be formed in any suitable manner.
  • the manufactured object may therefore comprise an article, or assembled object, formed by combining a customised printed object with an additional, standard part, such that the assembled article may fit the body part or other reference object.
  • FIG. 1 shows an example process for generating object model data based on body part model data and one of a set of object models.
  • a scan of a body part 102 is used to generate the body part model data. While a hand is shown as the body part in this example, any part of the body may be used.
  • This body part model data may be used to provide a digital representation of the body part 104 within the pre-print application 103. In some examples, this body part model data may be used to provide a visual representation of the body part, which could be moved or modified, or it may simply be provided as a data set based on which an object model may be adjusted.
  • obtaining the scan may comprise scanning the body part in a 3D scanner 101 , or scanning at least a portion of the body part which is to interface with an object. While a 3D scanner is shown in FIG. 1 , in some examples multiple 2D images of the body part may instead be taken from different positions, in order to capture information about the 3D shape of the body part. These images may be taken by any system capable of taking images or photographs, such as a camera phone. In this example, the use of multiple 2D images, taken sequentially at different times from different angles or positions around the body part, may introduce inconsistencies due to the movement of the body part between different images. These inconsistencies may result in an adjusted object model that is an inaccurate representation of the user’s body part.
  • the body part may be placed in a support to prevent or reduce movement during the time taken to capture the sequence of images. This may enable the body part to remain stationary between images, which may in some cases increase the accuracy of the generated body part model data.
  • errors in the generation of the body part model data may arise as a result of shadows falling on the body part and/or the reflection of light from the surface of the body part, each of which may introduce inaccuracies in the obtained representation of the user’s body part.
  • diffused light or low levels of light may be used so that there are no strong shadows.
  • the brightness of ambient light may be selected to be below a threshold level above which shadows may occur that disrupt the accuracy of the scanning process.
  • the body part may be sprayed or otherwise covered with a low reflectiveness, matt, or speckled material, in order to reduce the effect of surface reflection.
  • the body part may be marked with specific spots or other markers at predetermined locations, in order that these locations may be used to generate the body part model data.
  • these specific spots may be applied to predetermined locations on, for example, the fingers and/or the palm of the hand, and the body part model data may be generated based on the identified positions of these points of the hand.
  • body part model data may be generated from different representations generated in relation to the user’s body part.
  • the representation may be a hand print of the hand, from which data related to the hand itself is generated.
  • body part model data may be generated based on a mould of the body part.
  • a user may grip a soft, plastically deformable object to make the mould of their hand.
  • this soft object may be a modelling clay or similar material, due to its malleability.
  • the representations of the body part may then be obtained by scanning or otherwise photographing the plastically deformed object, or in some examples the object may be used as a support for the body part while the body part itself is scanned or otherwise imaged.
  • Body part model data may be generated by a pre-print application, within which adjustment of an object is performed, or obtained by the pre-print application as an input.
  • the body part model data also need not reflect an entire body part, but may instead represent a portion of that body part.
  • body part model data may represent just the fingers and certain areas of the palm of a hand, which in themselves reflect parts of the body.
  • Body part model data may comprise body part reference locations 105a - 105d, identifying particular locations on the body part.
  • these body part reference locations may of different categories.
  • these body part reference locations may comprise categories of: - Press locations, which may represent locations of the body part that are intended to be adjacent a user input location of the computer mouse that is activated by pressing, for example a computer mouse button; and/or
  • - Rest locations which may represent locations on the fingers and/or palm that are intended to rest on a corresponding surface of the mouse.
  • the reference locations could instead be any locations identified on the body part or in the body part model data.
  • these body part reference locations may be identified on the body part before generating the body part model data, for example by marking the body part at the desired locations.
  • the mould may be augmented with features that indicate the body part reference locations. For example, these features may be pins or markers that are added to the mould in order to identify the respective reference locations.
  • body part reference locations may be determined automatically, manually or a combination of both.
  • body part reference locations may be determined automatically and then updated by a user.
  • the body part model data may comprise just the reference locations, whereas in some examples the body part model data may represent both the reference locations and at least a part of the surface of the body part.
  • reference locations may be points and/or areas.
  • the reference locations may be different for different categories of reference location, e.g. reference points for press locations and reference areas for rest locations.
  • reference locations may be assigned a level of significance. This level of significance may indicate which reference locations are more or less significant to the generation of the customised object.
  • particular categories of reference location may be allocated a level of significance over other categories. This level of significance may be assigned automatically, manually or a combination of both. For example, particular categories may automatically assigned as more significant than others, which may then be updated by a user to change the significance level of any of the reference locations.
  • an object model 106 may be automatically selected from a predetermined set of object models.
  • the body part model data may be linked to a predetermined set of object models for a given category of object, from which the object model may be selected.
  • the category of object may be a computer mouse and so the predetermined set may comprise a number of computer mouse models.
  • different categories of object, each having an associated set of predetermined object models, may be associated with a user’s hand or other body part, and a computer mouse may be merely one of these categories.
  • Linking the object part model data to this predetermined set may comprise determining the category of object when generating the body part model data or selecting the category after receiving the body part model data.
  • an object model 106 may be selected based on a comparison of the body part reference locations 105a - 105d to object reference locations 107a - 107d included in the object model.
  • object reference locations 107a - 107d may correspond to the body part reference locations 105a - 105d when the respective object and body part represented by the models are placed adjacent to one another in the manner in which the object is intended to be interacted with by the user’s body part, e.g. when a user’s hand is placed against the surface of a computer mouse.
  • an object model may be selected in which object reference locations 107a - 107d most closely match the positions of body part reference locations 105a - 105d, during such interaction.
  • the body part reference locations and object reference locations may correspond to the same or different categories of reference location.
  • certain categories of reference location may all be points or areas, respectively, while in other examples each individual reference location can be either a point or an area.
  • Object reference locations may not initially be in the correct positons on a given object model when compared with the body part reference locations.
  • buttons on the object model may not be in the correct positions compared to hand reference locations and so the object reference locations may accordingly be adjusted until they match a respective hand reference location.
  • Each object model may, however, have constraints on the positions of the object reference locations. For example, if the object model is a mouse, the mouse model may be constrained as to how many buttons it may have, where these buttons may be located, the sizes of the buttons and the size of the object. In some examples, any constraint relating to the design of the object may be considered. In some examples, these constraints may arise as a result of particular electronics to be fitted to a printed object.
  • the electronics may include a circuit board with associated wires that may be able to move within a certain range of positions, for example between the circuit board itself and a switch located at a mouse button. This may enable variation within a predetermined range of possible positions on the object that components of the object, e.g. mouse buttons, could have and the overall shape that the object could take, such that the object reference locations may be adjusted within these constraints.
  • positional degrees of freedom 108a - 108d of the object reference locations may comprise a specified positional variance or tolerance of the reference location in specified direction(s). These variances or positional degrees of freedom may be the same for each object reference location, or the specified positional variations may be independent from one another for the different reference locations.
  • Automatic selection of the object model may comprise selecting an object model 106 with object reference locations 107a - 107d that may be adjusted within positional degrees of freedom 108a - 108d. In some examples this may comprise selecting an object model that has object reference locations that may be adjusted until they at least closely match the body part reference locations.
  • automatic selection of the object model may comprise focusing on which of the more significant reference locations are closely matched. For example, when multiple object models are determined as being suitable for the body part model data, an object model may be selected that has closely matching reference locations having a high significance level. In some examples, an object model that does not have all reference locations closely enough located may be selected because significant reference locations are closely enough located. In this way, an object model may be chosen that is more accurately customised to the user’s specifications.
  • the object reference locations may be automatically adjusted. This may result in an adjusted object model 109 having adjusted object reference locations 110a - 110d.
  • This automatic adjustment may comprise automatic adjustment based on any combination of:
  • object reference locations may be adjusted until they match body part reference locations.
  • automatic adjustment of the object reference locations may comprise adjusting the size, shape and/or orientation angle of the object represented by the object model.
  • adjustment of the object reference locations may comprise focusing on adjusting more significant object reference locations over less significant object reference locations.
  • reference locations may comprise reference areas and/or points.
  • object reference areas may be adjusted based on a comparison with body part reference points.
  • automatic adjustment may performed by adjusting the object reference areas within the positional degrees of freedom specified for the respective object reference area within the object model so that the body part reference point falls within the respective object reference area.
  • object reference areas may be adjusted based on a comparison with body part reference areas.
  • automatic adjustment may comprise adjusting the object reference areas within the positional degrees of freedom so that the body part reference area falls within a respective object reference area, or so that there is as great a degree of overlap as possible of the respective reference areas.
  • the object model may be adjusted based on additional criteria specified in the body part model data. For example, where a hand needs to operate a computer mouse, the mouse may need to fit the wrist and arm of the user. It may be possible then to adjust the angular orientation or shape of the object model based on these additional criteria or constraints. These criteria may be added to the body part model data automatically, manually or a combination of both. The body part model data may therefore be updated to more accurately represent the user’s body part. For example, the user may change the angular orientation or shape based on their preference, or based on the particular use that will be made of the object. The user may also be able to correct for errors arising during the generation of the body part model data.
  • Automatic adjustment of the object model based on the reference locations may comprise an automatic adjustment of the shape of the object model. This may comprise assigning each reference location a particular area, for example an area surrounding the reference location. Adjusting the object reference location may then result in automatic adjustment of the surrounding area based on the adjustment made to the reference location, and so will change the overall shape of the object. In some examples a reference location may be assigned a particular reference location shape, such that moving the reference location will move the reference location shape and change the overall object shape. [0030] In some examples, the object model may be adjusted using a number of adjustable object features. In the example where the object is a mouse, these adjustable mouse features may comprise:
  • the mouse model may incorporate locations where buttons may be placed during generation of the mouse by 3D printing. These buttons may have associated mouse reference locations, which may be associated with corresponding hand reference locations in the hand model, and the hand reference locations may be used to adjust the mouse reference locations of the buttons in the mouse model.
  • the adjustable mouse features may include the weight of the mouse when printed.
  • the weight may be estimated based on a calculation of the amount of material to be used for printing the mouse.
  • the weight of the mouse may be reduced in the design by reducing solid volumes in the mouse structure, but the design may allow for weights to be added at specified locations in the mouse, once manufactured, in accordance with a user’s preferences regarding weight distribution and total weight of the mouse.
  • the adjustable mouse features may comprise the filling structure of internal volumes of the mouse model. For example, solid or honeycomb fillings, or other structures having a lower density than a solid volume, may be chosen.
  • the honeycomb or other lower-density filling may provide air ventilation of the electronics in the mouse, reducing heat build-up and potentially reducing risk of faults after long term operation of the mouse. This may also reduce the overall weight of the mouse compared to using a solid filling.
  • a honeycomb filling may be applied to areas surrounding the electronics, while a plain filling may be applied elsewhere.
  • Areas of the surface of the mouse model may be allocated different textures. For example, some areas may be provided with a texture that increases grip. These textured areas may be assigned based on, for example, the rest reference locations. In some examples, these textured areas may be applied to press locations where the user might not want fingers to slip when pressing a mouse button. In some examples the mouse model may include areas that may be personalized with different markers and colours.
  • adjustable object features may comprise any features of the object and the described adjustable object features may be applied to objects other than a mouse.
  • Adjustable object features may be incorporated into the object model either automatically, manually or a combination of both. For example, a particular colour or texture may be applied to a particular area of the object model. A user may want to update this and so may modify the texture and/or colour that is applied to these areas. Automatic inclusion of adjustable object features may use a predetermined set of features, but it may be possible for a user to introduce new features not previously determined. For example, the user may introduce new colours or textures, or even update colours or textures in the predetermined set. Adjustable object features may be incorporated into the object model data when selecting the object model, or by receiving object feature data that may reflect adjustable object features.
  • the methods of adjusting an object model detailed above may be performed in any combination, and may be performed automatically, manually or a combination of both. For example, all of these described methods may be automatically performed, and then updated by a user. In other examples, each of the described adjustments may be performed separately with user update after each respective stage.
  • An adjusted object model 109 may be generated following these adjustments. This adjusted object model may comprise adjustments relating to any combination of the above described forms of adjustment.
  • This adjusted object model may be modified by a user. For example, the user may modify the automatically adjusted object model to further refine the shape. In some examples the user may introduce additional reference locations, enabling further adjustment of the object model.
  • Object model data may be generated based on the adjusted object model. This object model data may be generated by the pre-print application, or may be generated separately, either immediately following generation of the adjusted object model or at a later time.
  • object model data may not represent an entire object, but may instead represent a part of the object.
  • FIG. 2 shows an example in which the object is a mouse, and the mouse model data comprises:
  • the mouse top portion 208 is adjusted based on the hand model data, but may be attached to a standard mouse base portion 209 whose dimensions are not adjusted in dependence on the hand model. In this way the top portion alone may be printed, and then attached to the standard mouse base portion after printing.
  • an adjusted mouse model is generated, wherein the mouse model represents a top portion of the mouse.
  • the mouse model data is obtained by the pre-print application 203.
  • the hand 202 may be scanned by a 3D scanner 201 .
  • the hand model data may be used to provide a visual representation 204 of the hand, whereas in other examples the hand model data will simply be a data set representing features of the shape of the hand.
  • the hand model comprises hand reference points 205a - 205d, which may be automatically assigned or may be specified by a user. The reference points may be of different categories.
  • reference point 205a located on a finger tip of the hand, may be “press” location or “click” point, which is a location on the hand at which a corresponding point on the mouse should be provided with a user input control such as a mouse button.
  • Other reference points such as points 205b - 205d may be rest locations at which it is specified that corresponding points on the mouse should be provided with a surface on which that part of the user’s hand can rest.
  • a predetermined set of mouse models each comprise alternative models of a mouse, or portion of a mouse, and each include mouse reference locations.
  • mouse reference locations 211a - 211 d may be confined to the top portion 208, but have positional degrees of freedom 212a - 212d resulting from constraints of the base portion 209. These constraints may arise from the same features as detailed above.
  • the positional degrees of freedom may electronics 210 that are fitted to a standard base portion 209.
  • the constraints on the reference points of the top portion may include constraints determined by the dimensions of a standard base portion onto which the top portion is intended to fit.
  • the mouse model may be selected based on a comparison of the hand reference locations 205a - 205d with the mouse reference locations 211a - 211 d, within specified positional degrees of freedom 212a - 212d. This selection may be based on determining a mouse model from a predetermined set of mouse models in which the mouse reference locations most closely match the specified hand reference locations, or in which the hand reference locations are able to be accommodated adjacent to respective mouse reference locations within the variations permitted by the specified degrees of freedom.
  • the mouse model may then be adjusted in accordance with any combination of the above described adjustments.
  • the mouse reference locations may be adjusted.
  • adjusted mouse reference locations 214a - 214d may be generated by adjusting the mouse reference locations 211a - 211 d in relation to the corresponding hand reference locations 205a - 205d.
  • An adjusted top portion 213 may be generated representing the adjusted mouse model.
  • This adjusted mouse model may be used to generate mouse model data, which may be used for printing of the top portion of the mouse.
  • the mouse model may represent any part of a mouse, and is not limited to just the whole mouse or a top portion.
  • the mouse reference locations may then be located on any part of a mouse model. For example, mouse models that comprise only a small part of the mouse may be differently constrained than mouse models that comprise larger parts of the mouse. The mouse reference locations may then be constrained as a result of various parts of the mouse, and not just the base.
  • the adjust top portion 213 may be used to 3D print a customised top portion of a mouse, which is customised to fit the user’s hand as specified in the hand model data, and fits a selected standard mouse base portion and electronics.
  • the customised mouse top portion may have mouse buttons located in positions specified by the user as reference points in the hand model data.
  • FIG. 3 shows an example of a method for generating object model data, which may be used to generate printer data comprising build data to control a 3D printer to generate an object.
  • the method 300 comprises initially obtaining body part model data 301 .
  • an object model is automatically selected from a predetermined set of object models.
  • the object model may be selected based on the considerations described above, and in particular an object model may be selected in which the object reference locations most closely match the body part reference locations.
  • the object model may represent either an entire object or a part of the object. In this way, at least a part of the object may be selected for customisation. This may comprise selecting just the part that is to be customised, or customising that part on an entire object.
  • the object reference locations are automatically adjusted at 305, based on the body part reference locations. This automatic adjustment may be performed in accordance with the details described above. In some examples the object reference locations may be adjusted to match the body part reference locations. While the object reference locations may be adjusted to match the body part reference locations, this may not always be possible. For example, the number of object reference locations may not match the number of body part reference locations. In this case automatic adjustment may comprise adjusting the reference locations that match.
  • the object model is automatically adjusted based on criteria of the body part model data, as described above. For example, this adjustment may comprise adjusting the angular orientation or shape of the object model, where these are not determined by the object reference locations. These criteria may already form part of the body part model data, or they may be incorporated into the body part model data at 306.
  • the object model includes locations for adjustable object features. If these are present, adjustable object features may be obtained at 308 and incorporated into the object model 309.
  • the object model may comprise locations that have different textures, and a number of textures may be applied to the model.
  • the adjusted object model is modified based on user input at 311. For example, the user may modify the particular shape of the adjusted object model and/or modify the adjustable object features.
  • object model data is generated at 312, based on the adjusted object model. Where the object model represents only a part of the object, the generation of the object model data may comprise generating object model data for that part, or the entire object incorporating that customised part.
  • the object model data generated at 312 may be used for generating printer data for a 3D printer.
  • FIG. 3 illustrates an example method for the generation of object model data, which may then be used to generate print data.
  • the user may be able to adjust the body part reference locations immediately after obtaining or determining them.
  • the user may also be able to add or remove body part reference locations.
  • the ability to update automatically generated reference locations may enable the user to correct for errors in this automatic process, or allow for further customisation beyond what would be possible using the initial reference locations.
  • the user may also change the number of body part reference locations after the object model is selected, in order to change the object model that is automatically selected.
  • the adjusted object model may be saved and added to the predetermined set of object models, so that it may be used in future build preparations. This may reduce the processing time for future object customisations, wherein future users may be able to select an object model that already closely matches their specifications.
  • machine learning techniques may be implemented so that the system learns from previously determined object models and applies the properties of the object model to future builds.
  • the system may apply the properties to a number of identical builds so that the object may be mass produced without the need for individual approval by a user.
  • machine learning may be used to learn from the positioning of body part reference points in previous object build preparations.
  • a user may be able select how many, and what kind of, reference locations they want to include in the body part model data, and these may be automatically positioned. These may then be updated by a user.
  • Machine learning may also enable the system to learn from previous object reference location adjustments of particular models, such as when a user updates an automatically adjusted object model, so as to be able to more efficiently adjust future object reference locations without the same amount of user input.
  • Automatic adjustment of an object model may save processing time and power. For example, rather than having to build an entire object model in the pre-print application, selecting a predetermined object model based on the body part and object reference locations means that an object model that closely matches the body part may be chosen quickly.
  • the use of specified object reference locations, and the adjustment of these object reference locations based on the corresponding body part reference locations may mean that the adjustment of these locations may be prioritised, and other surrounding areas of the object model may then be fitted around the reference locations accordingly. This is in contrast to having to adjust each individual point of the object model to the body part model.
  • FIG. 4 shows an example of a controller 400 to generate object model data.
  • the controller 400 comprises a processor 401 and a memory 402. Stored within the memory 402 are instructions 403 for generating object model data according to any of the examples described above.
  • the controller 400 may be part of a computer running the instructions 403.
  • the controller 400 may be part of a 3D printer which may be used to run the instructions 403 after obtaining body part model data.
  • FIG. 5 shows a memory 502, which is an example of a computer readable medium storing instructions 510, 511 , 512, 513 that, when executed by a processor 500 communicably coupled to an additive manufacturing system, in this case a 3D printer 501 , causes the processor 500 to generate object model data in accordance with any of the examples described above.
  • the computer readable medium 503 may be any form of storage device capable of storing executable instructions, such as a non-transient computer readable medium, for example Random Access Memory (RAM), Electrically- Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, or the like.
  • RAM Random Access Memory
  • EEPROM Electrically- Erasable Programmable Read-Only Memory
  • references above to a body part relate to examples in which the generated object model data is adjusted on the basis of a body part, for example a body part of a user or intended user of the object.
  • a body part for example a body part of a user or intended user of the object.
  • references in the above description to body part model data, body part reference locations and a body part model will be understood to relate instead to the model data, reference locations and model, respectively, of such an object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Materials Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)

Abstract

Body part model data is obtained, representing a human body part and comprising body part reference locations. A object model is selected based on comparison of object reference locations and body part reference locations. The object model is adjusted by adjusting the object reference locations based on the body part reference locations, and object model data is generated based on the adjusted object model, for generation by a three dimensional printer.

Description

Generation of model data for three dimensional printers
BACKGROUND
[0001] Many products are designed to interface with another object during use or operation, and in some cases the object may be a naturally occurring object or a body part of a human user or animal. Different shapes or sizes of such products may be made available to suit different uses or users.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 shows an example process for generating an adjusted object model.
[0003] FIG. 2 shows another example process for generating an adjusted object model.
[0004] FIG. 3 is a flowchart showing an example method for generating object model data based on an adjusted object model.
[0005] FIG. 4 shows an example controller to generate object model data. [0006] FIG. 5 shows an example of a computer readable medium comprising instructions to generate object model data.
DETAILED DESCRIPTION
[0007] Many products or objects are manufactured to fit against, or otherwise interface with, another object during their use or operation. Examples are objects that are operated or used by a user, and interface with a body part of the user, for example user input devices such as a computer mouse or games console controller, or items designed to fit to or against a body part, such as a prosthesis, eye glasses or the insole of a shoe. Other examples include products designed to fit other types of naturally occurring, non-uniform object or body, such as a saddle for a horse, a birdhouse fitted to a tree branch, or a rock-climbing piton for fitting to a rock formation. There may exist discrepancies between the shape of a product and an object it is designed to interface with. For example, during operation of an object by a user via a physical interface, there may exist inconsistencies between the design of the object and the topography of the part of the body that is being used to operate the object. For example, the body part may experience strain due to having to grip an object in a certain manner. In order to better reflect the specifications of a user, such as their respective body part, objects may be customised for improved interaction with a user. Similarly, any object to be built, which is designed to fit to or against, or interface with, another object, which may be referred to as a source object or reference object, may be customised for improved interaction with the reference object.
[0008] The present disclosure describes how a customised object may be determined, based on the particular topography of another object. In the following description, reference is made to a customised object fitting a user’s body part. However, this is an example, and it will be appreciated that the same principle may be applied to a customised object designed to fit any other type of object.
[0009] Accordingly, in the example of an object customised to fit a body part, a model of an object may be automatically adjusted by comparing reference locations in model data representing a user’s body part with corresponding reference locations in the object model. This adjustment may facilitate 3D printing of an object that is more accurately fitted to the user’s body part.
[0010] The present disclosure describes how, during a pre-print procedure, body part model data may be used to adjust an object model and generate object model data representing an adjusted object model.
[0011] In an example of the disclosure, the process of producing a customised 3D-printed object includes: (i) obtaining body part model data; (ii) adjusting an object model and generating object model data based on the adjusted object model; and (iii) 3D printing the object defined by the model, or generating the object using the generated object model data by any other form of additive manufacturing system. The body part model data may reflect the surface of a human body part, and may be received by a pre-print application. Based on the body part model data, an object model may be adjusted. This adjustment may be performed in accordance with the details provided below. In some examples, adjustment of the object model may be performed automatically, manually, or a combination of both, in the pre-print application. For example, an automatically adjusted object model generated by the pre-print application may be accepted, rejected or modified by a user. This automatic adjustment of the object model may utilize machine learning techniques. An entire object may be printed, of which either the whole or a part has been adjusted from an initial object model, or a part of the object may be printed for attachment to a further, standard object part after 3D printing. For example, a part of a computer mouse may be comprised in the object model data, which could then be attached to a standard mouse base after printing. In another example, an interface portion of a prosthesis, such as a socket portion of a prosthetic limb, may be comprised in the object model data, which may then be customised to fit a user and attached to other standard components of the prosthesis, such as the pylon of a prosthetic limb.
[0012] In some examples, the entire object may be printed. In other examples, a part of the object may printed so that it may be attached to another, standard object part after printing. This enables objects to be manufactured comprising a combination of 3D-printed parts, which may be customised, and standardised off-the-shelf parts. The standardised off-the-shelf parts may, for example, be 3D-printed, be molded, or be formed in any suitable manner. The manufactured object may therefore comprise an article, or assembled object, formed by combining a customised printed object with an additional, standard part, such that the assembled article may fit the body part or other reference object.
[0013] FIG. 1 shows an example process for generating object model data based on body part model data and one of a set of object models. In this example, a scan of a body part 102 is used to generate the body part model data. While a hand is shown as the body part in this example, any part of the body may be used. This body part model data may be used to provide a digital representation of the body part 104 within the pre-print application 103. In some examples, this body part model data may be used to provide a visual representation of the body part, which could be moved or modified, or it may simply be provided as a data set based on which an object model may be adjusted.
[0014] In some examples, obtaining the scan may comprise scanning the body part in a 3D scanner 101 , or scanning at least a portion of the body part which is to interface with an object. While a 3D scanner is shown in FIG. 1 , in some examples multiple 2D images of the body part may instead be taken from different positions, in order to capture information about the 3D shape of the body part. These images may be taken by any system capable of taking images or photographs, such as a camera phone. In this example, the use of multiple 2D images, taken sequentially at different times from different angles or positions around the body part, may introduce inconsistencies due to the movement of the body part between different images. These inconsistencies may result in an adjusted object model that is an inaccurate representation of the user’s body part. In order to account for possible movements in the body part between different images, the body part may be placed in a support to prevent or reduce movement during the time taken to capture the sequence of images. This may enable the body part to remain stationary between images, which may in some cases increase the accuracy of the generated body part model data.
[0015] In some examples, errors in the generation of the body part model data may arise as a result of shadows falling on the body part and/or the reflection of light from the surface of the body part, each of which may introduce inaccuracies in the obtained representation of the user’s body part. In order to address the issue of shadows, diffused light or low levels of light may be used so that there are no strong shadows. For example, the brightness of ambient light may be selected to be below a threshold level above which shadows may occur that disrupt the accuracy of the scanning process. In order to address the issue of reflections, in some examples the body part may be sprayed or otherwise covered with a low reflectiveness, matt, or speckled material, in order to reduce the effect of surface reflection. In another example, the body part may be marked with specific spots or other markers at predetermined locations, in order that these locations may be used to generate the body part model data. In the example that the body part is a hand, these specific spots may be applied to predetermined locations on, for example, the fingers and/or the palm of the hand, and the body part model data may be generated based on the identified positions of these points of the hand.
[0016] In some examples, body part model data may be generated from different representations generated in relation to the user’s body part. For example, the representation may be a hand print of the hand, from which data related to the hand itself is generated. In other examples, body part model data may be generated based on a mould of the body part. For example, a user may grip a soft, plastically deformable object to make the mould of their hand. In some examples this soft object may be a modelling clay or similar material, due to its malleability. The representations of the body part may then be obtained by scanning or otherwise photographing the plastically deformed object, or in some examples the object may be used as a support for the body part while the body part itself is scanned or otherwise imaged.
[0017] Body part model data may be generated by a pre-print application, within which adjustment of an object is performed, or obtained by the pre-print application as an input. The body part model data also need not reflect an entire body part, but may instead represent a portion of that body part. For example, body part model data may represent just the fingers and certain areas of the palm of a hand, which in themselves reflect parts of the body.
[0018] Body part model data may comprise body part reference locations 105a - 105d, identifying particular locations on the body part. In some examples, these body part reference locations may of different categories. For example, where the body part is the hand and the object is a computer mouse, these body part reference locations may comprise categories of: - Press locations, which may represent locations of the body part that are intended to be adjacent a user input location of the computer mouse that is activated by pressing, for example a computer mouse button; and/or
- Rest locations, which may represent locations on the fingers and/or palm that are intended to rest on a corresponding surface of the mouse.
[0019] This list is not exhaustive, and the reference locations could instead be any locations identified on the body part or in the body part model data. In some examples, these body part reference locations may be identified on the body part before generating the body part model data, for example by marking the body part at the desired locations. If moulding the body part, the mould may be augmented with features that indicate the body part reference locations. For example, these features may be pins or markers that are added to the mould in order to identify the respective reference locations. In some examples body part reference locations may be determined automatically, manually or a combination of both. For example, body part reference locations may be determined automatically and then updated by a user. In some examples the body part model data may comprise just the reference locations, whereas in some examples the body part model data may represent both the reference locations and at least a part of the surface of the body part.
[0020] In some examples, reference locations may be points and/or areas. In some examples the reference locations may be different for different categories of reference location, e.g. reference points for press locations and reference areas for rest locations. In some examples, reference locations may be assigned a level of significance. This level of significance may indicate which reference locations are more or less significant to the generation of the customised object. For example, particular categories of reference location may be allocated a level of significance over other categories. This level of significance may be assigned automatically, manually or a combination of both. For example, particular categories may automatically assigned as more significant than others, which may then be updated by a user to change the significance level of any of the reference locations.
[0021] Based on the body part model data, an object model 106 may be automatically selected from a predetermined set of object models. The body part model data may be linked to a predetermined set of object models for a given category of object, from which the object model may be selected. For example, the category of object may be a computer mouse and so the predetermined set may comprise a number of computer mouse models. In other examples, different categories of object, each having an associated set of predetermined object models, may be associated with a user’s hand or other body part, and a computer mouse may be merely one of these categories. Linking the object part model data to this predetermined set may comprise determining the category of object when generating the body part model data or selecting the category after receiving the body part model data. From the predetermined set of object models, an object model 106 may be selected based on a comparison of the body part reference locations 105a - 105d to object reference locations 107a - 107d included in the object model. In some examples, object reference locations 107a - 107d may correspond to the body part reference locations 105a - 105d when the respective object and body part represented by the models are placed adjacent to one another in the manner in which the object is intended to be interacted with by the user’s body part, e.g. when a user’s hand is placed against the surface of a computer mouse. In that case, an object model may be selected in which object reference locations 107a - 107d most closely match the positions of body part reference locations 105a - 105d, during such interaction. In some examples the body part reference locations and object reference locations may correspond to the same or different categories of reference location. In some examples, certain categories of reference location may all be points or areas, respectively, while in other examples each individual reference location can be either a point or an area. [0022] Object reference locations may not initially be in the correct positons on a given object model when compared with the body part reference locations. For example, if the object model is a computer mouse, then buttons on the object model may not be in the correct positions compared to hand reference locations and so the object reference locations may accordingly be adjusted until they match a respective hand reference location. Each object model may, however, have constraints on the positions of the object reference locations. For example, if the object model is a mouse, the mouse model may be constrained as to how many buttons it may have, where these buttons may be located, the sizes of the buttons and the size of the object. In some examples, any constraint relating to the design of the object may be considered. In some examples, these constraints may arise as a result of particular electronics to be fitted to a printed object. For example, the electronics may include a circuit board with associated wires that may be able to move within a certain range of positions, for example between the circuit board itself and a switch located at a mouse button. This may enable variation within a predetermined range of possible positions on the object that components of the object, e.g. mouse buttons, could have and the overall shape that the object could take, such that the object reference locations may be adjusted within these constraints.
[0023] These constraints may result in positional degrees of freedom 108a - 108d of the object reference locations, which may comprise a specified positional variance or tolerance of the reference location in specified direction(s). These variances or positional degrees of freedom may be the same for each object reference location, or the specified positional variations may be independent from one another for the different reference locations. Automatic selection of the object model may comprise selecting an object model 106 with object reference locations 107a - 107d that may be adjusted within positional degrees of freedom 108a - 108d. In some examples this may comprise selecting an object model that has object reference locations that may be adjusted until they at least closely match the body part reference locations. In the example that certain reference locations are assigned levels of significance, automatic selection of the object model may comprise focusing on which of the more significant reference locations are closely matched. For example, when multiple object models are determined as being suitable for the body part model data, an object model may be selected that has closely matching reference locations having a high significance level. In some examples, an object model that does not have all reference locations closely enough located may be selected because significant reference locations are closely enough located. In this way, an object model may be chosen that is more accurately customised to the user’s specifications.
[0024] Once the object model has been selected, the object reference locations may be automatically adjusted. This may result in an adjusted object model 109 having adjusted object reference locations 110a - 110d.
[0025] This automatic adjustment may comprise automatic adjustment based on any combination of:
- Comparison of the body part reference locations and object reference locations;
- Additional criteria of the body part model data; and
- Adjustable object features.
[0026] For example, object reference locations may be adjusted until they match body part reference locations. In some examples, automatic adjustment of the object reference locations may comprise adjusting the size, shape and/or orientation angle of the object represented by the object model. In the example that the reference locations are assigned a significance level, adjustment of the object reference locations may comprise focusing on adjusting more significant object reference locations over less significant object reference locations.
[0027] In some examples, as described above, reference locations may comprise reference areas and/or points. For example, object reference areas may be adjusted based on a comparison with body part reference points. In this example, automatic adjustment may performed by adjusting the object reference areas within the positional degrees of freedom specified for the respective object reference area within the object model so that the body part reference point falls within the respective object reference area. In some examples, object reference areas may be adjusted based on a comparison with body part reference areas. In this example, automatic adjustment may comprise adjusting the object reference areas within the positional degrees of freedom so that the body part reference area falls within a respective object reference area, or so that there is as great a degree of overlap as possible of the respective reference areas.
[0028] In some examples, the object model may be adjusted based on additional criteria specified in the body part model data. For example, where a hand needs to operate a computer mouse, the mouse may need to fit the wrist and arm of the user. It may be possible then to adjust the angular orientation or shape of the object model based on these additional criteria or constraints. These criteria may be added to the body part model data automatically, manually or a combination of both. The body part model data may therefore be updated to more accurately represent the user’s body part. For example, the user may change the angular orientation or shape based on their preference, or based on the particular use that will be made of the object. The user may also be able to correct for errors arising during the generation of the body part model data.
[0029] Automatic adjustment of the object model based on the reference locations may comprise an automatic adjustment of the shape of the object model. This may comprise assigning each reference location a particular area, for example an area surrounding the reference location. Adjusting the object reference location may then result in automatic adjustment of the surrounding area based on the adjustment made to the reference location, and so will change the overall shape of the object. In some examples a reference location may be assigned a particular reference location shape, such that moving the reference location will move the reference location shape and change the overall object shape. [0030] In some examples, the object model may be adjusted using a number of adjustable object features. In the example where the object is a mouse, these adjustable mouse features may comprise:
- Buttons and their locations on the mouse model;
- Weight of the mouse;
- Filling structure of internal volumes of the mouse; and/or
- Texture of the mouse surface.
[0031] For example, the mouse model may incorporate locations where buttons may be placed during generation of the mouse by 3D printing. These buttons may have associated mouse reference locations, which may be associated with corresponding hand reference locations in the hand model, and the hand reference locations may be used to adjust the mouse reference locations of the buttons in the mouse model.
[0032] The adjustable mouse features may include the weight of the mouse when printed. For example, the weight may be estimated based on a calculation of the amount of material to be used for printing the mouse. The weight of the mouse may be reduced in the design by reducing solid volumes in the mouse structure, but the design may allow for weights to be added at specified locations in the mouse, once manufactured, in accordance with a user’s preferences regarding weight distribution and total weight of the mouse. [0033] In some examples, the adjustable mouse features may comprise the filling structure of internal volumes of the mouse model. For example, solid or honeycomb fillings, or other structures having a lower density than a solid volume, may be chosen. The honeycomb or other lower-density filling may provide air ventilation of the electronics in the mouse, reducing heat build-up and potentially reducing risk of faults after long term operation of the mouse. This may also reduce the overall weight of the mouse compared to using a solid filling. In some examples there may not be a single filling type for the entire mouse model, but different areas may be assigned different filling types. For example, a honeycomb filling may be applied to areas surrounding the electronics, while a plain filling may be applied elsewhere. There may be a library of different filling types, which may be applied automatically or with user input to different areas of the mouse structure.
[0034] Areas of the surface of the mouse model may be allocated different textures. For example, some areas may be provided with a texture that increases grip. These textured areas may be assigned based on, for example, the rest reference locations. In some examples, these textured areas may be applied to press locations where the user might not want fingers to slip when pressing a mouse button. In some examples the mouse model may include areas that may be personalized with different markers and colours.
[0035] These examples of adjustable object features detailed above are not an exhaustive list, and instead the adjustable object features may comprise any features of the object and the described adjustable object features may be applied to objects other than a mouse. Adjustable object features may be incorporated into the object model either automatically, manually or a combination of both. For example, a particular colour or texture may be applied to a particular area of the object model. A user may want to update this and so may modify the texture and/or colour that is applied to these areas. Automatic inclusion of adjustable object features may use a predetermined set of features, but it may be possible for a user to introduce new features not previously determined. For example, the user may introduce new colours or textures, or even update colours or textures in the predetermined set. Adjustable object features may be incorporated into the object model data when selecting the object model, or by receiving object feature data that may reflect adjustable object features.
[0036] The methods of adjusting an object model detailed above may be performed in any combination, and may be performed automatically, manually or a combination of both. For example, all of these described methods may be automatically performed, and then updated by a user. In other examples, each of the described adjustments may be performed separately with user update after each respective stage. [0037] An adjusted object model 109 may be generated following these adjustments. This adjusted object model may comprise adjustments relating to any combination of the above described forms of adjustment. This adjusted object model may be modified by a user. For example, the user may modify the automatically adjusted object model to further refine the shape. In some examples the user may introduce additional reference locations, enabling further adjustment of the object model.
[0038] Object model data may be generated based on the adjusted object model. This object model data may be generated by the pre-print application, or may be generated separately, either immediately following generation of the adjusted object model or at a later time.
[0039] In some examples object model data may not represent an entire object, but may instead represent a part of the object. FIG. 2 shows an example in which the object is a mouse, and the mouse model data comprises:
- A base portion 209; and
- A top portion 208.
[0040] In this example, the mouse top portion 208 is adjusted based on the hand model data, but may be attached to a standard mouse base portion 209 whose dimensions are not adjusted in dependence on the hand model. In this way the top portion alone may be printed, and then attached to the standard mouse base portion after printing.
[0041] In the example of FIG. 2, an adjusted mouse model is generated, wherein the mouse model represents a top portion of the mouse. This process may be similar to the process depicted in FIG. 1. In FIG. 2, the mouse model data is obtained by the pre-print application 203. As in FIG. 1 , the hand 202 may be scanned by a 3D scanner 201 . In some examples the hand model data may be used to provide a visual representation 204 of the hand, whereas in other examples the hand model data will simply be a data set representing features of the shape of the hand. The hand model comprises hand reference points 205a - 205d, which may be automatically assigned or may be specified by a user. The reference points may be of different categories. For example, reference point 205a, located on a finger tip of the hand, may be “press” location or “click” point, which is a location on the hand at which a corresponding point on the mouse should be provided with a user input control such as a mouse button. Other reference points such as points 205b - 205d may be rest locations at which it is specified that corresponding points on the mouse should be provided with a surface on which that part of the user’s hand can rest.
[0042] A predetermined set of mouse models each comprise alternative models of a mouse, or portion of a mouse, and each include mouse reference locations. In this example, mouse reference locations 211a - 211 d may be confined to the top portion 208, but have positional degrees of freedom 212a - 212d resulting from constraints of the base portion 209. These constraints may arise from the same features as detailed above. For example, the positional degrees of freedom may electronics 210 that are fitted to a standard base portion 209. In the case where the mouse model comprises a base portion and a top portion, the constraints on the reference points of the top portion may include constraints determined by the dimensions of a standard base portion onto which the top portion is intended to fit.
[0043] The mouse model may be selected based on a comparison of the hand reference locations 205a - 205d with the mouse reference locations 211a - 211 d, within specified positional degrees of freedom 212a - 212d. This selection may be based on determining a mouse model from a predetermined set of mouse models in which the mouse reference locations most closely match the specified hand reference locations, or in which the hand reference locations are able to be accommodated adjacent to respective mouse reference locations within the variations permitted by the specified degrees of freedom. Once the mouse model is selected, the mouse model may then be adjusted in accordance with any combination of the above described adjustments. For example, the mouse reference locations may be adjusted. In this example, adjusted mouse reference locations 214a - 214d may be generated by adjusting the mouse reference locations 211a - 211 d in relation to the corresponding hand reference locations 205a - 205d.
[0044] An adjusted top portion 213 may be generated representing the adjusted mouse model. This adjusted mouse model may be used to generate mouse model data, which may be used for printing of the top portion of the mouse. It should be understood that the mouse model may represent any part of a mouse, and is not limited to just the whole mouse or a top portion. The mouse reference locations may then be located on any part of a mouse model. For example, mouse models that comprise only a small part of the mouse may be differently constrained than mouse models that comprise larger parts of the mouse. The mouse reference locations may then be constrained as a result of various parts of the mouse, and not just the base. Once the adjusted mouse model is generated, the adjust top portion 213 may be used to 3D print a customised top portion of a mouse, which is customised to fit the user’s hand as specified in the hand model data, and fits a selected standard mouse base portion and electronics. As well as fitting against the user’s hand, the customised mouse top portion may have mouse buttons located in positions specified by the user as reference points in the hand model data.
[0045] FIG. 3 shows an example of a method for generating object model data, which may be used to generate printer data comprising build data to control a 3D printer to generate an object.
[0046] The method 300 comprises initially obtaining body part model data 301 . At 302 it is determined whether the body part model data includes body part reference locations. If they are not included in the body part model data, these body part reference locations are determined at 303. This may be done automatically, manually, or a combination of both. For example, body part reference locations may be added automatically, and then approved or updated by a user. It may also be possible to introduce or remove reference locations at later stages of the method 300.
[0047] At 304 an object model is automatically selected from a predetermined set of object models. The object model may be selected based on the considerations described above, and in particular an object model may be selected in which the object reference locations most closely match the body part reference locations. The object model may represent either an entire object or a part of the object. In this way, at least a part of the object may be selected for customisation. This may comprise selecting just the part that is to be customised, or customising that part on an entire object.
[0048] The object reference locations are automatically adjusted at 305, based on the body part reference locations. This automatic adjustment may be performed in accordance with the details described above. In some examples the object reference locations may be adjusted to match the body part reference locations. While the object reference locations may be adjusted to match the body part reference locations, this may not always be possible. For example, the number of object reference locations may not match the number of body part reference locations. In this case automatic adjustment may comprise adjusting the reference locations that match. At 306 the object model is automatically adjusted based on criteria of the body part model data, as described above. For example, this adjustment may comprise adjusting the angular orientation or shape of the object model, where these are not determined by the object reference locations. These criteria may already form part of the body part model data, or they may be incorporated into the body part model data at 306.
[0049] At 307 it is determined whether the object model includes locations for adjustable object features. If these are present, adjustable object features may be obtained at 308 and incorporated into the object model 309. For example, the object model may comprise locations that have different textures, and a number of textures may be applied to the model.
[0050] Once the adjustable object features have been incorporated, or in the event that it is determined at 307 that the object model does not include such features, at 310 it is determined whether a user wants to modify the adjusted object model. If they do, the adjusted object model is modified based on user input at 311. For example, the user may modify the particular shape of the adjusted object model and/or modify the adjustable object features. Once the adjusted object model has been modified at 311 , or in the event that the user does not wish to modify the adjusted object model at 310, object model data is generated at 312, based on the adjusted object model. Where the object model represents only a part of the object, the generation of the object model data may comprise generating object model data for that part, or the entire object incorporating that customised part. The object model data generated at 312 may be used for generating printer data for a 3D printer.
[0051] It should be understood that FIG. 3 illustrates an example method for the generation of object model data, which may then be used to generate print data. In some examples, it may be possible to have user input at various stages. For example, the user may be able to adjust the body part reference locations immediately after obtaining or determining them. The user may also be able to add or remove body part reference locations. The ability to update automatically generated reference locations may enable the user to correct for errors in this automatic process, or allow for further customisation beyond what would be possible using the initial reference locations. The user may also change the number of body part reference locations after the object model is selected, in order to change the object model that is automatically selected. [0052] In some examples the adjusted object model may be saved and added to the predetermined set of object models, so that it may be used in future build preparations. This may reduce the processing time for future object customisations, wherein future users may be able to select an object model that already closely matches their specifications.
[0053] In some examples, machine learning techniques may be implemented so that the system learns from previously determined object models and applies the properties of the object model to future builds. In one application of this, the system may apply the properties to a number of identical builds so that the object may be mass produced without the need for individual approval by a user. In some examples, where body part reference locations are determined automatically, machine learning may be used to learn from the positioning of body part reference points in previous object build preparations. In this example, a user may be able select how many, and what kind of, reference locations they want to include in the body part model data, and these may be automatically positioned. These may then be updated by a user. Machine learning may also enable the system to learn from previous object reference location adjustments of particular models, such as when a user updates an automatically adjusted object model, so as to be able to more efficiently adjust future object reference locations without the same amount of user input.
[0054] Automatic adjustment of an object model may save processing time and power. For example, rather than having to build an entire object model in the pre-print application, selecting a predetermined object model based on the body part and object reference locations means that an object model that closely matches the body part may be chosen quickly. The use of specified object reference locations, and the adjustment of these object reference locations based on the corresponding body part reference locations may mean that the adjustment of these locations may be prioritised, and other surrounding areas of the object model may then be fitted around the reference locations accordingly. This is in contrast to having to adjust each individual point of the object model to the body part model.
[0055] It should be understood that while a computer mouse has been referred to above as an example, any type of object could be adjusted through the described methods.
[0056] FIG. 4 shows an example of a controller 400 to generate object model data. The controller 400 comprises a processor 401 and a memory 402. Stored within the memory 402 are instructions 403 for generating object model data according to any of the examples described above. In one example, the controller 400 may be part of a computer running the instructions 403. In another example, the controller 400 may be part of a 3D printer which may be used to run the instructions 403 after obtaining body part model data.
[0057] FIG. 5 shows a memory 502, which is an example of a computer readable medium storing instructions 510, 511 , 512, 513 that, when executed by a processor 500 communicably coupled to an additive manufacturing system, in this case a 3D printer 501 , causes the processor 500 to generate object model data in accordance with any of the examples described above. The computer readable medium 503 may be any form of storage device capable of storing executable instructions, such as a non-transient computer readable medium, for example Random Access Memory (RAM), Electrically- Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, or the like.
[0058] It will be appreciated that the references above to a body part relate to examples in which the generated object model data is adjusted on the basis of a body part, for example a body part of a user or intended user of the object. However, it will be appreciated that the same examples may also be applied to the generation of object model data which is adjusted on the basis of any other type of object, in which case references in the above description to body part model data, body part reference locations and a body part model will be understood to relate instead to the model data, reference locations and model, respectively, of such an object.

Claims

1. A method comprising: obtaining source object model data representing at least a part of the surface of a reference object, wherein the source object model data comprises a set of source object reference locations; selecting a build object model, representing at least part of an object to be built and comprising a set of build object reference locations, from a predetermined set of build object models based on a comparison of positions of the source object reference locations and the build object reference locations; automatically adjusting the build object model by adjusting the build object reference locations based on the source object reference locations; and generating build object model data based on the adjusted build object model, for generation by a three dimensional printer.
2. The method of claim 1 , wherein the source object reference locations and the build object reference locations each comprise at least two different categories of reference location, wherein automatically adjusting the build object model comprises adjusting individual build object reference locations based on corresponding source object reference locations of the same respective categories.
3. The method of claim 1 , wherein the comparison of the source object reference locations and the build object reference locations comprises determining whether positions of the source object reference locations fall within specified positional degrees of freedom of the build object reference locations.
4. The method of claim 1 , wherein selecting the build object model comprises determining a build object model from the predetermined set of build object models in which the positions of the build object reference locations most closely correspond to the respective positions on the reference object, represented by the source object reference locations in the source object model data.
5. The method of claim 1 , wherein obtaining the source object model data comprises identifying source object reference locations on a source object model.
6. The method of claim 1 , wherein automatically adjusting the build object model further comprises automatically adjusting the build object reference locations to match the source object reference locations.
7. The method of claim 1 , wherein the build object reference locations have specified positional degrees of freedom, and automatically adjusting the build object model comprises automatically adjusting the build object reference locations within the positional degrees of freedom, in order to match the positions of respective source object reference locations of the source object model.
8. The method of claim 1 , wherein automatically adjusting the build object model further comprises: determining whether the source object model data comprises a set of additional criteria for the build object model; and automatically adjusting the build object model to satisfy these criteria.
9. The method of claim 2, wherein the reference object comprises a user’s hand and the build object model represents at least part of a computer mouse, wherein a first category of reference location represents, in the build object model, the position of a user input control on the computer mouse and, in the source object model, a location on the user’s hand used to activate the user input control, and wherein a second category of reference location represents, in the build object model, the position of a rest surface to support the user’s hand and, in the source object model, a location on the user’s hand to be provided with a corresponding rest surface on the computer mouse.
10. The method of claim 1 , further comprising modifying the automatically adjusted build object model on the basis of user input.
11. The method of claim 1 , further comprising generating an object using the generated build object model data.
12. The method of claim 11 , further comprising combining the generated object with an additional part to form an assembled object that fits the reference object.
13. A system comprising a controller configured to: obtain source object model data representing a surface of a reference object, wherein the source object model data comprises a set of source object reference locations; select a build object model representing at least part of an object to be built and comprising a set of build object model reference locations, wherein the build object model is selected from a predetermined set of build object models based on a comparison of positions of the source object reference locations and the build object reference locations; automatically adjust the build object model by adjusting the build object reference locations based on the source object reference locations; and generate build object model data based on the adjusted build object model, for generation by an additive manufacturing system.
14. The system of claim 11 , wherein the source object reference locations and the build object reference locations comprise two different categories of reference location, wherein automatically adjusting the build object model comprises adjusting individual build object reference locations based on corresponding source object reference locations of the same respective categories.
15. An article comprising: an object generated by an additive manufacturing process using object model data, wherein the object model data is generated by: obtaining source object model data representing at least a part of the surface of a reference object, wherein the source object model data comprises a set of source object reference locations; selecting an object model, representing at least part of the object to be generated and comprising a set of object reference locations, from a predetermined set of object models based on a comparison of positions of the source object reference locations and the object reference locations; automatically adjusting the object model by adjusting the object reference locations based on the source object reference locations; and generating the object model data based on the adjusted object model; and an additional part, wherein the object and the additional part are assembled such that the article fits the reference object.
PCT/US2019/058469 2019-10-29 2019-10-29 Generation of model data for three dimensional printers Ceased WO2021086319A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2019/058469 WO2021086319A1 (en) 2019-10-29 2019-10-29 Generation of model data for three dimensional printers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2019/058469 WO2021086319A1 (en) 2019-10-29 2019-10-29 Generation of model data for three dimensional printers

Publications (1)

Publication Number Publication Date
WO2021086319A1 true WO2021086319A1 (en) 2021-05-06

Family

ID=75716165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/058469 Ceased WO2021086319A1 (en) 2019-10-29 2019-10-29 Generation of model data for three dimensional printers

Country Status (1)

Country Link
WO (1) WO2021086319A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1356709B1 (en) * 2000-10-06 2008-10-22 Phonak Ag Manufacturing methods and systems for rapid production of hearing-aid shells
CN107187059A (en) * 2017-06-13 2017-09-22 成都智创华信科技有限公司 A kind of method of 3D printing mouse case
WO2017163000A1 (en) * 2016-03-23 2017-09-28 Sony Interactive Entertainment Inc. 3d printing system
US20190146246A1 (en) * 2013-08-22 2019-05-16 Bespoke, Inc. Method and system to create custom, user-specific eyewear

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1356709B1 (en) * 2000-10-06 2008-10-22 Phonak Ag Manufacturing methods and systems for rapid production of hearing-aid shells
US20190146246A1 (en) * 2013-08-22 2019-05-16 Bespoke, Inc. Method and system to create custom, user-specific eyewear
WO2017163000A1 (en) * 2016-03-23 2017-09-28 Sony Interactive Entertainment Inc. 3d printing system
CN107187059A (en) * 2017-06-13 2017-09-22 成都智创华信科技有限公司 A kind of method of 3D printing mouse case

Similar Documents

Publication Publication Date Title
CN111832383B (en) Training method of gesture key point recognition model, gesture recognition method and device
JP6126437B2 (en) Image processing apparatus and image processing method
US20200050965A1 (en) System and method for capture and adaptive data generation for training for machine vision
CN110310222A (en) A kind of image Style Transfer method, apparatus, electronic equipment and storage medium
JPH05507211A (en) System for measuring feet and determining footwear size
JP6255125B2 (en) Image processing apparatus, image processing system, and image processing method
US11147353B2 (en) Apparatus and method for model reconstruction using photogrammetry
EP3462416A1 (en) Systems and methods for deep learning-based image reconstruction
US20140085298A1 (en) Mixed reality space image providing apparatus
JP7154823B2 (en) Information processing device, robot control device, information processing method and program
AU2017202711B2 (en) Illumination-guided example-based stylization of 3d scenes
US20160072986A1 (en) Body part imaging system
CN104809288A (en) Trying method or customizing nail art
CN115719385A (en) Vision-based tactile sensor calibration method and device
US20180125170A1 (en) Reversibly Modifying the Optical Appearance of a Piece of Apparel
CN114638921A (en) Motion capture method, terminal device, and storage medium
JP7423069B2 (en) prosthetic socket
KR20200120036A (en) Method and apparatus for matching of images using tooth object
WO2021086319A1 (en) Generation of model data for three dimensional printers
CN107492143B (en) Method and system for controlling patch use in image synthesis
KR102180943B1 (en) Apparatus, method and program for setting a transparent orthodontic model using dynamic function
KR20180026029A (en) Method for simulation of plastic surgery
CN114241118A (en) A method, device, system and computer storage medium for creating virtual objects
WO2018030073A1 (en) Design assistance device, design assistance system, server, and design assistance method
CN112435326A (en) Printable model file generation method and related product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19950571

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19950571

Country of ref document: EP

Kind code of ref document: A1