[go: up one dir, main page]

WO2007004448A1 - Programme, dispositif et procédé d’inclusion de motif - Google Patents

Programme, dispositif et procédé d’inclusion de motif Download PDF

Info

Publication number
WO2007004448A1
WO2007004448A1 PCT/JP2006/312637 JP2006312637W WO2007004448A1 WO 2007004448 A1 WO2007004448 A1 WO 2007004448A1 JP 2006312637 W JP2006312637 W JP 2006312637W WO 2007004448 A1 WO2007004448 A1 WO 2007004448A1
Authority
WO
WIPO (PCT)
Prior art keywords
polygon
hidden
light
specific
symbol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2006/312637
Other languages
English (en)
Japanese (ja)
Inventor
Yoshiyuki Sakaguchi
Koji Imao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Digital Fashion Ltd
Original Assignee
Digital Fashion Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digital Fashion Ltd filed Critical Digital Fashion Ltd
Publication of WO2007004448A1 publication Critical patent/WO2007004448A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0021Image watermarking
    • G06T1/0028Adaptive watermarking, e.g. Human Visual System [HVS]-based watermarking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0051Embedding of the watermark in the spatial domain

Definitions

  • the present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method using a computer graphics technique, and in particular, a predetermined hidden symbol is applied to a polygon mesh arranged in a virtual three-dimensional space.
  • the present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method.
  • Patent Document 1 discloses a step of setting a parameter for embedding watermark processing information, a step of inputting pre-conversion data of a 3D shape model, and each curved mesh of the input 3D shape model.
  • the process of selecting meshes that are transparent and that perform information embedding processing and meshes that are not embedding processing, and for a given mesh in each selected set of curved surface meshes, are specified as embedding parameters.
  • a technique for embedding electronic permeability in a dimensional shape model is disclosed.
  • Patent Document 1 Japanese Patent Laid-Open No. 2003-99805
  • An object of the present invention is to provide a symbol embedding program, a symbol embedding device, and a symbol embedding method capable of embedding a hidden symbol in an actual object or a virtual three-dimensional model. Is Rukoto.
  • a pattern embedding program includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh
  • the computer is caused to function as changing means for changing the direction of the specific polygon.
  • a pattern embedding device includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh.
  • a means for identifying a polygon located within an area for embedding a predetermined hidden pattern as a specific polygon, and a regular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction
  • changing means for changing the orientation of the specific polygon is provided.
  • a pattern embedding method includes a setting step in which a computer sets, in a virtual three-dimensional space, a polygon mesh that represents a surface shape of an object on which irregularities are repeatedly formed on a surface.
  • a specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction are provided.
  • a changing step of changing the orientation of the specific polygon so as to face a predetermined line-of-sight direction.
  • a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in an area where a hidden symbol is embedded
  • the direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction.
  • FIG. 1 is a block diagram showing a configuration of a symbol embedding apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a flowchart for explaining symbol embedding processing by the symbol embedding device shown in FIG. 1.
  • ⁇ 3 A schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space. 4) Polygon mesh force is also a diagram showing how a specific polygon is specified.
  • FIG. 6 is a diagram illustrating an example of a polygon mesh rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized.
  • FIG. 7 is a schematic diagram showing an example of a positional relationship between three light ray directions and a line-of-sight direction.
  • FIG. 8 is a diagram illustrating an example of a rendering result of a polygon mesh with respect to a light ray direction.
  • FIG. 9 is a diagram showing another example of a polygon mesh rendering result with respect to a light ray direction.
  • FIG. 10 is a diagram showing another example of a rendering result of a polygon mesh with respect to a light ray direction.
  • FIG. 11 is a cross-sectional view of an example of a polygon mesh in which a hidden symbol is embedded.
  • FIG. 12 is a flowchart for explaining a process in which the texture generation unit shown in FIG. 1 generates a base texture.
  • FIG. 13 is a schematic diagram showing sample points arranged in a grid in a virtual three-dimensional space.
  • FIG. 14 is a schematic diagram for explaining a light ray direction and a line-of-sight direction.
  • FIG. 15 is a schematic diagram when the sample points arranged on the XY plane are viewed from the Z direction.
  • FIG. 17 is a diagram showing an example of a base texture when viewed from the heel direction.
  • FIG. 18 is a schematic diagram for explaining a state in which a plurality of hidden symbols are embedded in a polygon mesh by the symbol embedding device according to the second embodiment of the present invention.
  • FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment of the present invention.
  • FIG. 1 shows a block configuration diagram of a symbol embedding apparatus according to a first embodiment of the present invention.
  • This symbol embedding device is also configured with a known computer power, and includes a processing unit 10, a storage unit 20, an input unit 30, a display unit 40, and an optical characteristic acquisition device 50.
  • the processing unit 10 is configured with CPU power, and includes a polygon mesh setting unit 11, a polygon specifying unit 12, a target direction calculating unit 13, a direction changing unit 14, a rendering processing unit 15, a texture generating unit 16, and a 3D data output unit 17 It has the function of.
  • the storage unit 20 includes a storage device such as a hard disk, and includes functions of a base texture storage unit 21, a symbol storage unit 22, and an optical characteristic storage unit 23.
  • the polygon mesh setting unit 11 to the three-dimensional data output unit 17 and the base texture storage unit 21 to the optical characteristic storage unit 23 execute a symbol embedding program stored in a hard disk as a recording medium by the CPU. This is realized.
  • the optical characteristic acquisition device 50 is a known device invented by the applicant of the present invention, and obtains a sample by photographing the sample while changing the light irradiation direction and the photographing direction with respect to the actual sample. This is a device that generates a BRDF (bidirectional reflection distribution function) from the sample image. Detailed contents are disclosed in JP-A-2005-115645.
  • the optical characteristic acquisition device 50 generates a BRDF of an object (material) that is a base texture creation target (modeling target).
  • the optical property storage unit 23 stores the BRDF generated by the optical property acquisition device 50.
  • the base texture also includes a plurality of sample point forces.
  • Each sample point consists of three components: an X component indicating the X-axis value, a Y component indicating the Y-axis value, and a Z component indicating the Z-axis value set in the virtual three-dimensional space.
  • the values of the X and ⁇ components at each sample point are determined so that the projections on the XY plane are arranged in a square grid, and the value of the Z component at each sample point is the value of the modeled object. Show height data!
  • the input unit 30 includes a known input device force such as a keyboard and a mouse.
  • the polygon mesh setting unit 11 reads the base texture specified by the user from the base texture storage unit 21 using the input unit 30, and plots each sample point constituting the read base texture in the virtual three-dimensional space.
  • the polygon mesh setting unit 11 connects adjacent sample points with straight lines, and sets a polygon mesh composed of polygons such as triangles or quadrangles in the virtual three-dimensional space. This reproduces the surface shape of an object in which fine irregularities are repeatedly formed in a fixed pattern in a virtual three-dimensional space.
  • the symbol storage unit 22 stores in advance hidden symbol data indicating image data of a hidden symbol embedded in the base texture.
  • the hidden symbol data is created in advance by the user using drawing software or the like, captured by the user via the Internet, or previously created by the provider of this symbol embedding program, etc. including.
  • the polygon specifying unit 12 reads out the hidden symbol data designated by the user using the input unit 30 from the symbol storage unit 22, and reads out the polygons constituting the polygon mesh set by the polygon mesh setting unit 11.
  • the polygon located in the area where the hidden symbol data is embedded is specified as a specific polygon.
  • the target direction calculation unit 13 calculates, as the target direction, a straight line direction that bisects the angle between the light ray direction and the line-of-sight direction specified by the user using the input unit 30.
  • the direction changing unit 14 extracts a specific polygon whose normal vector angle with respect to the target direction is less than or equal to a specified value from the specific polygon, and specifies the normal vector of the extracted specific polygon so that it matches the target direction. Change the orientation of the polygon.
  • the rendering processing unit 15 renders the specific polygon whose direction has been changed by the target direction calculation unit 13 using the BRD F of the object that is the base texture modeling target, and displays it on the display unit 40.
  • the display unit 40 includes a known display device such as a CRT, a liquid crystal panel, or a plasma panel.
  • the rendering processing unit 15 performs rendering according to the line-of-sight direction and the ray direction input by the user using the input unit 30.
  • the three-dimensional data output unit 17 generates NC (Numerical Control) data from the three-dimensional data representing the position of each sample point of the polygon mesh PM in which the direction of the specific polygon has been changed by the direction changing unit 14, Output to stereolithography apparatus 60.
  • NC Numerical Control
  • the stereolithography apparatus 60 is composed of a well-known stereolithography apparatus, and forms the shape of the polygon mesh PM into a grease according to the NC data generated by the three-dimensional data output unit 17.
  • a device other than the stereolithography device can be used as long as the device can form the shape on the actual object with a three-dimensional data force representing the shape of the object created on the computer. It may be used.
  • the polygon mesh setting unit 11 corresponds to an example of a setting unit
  • the polygon specifying unit 12 corresponds to an example of a specifying unit
  • the target direction calculating unit 13 and the direction changing unit 14 are changing units.
  • the rendering processing unit 15 corresponds to an example of a rendering processing unit
  • the texture generation unit 16 corresponds to an example of a polygon mesh generation unit
  • the 3D data output unit 17 corresponds to an example of an output unit.
  • step S1 when the input unit 30 receives an operation command for designating a base texture, the polygon mesh setting unit 11 reads the designated base texture from the base texture storage unit 21 and acquires the base texture. .
  • step S2 the polygon mesh setting unit 11 reads in step S1.
  • Each sample point composing the extracted base texture is plotted in the virtual 3D space, each sample point is connected by a straight line, and a polygon mesh is set in the virtual 3D space.
  • FIG. 3 is a schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space.
  • the polygon mesh PM is composed of a plurality of polygons P L having the sample point P as a vertex.
  • the X and Y components are arranged so that each projected point SP 'is arranged in a square grid with an interval d of mesoscale (scale of micron order to millimeter order). The value of is determined.
  • Polygon mesh PM represents the surface shape of an object with irregularities repeatedly formed on the surface, and it can be seen that the Z component of each sample point P is dispersed.
  • step S3 the polygon mesh setting unit 11 adds noise to the Z component of each sample point P to further disperse the irregularities on the surface of the polygon mesh PM.
  • step S4 the polygon specifying unit 12 reads out the hidden symbol data designated by the user from the symbol storage unit 22, sets an area in which the read hidden symbol data is embedded in the polygon mesh PM, Among the polygons constituting the polygon mesh PM, the polygon PL located within the set area is identified as the specific polygon TPL.
  • FIG. 4 is a diagram showing how the specific polygon TPL is specified for the polygon mesh PM force. As shown in Fig. 4, the area D1 where the hidden symbol indicating the alphabet D is embedded is set in the polygon mesh PM. Then, the polygon PL located in the area D1 is specified as the specific polygon TPL.
  • step S5 the target direction calculation unit 13 sets a light ray direction and a line-of-sight direction in which the hidden symbol G can be visually recognized in accordance with the user force operation command received by the input unit 30.
  • step S6 the target direction calculation unit 13 calculates, as the target direction OD, the direction of a straight line that equally divides the angle ⁇ 1 formed by the light beam direction LD and the line-of-sight direction VD as shown in FIG.
  • step S7 the direction changing unit 14 obtains the normal vector n of the specific polygon TPL as shown in FIG. 5, and the angle ⁇ between the normal vector n and the target direction OD is a predetermined value (10 Extract any specific polygon TPL that is less than or equal to 20 degrees or less). The default value Depending on how scattered the polygon PL is in the normal direction and how much the hidden pattern is visible depending on the BRDF of the material.
  • step S8 the direction changing unit 14 configures the sample point P constituting the specific polygon TPL so that the normal vector n of the specific polygon TPL extracted in step S7 matches the target direction OD. Change the Z component value of and change the direction of the specific polygon TPL
  • step S9 the rendering processing unit 15 renders the polygon mesh PM whose orientation has been changed using the BRDF of the object to be modeled by the polygon mesh PM, and causes the display unit 40 to display it. .
  • step S10 the three-dimensional data output unit 17 converts the three-dimensional data of each sample point P of the polygon mesh PM whose direction has been changed into NC data, Output to stereolithography apparatus 60.
  • the optical modeling apparatus 60 forms a resin according to NC data, and forms a texture with a hidden pattern embedded in the resin.
  • FIG. 6 is a diagram illustrating an example of a polygon mesh PM rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized.
  • the ray direction and the line-of-sight direction are set so that the hidden symbol can be seen, so only the surface shape of the object to be modeled is displayed and the hidden symbol G is not displayed.
  • Fig. 7 is a schematic diagram showing an example of the positional relationship between the three ray directions LD1 to LD3 and the line-of-sight direction VD.
  • Figs. 8 to 10 show the polygon mesh PM for each ray direction LD1 to LD3. It is a figure which shows an example of a rendering result (rendered image).
  • the normal vector of a specific polygon matches the target direction that bisects the angle between the light ray direction LD2 and the line-of-sight direction VD, the light directions LD1 and LD3 visually recognize the hidden pattern as shown in Fig. 7.
  • the hidden symbol G is not displayed as shown in FIG. 8 or FIG.
  • the hidden symbol G which is the character power of DFL, is displayed.
  • FIG. 11 shows a cross-sectional view of an example of a polygon mesh PM in which a hidden symbol is embedded.
  • the plane Kl shown in FIG. 11 indicates the polygon surface of the specific polygon TPL whose orientation has been changed in step S8.
  • the specular reflection direction of light on the plane K1 from the light beam direction LD2 is the line-of-sight direction VD.
  • Hidden symbol G will be displayed as a result of directing in line of sight direction VD.
  • the light in the irregular reflection direction of the plane K1 is directed to the line-of-sight direction VD, so the hidden symbol G is not displayed.
  • step S21 when the input unit 30 receives a user operation command for designating one BRDF from the optical characteristic storage unit 23, the texture generation unit 16 converts the BRDF designated by the user into the optical characteristics. Read from the storage unit 23 to obtain the BRDF.
  • step S22 the texture generation unit 16 arranges a plurality of sample points in a lattice in the virtual three-dimensional space, and regular reflection of light when light is irradiated to each sample point of the virtual light source power.
  • the direction is calculated using BRDF.
  • FIG. 13 is a schematic diagram showing sample points P arranged in a lattice pattern in a virtual three-dimensional space.
  • X, Y, and saddle axes that are orthogonal to each other are set in the virtual three-dimensional space.
  • the heel axis indicates the vertical direction
  • the heel plane indicates the horizontal plane.
  • the texture generation unit 11 arranges the sample points ⁇ in a grid pattern on the ⁇ plane in the virtual three-dimensional space.
  • the sample points ⁇ are arranged on the XY plane at a mesoscale interval d.
  • the sample point P to be processed is referred to as a target sample point CP.
  • the texture generation unit 16 obtains the ray direction LD from the virtual light source VL for the target sample point CP, inputs the obtained ray direction LD to the BRDF obtained in step S21, changes the line-of-sight direction VD, and changes each line-of-sight direction VD. Then, the line-of-sight direction that gives the maximum reflectance among the calculated reflectances is calculated as the regular reflection direction of light at the sample point CP of interest. In the example of FIG.
  • the line-of-sight direction VDMAX is calculated as the regular reflection direction RD.
  • the specular reflection direction RD is calculated for other sample points P in the same way.
  • the normal vector n at the sample point CP of interest is displayed exaggerated over the normal vector n at the other sample points P.
  • FIG. 14 is a schematic diagram for explaining the light beam direction LD and the line-of-sight direction VD.
  • the sample point of interest CP is the origin
  • the X 'axis parallel to the X axis
  • Y' axis parallel to the Y axis
  • axis parallel to the Z axis
  • the line-of-sight direction VD is expressed by an angle ⁇ formed by the projection VD ′ and the X ′ axis of the line-of-sight direction VD onto the vertical plane, and an angle ⁇ formed by the projection VD and the line-of-sight direction VD.
  • the texture generation unit 16 changes the angle ⁇ within a range of 0 to 90 degrees with a predetermined resolution (for example, 5 degrees), and changes the angle ⁇ within a range of ⁇ 90 to 90 degrees. Change the resolution (eg 10 degrees) and change the line-of-sight direction VD.
  • step S23 the texture generation unit 16 bisects the angle ⁇ 2 formed between the specular reflection direction RD and the light ray direction LD of the target sample point CP shown in FIG. Is calculated as a normal vector n with respect to the plane including the sample point of interest CP.
  • step S24 the texture generation unit 16 calculates the unevenness information of the sample point P as follows.
  • FIG. 15 is a schematic diagram when the sample points P arranged on the XY plane are viewed from the Z direction
  • FIG. 16 is a schematic diagram of a virtual three-dimensional space in which the sample points P are set.
  • the texture generating unit 16 specifies the polygon PL1 located at the lower left of the two polygons constituting the lower left grid K1 as the target polygon.
  • the sample points P2, P3 are set so that the direction of the polygon PL1 is orthogonal to the normal vector nl of the lower left sample point P1. Is moved in the Z direction, and the value of the Z component of the sample points P2 and P3 after the movement is calculated as the unevenness information of the sample points P2 and P3.
  • the texture generation unit 16 identifies the polygon PL2 located at the lower left of the polygons constituting the grid K2 adjacent to the grid K1 as the target polygon, as shown in FIG. Of the polygon PL2, the sample points P4, P so as to be orthogonal to the normal vector n2 of the sample point P2 in the lower left of the three sample points P constituting the polygon PL2 5 is moved in the Z direction, and the value of the Z component of the sample points P4 and P5 after the movement is calculated as the unevenness information of the sample points P4 and P5.
  • the texture generation unit 16 identifies the polygon PL3 in the lattice K3 adjacent to the upper side of the lattice K1 as the target polygon, and the orientation of the polygon PL3 as shown in FIG. Force
  • the sample point P6 is moved in the Z direction so as to be orthogonal to the normal vector n3 of the sample point P3, and the value of the Z component of the sample point P6 after the movement is calculated as the unevenness information of the sample point P6.
  • the texture generation unit 16 identifies the polygon PL4 in the lattice K4 adjacent to the right side of the lattice K2 as the target polygon, and changes the orientation of the polygon PL4 in the same manner as the polygon PL2. .
  • the texture generation unit 16 sequentially identifies the target polygon from the polygon PL1 of the lower left grid K1 so as to meander diagonally upward to the right, and the orientation of the identified target polygon is as described above. Change and calculate the Z component value of each sample point after the change as unevenness information of each sample point to generate a 3D texture.
  • FIG. 17 is a diagram illustrating an example of a texture generated by the texture generation unit 16.
  • the texture generation unit 16 sequentially identifies the target polygon from the polygons in the lower right, upper left, and upper right grids other than the lower left so as to meander to the upper left, lower right, and lower left. A little.
  • the specific polygon located in the region where the hidden design is embedded is specified, and the regular reflection direction of the light irradiated from the specific light ray direction is specified.
  • the direction of the specific polygon is changed so as to face the viewing direction.
  • the present design embedding apparatus it is possible to embed a hidden design in the wrinkles formed with wrinkles, and light is applied to the wrinkles formed with wrinkles from a specific direction (light beam direction).
  • a specific direction line-of-sight direction
  • a hidden symbol appears.
  • a third party imitates a wrinkle and forms a resin
  • the hidden pattern is not embedded in this resin, so even if it is irradiated from a specific direction and observed from a specific direction, the hidden pattern does not appear. It will not appear. Thereby, imitation of the grain by a third party can be prevented.
  • step S3 in FIG. 2 the force applied noise to the Z component is not limited to this, and the processing shown in step S3 may be omitted.
  • the force that prevents imitation of the wrinkles is not limited to this, and the object is a three-dimensional or two-dimensional decorated object, and the entire surface of the object or a part thereof. By embedding a hidden pattern in the area where the wrinkles are displayed, it is possible to protect the decorations from imitation by third parties.
  • the symbol embedding device embeds a plurality of types of hidden symbols in the polygon mesh compared to the symbol embedding device according to the first embodiment that embeds one type of hidden symbol in the polygon mesh PM. It is characterized by. Since the symbol embedding device according to the second embodiment has substantially the same configuration as the symbol embedding device according to the first embodiment, the symbol embedding device according to the first embodiment shown in FIG. This will be described with reference to the block diagram.
  • FIG. 18 is a schematic diagram for explaining how the hidden symbols G1 to G3 are embedded in the polygon mesh
  • FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment. is there.
  • the target direction calculation unit 13 calculates a direction that bisects the angle formed by the light beam direction LD1 and the line-of-sight direction VD as the target direction OD1, and calculates the light beam direction LD2 and the line-of-sight direction VD
  • the target direction OD2 is calculated as the direction that divides the angle formed by halving the target direction OD3.
  • the direction changing unit 14 extracts the specific polygon TPL 1 whose angle between the normal vector nl and the target direction OD 1 is less than the specified value from the specific polygon TPL1 as shown in FIG. Among them, a specific polygon TPL2 whose angle between the normal vector n2 and the target direction OD2 is less than the specified value is extracted, and among the specific polygon TPL3, the angle between the normal vector n3 and the target direction OD3 is less than the specified value. Extract specific polygon TPL3.
  • the direction of the specific polygon TPL 1 is changed so that the normal vector nl of the extracted specific polygon TPL1 matches the target direction OD1, and the normal vector n2 of the extracted specific polygon TPL2 matches the target direction OD2
  • the direction of the specific polygon TPL2 is changed, and the direction of the specific polygon TPL3 is changed so that the normal vector n3 of the extracted specific polygon TPL3 matches the target direction OD3.
  • the rendering processing unit 15 sets the ray direction to LD1 to LD3, sets the line-of-sight direction to VD, and renders the polygon mesh PM using BRDF, as shown in FIG. A hidden symbol G1, a hidden symbol G2 of “F”, and a hidden symbol G3 of “L” are displayed on the display unit 40.
  • the rendering processing unit 15 renders the polygon mesh PM with the ray direction set to LD1 and the line-of-sight direction set to VD, only the hidden symbol “D” is displayed on the display unit 40 in FIG.
  • Hidden symbols G2 and G3 of “F” and “L” are not displayed on the display unit 40.
  • the rendering processor 15 sets the light beam direction in a direction other than the light beam directions LD1 to LD3 and renders the polygon mesh PM, the hidden symbols G1 to G3 are not displayed.
  • a plurality of hidden symbols are embedded in the polygon mesh PM, and each hidden symbol can be visually recognized. Because of the different directions, it is more difficult for a third party to visually recognize the hidden symbol embedded in the polygon mesh.
  • the directions of the specific polygons TPL and TPL1 to TPL3 are changed so that the normal vectors n and nl to n3 of the specific polygon TPL match the target directions OD and OD1 to OD3.
  • the Z component of the sample point SP is corrected so that the normal vector to the sample point SP matches the target direction OD, and each specific polygon TPL1 to TPL5 sharing the sample point SP is corrected.
  • the normal vector ⁇ ⁇ of the sample point SP can be calculated by taking the average of the normal vectors nl to n5 of the specific polygons TPL1 to TPL5 including the sample point SP.
  • the rendering processing unit 15 performs rendering by setting the colors of the light beam directions LD1 to LD3 to different colors, respectively, “D”, “F”, The letters “L” can be displayed in different colors.
  • the hidden symbol is a force in which three types of hidden symbol forces are also configured. The present invention is not limited to this, and two types or four or more types of hidden symbols may be used.
  • the number of the line-of-sight directions VD is one, but the present invention is not limited to this, and a plurality of line-of-sight directions VD may be set.
  • the design embedding program sets the polygon mesh representing the surface shape of the object formed by repeatedly forming the unevenness on the surface in the virtual three-dimensional space. And a means for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction
  • the computer is caused to function as changing means for changing the orientation of the specific polygon so that is directed in a predetermined line-of-sight direction.
  • a pattern embedding device includes a setting unit that sets a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space;
  • a change means for changing the orientation of the specific polygon is provided.
  • the computer sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space;
  • a specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern as a specific polygon from among the polygons constituting the polygon mesh, and a specular reflection direction for light from a predetermined ray direction is a predetermined line of sight Changing the direction of the specific polygon so as to face the direction.
  • a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in a region where a hidden symbol is embedded
  • the direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction.
  • a cubic that defines the shape of the polygon mesh whose orientation has been changed by the changing means. It is preferable to further cause the computer to function as output means for outputting the original data to the three-dimensional modeling apparatus. In this case, it is possible to embed a hidden symbol in an actual object having irregularities repeatedly formed on the surface.
  • an acquisition unit that acquires the optical characteristics of the object
  • a rendering processing unit that renders the polygon mesh whose orientation has been changed by the change unit using the optical characteristics acquired by the acquisition unit. It is preferable to make a computer function. In this case, it is possible to embed a hidden symbol in a virtual three-dimensional model of an object with irregularities repeatedly formed on the surface.
  • the changing unit changes the direction of only the specific polygon whose direction change amount is equal to or less than a predetermined angle among the specific polygons.
  • the orientation of the polygon that has to be changed greatly does not change, it is possible to embed a hidden pattern that does not greatly change the original shape of the polygon mesh.
  • there are many polygons whose orientation has been changed greatly there is a possibility that hidden symbols can be seen from directions other than the predetermined line-of-sight direction, but by adopting the above configuration, directions other than the predetermined line-of-sight direction can be observed. Therefore, it is possible to prevent the hidden symbol from being visually recognized.
  • the changing unit sets a direction that bisects the angle between the predetermined ray direction and the predetermined line-of-sight direction as a target direction, and a normal vector of the specific polygon matches the target direction It is preferable to change the direction of the specific polygon.
  • the direction that bisects the angle between the predetermined ray direction and the line-of-sight direction is set as the target direction
  • the direction of the specific polygon is set so that the normal vector of the specific polygon coincides with the target direction. Because it is changed, it is possible to accurately change the direction of the specific polygon to the target direction.
  • the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces
  • the specifying unit specifies a specific polygon corresponding to each hidden symbol
  • the changing unit includes The specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is the predetermined line of sight. It is preferable to change the orientation of the specific polygon corresponding to each hidden symbol so that it faces the direction.
  • the hidden symbol is composed of n types of hidden symbol powers, and a light beam directional light that can be visually recognized for each hidden symbol is irradiated to illuminate the hidden symbol. Only when observing from the line of sight that can be recognized, the hidden symbol can be visually recognized. Therefore, the entire hidden symbol cannot be visually recognized unless the line-of-sight direction in which the symbols for all the n types of hidden symbols are visible is known. As a result, the probability that the entire hidden symbol is visually recognized can be reduced, and the possibility that the hidden symbol is visually recognized by a third party can be further reduced.
  • the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces
  • the specifying unit specifies a specific polygon corresponding to each hidden symbol
  • the changing unit includes: The direction of the specific polygon corresponding to each hidden symbol is changed so that the specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is directed to the predetermined line-of-sight direction. It is also possible to render the polygon mesh whose direction is changed by the changing means by setting the color of the light from the light ray direction corresponding to the hidden symbol to a different color. In this case, n types of hidden symbols can be represented by different colors.
  • a plurality of sample points are arranged in a virtual three-dimensional space, and the regular reflection direction of light emitted from the virtual light source to each of the arranged sample points is represented by the bidirectional reflectance distribution function of the object.
  • the normal vector of each sample point is calculated from the calculated regular reflection direction and the incident direction of the light of the virtual light source power, and the height data of each sample point is calculated based on the calculated normal vector.
  • the computer further function as polygon mesh generating means for generating the polygon mesh.
  • the user can obtain a polygon mesh representing the surface shape of the object simply by giving the bidirectional reflectance distribution function of the object without modeling the polygon mesh.
  • the object is preferably an object having a texture.
  • the polygon mesh represents the surface shape of the object having the texture, the orientation of the polygons other than the specific polygon is dispersed, so that the hidden symbol can be expressed more clearly. Also hidden in wrinkles It becomes possible to embed a design and prevent imitation of a wrinkle by a third party. Industrial applicability
  • the symbol embedding device can embed a hidden symbol in an actual object or a virtual three-dimensional model, and thus is useful as a symbol embedding device using computer graphics technology. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L’invention concerne une unité de montage de trame polygonale destinée à monter une trame polygonale représentant une forme de surface d’un objet dans un espace tridimensionnel virtuel. Une unité de spécification du polygone spécifie une région où doit être inclus un motif caché dans le polygone constituant la trame polygonale et spécifie le polygone placé dans la région spécifique en tant que polygone spécifique. Une unité de modification de la direction modifie la direction du polygone spécifique de sorte que la direction de réflexion directe de la lumière de la direction du rayon lumineux est dirigée vers la direction de la ligne visuelle.
PCT/JP2006/312637 2005-07-01 2006-06-23 Programme, dispositif et procédé d’inclusion de motif Ceased WO2007004448A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-194252 2005-07-01
JP2005194252 2005-07-01

Publications (1)

Publication Number Publication Date
WO2007004448A1 true WO2007004448A1 (fr) 2007-01-11

Family

ID=37604318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/312637 Ceased WO2007004448A1 (fr) 2005-07-01 2006-06-23 Programme, dispositif et procédé d’inclusion de motif

Country Status (1)

Country Link
WO (1) WO2007004448A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010103942A1 (fr) * 2009-03-09 2010-09-16 カルソニックカンセイ株式会社 Procédé et dispositif de création de données de traitement de surface

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10334272A (ja) * 1997-05-27 1998-12-18 Ibm Japan Ltd 3次元形状モデルへの情報の埋め込み方法及びシステム
JP2000082156A (ja) * 1998-09-04 2000-03-21 Osamu Kanai 電子情報データ埋め込み方法及び抽出方法並びに電子情報データ埋め込み装置及び抽出装置、並びに前記方法のプログラムを記録した記録媒体
JP2003099805A (ja) * 2001-09-21 2003-04-04 Rikogaku Shinkokai 3次元形状モデルへの電子透かし埋込み方法並びに電子透かし復元方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10334272A (ja) * 1997-05-27 1998-12-18 Ibm Japan Ltd 3次元形状モデルへの情報の埋め込み方法及びシステム
JP2000082156A (ja) * 1998-09-04 2000-03-21 Osamu Kanai 電子情報データ埋め込み方法及び抽出方法並びに電子情報データ埋め込み装置及び抽出装置、並びに前記方法のプログラムを記録した記録媒体
JP2003099805A (ja) * 2001-09-21 2003-04-04 Rikogaku Shinkokai 3次元形状モデルへの電子透かし埋込み方法並びに電子透かし復元方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010103942A1 (fr) * 2009-03-09 2010-09-16 カルソニックカンセイ株式会社 Procédé et dispositif de création de données de traitement de surface
CN102348551A (zh) * 2009-03-09 2012-02-08 康奈可关精株式会社 用于形成表面加工数据的方法和装置
CN102348551B (zh) * 2009-03-09 2014-05-28 康奈可关精株式会社 用于形成表面加工数据的方法和装置
US9275497B2 (en) 2009-03-09 2016-03-01 Calsonic Kansei Corporation Method and device for forming surface processed

Similar Documents

Publication Publication Date Title
JP4276178B2 (ja) 皮膚又は類似物をデジタル式にレンダリングする方法
US7446778B2 (en) Dynamically adjusted brush for direct paint systems on parameterized multi-dimensional surfaces
JP5299173B2 (ja) 画像処理装置および画像処理方法、並びにプログラム
JP2003256865A (ja) 立体オブジェクトデータからの漫画的表現の2次元画像の生成方法および生成プログラム
TWI406186B (zh) 3d圖形的2d編輯隱喻
US10475230B2 (en) Surface material pattern finish simulation device and surface material pattern finish simulation method
US20180005432A1 (en) Shading Using Multiple Texture Maps
US9317967B1 (en) Deformation of surface objects
KR100942026B1 (ko) 다중 감각 인터페이스에 기반한 가상의 3차원 얼굴메이크업 시스템 및 방법
US7609275B2 (en) System and method for mosaic rendering of three dimensional image
JPH06236440A (ja) 画像処理方法
WO2007004448A1 (fr) Programme, dispositif et procédé d’inclusion de motif
Levene A framework for non-realistic projections
ATE433172T1 (de) Wiedergabe von 3d-computergraphik unter verwendung von 2d-computergraphik-fähigkeiten
KR101921706B1 (ko) 현실영상 원근에 따른 혼합현실 생성 및 제공 시스템
WO2007108288A1 (fr) Programme, appareil et procede de production de texture
GB2341529A (en) Three-dimensional embroidery design simulator
JP4172556B2 (ja) 二次元スカラ場デザイン方法及びそのシステム
JP2006202066A (ja) 3次元コンピュータ・グラフィックス曲線・曲面モデル生成装置およびそのシステム
Martín et al. Flattening 3D objects using silhouettes
CN101263529B (zh) 用于3d图形的2d编辑隐喻
JP4736239B2 (ja) 模様画像作成方法および装置
JP2025059730A (ja) 影絵投影物の製造方法及び端末装置のためのコンピュータプログラム
Ji Design and Modeling of Chinese Classical Lanterns Based on Different Processes
JP6720523B2 (ja) 面材模様仕上がりシミュレーションシステム及び面材模様仕上がりシミュレーション方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06767254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP