[go: up one dir, main page]

WO2007004448A1 - Pattern embedding program, pattern embedding device, and pattern embedding method - Google Patents

Pattern embedding program, pattern embedding device, and pattern embedding method Download PDF

Info

Publication number
WO2007004448A1
WO2007004448A1 PCT/JP2006/312637 JP2006312637W WO2007004448A1 WO 2007004448 A1 WO2007004448 A1 WO 2007004448A1 JP 2006312637 W JP2006312637 W JP 2006312637W WO 2007004448 A1 WO2007004448 A1 WO 2007004448A1
Authority
WO
WIPO (PCT)
Prior art keywords
polygon
hidden
light
specific
symbol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2006/312637
Other languages
French (fr)
Japanese (ja)
Inventor
Yoshiyuki Sakaguchi
Koji Imao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Digital Fashion Ltd
Original Assignee
Digital Fashion Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digital Fashion Ltd filed Critical Digital Fashion Ltd
Publication of WO2007004448A1 publication Critical patent/WO2007004448A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0021Image watermarking
    • G06T1/0028Adaptive watermarking, e.g. Human Visual System [HVS]-based watermarking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0051Embedding of the watermark in the spatial domain

Definitions

  • the present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method using a computer graphics technique, and in particular, a predetermined hidden symbol is applied to a polygon mesh arranged in a virtual three-dimensional space.
  • the present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method.
  • Patent Document 1 discloses a step of setting a parameter for embedding watermark processing information, a step of inputting pre-conversion data of a 3D shape model, and each curved mesh of the input 3D shape model.
  • the process of selecting meshes that are transparent and that perform information embedding processing and meshes that are not embedding processing, and for a given mesh in each selected set of curved surface meshes, are specified as embedding parameters.
  • a technique for embedding electronic permeability in a dimensional shape model is disclosed.
  • Patent Document 1 Japanese Patent Laid-Open No. 2003-99805
  • An object of the present invention is to provide a symbol embedding program, a symbol embedding device, and a symbol embedding method capable of embedding a hidden symbol in an actual object or a virtual three-dimensional model. Is Rukoto.
  • a pattern embedding program includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh
  • the computer is caused to function as changing means for changing the direction of the specific polygon.
  • a pattern embedding device includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh.
  • a means for identifying a polygon located within an area for embedding a predetermined hidden pattern as a specific polygon, and a regular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction
  • changing means for changing the orientation of the specific polygon is provided.
  • a pattern embedding method includes a setting step in which a computer sets, in a virtual three-dimensional space, a polygon mesh that represents a surface shape of an object on which irregularities are repeatedly formed on a surface.
  • a specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction are provided.
  • a changing step of changing the orientation of the specific polygon so as to face a predetermined line-of-sight direction.
  • a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in an area where a hidden symbol is embedded
  • the direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction.
  • FIG. 1 is a block diagram showing a configuration of a symbol embedding apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a flowchart for explaining symbol embedding processing by the symbol embedding device shown in FIG. 1.
  • ⁇ 3 A schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space. 4) Polygon mesh force is also a diagram showing how a specific polygon is specified.
  • FIG. 6 is a diagram illustrating an example of a polygon mesh rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized.
  • FIG. 7 is a schematic diagram showing an example of a positional relationship between three light ray directions and a line-of-sight direction.
  • FIG. 8 is a diagram illustrating an example of a rendering result of a polygon mesh with respect to a light ray direction.
  • FIG. 9 is a diagram showing another example of a polygon mesh rendering result with respect to a light ray direction.
  • FIG. 10 is a diagram showing another example of a rendering result of a polygon mesh with respect to a light ray direction.
  • FIG. 11 is a cross-sectional view of an example of a polygon mesh in which a hidden symbol is embedded.
  • FIG. 12 is a flowchart for explaining a process in which the texture generation unit shown in FIG. 1 generates a base texture.
  • FIG. 13 is a schematic diagram showing sample points arranged in a grid in a virtual three-dimensional space.
  • FIG. 14 is a schematic diagram for explaining a light ray direction and a line-of-sight direction.
  • FIG. 15 is a schematic diagram when the sample points arranged on the XY plane are viewed from the Z direction.
  • FIG. 17 is a diagram showing an example of a base texture when viewed from the heel direction.
  • FIG. 18 is a schematic diagram for explaining a state in which a plurality of hidden symbols are embedded in a polygon mesh by the symbol embedding device according to the second embodiment of the present invention.
  • FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment of the present invention.
  • FIG. 1 shows a block configuration diagram of a symbol embedding apparatus according to a first embodiment of the present invention.
  • This symbol embedding device is also configured with a known computer power, and includes a processing unit 10, a storage unit 20, an input unit 30, a display unit 40, and an optical characteristic acquisition device 50.
  • the processing unit 10 is configured with CPU power, and includes a polygon mesh setting unit 11, a polygon specifying unit 12, a target direction calculating unit 13, a direction changing unit 14, a rendering processing unit 15, a texture generating unit 16, and a 3D data output unit 17 It has the function of.
  • the storage unit 20 includes a storage device such as a hard disk, and includes functions of a base texture storage unit 21, a symbol storage unit 22, and an optical characteristic storage unit 23.
  • the polygon mesh setting unit 11 to the three-dimensional data output unit 17 and the base texture storage unit 21 to the optical characteristic storage unit 23 execute a symbol embedding program stored in a hard disk as a recording medium by the CPU. This is realized.
  • the optical characteristic acquisition device 50 is a known device invented by the applicant of the present invention, and obtains a sample by photographing the sample while changing the light irradiation direction and the photographing direction with respect to the actual sample. This is a device that generates a BRDF (bidirectional reflection distribution function) from the sample image. Detailed contents are disclosed in JP-A-2005-115645.
  • the optical characteristic acquisition device 50 generates a BRDF of an object (material) that is a base texture creation target (modeling target).
  • the optical property storage unit 23 stores the BRDF generated by the optical property acquisition device 50.
  • the base texture also includes a plurality of sample point forces.
  • Each sample point consists of three components: an X component indicating the X-axis value, a Y component indicating the Y-axis value, and a Z component indicating the Z-axis value set in the virtual three-dimensional space.
  • the values of the X and ⁇ components at each sample point are determined so that the projections on the XY plane are arranged in a square grid, and the value of the Z component at each sample point is the value of the modeled object. Show height data!
  • the input unit 30 includes a known input device force such as a keyboard and a mouse.
  • the polygon mesh setting unit 11 reads the base texture specified by the user from the base texture storage unit 21 using the input unit 30, and plots each sample point constituting the read base texture in the virtual three-dimensional space.
  • the polygon mesh setting unit 11 connects adjacent sample points with straight lines, and sets a polygon mesh composed of polygons such as triangles or quadrangles in the virtual three-dimensional space. This reproduces the surface shape of an object in which fine irregularities are repeatedly formed in a fixed pattern in a virtual three-dimensional space.
  • the symbol storage unit 22 stores in advance hidden symbol data indicating image data of a hidden symbol embedded in the base texture.
  • the hidden symbol data is created in advance by the user using drawing software or the like, captured by the user via the Internet, or previously created by the provider of this symbol embedding program, etc. including.
  • the polygon specifying unit 12 reads out the hidden symbol data designated by the user using the input unit 30 from the symbol storage unit 22, and reads out the polygons constituting the polygon mesh set by the polygon mesh setting unit 11.
  • the polygon located in the area where the hidden symbol data is embedded is specified as a specific polygon.
  • the target direction calculation unit 13 calculates, as the target direction, a straight line direction that bisects the angle between the light ray direction and the line-of-sight direction specified by the user using the input unit 30.
  • the direction changing unit 14 extracts a specific polygon whose normal vector angle with respect to the target direction is less than or equal to a specified value from the specific polygon, and specifies the normal vector of the extracted specific polygon so that it matches the target direction. Change the orientation of the polygon.
  • the rendering processing unit 15 renders the specific polygon whose direction has been changed by the target direction calculation unit 13 using the BRD F of the object that is the base texture modeling target, and displays it on the display unit 40.
  • the display unit 40 includes a known display device such as a CRT, a liquid crystal panel, or a plasma panel.
  • the rendering processing unit 15 performs rendering according to the line-of-sight direction and the ray direction input by the user using the input unit 30.
  • the three-dimensional data output unit 17 generates NC (Numerical Control) data from the three-dimensional data representing the position of each sample point of the polygon mesh PM in which the direction of the specific polygon has been changed by the direction changing unit 14, Output to stereolithography apparatus 60.
  • NC Numerical Control
  • the stereolithography apparatus 60 is composed of a well-known stereolithography apparatus, and forms the shape of the polygon mesh PM into a grease according to the NC data generated by the three-dimensional data output unit 17.
  • a device other than the stereolithography device can be used as long as the device can form the shape on the actual object with a three-dimensional data force representing the shape of the object created on the computer. It may be used.
  • the polygon mesh setting unit 11 corresponds to an example of a setting unit
  • the polygon specifying unit 12 corresponds to an example of a specifying unit
  • the target direction calculating unit 13 and the direction changing unit 14 are changing units.
  • the rendering processing unit 15 corresponds to an example of a rendering processing unit
  • the texture generation unit 16 corresponds to an example of a polygon mesh generation unit
  • the 3D data output unit 17 corresponds to an example of an output unit.
  • step S1 when the input unit 30 receives an operation command for designating a base texture, the polygon mesh setting unit 11 reads the designated base texture from the base texture storage unit 21 and acquires the base texture. .
  • step S2 the polygon mesh setting unit 11 reads in step S1.
  • Each sample point composing the extracted base texture is plotted in the virtual 3D space, each sample point is connected by a straight line, and a polygon mesh is set in the virtual 3D space.
  • FIG. 3 is a schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space.
  • the polygon mesh PM is composed of a plurality of polygons P L having the sample point P as a vertex.
  • the X and Y components are arranged so that each projected point SP 'is arranged in a square grid with an interval d of mesoscale (scale of micron order to millimeter order). The value of is determined.
  • Polygon mesh PM represents the surface shape of an object with irregularities repeatedly formed on the surface, and it can be seen that the Z component of each sample point P is dispersed.
  • step S3 the polygon mesh setting unit 11 adds noise to the Z component of each sample point P to further disperse the irregularities on the surface of the polygon mesh PM.
  • step S4 the polygon specifying unit 12 reads out the hidden symbol data designated by the user from the symbol storage unit 22, sets an area in which the read hidden symbol data is embedded in the polygon mesh PM, Among the polygons constituting the polygon mesh PM, the polygon PL located within the set area is identified as the specific polygon TPL.
  • FIG. 4 is a diagram showing how the specific polygon TPL is specified for the polygon mesh PM force. As shown in Fig. 4, the area D1 where the hidden symbol indicating the alphabet D is embedded is set in the polygon mesh PM. Then, the polygon PL located in the area D1 is specified as the specific polygon TPL.
  • step S5 the target direction calculation unit 13 sets a light ray direction and a line-of-sight direction in which the hidden symbol G can be visually recognized in accordance with the user force operation command received by the input unit 30.
  • step S6 the target direction calculation unit 13 calculates, as the target direction OD, the direction of a straight line that equally divides the angle ⁇ 1 formed by the light beam direction LD and the line-of-sight direction VD as shown in FIG.
  • step S7 the direction changing unit 14 obtains the normal vector n of the specific polygon TPL as shown in FIG. 5, and the angle ⁇ between the normal vector n and the target direction OD is a predetermined value (10 Extract any specific polygon TPL that is less than or equal to 20 degrees or less). The default value Depending on how scattered the polygon PL is in the normal direction and how much the hidden pattern is visible depending on the BRDF of the material.
  • step S8 the direction changing unit 14 configures the sample point P constituting the specific polygon TPL so that the normal vector n of the specific polygon TPL extracted in step S7 matches the target direction OD. Change the Z component value of and change the direction of the specific polygon TPL
  • step S9 the rendering processing unit 15 renders the polygon mesh PM whose orientation has been changed using the BRDF of the object to be modeled by the polygon mesh PM, and causes the display unit 40 to display it. .
  • step S10 the three-dimensional data output unit 17 converts the three-dimensional data of each sample point P of the polygon mesh PM whose direction has been changed into NC data, Output to stereolithography apparatus 60.
  • the optical modeling apparatus 60 forms a resin according to NC data, and forms a texture with a hidden pattern embedded in the resin.
  • FIG. 6 is a diagram illustrating an example of a polygon mesh PM rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized.
  • the ray direction and the line-of-sight direction are set so that the hidden symbol can be seen, so only the surface shape of the object to be modeled is displayed and the hidden symbol G is not displayed.
  • Fig. 7 is a schematic diagram showing an example of the positional relationship between the three ray directions LD1 to LD3 and the line-of-sight direction VD.
  • Figs. 8 to 10 show the polygon mesh PM for each ray direction LD1 to LD3. It is a figure which shows an example of a rendering result (rendered image).
  • the normal vector of a specific polygon matches the target direction that bisects the angle between the light ray direction LD2 and the line-of-sight direction VD, the light directions LD1 and LD3 visually recognize the hidden pattern as shown in Fig. 7.
  • the hidden symbol G is not displayed as shown in FIG. 8 or FIG.
  • the hidden symbol G which is the character power of DFL, is displayed.
  • FIG. 11 shows a cross-sectional view of an example of a polygon mesh PM in which a hidden symbol is embedded.
  • the plane Kl shown in FIG. 11 indicates the polygon surface of the specific polygon TPL whose orientation has been changed in step S8.
  • the specular reflection direction of light on the plane K1 from the light beam direction LD2 is the line-of-sight direction VD.
  • Hidden symbol G will be displayed as a result of directing in line of sight direction VD.
  • the light in the irregular reflection direction of the plane K1 is directed to the line-of-sight direction VD, so the hidden symbol G is not displayed.
  • step S21 when the input unit 30 receives a user operation command for designating one BRDF from the optical characteristic storage unit 23, the texture generation unit 16 converts the BRDF designated by the user into the optical characteristics. Read from the storage unit 23 to obtain the BRDF.
  • step S22 the texture generation unit 16 arranges a plurality of sample points in a lattice in the virtual three-dimensional space, and regular reflection of light when light is irradiated to each sample point of the virtual light source power.
  • the direction is calculated using BRDF.
  • FIG. 13 is a schematic diagram showing sample points P arranged in a lattice pattern in a virtual three-dimensional space.
  • X, Y, and saddle axes that are orthogonal to each other are set in the virtual three-dimensional space.
  • the heel axis indicates the vertical direction
  • the heel plane indicates the horizontal plane.
  • the texture generation unit 11 arranges the sample points ⁇ in a grid pattern on the ⁇ plane in the virtual three-dimensional space.
  • the sample points ⁇ are arranged on the XY plane at a mesoscale interval d.
  • the sample point P to be processed is referred to as a target sample point CP.
  • the texture generation unit 16 obtains the ray direction LD from the virtual light source VL for the target sample point CP, inputs the obtained ray direction LD to the BRDF obtained in step S21, changes the line-of-sight direction VD, and changes each line-of-sight direction VD. Then, the line-of-sight direction that gives the maximum reflectance among the calculated reflectances is calculated as the regular reflection direction of light at the sample point CP of interest. In the example of FIG.
  • the line-of-sight direction VDMAX is calculated as the regular reflection direction RD.
  • the specular reflection direction RD is calculated for other sample points P in the same way.
  • the normal vector n at the sample point CP of interest is displayed exaggerated over the normal vector n at the other sample points P.
  • FIG. 14 is a schematic diagram for explaining the light beam direction LD and the line-of-sight direction VD.
  • the sample point of interest CP is the origin
  • the X 'axis parallel to the X axis
  • Y' axis parallel to the Y axis
  • axis parallel to the Z axis
  • the line-of-sight direction VD is expressed by an angle ⁇ formed by the projection VD ′ and the X ′ axis of the line-of-sight direction VD onto the vertical plane, and an angle ⁇ formed by the projection VD and the line-of-sight direction VD.
  • the texture generation unit 16 changes the angle ⁇ within a range of 0 to 90 degrees with a predetermined resolution (for example, 5 degrees), and changes the angle ⁇ within a range of ⁇ 90 to 90 degrees. Change the resolution (eg 10 degrees) and change the line-of-sight direction VD.
  • step S23 the texture generation unit 16 bisects the angle ⁇ 2 formed between the specular reflection direction RD and the light ray direction LD of the target sample point CP shown in FIG. Is calculated as a normal vector n with respect to the plane including the sample point of interest CP.
  • step S24 the texture generation unit 16 calculates the unevenness information of the sample point P as follows.
  • FIG. 15 is a schematic diagram when the sample points P arranged on the XY plane are viewed from the Z direction
  • FIG. 16 is a schematic diagram of a virtual three-dimensional space in which the sample points P are set.
  • the texture generating unit 16 specifies the polygon PL1 located at the lower left of the two polygons constituting the lower left grid K1 as the target polygon.
  • the sample points P2, P3 are set so that the direction of the polygon PL1 is orthogonal to the normal vector nl of the lower left sample point P1. Is moved in the Z direction, and the value of the Z component of the sample points P2 and P3 after the movement is calculated as the unevenness information of the sample points P2 and P3.
  • the texture generation unit 16 identifies the polygon PL2 located at the lower left of the polygons constituting the grid K2 adjacent to the grid K1 as the target polygon, as shown in FIG. Of the polygon PL2, the sample points P4, P so as to be orthogonal to the normal vector n2 of the sample point P2 in the lower left of the three sample points P constituting the polygon PL2 5 is moved in the Z direction, and the value of the Z component of the sample points P4 and P5 after the movement is calculated as the unevenness information of the sample points P4 and P5.
  • the texture generation unit 16 identifies the polygon PL3 in the lattice K3 adjacent to the upper side of the lattice K1 as the target polygon, and the orientation of the polygon PL3 as shown in FIG. Force
  • the sample point P6 is moved in the Z direction so as to be orthogonal to the normal vector n3 of the sample point P3, and the value of the Z component of the sample point P6 after the movement is calculated as the unevenness information of the sample point P6.
  • the texture generation unit 16 identifies the polygon PL4 in the lattice K4 adjacent to the right side of the lattice K2 as the target polygon, and changes the orientation of the polygon PL4 in the same manner as the polygon PL2. .
  • the texture generation unit 16 sequentially identifies the target polygon from the polygon PL1 of the lower left grid K1 so as to meander diagonally upward to the right, and the orientation of the identified target polygon is as described above. Change and calculate the Z component value of each sample point after the change as unevenness information of each sample point to generate a 3D texture.
  • FIG. 17 is a diagram illustrating an example of a texture generated by the texture generation unit 16.
  • the texture generation unit 16 sequentially identifies the target polygon from the polygons in the lower right, upper left, and upper right grids other than the lower left so as to meander to the upper left, lower right, and lower left. A little.
  • the specific polygon located in the region where the hidden design is embedded is specified, and the regular reflection direction of the light irradiated from the specific light ray direction is specified.
  • the direction of the specific polygon is changed so as to face the viewing direction.
  • the present design embedding apparatus it is possible to embed a hidden design in the wrinkles formed with wrinkles, and light is applied to the wrinkles formed with wrinkles from a specific direction (light beam direction).
  • a specific direction line-of-sight direction
  • a hidden symbol appears.
  • a third party imitates a wrinkle and forms a resin
  • the hidden pattern is not embedded in this resin, so even if it is irradiated from a specific direction and observed from a specific direction, the hidden pattern does not appear. It will not appear. Thereby, imitation of the grain by a third party can be prevented.
  • step S3 in FIG. 2 the force applied noise to the Z component is not limited to this, and the processing shown in step S3 may be omitted.
  • the force that prevents imitation of the wrinkles is not limited to this, and the object is a three-dimensional or two-dimensional decorated object, and the entire surface of the object or a part thereof. By embedding a hidden pattern in the area where the wrinkles are displayed, it is possible to protect the decorations from imitation by third parties.
  • the symbol embedding device embeds a plurality of types of hidden symbols in the polygon mesh compared to the symbol embedding device according to the first embodiment that embeds one type of hidden symbol in the polygon mesh PM. It is characterized by. Since the symbol embedding device according to the second embodiment has substantially the same configuration as the symbol embedding device according to the first embodiment, the symbol embedding device according to the first embodiment shown in FIG. This will be described with reference to the block diagram.
  • FIG. 18 is a schematic diagram for explaining how the hidden symbols G1 to G3 are embedded in the polygon mesh
  • FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment. is there.
  • the target direction calculation unit 13 calculates a direction that bisects the angle formed by the light beam direction LD1 and the line-of-sight direction VD as the target direction OD1, and calculates the light beam direction LD2 and the line-of-sight direction VD
  • the target direction OD2 is calculated as the direction that divides the angle formed by halving the target direction OD3.
  • the direction changing unit 14 extracts the specific polygon TPL 1 whose angle between the normal vector nl and the target direction OD 1 is less than the specified value from the specific polygon TPL1 as shown in FIG. Among them, a specific polygon TPL2 whose angle between the normal vector n2 and the target direction OD2 is less than the specified value is extracted, and among the specific polygon TPL3, the angle between the normal vector n3 and the target direction OD3 is less than the specified value. Extract specific polygon TPL3.
  • the direction of the specific polygon TPL 1 is changed so that the normal vector nl of the extracted specific polygon TPL1 matches the target direction OD1, and the normal vector n2 of the extracted specific polygon TPL2 matches the target direction OD2
  • the direction of the specific polygon TPL2 is changed, and the direction of the specific polygon TPL3 is changed so that the normal vector n3 of the extracted specific polygon TPL3 matches the target direction OD3.
  • the rendering processing unit 15 sets the ray direction to LD1 to LD3, sets the line-of-sight direction to VD, and renders the polygon mesh PM using BRDF, as shown in FIG. A hidden symbol G1, a hidden symbol G2 of “F”, and a hidden symbol G3 of “L” are displayed on the display unit 40.
  • the rendering processing unit 15 renders the polygon mesh PM with the ray direction set to LD1 and the line-of-sight direction set to VD, only the hidden symbol “D” is displayed on the display unit 40 in FIG.
  • Hidden symbols G2 and G3 of “F” and “L” are not displayed on the display unit 40.
  • the rendering processor 15 sets the light beam direction in a direction other than the light beam directions LD1 to LD3 and renders the polygon mesh PM, the hidden symbols G1 to G3 are not displayed.
  • a plurality of hidden symbols are embedded in the polygon mesh PM, and each hidden symbol can be visually recognized. Because of the different directions, it is more difficult for a third party to visually recognize the hidden symbol embedded in the polygon mesh.
  • the directions of the specific polygons TPL and TPL1 to TPL3 are changed so that the normal vectors n and nl to n3 of the specific polygon TPL match the target directions OD and OD1 to OD3.
  • the Z component of the sample point SP is corrected so that the normal vector to the sample point SP matches the target direction OD, and each specific polygon TPL1 to TPL5 sharing the sample point SP is corrected.
  • the normal vector ⁇ ⁇ of the sample point SP can be calculated by taking the average of the normal vectors nl to n5 of the specific polygons TPL1 to TPL5 including the sample point SP.
  • the rendering processing unit 15 performs rendering by setting the colors of the light beam directions LD1 to LD3 to different colors, respectively, “D”, “F”, The letters “L” can be displayed in different colors.
  • the hidden symbol is a force in which three types of hidden symbol forces are also configured. The present invention is not limited to this, and two types or four or more types of hidden symbols may be used.
  • the number of the line-of-sight directions VD is one, but the present invention is not limited to this, and a plurality of line-of-sight directions VD may be set.
  • the design embedding program sets the polygon mesh representing the surface shape of the object formed by repeatedly forming the unevenness on the surface in the virtual three-dimensional space. And a means for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction
  • the computer is caused to function as changing means for changing the orientation of the specific polygon so that is directed in a predetermined line-of-sight direction.
  • a pattern embedding device includes a setting unit that sets a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space;
  • a change means for changing the orientation of the specific polygon is provided.
  • the computer sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space;
  • a specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern as a specific polygon from among the polygons constituting the polygon mesh, and a specular reflection direction for light from a predetermined ray direction is a predetermined line of sight Changing the direction of the specific polygon so as to face the direction.
  • a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in a region where a hidden symbol is embedded
  • the direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction.
  • a cubic that defines the shape of the polygon mesh whose orientation has been changed by the changing means. It is preferable to further cause the computer to function as output means for outputting the original data to the three-dimensional modeling apparatus. In this case, it is possible to embed a hidden symbol in an actual object having irregularities repeatedly formed on the surface.
  • an acquisition unit that acquires the optical characteristics of the object
  • a rendering processing unit that renders the polygon mesh whose orientation has been changed by the change unit using the optical characteristics acquired by the acquisition unit. It is preferable to make a computer function. In this case, it is possible to embed a hidden symbol in a virtual three-dimensional model of an object with irregularities repeatedly formed on the surface.
  • the changing unit changes the direction of only the specific polygon whose direction change amount is equal to or less than a predetermined angle among the specific polygons.
  • the orientation of the polygon that has to be changed greatly does not change, it is possible to embed a hidden pattern that does not greatly change the original shape of the polygon mesh.
  • there are many polygons whose orientation has been changed greatly there is a possibility that hidden symbols can be seen from directions other than the predetermined line-of-sight direction, but by adopting the above configuration, directions other than the predetermined line-of-sight direction can be observed. Therefore, it is possible to prevent the hidden symbol from being visually recognized.
  • the changing unit sets a direction that bisects the angle between the predetermined ray direction and the predetermined line-of-sight direction as a target direction, and a normal vector of the specific polygon matches the target direction It is preferable to change the direction of the specific polygon.
  • the direction that bisects the angle between the predetermined ray direction and the line-of-sight direction is set as the target direction
  • the direction of the specific polygon is set so that the normal vector of the specific polygon coincides with the target direction. Because it is changed, it is possible to accurately change the direction of the specific polygon to the target direction.
  • the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces
  • the specifying unit specifies a specific polygon corresponding to each hidden symbol
  • the changing unit includes The specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is the predetermined line of sight. It is preferable to change the orientation of the specific polygon corresponding to each hidden symbol so that it faces the direction.
  • the hidden symbol is composed of n types of hidden symbol powers, and a light beam directional light that can be visually recognized for each hidden symbol is irradiated to illuminate the hidden symbol. Only when observing from the line of sight that can be recognized, the hidden symbol can be visually recognized. Therefore, the entire hidden symbol cannot be visually recognized unless the line-of-sight direction in which the symbols for all the n types of hidden symbols are visible is known. As a result, the probability that the entire hidden symbol is visually recognized can be reduced, and the possibility that the hidden symbol is visually recognized by a third party can be further reduced.
  • the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces
  • the specifying unit specifies a specific polygon corresponding to each hidden symbol
  • the changing unit includes: The direction of the specific polygon corresponding to each hidden symbol is changed so that the specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is directed to the predetermined line-of-sight direction. It is also possible to render the polygon mesh whose direction is changed by the changing means by setting the color of the light from the light ray direction corresponding to the hidden symbol to a different color. In this case, n types of hidden symbols can be represented by different colors.
  • a plurality of sample points are arranged in a virtual three-dimensional space, and the regular reflection direction of light emitted from the virtual light source to each of the arranged sample points is represented by the bidirectional reflectance distribution function of the object.
  • the normal vector of each sample point is calculated from the calculated regular reflection direction and the incident direction of the light of the virtual light source power, and the height data of each sample point is calculated based on the calculated normal vector.
  • the computer further function as polygon mesh generating means for generating the polygon mesh.
  • the user can obtain a polygon mesh representing the surface shape of the object simply by giving the bidirectional reflectance distribution function of the object without modeling the polygon mesh.
  • the object is preferably an object having a texture.
  • the polygon mesh represents the surface shape of the object having the texture, the orientation of the polygons other than the specific polygon is dispersed, so that the hidden symbol can be expressed more clearly. Also hidden in wrinkles It becomes possible to embed a design and prevent imitation of a wrinkle by a third party. Industrial applicability
  • the symbol embedding device can embed a hidden symbol in an actual object or a virtual three-dimensional model, and thus is useful as a symbol embedding device using computer graphics technology. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A polygon mesh setting unit sets a polygon mesh representing a surface shape of an object in a virtual 3-dimensional space. A polygon specification unit specifies a region where a concealed pattern is to be embedded in the polygon constituting the polygon mesh and specifies the polygon positioned in the specified region as a specific polygon. A direction modification unit modifies the direction of the specific polygon so that the direct reflection direction of the light from the light ray direction is directed to the direction of the visual line.

Description

明 細 書  Specification

図柄埋込プログラム、図柄埋込装置、及び図柄埋込方法  Symbol embedding program, symbol embedding device, and symbol embedding method

技術分野  Technical field

[0001] 本発明は、コンピュータグラフィックスの技術を用いた図柄埋込プログラム、図柄埋 込装置、及び図柄埋込方法に関し、特に仮想 3次元空間内に配置されたポリゴンメッ シュに所定の隠し図柄を埋め込む図柄埋込プログラム、図柄埋込装置、及び図柄埋 込方法に関するものである。  TECHNICAL FIELD [0001] The present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method using a computer graphics technique, and in particular, a predetermined hidden symbol is applied to a polygon mesh arranged in a virtual three-dimensional space. The present invention relates to a symbol embedding program, a symbol embedding device, and a symbol embedding method.

背景技術  Background art

[0002] 近年、著作権保護の観点から著作権の対象となる現物の物体や、仮想 3次元空間 内で生成された 3次元形状モデルに対して、隠し図柄を埋め込み、これらの著作物 を保護する種々の試みがなされて 、る。  [0002] In recent years, from the viewpoint of copyright protection, hidden objects have been embedded in the actual objects that are the subject of copyright and the 3D shape model generated in the virtual 3D space to protect these works. Various attempts have been made.

[0003] 例えば、特許文献 1には、透かし処理情報の埋め込み用パラメータを設定する工程 と、 3次元形状モデルの変換前データを入力する工程と、入力した 3次元形状モデル の各曲面メッシュに対して透力し情報の埋め込み処理を行うメッシュと埋め込み処理 を行わないメッシュとを選別する工程と、選別された各曲面メッシュの集合の中の所 定のメッシュに対して、埋め込み用パラメータに指定された分割比率を有する制御点 を新しく生成し、当該制御点が所定の曲面上にありかつ前記所定の曲面メッシュと同 数の制御点を有する分割メッシュを生成する工程とを実行することにより、 3次元形状 モデルに電子透力しを埋め込む技術が開示されている。  [0003] For example, Patent Document 1 discloses a step of setting a parameter for embedding watermark processing information, a step of inputting pre-conversion data of a 3D shape model, and each curved mesh of the input 3D shape model. The process of selecting meshes that are transparent and that perform information embedding processing and meshes that are not embedding processing, and for a given mesh in each selected set of curved surface meshes, are specified as embedding parameters. Generating a control point having a divided ratio and generating a divided mesh having the control point on the predetermined curved surface and having the same number of control points as the predetermined curved surface mesh. A technique for embedding electronic permeability in a dimensional shape model is disclosed.

[0004] し力しながら、特許文献 1記載の技術では、 3次元形状モデルに対して所定の復元 処理を施さなければ、 3次元形状モデルに埋め込まれた該電子透力 情報を得るこ とができず、電子透力 情報を 3次元形状モデル力 直接的に視認することができな いという課題がある。  However, with the technique described in Patent Document 1, the electronic permeability information embedded in the three-dimensional shape model can be obtained if the predetermined restoration process is not performed on the three-dimensional shape model. However, there is a problem that the electronic permeability information cannot be visually recognized directly by the 3D shape model force.

特許文献 1:特開 2003 - 99805号公報  Patent Document 1: Japanese Patent Laid-Open No. 2003-99805

発明の開示  Disclosure of the invention

[0005] 本発明の目的は、実際の物体、或いは仮想 3次元モデルに対して隠し図柄を埋め 込むことができる図柄埋込プログラム、図柄埋込装置、及び図柄埋込方法を提供す ることである。 An object of the present invention is to provide a symbol embedding program, a symbol embedding device, and a symbol embedding method capable of embedding a hidden symbol in an actual object or a virtual three-dimensional model. Is Rukoto.

[0006] 本発明の一の局面に従う図柄埋込プログラムは、表面に凹凸が繰り返し形成された 物体の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定する設定手段と、 前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むための 領域内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、所定の光線方 向からの光に対する正反射方向が、所定の視線方向に向くように前記特定ポリゴン の向きを変更する変更手段としてコンピュータを機能させる。  [0006] A pattern embedding program according to one aspect of the present invention includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh A means for identifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons to be specified as a specific polygon, and a regular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction Thus, the computer is caused to function as changing means for changing the direction of the specific polygon.

[0007] 本発明の他の局面に従う図柄埋込装置は、表面に凹凸が繰り返し形成された物体 の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定する設定手段と、前記 ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むための領域 内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、所定の光線方向か らの光に対する正反射方向が、所定の視線方向に向くように前記特定ポリゴンの向き を変更する変更手段とを備える。  [0007] A pattern embedding device according to another aspect of the present invention includes a setting unit that sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space, and the polygon mesh. A means for identifying a polygon located within an area for embedding a predetermined hidden pattern as a specific polygon, and a regular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction And changing means for changing the orientation of the specific polygon.

[0008] 本発明のさらに他の局面に従う図柄埋込方法は、コンピュータが、表面に凹凸が繰 り返し形成された物体の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定 する設定ステップと、前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図 柄を埋め込むための領域内に位置するポリゴンを特定ポリゴンとして特定する特定ス テツプと、所定の光線方向からの光に対する正反射方向が、所定の視線方向に向く ように前記特定ポリゴンの向きを変更する変更ステップとを含む。  [0008] A pattern embedding method according to still another aspect of the present invention includes a setting step in which a computer sets, in a virtual three-dimensional space, a polygon mesh that represents a surface shape of an object on which irregularities are repeatedly formed on a surface. A specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction are provided. And a changing step of changing the orientation of the specific polygon so as to face a predetermined line-of-sight direction.

[0009] 上記の各構成によれば、表面に凹凸が繰り返し形成された物体の表面形状を表す ポリゴンメッシュが仮想 3次元空間内に設定され、隠し図柄が埋め込まれる領域内に 位置する特定ポリゴンが特定され、所定の光線方向から照射された光の正反射方向 が所定の視線方向を向くように、特定ポリゴンの向きが変更される。この結果、特定の 方向から光を照射して特定の方向力 特定ポリゴンに対応する部分を観察したときに のみ、当該部分を隠し図柄として視認することができるので、実際の物体、或いは仮 想 3次元モデルに対して隠し図柄を埋め込むことができる。  [0009] According to each of the above-described configurations, a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in an area where a hidden symbol is embedded The direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction. As a result, only when a portion corresponding to a specific directional force specific polygon is observed by irradiating light from a specific direction, the portion can be visually recognized as a hidden symbol, so that an actual object or virtual 3 Hidden symbols can be embedded in the dimensional model.

図面の簡単な説明  Brief Description of Drawings

[0010] [図 1]本発明の第 1の実施の形態による図柄埋込装置の構成を示すブロック図である [図 2]図 1に示す図柄埋込装置による図柄埋込処理を説明するためのフローチャート である。 FIG. 1 is a block diagram showing a configuration of a symbol embedding apparatus according to a first embodiment of the present invention. FIG. 2 is a flowchart for explaining symbol embedding processing by the symbol embedding device shown in FIG. 1.

圆 3]仮想 3次元空間内に設定されたポリゴンメッシュの一例を示す模式図である。 圆 4]ポリゴンメッシュ力も特定ポリゴンが特定される様子を示す図である。 圆 3] A schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space. 4) Polygon mesh force is also a diagram showing how a specific polygon is specified.

圆 5]特定ポリゴンの向きが変更される様子を説明するための模式図である。 [5] It is a schematic diagram for explaining how the orientation of a specific polygon is changed.

圆 6]光線方向と視線方向とが隠し図柄を視認できる方向に設定されていない場合 にレンダリングされたポリゴンメッシュの一例を示す図である。 [6] FIG. 6 is a diagram illustrating an example of a polygon mesh rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized.

[図 7]3つの光線方向と視線方向との位置関係の一例を示す模式図である。  FIG. 7 is a schematic diagram showing an example of a positional relationship between three light ray directions and a line-of-sight direction.

[図 8]光線方向に対するポリゴンメッシュのレンダリング結果の一例を示す図である。  FIG. 8 is a diagram illustrating an example of a rendering result of a polygon mesh with respect to a light ray direction.

[図 9]光線方向に対するポリゴンメッシュのレンダリング結果の他の例を示す図である  FIG. 9 is a diagram showing another example of a polygon mesh rendering result with respect to a light ray direction.

[図 10]光線方向に対するポリゴンメッシュのレンダリング結果の他の例を示す図であ る。 FIG. 10 is a diagram showing another example of a rendering result of a polygon mesh with respect to a light ray direction.

[図 11]隠し図柄が埋め込まれたポリゴンメッシュの一例の断面図である。  FIG. 11 is a cross-sectional view of an example of a polygon mesh in which a hidden symbol is embedded.

[図 12]図 1に示すテクスチャ生成部がベーステクスチャを生成する処理を説明するた めのフローチャートである。  FIG. 12 is a flowchart for explaining a process in which the texture generation unit shown in FIG. 1 generates a base texture.

圆 13]仮想 3次元空間内に格子状に配列されたサンプル点をを示す模式図である。 圆 14]光線方向と視線方向とを説明するための模式図である。 [13] FIG. 13 is a schematic diagram showing sample points arranged in a grid in a virtual three-dimensional space. [14] FIG. 14 is a schematic diagram for explaining a light ray direction and a line-of-sight direction.

[図 15]XY平面に配列されたサンプル点を Z方向から見たときの模式図である。  FIG. 15 is a schematic diagram when the sample points arranged on the XY plane are viewed from the Z direction.

圆 16]サンプル点が設定された仮想 3次元空間の模式図である。 [16] It is a schematic diagram of a virtual three-dimensional space in which sample points are set.

[図 17]Ζ方向からみたときのベーステクスチャの一例を示す図である。  FIG. 17 is a diagram showing an example of a base texture when viewed from the heel direction.

[図 18]本発明の第 2の実施の形態による図柄埋込装置によりポリゴンメッシュに複数 の隠し図柄が埋め込まれる様子を説明するための模式図である。  FIG. 18 is a schematic diagram for explaining a state in which a plurality of hidden symbols are embedded in a polygon mesh by the symbol embedding device according to the second embodiment of the present invention.

圆 19]本発明の第 2の実施の形態における光線方向と視線方向との関係を示す模 式図である。 [19] FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment of the present invention.

圆 20]特定ポリゴンの向きが変更される様子を説明するための模式面である。 [20] This is a schematic surface for explaining how the orientation of a specific polygon is changed.

発明を実施するための最良の形態 [0011] 以下、図面を参照して本発明の各実施の形態について説明する。 BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, embodiments of the present invention will be described with reference to the drawings.

[0012] (第 1の実施の形態)  [0012] (First embodiment)

図 1は、本発明の第 1の実施の形態による図柄埋込装置のブロック構成図を示して いる。この図柄埋込装置は、公知のコンピュータ力も構成され、処理部 10、記憶部 2 0、入力部 30、表示部 40、及び光学特性取得装置 50を備えている。処理部 10は、 CPU力 構成され、ポリゴンメッシュ設定部 11、ポリゴン特定部 12、目標方向算出部 13、方向変更部 14、レンダリング処理部 15、テクスチャ生成部 16、及び 3次元デー タ出力部 17の機能を備えている。記憶部 20は、ハードディスク等の記憶装置から構 成され、ベーステクスチャ記憶部 21、図柄記憶部 22、及び光学特性記憶部 23の機 能を備えている。  FIG. 1 shows a block configuration diagram of a symbol embedding apparatus according to a first embodiment of the present invention. This symbol embedding device is also configured with a known computer power, and includes a processing unit 10, a storage unit 20, an input unit 30, a display unit 40, and an optical characteristic acquisition device 50. The processing unit 10 is configured with CPU power, and includes a polygon mesh setting unit 11, a polygon specifying unit 12, a target direction calculating unit 13, a direction changing unit 14, a rendering processing unit 15, a texture generating unit 16, and a 3D data output unit 17 It has the function of. The storage unit 20 includes a storage device such as a hard disk, and includes functions of a base texture storage unit 21, a symbol storage unit 22, and an optical characteristic storage unit 23.

[0013] なお、ポリゴンメッシュ設定部 11〜3次元データ出力部 17及びベーステクスチャ記 憶部 21〜光学特性記憶部 23は、 CPUが記録媒体であるハードディスクに格納され た図柄埋込プログラムを実行することで実現される。  The polygon mesh setting unit 11 to the three-dimensional data output unit 17 and the base texture storage unit 21 to the optical characteristic storage unit 23 execute a symbol embedding program stored in a hard disk as a recording medium by the CPU. This is realized.

[0014] 光学特性取得装置 50は、本出願人によって発明された公知の装置であり、現物の 試料に対して光の照射方向と撮影方向とを変更しながら、試料を撮影していき、得ら れた試料の画像から BRDF (双方向反射率分布関数: Bidirectional Reflectance distr ibution fonction)を生成する装置である。詳しい内容は特開 2005— 115645号に開 示されている。光学特性取得装置 50は、ベーステクスチャの作成対象 (モデリング対 象)となる物体 (素材)の BRDFを生成する。光学特性記憶部 23は、光学特性取得 装置 50が生成した BRDFを記憶する。  [0014] The optical characteristic acquisition device 50 is a known device invented by the applicant of the present invention, and obtains a sample by photographing the sample while changing the light irradiation direction and the photographing direction with respect to the actual sample. This is a device that generates a BRDF (bidirectional reflection distribution function) from the sample image. Detailed contents are disclosed in JP-A-2005-115645. The optical characteristic acquisition device 50 generates a BRDF of an object (material) that is a base texture creation target (modeling target). The optical property storage unit 23 stores the BRDF generated by the optical property acquisition device 50.

[0015] ここで、 BRDFは、光線方向と視線方向とを入力とし、光線方向と視線方向とに対 応付けられた反射率を出力する関数である。本実施の形態では、 BRDFは、入出力 関係が LUT (ルックアップテーブル)形式で表されている。反射率は、仮想光源から 照射される光の光量に対する反射光の光量の比率により表されている。なお、 BRD F記憶部 21は、光学特性取得装置 50が取得した BRDF以外の BRDFを記憶しても よい。ここで、光学特性取得装置 50により取得された BRDF以外の BRDFとは、例え ばユーザが BRDFを設定するためのアプリケーションソフトウェアを用いて、数値入 力する等して作成された BRDFや、数式によって表される BRDFが含まれる。 [0016] テクスチャ生成部 16は、ベーステクスチャの作成対象となる物体の BRDFを光学特 性記憶部 23から読み出し、読み出した BRDFに従って当該物体の 3次元形状を表 すベーステクスチャを生成し、ベーステクスチャ記憶部 21に記憶させる。このようにし て、ベーステクスチャ記憶部 21は、表面に細かな凹凸(シボ)が一定のパターンで繰 り返し形成された物体の表面形状をモデリングして得られたベーステクスチャを記憶 する。 [0015] Here, BRDF is a function that receives the light ray direction and the line-of-sight direction as input, and outputs a reflectance associated with the light ray direction and the line-of-sight direction. In the present embodiment, the BRDF represents the input / output relationship in the LUT (lookup table) format. The reflectance is represented by the ratio of the amount of reflected light to the amount of light emitted from the virtual light source. The BRD F storage unit 21 may store a BRDF other than the BRDF acquired by the optical characteristic acquisition device 50. Here, the BRDF other than the BRDF acquired by the optical characteristic acquisition device 50 is, for example, a BRDF created by inputting numerical values using an application software for setting the BRDF by the user, or a mathematical expression. Contains the represented BRDF. [0016] The texture generation unit 16 reads the BRDF of the object for which the base texture is to be created from the optical property storage unit 23, generates a base texture representing the three-dimensional shape of the object according to the read BRDF, and generates the base texture. Store in storage unit 21. In this way, the base texture storage unit 21 stores a base texture obtained by modeling the surface shape of an object in which fine irregularities (textures) are repeatedly formed in a constant pattern on the surface.

[0017] ベーステクスチャは、複数のサンプル点力も構成されている。各サンプル点は、仮 想 3次元空間に設定された X軸の値を示す X成分、 Y軸の値を示す Y成分、及び Z軸 の値を示す Z成分の 3つの成分から構成されている。また、各サンプル点の X成分及 ひ Ύ成分の値は、 XY平面上への射影が正方格子状に配列されるように定められ、各 サンプル点の Z成分の値は、モデリングされた物体の高さデータを示して!/、る。  [0017] The base texture also includes a plurality of sample point forces. Each sample point consists of three components: an X component indicating the X-axis value, a Y component indicating the Y-axis value, and a Z component indicating the Z-axis value set in the virtual three-dimensional space. . In addition, the values of the X and Ύ components at each sample point are determined so that the projections on the XY plane are arranged in a square grid, and the value of the Z component at each sample point is the value of the modeled object. Show height data!

[0018] 入力部 30は、キーボード、マウス等の公知の入力装置力も構成されて 、る。ポリゴ ンメッシュ設定部 11は、入力部 30を用いてユーザにより指定されたベーステクスチャ をベーステクスチャ記憶部 21から読み出し、読み出したベーステクスチャを構成する 各サンプル点を仮想 3次元空間内にプロットする。次に、ポリゴンメッシュ設定部 11は 、隣接するサンプル点間を直線で結び、三角形又は四角形等のポリゴンから構成さ れるポリゴンメッシュを仮想 3次元空間内に設定する。これにより、仮想 3次元空間内 に細かな凹凸が一定のパターンで繰り返し形成された物体の表面形状が再現される  [0018] The input unit 30 includes a known input device force such as a keyboard and a mouse. The polygon mesh setting unit 11 reads the base texture specified by the user from the base texture storage unit 21 using the input unit 30, and plots each sample point constituting the read base texture in the virtual three-dimensional space. Next, the polygon mesh setting unit 11 connects adjacent sample points with straight lines, and sets a polygon mesh composed of polygons such as triangles or quadrangles in the virtual three-dimensional space. This reproduces the surface shape of an object in which fine irregularities are repeatedly formed in a fixed pattern in a virtual three-dimensional space.

[0019] 図柄記憶部 22は、ベーステクスチャに埋め込まれる隠し図柄の画像データを示す 隠し図柄データを予め記憶している。ここで、隠し図柄データは、描画ソフトウェア等 を用いてユーザにより予め作成されたもの、ユーザによりインターネットを介して取り 込まれたもの、或いは本図柄埋込プログラムの提供者により予め作成されたもの等を 含む。 [0019] The symbol storage unit 22 stores in advance hidden symbol data indicating image data of a hidden symbol embedded in the base texture. Here, the hidden symbol data is created in advance by the user using drawing software or the like, captured by the user via the Internet, or previously created by the provider of this symbol embedding program, etc. including.

[0020] ポリゴン特定部 12は、入力部 30を用いてユーザにより指定された隠し図柄データ を図柄記憶部 22から読み出し、ポリゴンメッシュ設定部 11により設定されたポリゴンメ ッシュを構成するポリゴンのうち、読み出した隠し図柄データが埋め込まれる領域内 に位置するポリゴンを特定ポリゴンとして特定する。 [0021] 目標方向算出部 13は、入力部 30を用いてユーザにより指定された光線方向と視 線方向との角度を二等分する直線の方向を目標方向として算出する。方向変更部 1 4は、特定ポリゴンのうち、 目標方向に対する法線ベクトルの角度が規定値以下の特 定ポリゴンを抽出し、抽出した特定ポリゴンの法線ベクトルが目標方向と一致するよう に当該特定ポリゴンの向きを変更する。 [0020] The polygon specifying unit 12 reads out the hidden symbol data designated by the user using the input unit 30 from the symbol storage unit 22, and reads out the polygons constituting the polygon mesh set by the polygon mesh setting unit 11. The polygon located in the area where the hidden symbol data is embedded is specified as a specific polygon. The target direction calculation unit 13 calculates, as the target direction, a straight line direction that bisects the angle between the light ray direction and the line-of-sight direction specified by the user using the input unit 30. The direction changing unit 14 extracts a specific polygon whose normal vector angle with respect to the target direction is less than or equal to a specified value from the specific polygon, and specifies the normal vector of the extracted specific polygon so that it matches the target direction. Change the orientation of the polygon.

[0022] レンダリング処理部 15は、ベーステクスチャのモデリング対象となった物体の BRD Fを用いて、 目標方向算出部 13により向きが変更された特定ポリゴンをレンダリングし 、表示部 40に表示する。表示部 40は、 CRT、液晶パネル、プラズマパネル等の公 知の表示装置から構成されている。なお、レンダリング処理部 15は、入力部 30を用 V、てユーザにより入力された視線方向と光線方向とに従ってレンダリングする。  The rendering processing unit 15 renders the specific polygon whose direction has been changed by the target direction calculation unit 13 using the BRD F of the object that is the base texture modeling target, and displays it on the display unit 40. The display unit 40 includes a known display device such as a CRT, a liquid crystal panel, or a plasma panel. The rendering processing unit 15 performs rendering according to the line-of-sight direction and the ray direction input by the user using the input unit 30.

[0023] 3次元データ出力部 17は、方向変更部 14により特定ポリゴンの向きが変更された ポリゴンメッシュ PMの各サンプル点の位置を表す 3次元データから NC (Numerical C ontrol)データを生成し、光造形装置 60に出力する。  [0023] The three-dimensional data output unit 17 generates NC (Numerical Control) data from the three-dimensional data representing the position of each sample point of the polygon mesh PM in which the direction of the specific polygon has been changed by the direction changing unit 14, Output to stereolithography apparatus 60.

[0024] 光造形装置 60は、公知の光造形装置から構成され、 3次元データ出力部 17により 生成された NCデータに従って、ポリゴンメッシュ PMの形状を榭脂に形成する。なお 、光造形装置に代えて、コンピュータ上で作成された物体の形状を表す 3次元データ 力もその形状を現物の物体上に形成することができる装置であれば、光造形装置以 外の装置を用いてもよい。  The stereolithography apparatus 60 is composed of a well-known stereolithography apparatus, and forms the shape of the polygon mesh PM into a grease according to the NC data generated by the three-dimensional data output unit 17. In place of the stereolithography device, a device other than the stereolithography device can be used as long as the device can form the shape on the actual object with a three-dimensional data force representing the shape of the object created on the computer. It may be used.

[0025] 本実施の形態では、ポリゴンメッシュ設定部 11が設定手段の一例に相当し、ポリゴ ン特定部 12が特定手段の一例に相当し、 目標方向算出部 13及び方向変更部 14が 変更手段の一例に相当し、レンダリング処理部 15がレンダリング処理手段の一例に 相当し、テクスチャ生成部 16がポリゴンメッシュ生成手段の一例に相当し、 3次元デ ータ出力部 17が出力手段の一例に相当する。  In the present embodiment, the polygon mesh setting unit 11 corresponds to an example of a setting unit, the polygon specifying unit 12 corresponds to an example of a specifying unit, and the target direction calculating unit 13 and the direction changing unit 14 are changing units. The rendering processing unit 15 corresponds to an example of a rendering processing unit, the texture generation unit 16 corresponds to an example of a polygon mesh generation unit, and the 3D data output unit 17 corresponds to an example of an output unit. To do.

[0026] 次に、図 1に示す図柄埋込装置の処理について図 2に示すフローチャートを用いて 説明する。まず、ステップ S1において、ポリゴンメッシュ設定部 11は、入力部 30がべ ーステクスチャを指定するための操作指令を受け付けたとき、指定されたベーステク スチヤをベーステクスチャ記憶部 21から読み出し、ベーステクスチャを取得する。  Next, the process of the symbol embedding apparatus shown in FIG. 1 will be described using the flowchart shown in FIG. First, in step S1, when the input unit 30 receives an operation command for designating a base texture, the polygon mesh setting unit 11 reads the designated base texture from the base texture storage unit 21 and acquires the base texture. .

[0027] 次に、ステップ S2において、ポリゴンメッシュ設定部 11は、ステップ S1において読 み出されたベーステクスチャを構成する各サンプル点を仮想 3次元空間内にプロット し、各サンプル点を直線で結び、仮想 3次元空間内にポリゴンメッシュを設定する。 [0027] Next, in step S2, the polygon mesh setting unit 11 reads in step S1. Each sample point composing the extracted base texture is plotted in the virtual 3D space, each sample point is connected by a straight line, and a polygon mesh is set in the virtual 3D space.

[0028] 図 3は、仮想 3次元空間内に設定されたポリゴンメッシュの一例を示す模式図である 。図 3に示すようにポリゴンメッシュ PMは、サンプル点 Pを頂点とする複数のポリゴン P Lから構成されている。各サンプル点 Pは、 XY平面上へ射影したとき、各射影点 SP' がメゾスケール (ミクロンオーダー以上ミリオーダー以下のスケール)の間隔 dで正方 格子状に配列されるように X成分と Y成分との値が定められて ヽる。  FIG. 3 is a schematic diagram showing an example of a polygon mesh set in a virtual three-dimensional space. As shown in FIG. 3, the polygon mesh PM is composed of a plurality of polygons P L having the sample point P as a vertex. When each sample point P is projected onto the XY plane, the X and Y components are arranged so that each projected point SP 'is arranged in a square grid with an interval d of mesoscale (scale of micron order to millimeter order). The value of is determined.

[0029] なお、各射影点の間隔 dとしては、 10 μ m以上 100 μ m以下が好ましい。また、ポ リゴンメッシュ PMは、表面に凹凸が繰り返し形成された物体の表面形状を表すもの であるため、各サンプル点 Pの Z成分は分散して 、ることが分かる。  [0029] The distance d between the projection points is preferably 10 μm or more and 100 μm or less. Polygon mesh PM represents the surface shape of an object with irregularities repeatedly formed on the surface, and it can be seen that the Z component of each sample point P is dispersed.

[0030] 次に、ステップ S3において、ポリゴンメッシュ設定部 11は、各サンプル点 Pの Z成分 にノイズを付与し、ポリゴンメッシュ PMの表面の凹凸をより分散させる。  [0030] Next, in step S3, the polygon mesh setting unit 11 adds noise to the Z component of each sample point P to further disperse the irregularities on the surface of the polygon mesh PM.

[0031] 次に、ステップ S4において、ポリゴン特定部 12は、ユーザにより指定された隠し図 柄データを図柄記憶部 22から読み出し、読み出した隠し図柄データが埋め込まれる 領域をポリゴンメッシュ PMに設定し、ポリゴンメッシュ PMを構成するポリゴンのうち、 設定した領域内に位置するポリゴン PLを特定ポリゴン TPLとして特定する。  [0031] Next, in step S4, the polygon specifying unit 12 reads out the hidden symbol data designated by the user from the symbol storage unit 22, sets an area in which the read hidden symbol data is embedded in the polygon mesh PM, Among the polygons constituting the polygon mesh PM, the polygon PL located within the set area is identified as the specific polygon TPL.

[0032] 図 4は、ポリゴンメッシュ PM力も特定ポリゴン TPLが特定される様子を示す図である 。図 4に示すようにアルファベットの Dを示す隠し図柄が埋め込まれる領域 D1がポリ ゴンメッシュ PMに設定される。そして、領域 D1内に位置するポリゴン PLが特定ポリ ゴン TPLとして特定される。  [0032] FIG. 4 is a diagram showing how the specific polygon TPL is specified for the polygon mesh PM force. As shown in Fig. 4, the area D1 where the hidden symbol indicating the alphabet D is embedded is set in the polygon mesh PM. Then, the polygon PL located in the area D1 is specified as the specific polygon TPL.

[0033] 次に、ステップ S5において、 目標方向算出部 13は、入力部 30により受付けられた ユーザ力 の操作指令に従って、隠し図柄 Gを視認することができる光線方向と視線 方向とを設定する。次に、ステップ S6において、 目標方向算出部 13は、図 5に示す ように光線方向 LDと視線方向 VDとのなす角度 θ 1を 2等分する直線の方向を目標 方向 ODとして算出する。  [0033] Next, in step S5, the target direction calculation unit 13 sets a light ray direction and a line-of-sight direction in which the hidden symbol G can be visually recognized in accordance with the user force operation command received by the input unit 30. Next, in step S6, the target direction calculation unit 13 calculates, as the target direction OD, the direction of a straight line that equally divides the angle θ 1 formed by the light beam direction LD and the line-of-sight direction VD as shown in FIG.

[0034] 次に、ステップ S7において、方向変更部 14は、図 5に示すように特定ポリゴン TPL の法線ベクトル nを求め、法線ベクトル nと目標方向 ODとの角度 φが既定値(10度 以上 20度以下のいずれかの値)以下の特定ポリゴン TPLを抽出する。なお、既定値 は、ポリゴン PLの法線方向の散らばり具合や、素材の BRDFによってどれくらいはつ きりと隠し図柄を見えるようにした 、かによつて適宜変更される。 Next, in step S7, the direction changing unit 14 obtains the normal vector n of the specific polygon TPL as shown in FIG. 5, and the angle φ between the normal vector n and the target direction OD is a predetermined value (10 Extract any specific polygon TPL that is less than or equal to 20 degrees or less). The default value Depending on how scattered the polygon PL is in the normal direction and how much the hidden pattern is visible depending on the BRDF of the material.

[0035] 次に、ステップ S8において、方向変更部 14は、ステップ S7で抽出した特定ポリゴン TPLの法線ベクトル nが目標方向 ODと一致するように、当該特定ポリゴン TPLを構 成するサンプル点 Pの Z成分の値を変更し、当該特定ポリゴン TPLの向きを変更する [0035] Next, in step S8, the direction changing unit 14 configures the sample point P constituting the specific polygon TPL so that the normal vector n of the specific polygon TPL extracted in step S7 matches the target direction OD. Change the Z component value of and change the direction of the specific polygon TPL

[0036] 次に、ステップ S9において、レンダリング処理部 15は、向きが変更されたポリゴンメ ッシュ PMを当該ポリゴンメッシュ PMのモデリング対象となった物体の BRDFを用い てレンダリングし、表示部 40に表示させる。 [0036] Next, in step S9, the rendering processing unit 15 renders the polygon mesh PM whose orientation has been changed using the BRDF of the object to be modeled by the polygon mesh PM, and causes the display unit 40 to display it. .

[0037] 或いは、ステップ S9の処理に代えてステップ S10において、 3次元データ出力部 1 7は向きが変更されたポリゴンメッシュ PMの各サンプル点 Pの 3次元データを NCデ ータに変換し、光造形装置 60に出力する。光造形装置 60は、 NCデータに従って榭 脂を形成し、隠し図柄が埋め込まれたシボを榭脂に形成する。  [0037] Alternatively, in step S10 instead of the process in step S9, the three-dimensional data output unit 17 converts the three-dimensional data of each sample point P of the polygon mesh PM whose direction has been changed into NC data, Output to stereolithography apparatus 60. The optical modeling apparatus 60 forms a resin according to NC data, and forms a texture with a hidden pattern embedded in the resin.

[0038] 図 6は、光線方向と視線方向とが隠し図柄を視認できる方向に設定されていない場 合にレンダリングされたポリゴンメッシュ PMの一例を示す図である。図 6においては、 光線方向と視線方向とが隠し図柄を視認できる方向に設定されて 、な 、ため、モデ リング対象となった物体の表面形状のみが表示され、隠し図柄 Gは表示されていない  FIG. 6 is a diagram illustrating an example of a polygon mesh PM rendered when the light ray direction and the line-of-sight direction are not set to a direction in which a hidden symbol can be visually recognized. In FIG. 6, the ray direction and the line-of-sight direction are set so that the hidden symbol can be seen, so only the surface shape of the object to be modeled is displayed and the hidden symbol G is not displayed.

[0039] 図 7は、 3つの光線方向 LD1〜LD3と視線方向 VDとの位置関係の一例を示す模 式図であり、図 8〜図 10は、各光線方向 LD1〜: LD3に対するポリゴンメッシュ PMの レンダリング結果 (レンダリングされた画像)の一例を示す図である。特定ポリゴンの法 線ベクトルが光線方向 LD2と視線方向 VDとのなす角度を 2等分する目標方向に一 致する場合、図 7に示すように、光線方向 LD1及び LD3は、隠し図柄を視認すること ができる方向ではな 、ため、光線方向を LD 1或 ヽは LD3に設定してポリゴンメッシュ PMをレンダリングしても、図 8或いは図 10に示すように、隠し図柄 Gは表示されない 。一方、光線方向と視線方向とを LD2と VDとに設定してレンダリングすると、図 9に 示すように、 DFLとの文字力 なる隠し図柄 Gが表示される。 [0039] Fig. 7 is a schematic diagram showing an example of the positional relationship between the three ray directions LD1 to LD3 and the line-of-sight direction VD. Figs. 8 to 10 show the polygon mesh PM for each ray direction LD1 to LD3. It is a figure which shows an example of a rendering result (rendered image). When the normal vector of a specific polygon matches the target direction that bisects the angle between the light ray direction LD2 and the line-of-sight direction VD, the light directions LD1 and LD3 visually recognize the hidden pattern as shown in Fig. 7. Therefore, even if the polygon mesh PM is rendered with the ray direction set to LD1 or LD3 and the polygon mesh PM is rendered, the hidden symbol G is not displayed as shown in FIG. 8 or FIG. On the other hand, when rendering is performed with the ray direction and line-of-sight direction set to LD2 and VD, as shown in Fig. 9, the hidden symbol G, which is the character power of DFL, is displayed.

[0040] 図 11は、隠し図柄が埋め込まれたポリゴンメッシュ PMの一例の断面図を示してい る。図 11に示す平面 Klは、ステップ S8において向きが変更された特定ポリゴン TPL のポリゴン面を示している。光線方向 LD2からの平面 K1における光の正反射方向は 、視線方向 VDであるため、光線方向と視線方向とを LD2と VDとに設定してレンダリ ングすると、平面 K1における正反射方向の光が視線方向 VDに向力う結果、隠し図 柄 Gが表示されることとなる。一方、光線方向を LD2以外の方向、例えば、 LD1に設 定してレンダリングすると、視線方向 VDには、平面 K1の乱反射方向の光が向かうた め、隠し図柄 Gが表示されないこととなる。 [0040] FIG. 11 shows a cross-sectional view of an example of a polygon mesh PM in which a hidden symbol is embedded. The The plane Kl shown in FIG. 11 indicates the polygon surface of the specific polygon TPL whose orientation has been changed in step S8. The specular reflection direction of light on the plane K1 from the light beam direction LD2 is the line-of-sight direction VD. Hidden symbol G will be displayed as a result of directing in line of sight direction VD. On the other hand, when rendering is performed with the ray direction set to a direction other than LD2, for example, LD1, the light in the irregular reflection direction of the plane K1 is directed to the line-of-sight direction VD, so the hidden symbol G is not displayed.

[0041] 次に、図 1に示すテクスチャ生成部 16がベーステクスチャを生成する処理について 図 12に示すフローチャートを用いて説明する。まず、ステップ S21において、入力部 30が光学特性記憶部 23の中から 1つの BRDFを指定するためのユーザによる操作 指令を受付けると、テクスチャ生成部 16は、ユーザにより指定された BRDFを光学特 性記憶部 23から読み出して BRDFを取得する。  Next, processing in which the texture generation unit 16 shown in FIG. 1 generates a base texture will be described using the flowchart shown in FIG. First, in step S21, when the input unit 30 receives a user operation command for designating one BRDF from the optical characteristic storage unit 23, the texture generation unit 16 converts the BRDF designated by the user into the optical characteristics. Read from the storage unit 23 to obtain the BRDF.

[0042] 次に、ステップ S22において、テクスチャ生成部 16は、仮想 3次元空間内に複数の サンプル点を格子状に配列し、仮想光源力 各サンプル点に光を照射したときの光 の正反射方向を、 BRDFを用いて算出する。  [0042] Next, in step S22, the texture generation unit 16 arranges a plurality of sample points in a lattice in the virtual three-dimensional space, and regular reflection of light when light is irradiated to each sample point of the virtual light source power. The direction is calculated using BRDF.

[0043] 図 13は、仮想 3次元空間内に格子状に配列されたサンプル点 Pを示す模式図であ る。図 13に示すように、仮想 3次元空間には、互いに直交する X、 Y、 Ζ軸が設定され ている。ここで、 Ζ軸は鉛直方向を示し、 ΧΥ平面は水平面を示している。テクスチャ 生成部 11は、仮想 3次元空間内の ΧΥ平面上にサンプル点 Ρを格子状に配列する。 ここで、サンプル点 Ρは、メゾスケールの間隔 dで XY平面上に配列されている。  FIG. 13 is a schematic diagram showing sample points P arranged in a lattice pattern in a virtual three-dimensional space. As shown in Fig. 13, X, Y, and saddle axes that are orthogonal to each other are set in the virtual three-dimensional space. Here, the heel axis indicates the vertical direction, and the heel plane indicates the horizontal plane. The texture generation unit 11 arranges the sample points 格子 in a grid pattern on the ΧΥ plane in the virtual three-dimensional space. Here, the sample points Ρ are arranged on the XY plane at a mesoscale interval d.

[0044] 以下、 XY平面上に配列された複数のサンプル点 Pのうち、処理対象となるサンプ ル点 Pを注目サンプル点 CPと呼ぶ。テクスチャ生成部 16は、注目サンプル点 CPに 対する仮想光源 VLからの光線方向 LDを求め、求めた光線方向 LDをステップ S21 で取得した BRDFに入力し、視線方向 VDを変化させ、各視線方向 VDに対する反 射率を求め、求めた反射率のうち最大の反射率が得られる視線方向を注目サンプル 点 CPでの光の正反射方向として算出する。図 13の例では、 VDの符号を付した矢印 は、反射率が高いほど長くなつているため、視線方向 VDMAXが正反射方向 RDと して算出される。他のサンプル点 Pも同様にして正反射方向 RDが算出される。なお、 図 13において、注目サンプル点 CPにおける法線ベクトル nは、他のサンプル点 Pに おける法線ベクトル nよりも誇張して表示されて 、る。 [0044] Hereinafter, among the plurality of sample points P arranged on the XY plane, the sample point P to be processed is referred to as a target sample point CP. The texture generation unit 16 obtains the ray direction LD from the virtual light source VL for the target sample point CP, inputs the obtained ray direction LD to the BRDF obtained in step S21, changes the line-of-sight direction VD, and changes each line-of-sight direction VD. Then, the line-of-sight direction that gives the maximum reflectance among the calculated reflectances is calculated as the regular reflection direction of light at the sample point CP of interest. In the example of FIG. 13, since the arrow with the sign of VD becomes longer as the reflectivity increases, the line-of-sight direction VDMAX is calculated as the regular reflection direction RD. The specular reflection direction RD is calculated for other sample points P in the same way. In addition, In FIG. 13, the normal vector n at the sample point CP of interest is displayed exaggerated over the normal vector n at the other sample points P.

[0045] 図 14は、光線方向 LDと視線方向 VDとを説明するための模式図である。図 14に示 すように、注目サンプル点 CPを原点として X '軸 (X軸に平行), Y'軸 (Y軸に平行), 軸 (Z軸に平行)を設定したとき、光線方向 LDは、光線方向 LDの XY平面への射 影 IJTと X'軸とのなす角度 a、及び射影 と光線方向 LDとのなす角度 βによつ て表される。また、視線方向 VDは、視線方向 VDの ΧΥ平面への射影 VD'と X'軸と のなす角度 γ、及び射影 VD,と視線方向 VDとのなす角度 δによって表される。  FIG. 14 is a schematic diagram for explaining the light beam direction LD and the line-of-sight direction VD. As shown in Fig. 14, when the sample point of interest CP is the origin, the X 'axis (parallel to the X axis), Y' axis (parallel to the Y axis), and axis (parallel to the Z axis) are set. Is expressed by the angle a between the projection IJT of the ray direction LD onto the XY plane and the X 'axis, and the angle β between the projection and the ray direction LD. The line-of-sight direction VD is expressed by an angle γ formed by the projection VD ′ and the X ′ axis of the line-of-sight direction VD onto the vertical plane, and an angle δ formed by the projection VD and the line-of-sight direction VD.

[0046] ここで、テクスチャ生成部 16は、角度 δを 0〜90度の範囲内で所定の分解能 (例え ば 5度)で変更し、角度 γを— 90〜90度の範囲内で所定の分解能 (例えば 10度)で 変更し、視線方向 VDを変化させていく。  Here, the texture generation unit 16 changes the angle δ within a range of 0 to 90 degrees with a predetermined resolution (for example, 5 degrees), and changes the angle γ within a range of −90 to 90 degrees. Change the resolution (eg 10 degrees) and change the line-of-sight direction VD.

[0047] 再び図 12を参照して、ステップ S23において、テクスチャ生成部 16は、図 13に示 す注目サンプル点 CPの正反射方向 RDと光線方向 LDとのなす角度 Θ 2を二等分す る方向を、注目サンプル点 CPを含む平面に対する法線ベクトル nとして算出する。  [0047] Referring again to FIG. 12, in step S23, the texture generation unit 16 bisects the angle Θ 2 formed between the specular reflection direction RD and the light ray direction LD of the target sample point CP shown in FIG. Is calculated as a normal vector n with respect to the plane including the sample point of interest CP.

[0048] 次に、ステップ S24において、テクスチャ生成部 16は、以下に示すようにして、サン プル点 Pの凹凸情報を算出する。図 15は、 XY平面に配列されたサンプル点 Pを Z方 向から見たときの模式図を示し、図 16は、サンプル点 Pが設定された仮想 3次元空間 の模式図を示している。  [0048] Next, in step S24, the texture generation unit 16 calculates the unevenness information of the sample point P as follows. FIG. 15 is a schematic diagram when the sample points P arranged on the XY plane are viewed from the Z direction, and FIG. 16 is a schematic diagram of a virtual three-dimensional space in which the sample points P are set.

[0049] まず、テクスチャ生成部 16は、図 15に示すように、左下の格子 K1を構成する 2個 のポリゴンのうち、左下に位置するポリゴン PL1を注目ポリゴンとして特定する。次に、 図 16に示すように、ポリゴン PL1を構成する 3個のサンプル点 Pのうち、ポリゴン PL1 の向きが左下のサンプル点 P1の法線ベクトル nlと直交するように、サンプル点 P2, P3を Z方向に移動させ、移動後のサンプル点 P2, P3の Z成分の値をサンプル点 P2 , P3の凹凸情報として算出する。  First, as shown in FIG. 15, the texture generating unit 16 specifies the polygon PL1 located at the lower left of the two polygons constituting the lower left grid K1 as the target polygon. Next, as shown in FIG. 16, among the three sample points P constituting the polygon PL1, the sample points P2, P3 are set so that the direction of the polygon PL1 is orthogonal to the normal vector nl of the lower left sample point P1. Is moved in the Z direction, and the value of the Z component of the sample points P2 and P3 after the movement is calculated as the unevenness information of the sample points P2 and P3.

[0050] 次に、テクスチャ生成部 16は、図 15に示すように、格子 K1に隣接する格子 K2を 構成するポリゴンのうち左下に位置するポリゴン PL2を注目ポリゴンとして特定し、図 16に示すように、ポリゴン PL2の向き力 ポリゴン PL2を構成する 3個のサンプル点 P のうち、左下のサンプル点 P2の法線ベクトル n2と直交するように、サンプル点 P4, P 5を Z方向に移動させ、移動後のサンプル点 P4, P5の Z成分の値をサンプル点 P4, P5の凹凸情報として算出する。 [0050] Next, as shown in FIG. 15, the texture generation unit 16 identifies the polygon PL2 located at the lower left of the polygons constituting the grid K2 adjacent to the grid K1 as the target polygon, as shown in FIG. Of the polygon PL2, the sample points P4, P so as to be orthogonal to the normal vector n2 of the sample point P2 in the lower left of the three sample points P constituting the polygon PL2 5 is moved in the Z direction, and the value of the Z component of the sample points P4 and P5 after the movement is calculated as the unevenness information of the sample points P4 and P5.

[0051] 次に、テクスチャ生成部 16は、図 15に示すように、格子 K1の上側に隣接する格子 K3内のポリゴン PL3を注目ポリゴンとして特定し、図 16に示すように、ポリゴン PL3の 向き力 サンプル点 P3の法線ベクトル n3と直交するようにサンプル点 P6を Z方向に 移動させ、移動後のサンプル点 P6の Z成分の値をサンプル点 P6の凹凸情報として 算出する。 Next, as shown in FIG. 15, the texture generation unit 16 identifies the polygon PL3 in the lattice K3 adjacent to the upper side of the lattice K1 as the target polygon, and the orientation of the polygon PL3 as shown in FIG. Force The sample point P6 is moved in the Z direction so as to be orthogonal to the normal vector n3 of the sample point P3, and the value of the Z component of the sample point P6 after the movement is calculated as the unevenness information of the sample point P6.

[0052] 以降、上記と同様にして、テクスチャ生成部 16は、格子 K2の右側に隣接する格子 K4内のポリゴン PL4を注目ポリゴンとして特定し、ポリゴン PL2と同様にしてポリゴン PL4の向きを変更する。このように、テクスチャ生成部 16は、左下の格子 K1のポリゴ ン PL1から、右斜め上方に向けて蛇行するように順次注目ポリゴンを特定していき、 特定した注目ポリゴンの向きを上述したように変更し、変更後の各サンプル点の Z成 分の値を各サンプル点の凹凸情報として算出し、 3次元テクスチャを生成する。  Thereafter, in the same manner as described above, the texture generation unit 16 identifies the polygon PL4 in the lattice K4 adjacent to the right side of the lattice K2 as the target polygon, and changes the orientation of the polygon PL4 in the same manner as the polygon PL2. . In this way, the texture generation unit 16 sequentially identifies the target polygon from the polygon PL1 of the lower left grid K1 so as to meander diagonally upward to the right, and the orientation of the identified target polygon is as described above. Change and calculate the Z component value of each sample point after the change as unevenness information of each sample point to generate a 3D texture.

[0053] 図 17は、テクスチャ生成部 16により生成されたテクスチャの一例を示す図である。  FIG. 17 is a diagram illustrating an example of a texture generated by the texture generation unit 16.

図 17に示すように、物体の表面形状がリアルに再現されていることが分かる。なお、 テクスチャ生成部 16は、左下以外の右下、左上、右上の格子のポリゴンから、左斜め 上、右斜め下、左斜め下に向けて蛇行するように順次注目ポリゴンを特定していって ちょい。  As shown in Fig. 17, it can be seen that the surface shape of the object is realistically reproduced. Note that the texture generation unit 16 sequentially identifies the target polygon from the polygons in the lower right, upper left, and upper right grids other than the lower left so as to meander to the upper left, lower right, and lower left. A little.

[0054] 以上説明したように、本図柄埋込装置によれば、隠し図柄が埋め込まれる領域内 に位置する特定ポリゴンが特定され、特定の光線方向から照射された光の正反射方 向が特定の視線方向を向くように、特定ポリゴンの向きが変更される。この結果、モデ リング対象となる物体の BRDFを用いてポリゴンメッシュをレンダリングすると、光線方 向と視線方向と特定の光線方向と視線方向とに設定してレンダリングしたときのみ、 隠し図柄を表示させることができる。  [0054] As described above, according to the present design embedding device, the specific polygon located in the region where the hidden design is embedded is specified, and the regular reflection direction of the light irradiated from the specific light ray direction is specified. The direction of the specific polygon is changed so as to face the viewing direction. As a result, when a polygon mesh is rendered using BRDF of the object to be modeled, a hidden pattern is displayed only when rendering is performed with the ray direction, line-of-sight direction, specific ray direction, and line-of-sight direction set. Can do.

[0055] したがって、特定の光線方向と視線方向とを知らない者は、埋め込まれた隠し図柄 を表示させることができなくなり、ポリゴンメッシュに対して隠し図柄を隠し込むことが できる。一方、特定の光線方向と視線方向とを知る者は、当該光線方向と視線方向と を設定して、ポリゴンメッシュをレンダリングすれば、隠し図柄を表示することができ、 特許文献 1のような復元処理を行わなくとも、直ちにポリゴンメッシュに埋め込まれた 隠し図柄を視認することができる。 [0055] Therefore, a person who does not know the specific light ray direction and line-of-sight direction cannot display the embedded hidden symbol and can hide the hidden symbol from the polygon mesh. On the other hand, a person who knows a specific ray direction and line-of-sight direction can display a hidden pattern by setting the ray direction and line-of-sight direction and rendering the polygon mesh. Even without performing the restoration process as in Patent Document 1, it is possible to immediately recognize the hidden symbol embedded in the polygon mesh.

[0056] また、本図柄埋込装置によれば、シボが形成された榭脂に隠し図柄を埋め込むこと が可能となり、シボが形成された榭脂に特定の方向(光線方向)から光を照射し、特 定の方向 (視線方向)から観察した場合、隠し図柄が現れることとなる。一方、第 3者 がシボを模倣して榭脂を形成した場合、この樹脂には隠し図柄埋め込まれていない ため、特定の方向力 光を照射し、特定の方向から観察しても隠し図柄は現れないこ ととなる。これにより、第三者によるシボの模倣を防止することができる。  [0056] Further, according to the present design embedding apparatus, it is possible to embed a hidden design in the wrinkles formed with wrinkles, and light is applied to the wrinkles formed with wrinkles from a specific direction (light beam direction). However, when observing from a specific direction (line-of-sight direction), a hidden symbol appears. On the other hand, when a third party imitates a wrinkle and forms a resin, the hidden pattern is not embedded in this resin, so even if it is irradiated from a specific direction and observed from a specific direction, the hidden pattern does not appear. It will not appear. Thereby, imitation of the grain by a third party can be prevented.

[0057] なお、図 2におけるステップ S3では Z成分にノイズを付与した力 これに限定されず 、ステップ S3に示す処理を省略してもよい。また、上記説明では、シボの模倣を防止 するものとした力 これに限定されず、 3次元的、或いは 2次元的な装飾が施された物 体であって、物体の表面全域、或いは一部にシボが表された物体において、シボが 表された領域に隠し図柄を埋め込むことにより、この装飾に対する第三者の模倣から 保護することも可能である。  [0057] It should be noted that in step S3 in FIG. 2, the force applied noise to the Z component is not limited to this, and the processing shown in step S3 may be omitted. Further, in the above description, the force that prevents imitation of the wrinkles is not limited to this, and the object is a three-dimensional or two-dimensional decorated object, and the entire surface of the object or a part thereof. By embedding a hidden pattern in the area where the wrinkles are displayed, it is possible to protect the decorations from imitation by third parties.

[0058] (第 2の実施の形態)  [0058] (Second Embodiment)

第 2の実施の形態による図柄埋込装置は、 1種類の隠し図柄をポリゴンメッシュ PM に埋め込む第 1の実施の形態による図柄埋込装置に対して、複数種類の隠し図柄を ポリゴンメッシュに埋め込むことを特徴としている。なお、第 2の実施の形態による図 柄埋込装置は、第 1の実施の形態による図柄埋込装置とほぼ同一構成であるため、 図 1に示す第 1の実施の形態による図柄埋込装置のブロック図を用いて説明する。  The symbol embedding device according to the second embodiment embeds a plurality of types of hidden symbols in the polygon mesh compared to the symbol embedding device according to the first embodiment that embeds one type of hidden symbol in the polygon mesh PM. It is characterized by. Since the symbol embedding device according to the second embodiment has substantially the same configuration as the symbol embedding device according to the first embodiment, the symbol embedding device according to the first embodiment shown in FIG. This will be described with reference to the block diagram.

[0059] 以下、隠し図柄 G1〜G3の 3種類の隠し図柄をポリゴンメッシュ PMに埋め込む場 合を例に挙げて説明する。図 18は、ポリゴンメッシュに隠し図柄 G1〜G3が埋め込ま れる様子を説明するための模式図であり、図 19は、第 2の実施の形態における光線 方向と視線方向との関係を示す模式図である。  [0059] Hereinafter, a case where three types of hidden symbols G1 to G3 are embedded in the polygon mesh PM will be described as an example. FIG. 18 is a schematic diagram for explaining how the hidden symbols G1 to G3 are embedded in the polygon mesh, and FIG. 19 is a schematic diagram showing the relationship between the light ray direction and the line-of-sight direction in the second embodiment. is there.

[0060] 図 18に示すように、隠し図柄 G1は「D」の文字を示す図柄とし、隠し図柄 G2は「F」 の文字を示す図柄とし、隠し図柄 G3は「L」の文字を示す図柄とする。また、図 19に 示す視線方向 VD力 各隠し図柄 G1〜G3を視認することができる光線方向を各々 図 19に示す光線方向 LD1〜LD3とする。この場合、ポリゴン特定部 12は、図 18に 示すように、ポリゴンメッシュ PMから隠し図柄 G1〜G3の各々が埋め込まれる領域 D 1〜D3を特定し、領域 D 1〜D3内に位置するポリゴンを特定ポリゴン TPL 1〜TPL3 として特定する。 [0060] As shown in FIG. 18, the hidden symbol G1 is a symbol indicating the letter “D”, the hidden symbol G2 is a symbol indicating the letter “F”, and the hidden symbol G3 is a symbol indicating the letter “L”. And Further, the light ray directions in which the respective hidden patterns G1 to G3 can be visually recognized are shown as light ray directions LD1 to LD3 shown in FIG. 19, respectively. In this case, the polygon identification unit 12 As shown, the regions D1 to D3 in which the hidden symbols G1 to G3 are embedded are identified from the polygon mesh PM, and the polygons located in the regions D1 to D3 are identified as the specific polygons TPL1 to TPL3.

[0061] 目標方向算出部 13は、図 18に示すように、光線方向 LD1と視線方向 VDとのなす 角度を 2等分する方向を目標方向 OD1として算出し、光線方向 LD2と視線方向 VD とのなす角度を 2等分する方向を目標方向 OD2として算出し、光線方向 LD3と視線 方向 VDとのなす角度を 2等分する方向を目標方向 OD3として算出する。  [0061] As shown in FIG. 18, the target direction calculation unit 13 calculates a direction that bisects the angle formed by the light beam direction LD1 and the line-of-sight direction VD as the target direction OD1, and calculates the light beam direction LD2 and the line-of-sight direction VD The target direction OD2 is calculated as the direction that divides the angle formed by halving the target direction OD3.

[0062] 方向変更部 14は、図 18に示すように特定ポリゴン TPL1のうち、法線ベクトル nlと 目標方向 OD 1との角度が規定値未満の特定ポリゴン TPL 1を抽出し、特定ポリゴン TPL2のうち、法線ベクトル n2と目標方向 OD2とのなす角度が規定値未満の特定ポ リゴン TPL2を抽出し、特定ポリゴン TPL3のうち、法線ベクトル n3と目標方向 OD3と のなす角度が規定値未満の特定ポリゴン TPL3を抽出する。そして、抽出した特定ポ リゴン TPL1の法線ベクトル nlが目標方向 OD1と一致するように、特定ポリゴン TPL 1の向きを変更し、抽出した特定ポリゴン TPL2の法線ベクトル n2が目標方向 OD2と 一致するように、特定ポリゴン TPL2の向きを変更し、抽出した特定ポリゴン TPL3の 法線ベクトル n3が目標方向 OD3と一致するように特定ポリゴン TPL3の向きを変更 する。  The direction changing unit 14 extracts the specific polygon TPL 1 whose angle between the normal vector nl and the target direction OD 1 is less than the specified value from the specific polygon TPL1 as shown in FIG. Among them, a specific polygon TPL2 whose angle between the normal vector n2 and the target direction OD2 is less than the specified value is extracted, and among the specific polygon TPL3, the angle between the normal vector n3 and the target direction OD3 is less than the specified value. Extract specific polygon TPL3. Then, the direction of the specific polygon TPL 1 is changed so that the normal vector nl of the extracted specific polygon TPL1 matches the target direction OD1, and the normal vector n2 of the extracted specific polygon TPL2 matches the target direction OD2 As described above, the direction of the specific polygon TPL2 is changed, and the direction of the specific polygon TPL3 is changed so that the normal vector n3 of the extracted specific polygon TPL3 matches the target direction OD3.

[0063] レンダリング処理部 15が、光線方向を LD1〜LD3に設定し、視線方向を VDに設 定し、 BRDFを用いてポリゴンメッシュ PMをレンダリングすると、図 18に示すように、「 D」の隠し図柄 G1と、「F」の隠し図柄 G2と、「L」の隠し図柄 G3とが表示部 40に表示 される。一方、レンダリング処理部 15が、光線方向を LD1のみ設定し、視線方向を V Dに設定してポリゴンメッシュ PMをレンダリングすると、図 15において、「D」の隠し図 柄のみ表示部 40に表示され、「F」及び「L」の隠し図柄 G2及び G3は表示部 40に表 示されない。また、レンダリング処理部 15が、光線方向 LD1〜LD3以外の方向に光 線方向を設定して、ポリゴンメッシュ PMをレンダリングすると、隠し図柄 G1〜G3は表 示されない。  [0063] When the rendering processing unit 15 sets the ray direction to LD1 to LD3, sets the line-of-sight direction to VD, and renders the polygon mesh PM using BRDF, as shown in FIG. A hidden symbol G1, a hidden symbol G2 of “F”, and a hidden symbol G3 of “L” are displayed on the display unit 40. On the other hand, when the rendering processing unit 15 renders the polygon mesh PM with the ray direction set to LD1 and the line-of-sight direction set to VD, only the hidden symbol “D” is displayed on the display unit 40 in FIG. Hidden symbols G2 and G3 of “F” and “L” are not displayed on the display unit 40. Further, when the rendering processor 15 sets the light beam direction in a direction other than the light beam directions LD1 to LD3 and renders the polygon mesh PM, the hidden symbols G1 to G3 are not displayed.

[0064] 以上説明したように、第 2の実施の形態による図柄埋込装置によれば、複数の隠し 図柄をポリゴンメッシュ PMに埋め込み、各々の隠し図柄を視認することができる方向 を異なる方向にしたため、第 3者がポリゴンメッシュに埋め込まれた隠し図柄を視認す ることをより困難にすることができる。 [0064] As described above, according to the symbol embedding apparatus according to the second embodiment, a plurality of hidden symbols are embedded in the polygon mesh PM, and each hidden symbol can be visually recognized. Because of the different directions, it is more difficult for a third party to visually recognize the hidden symbol embedded in the polygon mesh.

[0065] なお、上記各実施の形態は、特定ポリゴン TPLの法線ベクトル n、 nl〜n3が目標 方向 OD、 ODl〜OD3と一致するように特定ポリゴン TPL、 TPL1〜TPL3の向きを 変更させたが、これに限定されない。例えば、図 20に示すように、サンプル点 SPに 対する法線ベクトル が目標方向 ODに一致するようにサンプル点 SPの Z成分を修 正し、サンプル点 SPを共有する各特定ポリゴン TPL1〜TPL5の向きを変更してもよ い。この場合、サンプル点 SPの法線ベクトル ηΊま、サンプル点 SPを含む特定ポリゴ ン TPL1〜TPL5の各々の法線ベクトル nl〜n5の平均を取ることにより算出すること ができる。 In each of the above embodiments, the directions of the specific polygons TPL and TPL1 to TPL3 are changed so that the normal vectors n and nl to n3 of the specific polygon TPL match the target directions OD and OD1 to OD3. However, it is not limited to this. For example, as shown in FIG. 20, the Z component of the sample point SP is corrected so that the normal vector to the sample point SP matches the target direction OD, and each specific polygon TPL1 to TPL5 sharing the sample point SP is corrected. You may change the orientation. In this case, the normal vector η 平均 of the sample point SP can be calculated by taking the average of the normal vectors nl to n5 of the specific polygons TPL1 to TPL5 including the sample point SP.

[0066] また、第 2の実施の形態において、レンダリング処理部 15は、光線方向 LD1〜LD 3の色を各々異なる色に設定してレンダリングすると、図 18に示す「D」、 「F」、 「L」の 文字を各々異なる色で表示させることができる。また、第 2の実施の形態では、隠し図 柄は 3種類の隠し図柄力も構成されていた力 これに限定されず、 2種類、或いは 4 種類以上の隠し図柄を用いても良い。また、第 2の実施の形態において、視線方向 V Dの方向は 1つとしたが、これに限定されず、視線方向 VDを複数設定してもよい。  In the second embodiment, the rendering processing unit 15 performs rendering by setting the colors of the light beam directions LD1 to LD3 to different colors, respectively, “D”, “F”, The letters “L” can be displayed in different colors. Further, in the second embodiment, the hidden symbol is a force in which three types of hidden symbol forces are also configured. The present invention is not limited to this, and two types or four or more types of hidden symbols may be used. In the second embodiment, the number of the line-of-sight directions VD is one, but the present invention is not limited to this, and a plurality of line-of-sight directions VD may be set.

[0067] 上記のように、本発明の一態様に係る図柄埋込プログラムは、表面に凹凸が繰り返 し形成された物体の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定する 設定手段と、前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋 め込むための領域内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、 所定の光線方向からの光に対する正反射方向が所定の視線方向に向くように、前記 特定ポリゴンの向きを変更する変更手段としてコンピュータを機能させる。  [0067] As described above, the design embedding program according to an aspect of the present invention sets the polygon mesh representing the surface shape of the object formed by repeatedly forming the unevenness on the surface in the virtual three-dimensional space. And a means for specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon, and a regular reflection direction for light from a predetermined light ray direction The computer is caused to function as changing means for changing the orientation of the specific polygon so that is directed in a predetermined line-of-sight direction.

[0068] 本発明の他の態様に係る図柄埋込装置は、表面に凹凸が繰り返し形成された物体 の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定する設定手段と、前記 ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むための領域 内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、所定の光線方向か らの光に対する正反射方向が所定の視線方向に向くように、前記特定ポリゴンの向き を変更する変更手段とを備える。 [0069] 本発明の他の態様に係る図柄埋込方法は、コンピュータが、表面に凹凸が繰り返し 形成された物体の表面形状を表すポリゴンメッシュを仮想 3次元空間内に設定する 設定ステップと、前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を 埋め込むための領域内に位置するポリゴンを特定ポリゴンとして特定する特定ステツ プと、所定の光線方向からの光に対する正反射方向が所定の視線方向に向くように 、前記特定ポリゴンの向きを変更する変更ステップとを含む。 [0068] A pattern embedding device according to another aspect of the present invention includes a setting unit that sets a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space; A specifying means for specifying a polygon located in an area for embedding a predetermined hidden pattern as a specific polygon from the constituent polygons, and a regular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction As described above, a change means for changing the orientation of the specific polygon is provided. [0069] In the symbol embedding method according to another aspect of the present invention, the computer sets a polygon mesh representing a surface shape of an object having irregularities repeatedly formed on a surface in a virtual three-dimensional space; A specific step for specifying a polygon located in an area for embedding a predetermined hidden pattern as a specific polygon from among the polygons constituting the polygon mesh, and a specular reflection direction for light from a predetermined ray direction is a predetermined line of sight Changing the direction of the specific polygon so as to face the direction.

[0070] 上記の各構成によれば、表面に凹凸が繰り返し形成された物体の表面形状を表す ポリゴンメッシュが仮想 3次元空間内に設定され、隠し図柄が埋め込まれる領域内に 位置する特定ポリゴンが特定され、所定の光線方向から照射された光の正反射方向 が所定の視線方向を向くように、特定ポリゴンの向きが変更される。  [0070] According to each of the above-described configurations, a polygon mesh representing the surface shape of an object in which irregularities are repeatedly formed on the surface is set in a virtual three-dimensional space, and a specific polygon located in a region where a hidden symbol is embedded The direction of the specific polygon is changed so that the specular reflection direction of the light that is specified and irradiated from the predetermined light ray direction faces the predetermined line-of-sight direction.

[0071] 従って、ポリゴンの向きが変更されたポリゴンメッシュの 3次元データを光造形装置 に出力し、このポリゴンメッシュによって規定される形状を榭脂等の物体に形成した場 合、この物体に対して特定の方向から光を照射し、特定の方向から観察したときのみ 隠し図柄を観察することが可能となり、物体に隠し図柄を埋め込むことが可能になる  [0071] Therefore, when 3D data of a polygon mesh whose polygon orientation has been changed is output to an optical modeling apparatus, and the shape defined by this polygon mesh is formed on an object such as a resin, It is possible to observe a hidden symbol only when irradiating light from a specific direction and observing from a specific direction, and it is possible to embed a hidden symbol in an object

[0072] また、ポリゴンの向きが変更されたポリゴンメッシュを特定の光線方向と視線方向か ら、レンダリングしたときのみ隠し図柄を観察することが可能となり、仮想 3次元モデル に隠し図柄を埋め込むことが可能となる。 [0072] In addition, it is possible to observe a hidden symbol only when rendering a polygon mesh whose orientation of the polygon has been changed from a specific ray direction and line-of-sight direction, and it is possible to embed a hidden symbol in a virtual 3D model. It becomes possible.

[0073] そのため、物体の表面の凹凸を第三者が模倣して他の物体に形成した場合、当該 物体では隠し図柄を視認することができないため、物体の表面の凹凸に対する第三 者の模倣を防止することができる。或いは物体の表面の凹凸を表す仮想 3次元モデ ルに隠し図柄を埋め込むことが可能となり、仮想 3次元モデルに対する第三者の模 倣を防止することができる。  [0073] Therefore, when a third party imitates the irregularities on the surface of an object and forms it on another object, the hidden pattern cannot be visually recognized on the object, so the third party imitations on the irregularities on the surface of the object Can be prevented. Alternatively, it is possible to embed hidden symbols in a virtual 3D model that represents the irregularities on the surface of the object, preventing third-party imitations on the virtual 3D model.

[0074] また、隠し図柄を視認することができる光線方向と視線方向とを知る者は、当該光 線方向と視線方向とを設定して、ポリゴンメッシュをレンダリングすれば、隠し図柄を 直接視認することができ、特許文献 1のような復元処理を行わなくとも直ちにポリゴン メッシュに埋め込まれた隠し図柄を視認することができる。  [0074] Further, if a person who knows the light ray direction and the line-of-sight direction in which the hidden symbol can be visually recognized sets the light ray direction and the line-of-sight direction and renders the polygon mesh, the hidden symbol is directly visually recognized. Thus, the hidden symbol embedded in the polygon mesh can be immediately recognized without performing the restoration process as in Patent Document 1.

[0075] また、前記変更手段により向きが変更されたポリゴンメッシュの形状を規定する 3次 元データを 3次元造形装置に出力する出力手段として更にコンピュータを機能させる ことが好ましい。この場合、表面に凹凸が繰り返し形成された現物の物体に隠し図柄 を埋め込むことができる。 [0075] Further, a cubic that defines the shape of the polygon mesh whose orientation has been changed by the changing means. It is preferable to further cause the computer to function as output means for outputting the original data to the three-dimensional modeling apparatus. In this case, it is possible to embed a hidden symbol in an actual object having irregularities repeatedly formed on the surface.

[0076] また、前記物体の光学特性を取得する取得手段と、前記取得手段により取得され た光学特性を用いて、前記変更手段により向きが変更されたポリゴンメッシュをレンダ リングするレンダリング処理手段として更にコンピュータを機能させることが好ましい。 この場合、表面に凹凸が繰り返し形成された物体の仮想 3次元モデルに隠し図柄を 埋め込むことができる。 [0076] Further, an acquisition unit that acquires the optical characteristics of the object, and a rendering processing unit that renders the polygon mesh whose orientation has been changed by the change unit using the optical characteristics acquired by the acquisition unit. It is preferable to make a computer function. In this case, it is possible to embed a hidden symbol in a virtual three-dimensional model of an object with irregularities repeatedly formed on the surface.

[0077] また、前記取得手段は、前記光学特性として、双方向反射率分布関数を取得する ことが好ましい。この場合、双方向反射率分布関数を用いてポリゴンメッシュがレンダ リングされるため、物体の表面形状をリアルに再現することができる。  [0077] Further, it is preferable that the acquisition unit acquires a bidirectional reflectance distribution function as the optical characteristic. In this case, since the polygon mesh is rendered using the bidirectional reflectance distribution function, the surface shape of the object can be realistically reproduced.

[0078] また、前記変更手段は、前記特定ポリゴンのうち向きの変更量が所定角度以下であ る特定ポリゴンのみ、その向きを変更することが好ましい。この場合、向きを大きぐ変 更しなければならないポリゴンは、その向きが変更されないため、ポリゴンメッシュの 原型を大きく崩すことなぐ隠し図柄を埋め込むことができる。また、向きが大きく変更 されたポリゴンが数多く存在すると、所定の視線方向以外の方向からも隠し図柄を視 認できる虞があるが、上記の構成を採用することで、所定の視線方向以外の方向か ら隠し図柄が視認されることを防止することができる。  [0078] In addition, it is preferable that the changing unit changes the direction of only the specific polygon whose direction change amount is equal to or less than a predetermined angle among the specific polygons. In this case, since the orientation of the polygon that has to be changed greatly does not change, it is possible to embed a hidden pattern that does not greatly change the original shape of the polygon mesh. In addition, if there are many polygons whose orientation has been changed greatly, there is a possibility that hidden symbols can be seen from directions other than the predetermined line-of-sight direction, but by adopting the above configuration, directions other than the predetermined line-of-sight direction can be observed. Therefore, it is possible to prevent the hidden symbol from being visually recognized.

[0079] また、前記変更手段は、前記所定の光線方向と前記所定の視線方向との角度を二 等分する方向を目標方向として設定し、前記特定ポリゴンの法線ベクトルが前記目標 方向と一致するように前記特定ポリゴンの向きを変更することが好ましい。  [0079] Further, the changing unit sets a direction that bisects the angle between the predetermined ray direction and the predetermined line-of-sight direction as a target direction, and a normal vector of the specific polygon matches the target direction It is preferable to change the direction of the specific polygon.

[0080] この場合、前記所定の光線方向と視線方向との角度を二等分する方向が目標方向 として設定され、特定ポリゴンの法線ベクトルカこの目標方向と一致するように特定ポ リゴンの向きが変更されるため、特定ポリゴンの向きを目標方向に正確に変更するこ とがでさる。  [0080] In this case, the direction that bisects the angle between the predetermined ray direction and the line-of-sight direction is set as the target direction, and the direction of the specific polygon is set so that the normal vector of the specific polygon coincides with the target direction. Because it is changed, it is possible to accurately change the direction of the specific polygon to the target direction.

[0081] また、前記隠し図柄は、 n (nは 2以上の整数)種類の隠し図柄力も構成され、前記 特定手段は、各隠し図柄に対応する特定ポリゴンを特定し、前記変更手段は、各隠 し図柄に対応する所定の光線方向からの光に対する正反射方向が前記所定の視線 方向を向くように、各隠し図柄に対応する特定ポリゴンの向きを変更することが好まし い。 [0081] Further, the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces, the specifying unit specifies a specific polygon corresponding to each hidden symbol, and the changing unit includes The specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is the predetermined line of sight. It is preferable to change the orientation of the specific polygon corresponding to each hidden symbol so that it faces the direction.

[0082] この場合、隠し図柄は n種類の隠し図柄力 構成され、各隠し図柄に対して予め定 められた、隠し図柄を視認することができる光線方向力 光を照射し、隠し図柄を視 認することができる視線方向から観察したときのみ、隠し図柄を視認することができる 。そのため、 n種類の全ての隠し図柄に対する図柄を視認することができる視線方向 を知らなければ、隠し図柄全体を視認することができなくなる。これにより、隠し図柄 全体が視認される確率を低くすることができ、第 3者に隠し図柄が視認される可能性 をより低くすることができる。  [0082] In this case, the hidden symbol is composed of n types of hidden symbol powers, and a light beam directional light that can be visually recognized for each hidden symbol is irradiated to illuminate the hidden symbol. Only when observing from the line of sight that can be recognized, the hidden symbol can be visually recognized. Therefore, the entire hidden symbol cannot be visually recognized unless the line-of-sight direction in which the symbols for all the n types of hidden symbols are visible is known. As a result, the probability that the entire hidden symbol is visually recognized can be reduced, and the possibility that the hidden symbol is visually recognized by a third party can be further reduced.

[0083] また、前記隠し図柄は、 n (nは 2以上の整数)種類の隠し図柄力も構成され、前記 特定手段は、各隠し図柄に対応する特定ポリゴンを特定し、前記変更手段は、各隠 し図柄に対応する所定の光線方向からの光に対する正反射方向が前記所定の視線 方向を向くように、各隠し図柄に対応する特定ポリゴンの向きを変更し、前記レンダリ ング処理手段は、各隠し図柄に対応する光線方向からの光の色を異なる色に設定し て、前記変更手段により向きが変更されたポリゴンメッシュをレンダリングするようにし てもよい。この場合、 n種類の隠し図柄を各々異なる色で表すことができる。  [0083] Further, the hidden symbol is also configured with n (n is an integer of 2 or more) types of hidden symbol forces, the specifying unit specifies a specific polygon corresponding to each hidden symbol, and the changing unit includes: The direction of the specific polygon corresponding to each hidden symbol is changed so that the specular reflection direction with respect to light from a predetermined ray direction corresponding to the hidden symbol is directed to the predetermined line-of-sight direction. It is also possible to render the polygon mesh whose direction is changed by the changing means by setting the color of the light from the light ray direction corresponding to the hidden symbol to a different color. In this case, n types of hidden symbols can be represented by different colors.

[0084] また、仮想 3次元空間内に複数のサンプル点を配置し、配置した各サンプル点に 対して仮想光源から照射される光の正反射方向を、前記物体の双方向反射率分布 関数を用いて算出し、算出した正反射方向と仮想光源力もの光の入射方向とから各 サンプル点の法線ベクトルを算出し、算出した法線ベクトルを基に、各サンプル点の 高さデータを算出することにより、前記ポリゴンメッシュを生成するポリゴンメッシュ生 成手段として更にコンピュータを機能させることが好ましい。  [0084] In addition, a plurality of sample points are arranged in a virtual three-dimensional space, and the regular reflection direction of light emitted from the virtual light source to each of the arranged sample points is represented by the bidirectional reflectance distribution function of the object. The normal vector of each sample point is calculated from the calculated regular reflection direction and the incident direction of the light of the virtual light source power, and the height data of each sample point is calculated based on the calculated normal vector By doing so, it is preferable that the computer further function as polygon mesh generating means for generating the polygon mesh.

[0085] この場合、ユーザは、ポリゴンメッシュのモデリングを行わなくとも、物体の双方向反 射率分布関数を与えるだけで、物体の表面形状を表すポリゴンメッシュを得ることが できる。  [0085] In this case, the user can obtain a polygon mesh representing the surface shape of the object simply by giving the bidirectional reflectance distribution function of the object without modeling the polygon mesh.

[0086] また、前記物体は、シボを有する物体であることが好ま 、。この場合、ポリゴンメッ シュがシボを有する物体の表面形状を表して 、るため、特定ポリゴン以外のポリゴン の向きが分散する結果、隠し図柄をよりはっきりと表すことができる。また、シボに隠し 図柄を埋め込むことが可能となり、第三者によるシボの模倣を防止することができる。 産業上の利用可能性 [0086] The object is preferably an object having a texture. In this case, since the polygon mesh represents the surface shape of the object having the texture, the orientation of the polygons other than the specific polygon is dispersed, so that the hidden symbol can be expressed more clearly. Also hidden in wrinkles It becomes possible to embed a design and prevent imitation of a wrinkle by a third party. Industrial applicability

本発明に係る図柄埋込装置等は、実際の物体、或いは仮想 3次元モデルに対して 隠し図柄を埋め込むことができるので、コンピュータグラフィックスの技術を用いた図 柄埋込装置等として有用である。  The symbol embedding device according to the present invention can embed a hidden symbol in an actual object or a virtual three-dimensional model, and thus is useful as a symbol embedding device using computer graphics technology. .

Claims

請求の範囲 The scope of the claims [1] 表面に凹凸が繰り返し形成された物体の表面形状を表すポリゴンメッシュを仮想 3 次元空間内に設定する設定手段と、  [1] A setting means for setting a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space; 前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むため の領域内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、  A specifying means for specifying, as a specific polygon, a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh; 所定の光線方向からの光に対する正反射方向が所定の視線方向に向くように、前 記特定ポリゴンの向きを変更する変更手段としてコンピュータを機能させることを特徴 とする図柄埋込プログラム。  A symbol embedding program for causing a computer to function as changing means for changing the direction of the specific polygon so that a specular reflection direction with respect to light from a predetermined ray direction is directed to a predetermined line-of-sight direction. [2] 前記変更手段により向きが変更されたポリゴンメッシュの形状を規定する 3次元デ ータを 3次元造形装置に出力する出力手段として更にコンピュータを機能させること を特徴とする請求項 1記載の図柄埋込プログラム。 [2] The computer according to claim 1, wherein the computer is further caused to function as output means for outputting the 3D data defining the shape of the polygon mesh whose orientation has been changed by the changing means to the 3D modeling apparatus. Design embedding program. [3] 前記物体の光学特性を取得する取得手段と、 [3] an acquisition means for acquiring optical characteristics of the object; 前記取得手段により取得された光学特性を用いて、前記変更手段により向きが変 更されたポリゴンメッシュをレンダリングするレンダリング処理手段として更にコンビュ ータを機能させることを特徴とする請求項 1又は 2記載の図柄埋込プログラム。  3. The computer is further made to function as a rendering processing unit that renders a polygon mesh whose orientation has been changed by the changing unit, using the optical characteristics acquired by the acquiring unit. Design embedding program. [4] 前記取得手段は、前記光学特性として、双方向反射率分布関数を取得することを 請求項 3記載の図柄埋込プログラム。 4. The design embedding program according to claim 3, wherein the acquisition unit acquires a bidirectional reflectance distribution function as the optical characteristic. [5] 前記変更手段は、前記特定ポリゴンのうち、向きの変更量が所定角度以下である 特定ポリゴンのみ、その向きを変更することを特徴とする請求項 1〜4のいずれかに 記載の図柄埋込プログラム。 [5] The design according to any one of [1] to [4], wherein the changing unit changes the direction of only the specific polygon whose direction change amount is a predetermined angle or less among the specific polygon. Embedded program. [6] 前記変更手段は、前記所定の光線方向と前記所定の視線方向との角度を二等分 する方向を目標方向として設定し、前記特定ポリゴンの法線ベクトルが前記目標方向 と一致するように前記特定ポリゴンの向きを変更することを特徴とする請求項 1〜5の いずれかに記載の図柄埋込プログラム。 [6] The changing unit sets a direction that bisects the angle between the predetermined ray direction and the predetermined line-of-sight direction as a target direction, so that a normal vector of the specific polygon matches the target direction. The design embedding program according to claim 1, wherein the orientation of the specific polygon is changed. [7] 前記隠し図柄は、 n (nは 2以上の整数)種類の隠し図柄力も構成され、 [7] The hidden symbol includes n (n is an integer of 2 or more) types of hidden symbol powers, 前記特定手段は、各隠し図柄に対応する特定ポリゴンを特定し、  The specifying means specifies a specific polygon corresponding to each hidden symbol, 前記変更手段は、各隠し図柄に対応する所定の光線方向からの光に対する正反 射方向が前記所定の視線方向を向くように、各隠し図柄に対応する特定ポリゴンの 向きを変更することを特徴とする請求項 1〜6のいずれかに記載の図柄埋込プロダラ ム。 The changing means includes a specific polygon corresponding to each hidden symbol so that a normal reflection direction with respect to light from a predetermined light direction corresponding to each hidden symbol is directed to the predetermined line-of-sight direction. The design embedding program according to any one of claims 1 to 6, wherein the orientation is changed. [8] 前記隠し図柄は、 n (nは 2以上の整数)種類の隠し図柄力も構成され、  [8] The hidden symbol includes n (n is an integer of 2 or more) types of hidden symbol powers, 前記特定手段は、各隠し図柄に対応する特定ポリゴンを特定し、  The specifying means specifies a specific polygon corresponding to each hidden symbol, 前記変更手段は、各隠し図柄に対応する所定の光線方向からの光に対する正反 射方向が前記所定の視線方向を向くように、各隠し図柄に対応する特定ポリゴンの 向きを変更し、  The changing means changes the direction of the specific polygon corresponding to each hidden symbol so that the normal reflection direction with respect to light from the predetermined light direction corresponding to each hidden symbol is directed to the predetermined line-of-sight direction, 前記レンダリング処理手段は、各隠し図柄に対応する光線方向からの光の色を異 なる色に設定して、前記変更手段により向きが変更されたポリゴンメッシュをレンダリ ングすることを特徴とする請求項 3記載の図柄埋込プログラム。  The rendering processing means sets the color of light from the light ray direction corresponding to each hidden symbol to a different color, and renders the polygon mesh whose direction has been changed by the changing means. 3. The pattern embedding program described in 3. [9] 仮想 3次元空間内に複数のサンプル点を配置し、配置した各サンプル点に対して 仮想光源力も照射される光の正反射方向を、前記物体の双方向反射率分布関数を 用いて算出し、算出した正反射方向と仮想光源力もの光の入射方向とから各サンプ ル点の法線ベクトルを算出し、算出した法線ベクトルを基に、各サンプル点の高さデ ータを算出することにより、前記ポリゴンメッシュを生成するポリゴンメッシュ生成手段 として更にコンピュータを機能させることを特徴とする請求項 1〜8のいずれかに記載 の図柄埋込プログラム。  [9] A plurality of sample points are arranged in a virtual three-dimensional space, and the regular reflection direction of the light irradiated with the virtual light source force to each arranged sample point is calculated using the bidirectional reflectance distribution function of the object. The normal vector of each sample point is calculated from the calculated regular reflection direction and the incident direction of the light of the virtual light source power, and the height data of each sample point is calculated based on the calculated normal vector. 9. The design embedding program according to claim 1, further comprising a computer functioning as a polygon mesh generation means for generating the polygon mesh by calculation. [10] 前記物体は、シボを有する物体であることを特徴とする請求項 1〜9のいずれかに 記載の図柄埋込プログラム。  10. The design embedding program according to claim 1, wherein the object is an object having a wrinkle. [11] 表面に凹凸が繰り返し形成された物体の表面形状を表すポリゴンメッシュを仮想 3 次元空間内に設定する設定手段と、  [11] A setting means for setting a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space; 前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むため の領域内に位置するポリゴンを特定ポリゴンとして特定する特定手段と、  A specifying means for specifying, as a specific polygon, a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh; 所定の光線方向からの光に対する正反射方向が所定の視線方向に向くように、前 記特定ポリゴンの向きを変更する変更手段とを備えることを特徴とする図柄埋込装置  A pattern embedding device comprising: changing means for changing the direction of the specific polygon so that a specular reflection direction with respect to light from a predetermined light direction is directed to a predetermined line-of-sight direction [12] コンピュータが、表面に凹凸が繰り返し形成された物体の表面形状を表すポリゴン メッシュを仮想 3次元空間内に設定する設定ステップと、 前記ポリゴンメッシュを構成するポリゴンの中から、所定の隠し図柄を埋め込むため の領域内に位置するポリゴンを特定ポリゴンとして特定する特定ステップと、 所定の光線方向からの光に対する正反射方向が所定の視線方向に向くように、前 記特定ポリゴンの向きを変更する変更ステップとを含むことを特徴とする図柄埋込方 法。 [12] a setting step in which a computer sets a polygon mesh representing a surface shape of an object having uneven surfaces repeatedly formed in a virtual three-dimensional space; A step of specifying a polygon located in an area for embedding a predetermined hidden pattern from among the polygons constituting the polygon mesh as a specific polygon; and a specular reflection direction with respect to light from a predetermined ray direction is a predetermined line of sight A design embedding method comprising: a change step for changing the orientation of the specific polygon so as to face the direction.
PCT/JP2006/312637 2005-07-01 2006-06-23 Pattern embedding program, pattern embedding device, and pattern embedding method Ceased WO2007004448A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-194252 2005-07-01
JP2005194252 2005-07-01

Publications (1)

Publication Number Publication Date
WO2007004448A1 true WO2007004448A1 (en) 2007-01-11

Family

ID=37604318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/312637 Ceased WO2007004448A1 (en) 2005-07-01 2006-06-23 Pattern embedding program, pattern embedding device, and pattern embedding method

Country Status (1)

Country Link
WO (1) WO2007004448A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010103942A1 (en) * 2009-03-09 2010-09-16 カルソニックカンセイ株式会社 Method and device for creating surface treatment data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10334272A (en) * 1997-05-27 1998-12-18 Ibm Japan Ltd Method and system for embedding information in three-dimensional shape model
JP2000082156A (en) * 1998-09-04 2000-03-21 Osamu Kanai Method for embedding electronic information data and method for extracting, device for embedding electronic information data and device for extracting, and storage medium recording program for the same method
JP2003099805A (en) * 2001-09-21 2003-04-04 Rikogaku Shinkokai Digital watermark embedding method and digital watermark restoring method for three-dimensional shape model

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10334272A (en) * 1997-05-27 1998-12-18 Ibm Japan Ltd Method and system for embedding information in three-dimensional shape model
JP2000082156A (en) * 1998-09-04 2000-03-21 Osamu Kanai Method for embedding electronic information data and method for extracting, device for embedding electronic information data and device for extracting, and storage medium recording program for the same method
JP2003099805A (en) * 2001-09-21 2003-04-04 Rikogaku Shinkokai Digital watermark embedding method and digital watermark restoring method for three-dimensional shape model

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010103942A1 (en) * 2009-03-09 2010-09-16 カルソニックカンセイ株式会社 Method and device for creating surface treatment data
CN102348551A (en) * 2009-03-09 2012-02-08 康奈可关精株式会社 Method and device for creating surface treatment data
CN102348551B (en) * 2009-03-09 2014-05-28 康奈可关精株式会社 Method and device for creating surface treatment data
US9275497B2 (en) 2009-03-09 2016-03-01 Calsonic Kansei Corporation Method and device for forming surface processed

Similar Documents

Publication Publication Date Title
JP4276178B2 (en) Method for digital rendering of skin or similar
US7446778B2 (en) Dynamically adjusted brush for direct paint systems on parameterized multi-dimensional surfaces
JP5299173B2 (en) Image processing apparatus, image processing method, and program
JP2003256865A (en) Method and program for generating 2D image of cartoon expression from 3D object data
TWI406186B (en) 2d editing metaphor for 3d graphics
US10475230B2 (en) Surface material pattern finish simulation device and surface material pattern finish simulation method
US20180005432A1 (en) Shading Using Multiple Texture Maps
US9317967B1 (en) Deformation of surface objects
KR100942026B1 (en) Virtual 3D Face Makeup System and Method Based on Multi-Sensory Interfaces
US7609275B2 (en) System and method for mosaic rendering of three dimensional image
JPH06236440A (en) Image processing method
WO2007004448A1 (en) Pattern embedding program, pattern embedding device, and pattern embedding method
Levene A framework for non-realistic projections
ATE433172T1 (en) RENDERING 3D COMPUTER GRAPHICS USING 2D COMPUTER GRAPHICS CAPABILITIES
KR101921706B1 (en) Creation and providing system of mixed reality using distance of reality image
WO2007108288A1 (en) Texture producing program, texture producing apparatus, and texture producing method
GB2341529A (en) Three-dimensional embroidery design simulator
JP4172556B2 (en) Two-dimensional scalar field design method and system
JP2006202066A (en) 3D computer graphics curve / curved surface model generation apparatus and system
Martín et al. Flattening 3D objects using silhouettes
CN101263529B (en) 2D editing metaphor for 3D graphics
JP4736239B2 (en) Pattern image creating method and apparatus
JP2025059730A (en) Method for producing a shadow picture projection and computer program for a terminal device
Ji Design and Modeling of Chinese Classical Lanterns Based on Different Processes
JP6720523B2 (en) Surface material finish simulation system and surface material finish simulation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06767254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP