[go: up one dir, main page]

WO2017195299A1 - Système de simulation - Google Patents

Système de simulation Download PDF

Info

Publication number
WO2017195299A1
WO2017195299A1 PCT/JP2016/064021 JP2016064021W WO2017195299A1 WO 2017195299 A1 WO2017195299 A1 WO 2017195299A1 JP 2016064021 W JP2016064021 W JP 2016064021W WO 2017195299 A1 WO2017195299 A1 WO 2017195299A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
size
article
unit
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/064021
Other languages
English (en)
Japanese (ja)
Inventor
幸宏 陽奥
鈴木 達也
遠藤 康浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to PCT/JP2016/064021 priority Critical patent/WO2017195299A1/fr
Publication of WO2017195299A1 publication Critical patent/WO2017195299A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates to a simulation system.
  • an information input device for a user to control a graphic cursor displayed on a display.
  • a first sensor that generates first sensor data in response to a first type of user motion that is a motion of a part of the user's body, and a motion of a part of the user's body that is finer than the first type of user motion
  • a second sensor that generates second sensor data in response to a second type of user action.
  • the graphic having a wide-range movement component corresponding to the first type of user action and a high-precision range movement component corresponding to the second type of user action and representing a higher-precision movement than the wide-range movement component.
  • the apparatus further includes at least one processor that calculates a hybrid cursor movement signal that is a signal for moving the cursor.
  • the at least one processor is based on a first sensitivity parameter representing a sensitivity of the first sensor with respect to the first type of user action, which is determined by adding the second sensor data to the first sensor data.
  • a wide range moving component is calculated.
  • the at least one processor is based on a second sensitivity parameter representing a sensitivity of the second sensor with respect to the second type of user action, which is determined by adding the first sensor data to the second sensor data.
  • a high-accuracy range moving component is calculated (see, for example, Patent Document 1).
  • the at least one processor sets the first sensitivity parameter to be smaller as the second type user action is more intense when both the first type user action and the second type user action are executed.
  • the wide-range moving component is suppressed, and the second sensitivity parameter is set to be smaller as the first type user operation is more intense, thereby suppressing the high-accuracy range moving component.
  • the graphic cursor includes at least a first cursor and a second cursor located in the first cursor and smaller than the first cursor, and the at least one processor further has a large first sensitivity parameter.
  • the conventional information input device operates the two cursors of the first cursor and the second cursor, and thus there is a problem that the usability is not good.
  • an object is to provide a simulation system that is easy to use.
  • a simulation system stores a display unit that displays an image of the article based on article data representing the shape and position of the article, a pointer whose position is manipulated by a user, and the article data.
  • a data storage unit a first detection unit that detects an instruction operation in which the user indicates the position of the pointer, and a coordinate system of the display unit based on the instruction operation detected by the first detection unit.
  • a second detection unit that detects the position of the pointer, an area generation unit that includes the pointer and generates a detection area that is larger than the pointer, and the type of the article that is at least partially located within the detection area
  • a size setting unit for setting the size of the pointer, and an output for causing the display unit to display the pointer having a size set by the size setting unit. And a part.
  • a simulation system with good usability can be provided.
  • FIG. 1 is a perspective view of a computer system to which a processing apparatus according to an embodiment is applied. It is a block diagram explaining the structure of the principal part in the main-body part of a computer system. It is a figure which shows shape data. It is a figure which shows an example of the image of articles
  • FIG. 25 is a diagram illustrating a locus of a pointer displayed by the simulation system when the instruction operation described in FIG. 24 is performed. It is a figure which shows the result of having performed the instruction
  • FIG. 1 is a diagram illustrating a simulation system 100 according to an embodiment.
  • FIG. 2 is a diagram illustrating a configuration of the processing device 120 of the simulation system 100.
  • the simulation system 100 includes a screen 110A, a projection device 110B, 3D (3-dimensional) glasses 110C, a processing device 120, and a position measurement device 140.
  • the simulation system 100 can be applied to an assembly support system in order to grasp assembly workability in a virtual space, for example.
  • an operation of assembling an electronic component such as a CPU (Central Processing Unit) module, a memory module, a communication module, or a connector on a mother board or the like can be performed in a virtual space.
  • a CPU Central Processing Unit
  • simulation system 100 can be applied not only to the assembly support system but also to various systems for confirming workability in a three-dimensional space.
  • a projector screen can be used as the screen 110A.
  • the size of the screen 110A may be set as appropriate according to the application.
  • An image projected by the projection device 110B is displayed on the screen 110A.
  • images of the article 111, the buttons 111A and 111B, and the pointer 130A are displayed on the screen 110A.
  • the pointer 130A is displayed in the direction in which the user 1 points the hand toward the screen 110A.
  • the user 1 may be in a state where the fingertip of the right hand is opened or held.
  • the pointer 130A is displayed on the screen 110A in the direction in which the user 1 moves the right arm and points with the hand.
  • An operation in which the user 1 instructs with the right arm to move the pointer 130A is referred to as an instruction operation.
  • the projection device 110B may be any device that can project an image on the screen 110A.
  • a projector can be used.
  • the projection device 110B is connected to the processing device 120 via a cable 110B1, and projects an image input from the processing device 120 onto the screen 110A.
  • the projection device 110B is of a type that can project a 3D image (stereoscopic image) onto the screen 110A.
  • the screen 110A and the projection device 110B are examples of a display unit.
  • the user 1 who uses the simulation system 100 wears the 3D glasses 110C.
  • the 3D glasses 110C may be any glasses that can convert an image projected on the screen 110A by the projection device 110B into a 3D image.
  • polarized glasses for polarizing incident light or liquid crystal shutter glasses having a liquid crystal shutter are used. Can do.
  • a liquid crystal display panel may be used instead of the screen 110A and the projection device 110B.
  • the 3D glasses 110C may not be used.
  • a head mounted display capable of viewing a 3D image may be used instead of the screen 110A and the projection device 110B.
  • the processing device 120 includes a human body detection unit 121, a position detection unit 122, a detection region generation unit 123, an operation detection unit 124, an object determination unit 125, a pointer generation unit 126, a data holding unit 127, and a video output unit 128.
  • the processing device 120 is realized by a computer having a memory, for example.
  • the human body detection unit 121 determines whether or not the body of the user 1 exists based on data that three-dimensionally represents the position and shape of the body of the user 1 input from the position measurement device 140. When doing so, the coordinates indicating the position of each part of the body of the user 1 are obtained. As an example, the position of each part of the body of the user 1 is represented by the position of the skeleton of the user 1. The position of the skeleton includes, for example, the position of the head, shoulder, elbow, wrist, and hand.
  • the human body detection unit 121 is an example of a second detection unit together with the position detection unit 122.
  • the position detection unit 122 obtains coordinates P (P x , P y , P z ) based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121.
  • the position detection unit 122 is an example of a second detection unit together with the human body detection unit 121.
  • the position detection unit 122 obtains a straight line connecting the right shoulder and the right wrist of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121, and the straight line and the screen 110A. Find the coordinates of the intersection with.
  • the position detection unit 122 converts the coordinate value of the intersection of the straight line and the screen 110A into the coordinate in the image projected on the screen 110A, and outputs it as coordinates P (P x , P y , P z ). Note that the position measurement device 140 may detect the coordinates P (P x , P y , P z ).
  • an X axis is defined in the horizontal direction parallel to the screen 110A
  • a Y axis is defined in the vertical direction
  • a Z axis is defined in the horizontal direction perpendicular to the screen 110A.
  • the magnitude of the vector SH is given by the following equation (2) using the right shoulder coordinates S (S x , S y , S z ) and the right wrist coordinates H (H x , H y , H z ). expressed.
  • a coordinate P1 (P1 x , P1 y , P1 z ) represented by the following expression (3) is obtained by setting ⁇ L as an amount (offset amount) offset by the user 1 from the screen 10A in the Z-axis direction.
  • the coordinates P1 (P1 x , P1 y , P1 z ) are coordinates obtained by the equations (1), (2), and (3), and the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A.
  • the coordinates of this intersection are the coordinates in the real space.
  • the position detection unit 122 converts the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection into coordinates in the image projected on the screen 110A, and outputs the coordinates P (P x , P y , P z ). To do. Coordinates P (P x, P y, P z) are coordinates obtained by converting the coordinates in the image to be projected intersection coordinates P1 and (P1 x, P1 y, P1 z) of the screen 110A.
  • the position detector 122 converts the coordinates P (P x , P y , P) obtained by converting the coordinates P1 (P1 x , P1 y , P1 z ) of the intersection in the real space into values in the coordinate system in the virtual space. z ).
  • Detection area generator 123 generates a detection region 130B to the center coordinates P (P x, P y, P z) a.
  • the detection area generation unit 123 is an example of an area generation unit.
  • the pointer 130A is a sphere centered on the coordinates P (P x , P y , P z ), and the radius of the detection area 130B is larger than the radius of the pointer 130A. For this reason, the detection region 130B is arranged concentrically with the pointer 130A.
  • the detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center.
  • the detection region 130B includes a spherical surface defined by a predetermined radius.
  • the pointer 130A and the detection area 130B are the same in that they move according to the user's instruction operation.
  • the pointer 130A is displayed as an image on the screen 110A, but the detection area 130B is not displayed on the screen 110A.
  • the pointer 130A is used for determining contact with the article 111 or the button 111A or 111B displayed on the screen 110A, whereas the detection area 130B is the article 111 or the button 111A or 111B existing around the pointer 130A. Used to determine the presence of.
  • the motion detection unit 124 detects the motion of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121.
  • the operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and There is an operation of canceling the input to the button 111A or 111B.
  • the operation for determining input to the buttons 111A and 111B is an operation of tapping the buttons 111A and 111B, respectively. Further, the operation of canceling the input to the button 111A is an operation of paying while tracing the button 111A.
  • the operation of canceling the input to the button 111A is performed by moving the pointer 130A leftward while the pointer 130A touches the button 111A. It is an operation to move to.
  • the operation of canceling the input to the button 111B is an operation of paying while tracing the button 111B.
  • the operation of canceling the input to the button 111B is performed by moving the pointer 130A in the right direction while the pointer 130A touches the button 111B. It is an operation to move to.
  • the operation for resetting the size of the pointer 130A is an operation of crossing both arms.
  • the motion detection unit 124 is an example of a first determination unit and a second determination unit.
  • the object determination unit 125 determines the type of article at least a part of which is located inside the detection area 130B.
  • the type of article is, for example, the kind of article 111 or button 111A or 111B.
  • Whether or not at least a part is located inside the detection area 130B may be determined by whether or not the detection area 130B and the display area of the article 111 or the button 111A or 111B have an intersection. When at least a part of the article is located inside the detection area 130B, the case where the article is located on the outer peripheral surface (boundary) of the detection area 130B is also included.
  • the pointer generator 126 generates the pointer 130A as a spherical image centered on the coordinates P (P x , P y , P z ).
  • the radius of the pointer 130A is smaller than the radius of the detection area 130B. Since the pointer 130A and the detection area 130B are arranged concentrically, the detection area 130B exists around the pointer 130A.
  • the pointer generation unit 126 is an example of a size setting unit.
  • the pointer 130A moves according to the user's instruction operation, and is used to determine contact with the article 111 or the button 111A or 111B displayed on the screen 110A.
  • Whether the pointer 130A and the article 111 or the button 111A or 111B are in contact with each other is determined based on the coordinates P (P x , P y , P z ) and within the pointer 130A having a predetermined radius. The determination may be made based on whether at least a part of the display area of the button 111A or 111B is included.
  • the interior of the pointer 130A includes the surface of the pointer 130A.
  • Whether at least a part of the display area of the article 111 or the button 111A or 111B is included in the pointer 130A depends on whether or not the pointer 130A and the display area of the article 111 or the button 111A or 111B have an intersection. What is necessary is just to judge.
  • the display area of the article 111 or the button 111A or 111B is located on the surface (boundary) of the pointer 130A Is also included.
  • the pointer 130A is in contact with the article 111 or the button 111A or 111B.
  • the pointer 130A is in contact with the article 111 or the button 111A or 111B. In this case, it is only necessary to transmit the contact to the user 1 by changing the color of the pointer 130A.
  • the size of the pointer 130A generated by the pointer generator 126 is set based on data representing the size of the pointer 130A held in the data holding unit 127. In addition, the pointer generation unit 126 sets the size of the pointer 130A according to the type of article existing inside the detection area 130B. In addition, the pointer generation unit 126 sets the size of the pointer 130 ⁇ / b> A according to the operation of the user 1. A specific method for setting the size of the pointer 130A will be described later.
  • the data holding unit 127 holds data such as article data representing the coordinates and shape of the article 111 or the button 111A or 111B, image data of the pointer 130A, and data representing the size of the pointer 130A.
  • the data holding unit 127 is realized by a memory and is an example of a data storage unit.
  • the output terminal of the video output unit 128 is connected to the projection device 110B by a cable 110B1.
  • the video output unit 128 outputs an image specified by the article data of the article 111 held in the data holding unit 127 to the projection device 110B and displays it on the screen 110A.
  • the video output unit 128 displays the pointer 130A on the projection device 110B.
  • the image of the pointer 130A is generated by the pointer generator 126.
  • the position measuring device 140 is a device that acquires data that three-dimensionally represents the position and shape of the body of the user 1.
  • the position measuring device 140 is installed above the screen 110A and has a detection range 142 in front of the screen 110A.
  • the detection range 142 extends from the camera 140A of the position measurement device 140 to the front of the screen 110A.
  • the position measuring device 140 is connected to the processing device 120 by a cable 141.
  • the position measuring device 140 calculates a distance (depth) to a point included in the image based on, for example, the time from irradiating the subject with an infrared laser and receiving the reflected light. Device.
  • the position measurement device 140 acquires an image of the user 1 who performs an instruction operation toward the screen 110A, and acquires three-dimensional distance image data representing the posture of the user 1, a gesture, and the like.
  • the position measurement device 140 transmits the acquired three-dimensional data to the processing device 120 via the cable 141.
  • FIG. 3 is a perspective view of a computer system to which the processing device 120 of the embodiment is applied.
  • a computer system 10 shown in FIG. 3 includes a main body 11, a display 12, a keyboard 13, a mouse 14, and a modem 15.
  • the main unit 11 includes a CPU (Central Processing Unit), an HDD (Hard Disk Drive), a disk drive, and the like.
  • the display 12 displays an analysis result or the like on the screen 12A according to an instruction from the main body 11.
  • the display 12 may be a liquid crystal monitor, for example.
  • the keyboard 13 is an input unit for inputting various information to the computer system 10.
  • the mouse 14 is an input unit that designates an arbitrary position on the screen 12 ⁇ / b> A of the display 12.
  • the modem 15 accesses an external database or the like and downloads a program or the like stored in another computer system.
  • a program for causing the computer system 10 to function as the processing device 120 is stored in a portable recording medium such as the disk 17 or downloaded from the recording medium 16 of another computer system using a communication device such as the modem 15. Are input to the computer system 10 and compiled.
  • a program that causes the computer system 10 to have a function as the processing device 120 causes the computer system 10 to operate as the processing device 120.
  • This program may be stored in a computer-readable recording medium such as the disk 17.
  • the computer-readable recording medium is limited to a portable recording medium such as a disk 17, an IC card memory, a magnetic disk such as a floppy (registered trademark) disk, a magneto-optical disk, a CD-ROM, or a USB (Universal Serial Bus) memory. It is not something.
  • the computer-readable recording medium includes various recording media accessible by a computer system connected via a communication device such as a modem 15 or a LAN.
  • FIG. 4 is a block diagram illustrating a configuration of a main part in the main body 11 of the computer system 10.
  • the main body 11 includes a CPU 21 connected by a bus 20, a memory unit 22 including a RAM or a ROM, a disk drive 23 for the disk 17, and a hard disk drive (HDD) 24.
  • the display 12, the keyboard 13, and the mouse 14 are connected to the CPU 21 via the bus 20, but these may be directly connected to the CPU 21.
  • the display 12 may be connected to the CPU 21 via a known graphic interface (not shown) that processes input / output image data.
  • the keyboard 13 and the mouse 14 are input units of the processing device 120.
  • the display 12 is a display unit that displays input contents and the like for the processing device 120 on the screen 12A.
  • the computer system 10 is not limited to the configuration shown in FIGS. 3 and 4, and various known elements may be added or alternatively used.
  • FIG. 5 is a diagram showing shape data.
  • the article data is data representing the coordinates and shape of the article displayed on the screen 110A.
  • the article data has an article ID, a shape type, reference coordinates, a size, and a rotation angle.
  • the shape type represents the outer shape of the article.
  • the shape types indicate Cuboid (cuboid) and Cylinder (cylindrical body).
  • the reference coordinate indicates the coordinate value of a point that serves as a reference for coordinates representing the entire article.
  • the unit of the coordinate value is meter (m).
  • An XYZ coordinate system is used as the coordinate system.
  • the size represents the length of the article in the X-axis direction, the length in the Y-axis direction, and the length in the Z-axis direction.
  • the unit is meters (m).
  • the length in the X-axis direction represents the vertical length
  • the length in the Y-axis direction represents the height
  • the length in the Z-axis direction represents the depth (the length in the horizontal direction).
  • the rotation angle is represented by rotation angles ⁇ x, ⁇ y, and ⁇ z with respect to the X-axis direction, the Y-axis direction, and the Z-axis direction.
  • the unit is degree (deg.).
  • the rotation angle ⁇ x is an angle for rotating the article about the X axis as a rotation axis.
  • the rotation angles ⁇ y and ⁇ z are angles at which the article is rotated about the Y axis and the Z axis as rotation axes, respectively.
  • the positive directions of the rotation angles ⁇ x, ⁇ y, and ⁇ z may be determined in advance.
  • an image specified by the article data can be represented in the same manner as the article image displayed by the CAD data.
  • the article data is stored in the data holding unit 127 of the processing device 120.
  • FIG. 6 is a diagram illustrating an example of an image of an article.
  • FIG. 6 shows three articles represented by the article data in FIG.
  • An article with an article ID of 001 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.0, 0.0, 0.0), and a size of (0.8, 0.2, 0.4), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 0.0).
  • An article with an article ID of 002 has a shape type of Cuboid (cuboid), reference coordinates (X, Y, Z) of (0.6, 0.2, 0.0), and a size of (0.2, 0.2, 0.1), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 0.0).
  • the article with the article ID 002 is arranged on the article with the article ID 001.
  • the article with the article ID 003 has a shape type of Cylinder, a reference coordinate (X, Y, Z) of (0.8, 0.3, 0.1), and a size of (0.2 , 1.0, 0.3), and the rotation angles ⁇ x, ⁇ y, ⁇ z are (0.0, 0.0, 90.0).
  • the article with the article ID 003 is connected to the X axis positive direction side of the article with the article ID 002 in a state where the article ID is rotated 90 degrees about the Z axis.
  • the article data in the image projected on the screen 110A using the article data having the article ID, shape type, reference coordinates, size, and rotation angle shown in FIG. Define coordinates and shape.
  • the coordinates of the eight vertices are the length in the X-axis direction, the length in the Y-axis direction, the length in the Y-axis direction, and the Z-axis direction with respect to the reference coordinates. Can be obtained by adding or subtracting the length.
  • the coordinates of the eight vertices represent the coordinates of the corner of the article whose shape type is Cuboid.
  • the expression representing the 12 sides is an expression representing the coordinates of the Edge of the article whose shape type is Cuboid.
  • the expressions representing the eight vertices and / or the expressions representing the 12 sides are obtained, the expressions representing the six surfaces of the article whose shape type is Cuboid are obtained, and the coordinates of the surface are obtained. be able to.
  • the shape type is Cylinder (cylindrical body)
  • An expression representing a certain circle (or ellipse) can be obtained.
  • an equation representing a circle (or ellipse) at both ends and a reference coordinate are used, an equation representing the coordinates of the circle (or ellipse) at both ends can be obtained.
  • the coordinates of the side surface of the cylinder can be obtained by using an expression representing the coordinates of the circles (or ellipses) at both ends.
  • FIG. 7 is a diagram showing size data.
  • the size data is data in a table format in which the article ID of the article displayed on the screen 110A is associated with the pointer size of the pointer 130A.
  • Pointer size (X p, Y p, Z p) is the X-axis direction width X p, Y-axis direction of the height Y p, and represents the depth Z p in the Z-axis direction.
  • Pointer 130A has a width X p, is displayed as an ellipsoid having a height Y p, and the depth Z p.
  • the pointer size (X p , Y p , Z p ) associated with the article having the article ID 001 is (0.05, 0.02, 0.05).
  • Pointer size article ID is associated with the article 002 (X p, Y p, Z p) is a (0.01,0.01,0.01).
  • the pointer size (X p , Y p , Z p ) associated with the article with the article ID 002 is (0.015, 0.05, 0.015).
  • FIGS. 8 to 16 are diagrams showing a method for determining the size of the pointer 130A according to the size of the article.
  • the surface (ellipsoidal surface) of the pointer 130A is represented by Expression (4).
  • the parameter k is 0.05.
  • Article 111-2 has a shape in which three L-shaped blocks are connected.
  • the size of the pointer 130A is a size that can enter the gap between the three blocks of the article 111-2.
  • the parameter k is 0.05.
  • the article 111-3 is provided with a rectangular parallelepiped hole in the rectangular parallelepiped 111-31, and a prismatic portion 111-32 is provided in the hole.
  • the size of the pointer 130A is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • the parameters a, b, and c of the ellipsoid are expressed by the following equation (5). It may be set to l1 obtained in step (1).
  • the size of the pointer 130A obtained by Expression (6) is a size that can enter the gap between the three blocks of the article 111-2.
  • the size of the pointer 130A obtained by Expression (7) is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • the parameter k is 0.5.
  • X2A is the size of the gap between the three L-shaped blocks of the article 111-2 and is the minimum value of the outer dimensions of the article 111-2.
  • the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-2.
  • the size of the pointer 130A obtained in this way is a size that can enter the gap between the three blocks of the article 111-2.
  • the parameter k is 0.5.
  • Y3A is the height in the Y-axis direction of the rectangular column part 111-32 of the rectangular parallelepiped 111-31, and is the minimum value of the outer dimensions of the article 111-3.
  • the parameters a, b, and c may be set to a value that is half the minimum value of the outer dimensions of the article 111-3.
  • the size of the pointer 130A thus obtained is a size that can enter the gap between the hole of the rectangular parallelepiped 111-31 and the prism portion 111-32.
  • FIG. 17 is a diagram showing size data in which an article ID is associated with a pointer size.
  • buttons 111A and 111B and the article ID of the article 111 illustrated in FIG. 1 and a pointer size associated with the article ID will be described.
  • the item ID of the button 111A is 011
  • the item ID of the button 111B is 012
  • the item ID of the item 111 is 013.
  • the pointer size (X p , Y p , Z p ) for the button 111A with the article ID 011 is (0.01, 0.01, 0.01).
  • the pointer size (X p , Y p , Z p ) of the button 111B with the article ID 012 is (0.01, 0.01, 0.01).
  • the article 111 of the article ID is 013, the size of a pointer (X p, Y p, Z p) is a (0.04,0.04,0.04).
  • FIG. 18 and 19 are diagrams showing the relationship between the article and the size of the pointer 130A.
  • the pointer 130A is displayed based on the size data shown in FIG. 17 when the image of the article 111 and the buttons 111A and 111B is displayed on the screen 110A as shown in FIG. 1 will be described.
  • a pointer 130A is displayed on the screen 110A, and a detection area 130B is set around the pointer 130A.
  • the detection area 130B is not displayed on the screen 110A.
  • the size of the pointer 130A is set to an initial value.
  • the initial value may be set according to, for example, the size of the screen 110A and the appropriate position of the user 1 with respect to the screen 110A.
  • buttons 111A and 111B enter the detection area 130B
  • the pointer 130A is displayed using the minimum value among the plurality of pointer sizes associated with the plurality of articles.
  • the pointer size (X p , Y p , Z p ) is set to (0.01, 0.01, 0.01) here. .
  • buttons 111A and 111B are relatively small articles, the user 1 can easily select the buttons 111A and 111B by making the pointer 130A smaller than the initial value.
  • the pointer size (X p , Y p , Z p ) becomes (0 .04, 0.04, 0.04).
  • the article 111 is larger than the buttons 111A and 111B and is a relatively large article, so that the user 1 can easily select the article 111 by making the pointer 130A larger than the initial value.
  • the simulation system 100 changes the size of the pointer 130A according to the size of the article existing in the detection area 130B. This is to improve the usability of the simulation system 100 by making it easy for the user 1 to see the pointer 130A.
  • FIG. 20 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment.
  • the flowchart shown in FIG. 20 shows processing for setting the pointer size of the pointer 130A and displaying the pointer 130A on the screen 110A.
  • Processing device 120 starts processing after power is turned on (start).
  • the processing apparatus 120 acquires article data from the data holding unit 127 (step S1).
  • the processing device 120 generates a video signal using the article data, and causes the projection device 110B to project an image (step S2).
  • an image of the stereoscopic model of the article 111 and the buttons 111A and 111B is displayed on the screen 110A.
  • the image of the article 111 and the buttons 111A and 111B displayed on the screen 110A represents a virtual object existing in the virtual space.
  • steps S1 and S2 are performed by the video output unit 128.
  • the processing device 120 acquires data representing the position and shape of the body of the user 1 three-dimensionally from the position measurement device 140 (step S3).
  • the process of step S3 is performed by the human body detection unit 121.
  • the processing device 120 determines whether the body of the user 1 exists based on the data acquired in step S3 (step S4).
  • the process of step S4 is performed by the human body detection unit 121.
  • step S5 If the processing device 120 determines that the body of the user 1 exists (S4: YES), the processing device 120 obtains coordinates representing the position of each part of the body of the user 1 (step S5).
  • the process of step S5 is performed by the human body detection unit 121.
  • the processing device 120 detects the coordinates P (P x , P y , P z ) (step S6).
  • the coordinates P (P x , P y , P z ) are obtained by converting the coordinates of the intersection point of the straight line connecting the right shoulder and the right wrist of the user 1 and the screen 110A to the coordinates in the image projected on the screen 110A.
  • the coordinates are obtained by the position detector 122.
  • the process of step S6 is performed by the position detection unit 122.
  • the processing device 120 generates the detection area 130B (step S7).
  • the processing in step S7 is performed by the detection area generation unit 123.
  • the detection area generator 123 generates a detection area 130B having a predetermined radius centered on the coordinates P (P x , P y , P z ).
  • the detection area 130B is an area outside the surface of the pointer 130A and included in a sphere defined by a predetermined radius with the coordinates P (P x , P y , P z ) as the center.
  • the processing device 120 determines whether or not an article exists in the detection area 130B (step S8).
  • the process of step S8 is performed by the object determination unit 125.
  • the object determination unit 125 determines whether or not the detection area 130B and the article 111 or the display area of the button 111A or 111B have an intersection, and whether or not there is an article that is at least partially located inside the detection area 130B. judge. When an article exists inside the detection area 130B, the object determination unit 125 determines the type of the article.
  • step S9 the processing device 120 sets the pointer size to an initial value.
  • the processing in step S9 is performed by the pointer generator 126.
  • the initial value of the pointer size is held in the data holding unit 127.
  • the processing device 120 displays the pointer 130A on the screen 110A (step S10).
  • the processing in step S10 is performed by the video output unit 128.
  • the video output unit 128 causes the projection device 110B to display the image of the pointer 130A generated by the pointer generation unit 126 on the screen 110A.
  • step S11 If the processing device 120 determines that an article exists in the detection area 130B (S8: YES), the processing apparatus 120 acquires the article ID of the article existing in the detection area 130B (step S11).
  • the process of step S11 is performed by the object determination unit 125.
  • step S11 when there are a plurality of articles in the detection area 130B, a plurality of article IDs are acquired.
  • the object determination unit 125 acquires the article ID of the article 111 or the button 111A or 111B having an intersection with the detection area 130B.
  • the processing device 120 reads the pointer size corresponding to the article ID acquired in step S11 from the size data (see FIGS. 7 and 18) (step S12).
  • the processing in step S12 is performed by the pointer generation unit 126.
  • the pointer generation unit 126 reads a plurality of pointer sizes.
  • step S13 When the processing device 120 reads out a plurality of pointer sizes in step S11, the processing device 120 selects the smallest pointer size among the plurality of pointer sizes (step S13). The processing in step S13 is performed by the pointer generation unit 126. Note that, when the number of article IDs acquired in step S11 is one, the pointer generation unit 126 does not perform any particular process in step S13.
  • the processing device 120 determines whether the pointer size is smaller than a predetermined lower limit (step S14).
  • the processing in step S14 is performed by the pointer generator 126.
  • the pointer generation unit 126 reads out a predetermined lower limit value of the pointer size from the data holding unit 127, one pointer size read in step S12, or the pointer size selected in S13 and the pointer size read from the data holding unit 127 Is compared with a predetermined lower limit value.
  • the reason why the pointer size is compared with the predetermined lower limit in this way is to prevent the pointer 130A from becoming too small by reducing the pointer size in the process of step S109 described later.
  • the predetermined lower limit value may be set according to the size of the screen 110A, the appropriate position of the user 1 with respect to the screen 110A, and the like.
  • step S15 the processing device 120 corrects the pointer size to the predetermined lower limit value (step S15).
  • the processing in step S15 is performed by the pointer generator 126. If the pointer size is smaller than the predetermined lower limit value, it is difficult for the user 1 to see, and therefore the pointer size is corrected to be increased to the lower limit value before being displayed on the screen 110A.
  • the processing device 120 displays the pointer 130A having the pointer size corrected to the lower limit value on the screen 110A (step S10).
  • the processing in step S10 is performed by the video output unit 128.
  • the processing device 120 displays the pointer 130A having the pointer size set in step S12 or S13 on the screen 110A (step S10). .
  • the pointer 130A having the pointer size set by the processing device 120 is displayed on the screen 110A.
  • the flow shown in FIG. 20 is repeatedly executed.
  • FIG. 21 is a diagram showing the relationship between the article and the size of the pointer 130A.
  • images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A.
  • the determination operation to the button 111B is erroneously performed, and the operation to cancel the input to the button 111B is performed.
  • a case where the size of 130A is changed will be described.
  • the operation for determining the input to the buttons 111A and 111B and the operation for canceling the input to the buttons 111A and 111B are determined as operations that can be detected by the operation detection unit 124.
  • the size of the pointer 130A is set to an initial value.
  • the pointer size (X p , Y p , Z p ) becomes (0.01, 0.01, 0.01).
  • the user 1 wants to tap the button 111A in this state, but the right hand 2 swings, the pointer 130A moves from the point B3 to the point B4, and the pointer 130A does not touch the button 111A but touches the button 111B. It becomes a state.
  • the input to the button 111B is determined. Since the user 1 wants to determine the input to the button 111A, the user 1 performs a cancel operation. Specifically, the user 1 performs an operation of paying the right hand 2 to the right side with the pointer 130A touching the button 111B. Thereby, the canceling operation is detected by the operation detecting unit 124.
  • the pointer size (X p , Y p , Z p ) associated with the button 111B for which the cancel operation has been performed is set to a size of 90%. That is, the pointer 130A is 10% smaller.
  • a learning function Such a function of reducing the pointer size in accordance with the cancel operation is referred to as a learning function.
  • User 1 moves the pointer 130A from point B4 to point B5, and the pointer 130A is in a state of touching the button 111A.
  • the user 1 may tap the button 111A in this state. Since the pointer 130A is 10% smaller, it is easier to select the button 111A, and erroneous input can be suppressed.
  • FIG. 22 is a diagram showing size data in which an article ID is associated with a pointer size. Here, a change in size data before and after the cancel operation is performed will be described.
  • the size data shown on the left side of FIG. 22 is the size data before the cancel operation is performed, and is the same as the size data shown in FIG.
  • the pointer size associated with the item ID 012 corresponding to the button 111B is 10% smaller as shown on the right side of FIG. Is done.
  • the pointer size is reduced when the cancel operation is performed when the cancel operation is performed within a predetermined time for the same button as the button for which the cancel operation is performed. This is because when the user 1 notices an erroneous input, it is considered that the cancel operation is performed within a long time after the determination operation is performed. In addition, if the cancel operation is performed after a certain amount of time has elapsed since the determination operation was performed, it is considered that the erroneous input is not canceled but the intention to cancel is performed.
  • the predetermined time for determining whether or not the canceling operation is performed may be set to an appropriate value through experiments or simulations.
  • the predetermined time is 5 seconds.
  • the determination operation is an operation (selection operation) for determining selection of the button 111A or 111B.
  • FIG. 23 is a flowchart illustrating processing executed by the processing device 120 according to the embodiment.
  • FIG. 23 a case will be described in which images of the article 111 and the buttons 111A and 111B are displayed on the screen 110A as shown in FIG.
  • Processing device 120 starts processing after power is turned on (start).
  • the processing device 120 displays the pointer 130A on the screen 110A (step S101).
  • the process of step S101 is the process shown in FIG.
  • the processing device 120 detects the operation of the user 1 based on the coordinates representing the position of each part of the body of the user 1 input from the human body detection unit 121 (step S102).
  • the process in step S102 is performed by the operation detection unit 124.
  • the operation of the user 1 is an operation such as a gesture performed by the user 1, and here, as an example, an operation for resetting the size of the pointer 130A, an operation for determining an input to the button 111A or 111B, and The operation for canceling the input to the button 111A or 111B is determined in advance as an operation that can be detected by the operation detection unit 124.
  • the processing device 120 determines whether or not the operation of the user 1 detected in step S102 is a reset operation (step S103).
  • the process in step S103 is performed by the operation detection unit 124.
  • step S104 determines whether or not the operation of the user 1 detected in step S102 is a determination operation.
  • the process in step S104 is performed by the operation detection unit 124.
  • step S104 determines whether the operation is not a determination operation (S104: NO). If the processing device 120 determines whether the operation of the user 1 detected in step S102 is a cancel operation (step S105). The process in step S105 is performed by the operation detection unit 124.
  • the processing device 120 determines that the operation is not a cancel operation (S105: NO), the series of processing ends (end). If it is determined that none of the reset operation, the determination operation, and the cancel operation, the process is terminated. If the power is turned on, the process is repeated from the start.
  • step S106 the processing device 120 stores the item ID of the item for which the determining operation has been performed (step S106).
  • the processing in step S106 is performed by the pointer generator 126. For example, when the article for which the determination operation has been performed is the button 111A, the article ID of the button 111A is stored.
  • the processing unit 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
  • step S107 the processing device 120 determines whether or not the cancel operation has been performed for the same article as the already stored article ID.
  • the processing in step S107 is performed by the pointer generator 126.
  • the processing device 120 determines that the article ID is the same as the already-stored article ID (S107: YES), the elapsed time from the storage of the article ID of the article for which the determination operation has been performed until the cancel operation is performed Is determined within a predetermined time (step S108).
  • step S108 is performed by the pointer generator 126. Whether or not the elapsed time is within the predetermined time may be determined by whether or not the elapsed time is equal to or shorter than the predetermined time.
  • the processing device 120 When determining that the elapsed time is within the predetermined time (S108: YES), the processing device 120 reduces the pointer size associated with the canceled article to 90% of the pointer size included in the size data. (Step S109). The processing in step S109 is performed by the pointer generator 126.
  • the pointer size (X p associated with the button 111B is canceled.
  • Y p , Z p ) is set to a size of 90%.
  • the processing device 120 ends the series of processing (end). If the power is on, the process is repeated from the start.
  • step S110 the processing device 120 sets the pointer size to an initial value (step S110).
  • the processing in step S110 is performed by the pointer generation unit 126.
  • the initial value of the pointer size is held in the data holding unit 127 similarly to the size data shown in FIG.
  • FIG. 24 is a diagram showing a display on the screen 110A when an instruction operation is actually performed in the simulation system 100.
  • FIG. On the screen 110A nine markers are arranged in three rows and three columns. The marker is displayed as a sphere like the pointer 130A, and the center marker is a marker C1.
  • the interval between the nine markers is 100 pixels. For 100 pixels, when the width of the screen 110A in the X-axis direction is 3 meters, the interval between the markers is 150 mm.
  • the radius of the marker is 30 mm, and the radius of the detection region 130B is 100 mm.
  • the initial value of the radius of the pointer 130A is 40 mm, and the radius of the pointer 130A when the marker exists in the detection area 130B is 20 mm.
  • the radius of the pointer 130A shown in FIG. 24 is 20 mm.
  • FIG. 25 is a diagram illustrating a locus of the pointer 130A displayed by the simulation system 100 when the instruction operation described with reference to FIG. 24 is performed. That is, FIG. 25 shows the locus of the coordinates P (P x , P y , P z ) of the pointer 130A obtained by the simulation system 100.
  • the points surrounded by a broken-line circle indicate the coordinates P (P x , P y when an instruction operation is performed to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. , P z ).
  • FIG. 25 the points scattered from the upper right to the lower left toward the broken circle indicate the locus when the pointer 130A is moved from the start point A6 to the marker C1 shown in FIG.
  • the numerical value (pixel value) of the XY coordinate used by calculation is shown.
  • the position of the point surrounded by the broken-line circle fluctuates because the right hand 2 shakes in an attempt to keep the pointer 130A aligned with the marker C1 after the pointer 130A reaches the marker C1. is there.
  • FIG. 26 shows the result of the instruction operation for moving the pointer 130A to the marker C1 in this way.
  • FIG. 26 is a diagram showing a result of an instruction operation for moving the pointer 130A to the marker C1.
  • a marker in the detection area 130B a case where the process of changing the radius of the pointer 130A from the initial value of 40 mm to 20 mm is described as “with process”.
  • the result when the pointer 130A is set to 90% size by the learning function is also referred to as “processing & learning function”. Show.
  • the radius of the pointer 130A in the case of “with processing & learning function” is 18 mm.
  • the number of frames in which the pointer 130A touches only the marker C1 is 582, and the number of frames in which the pointer 130A touches a marker other than the marker C1 is 59. For this reason, the probability of contacting a marker other than the marker C1 was 9%.
  • the probability that the pointer 130A touches only the number of frames that touch only the marker C1 is significantly reduced as compared with the case of “without processing”.
  • the number of frames in which the pointer 130A contacts only the marker C1 is 540, and the number of frames in which the pointer 130A contacts a marker other than the marker C1 is 35. For this reason, the probability of contacting a marker other than the marker C1 was 6%.
  • the size of the pointer 130A is changed according to the type of the article.
  • the pointer size is made smaller than the initial value, and when the article is relatively large, the pointer size is made larger than the initial value.
  • the pointer size is set according to the size of the article existing around the pointer 130A, the usability of the simulation system 100 is greatly improved.
  • the pointer size is reduced to 90% by the learning function.
  • 1 tries to move the pointer 130A to the same article again, it is easy to move the pointer 130A to the desired article.
  • the pointer size is reduced to 81%. As described above, when the cancel operation is performed a plurality of times on the same article, the size decreases by 10%.
  • the operability when the user 1 operates the pointer 130A after the learning function is performed is as follows. It will be improved.
  • the article 111 or the button 111A or 111B is used.
  • the determining operation and the canceling operation have been described as a mode of inputting or canceling the button 111A or 111B.
  • a determination operation may be performed. After performing the determination operation, the article 111 may be moved together with the pointer 130A. And after moving the article
  • the pointer size associated with the article ID 011 corresponding to the button 111A in the detection area 130B may be reduced by 10%.
  • the pointer size associated with the article IDs 011 and 012 corresponding to the buttons 111A and 111B in the detection area 130B is reduced by 10%.
  • the pointer 130A is displayed as an ellipsoid.
  • the pointer 130A may be displayed as an image having a shape other than the ellipsoid.
  • the detection area 130B is an ellipsoidal area, but the detection area 130B may be an area having a shape other than an ellipsoid. Further, although the detection area 130B has been described as an area that exists outside the pointer 130A, it may be an area that also includes the inside of the pointer 130A.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système de simulation facile à utiliser. Le système de simulation comprend : une unité d'affichage servant à afficher un pointeur présentant une position qui peut être manipulée par un utilisateur et une image d'un article basé sur des données d'article représentant la forme et la position de l'article ; une unité de stockage de données servant à stocker les données d'article ; une première unité de détection qui détecte une opération d'indication dans laquelle l'utilisateur indique la position du pointeur ; une seconde unité de détection qui détecte la position du pointeur dans le système de coordonnées de l'unité d'affichage sur la base de l'opération d'indication détectée par la première unité de détection ; une unité de génération de zone servant à générer une zone de détection qui comporte le pointeur et qui est plus grande que le pointeur ; une unité de réglage de taille qui règle la taille du pointeur en fonction du type d'un article au moins partiellement positionné à l'intérieur de la zone de détection ; et une unité de sortie qui amène l'unité d'affichage à afficher le pointeur à la taille réglée par l'unité de réglage de taille.
PCT/JP2016/064021 2016-05-11 2016-05-11 Système de simulation Ceased WO2017195299A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/064021 WO2017195299A1 (fr) 2016-05-11 2016-05-11 Système de simulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/064021 WO2017195299A1 (fr) 2016-05-11 2016-05-11 Système de simulation

Publications (1)

Publication Number Publication Date
WO2017195299A1 true WO2017195299A1 (fr) 2017-11-16

Family

ID=60267552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/064021 Ceased WO2017195299A1 (fr) 2016-05-11 2016-05-11 Système de simulation

Country Status (1)

Country Link
WO (1) WO2017195299A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161841A (ja) * 1996-12-02 1998-06-19 Mitsubishi Heavy Ind Ltd ポインタ表示制御装置
JPH10254675A (ja) * 1997-03-14 1998-09-25 Matsushita Electric Ind Co Ltd データ入力方法とその方法を用いたデータ入力装置
WO2013098869A1 (fr) * 2011-12-26 2013-07-04 株式会社日立製作所 Ordinateur qui repositionne un objet, et procédé et programme qui repositionnent l'objet
JP2013143144A (ja) * 2012-01-09 2013-07-22 Samsung Electronics Co Ltd ディスプレイ装置およびそのアイテム選択方法
JP2013152697A (ja) * 2011-12-28 2013-08-08 Alps Electric Co Ltd 入力装置及び電子機器

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161841A (ja) * 1996-12-02 1998-06-19 Mitsubishi Heavy Ind Ltd ポインタ表示制御装置
JPH10254675A (ja) * 1997-03-14 1998-09-25 Matsushita Electric Ind Co Ltd データ入力方法とその方法を用いたデータ入力装置
WO2013098869A1 (fr) * 2011-12-26 2013-07-04 株式会社日立製作所 Ordinateur qui repositionne un objet, et procédé et programme qui repositionnent l'objet
JP2013152697A (ja) * 2011-12-28 2013-08-08 Alps Electric Co Ltd 入力装置及び電子機器
JP2013143144A (ja) * 2012-01-09 2013-07-22 Samsung Electronics Co Ltd ディスプレイ装置およびそのアイテム選択方法

Similar Documents

Publication Publication Date Title
KR100851977B1 (ko) 가상 평면을 이용하여 전자 기기의 사용자 인터페이스를제어하는 방법 및 장치.
EP2656181B1 (fr) Suivi tridimensionnel de dispositif de commande utilisateur dans volume
US6198485B1 (en) Method and apparatus for three-dimensional input entry
US8305365B2 (en) Mobile device and area-specific processing executing method
TWI512548B (zh) 移動軌跡產生方法
Schneider et al. Accuracy evaluation of touch tasks in commodity virtual and augmented reality head-mounted displays
JP5117418B2 (ja) 情報処理装置及び情報処理方法
JP2014512530A (ja) 座標位置決め装置
TW201911133A (zh) 用於多個自由度之控制器追蹤
JP6110893B2 (ja) 仮想空間位置指定方法、プログラム、プログラムを記録した記録媒体、および、装置
JP5802247B2 (ja) 情報処理装置
JP2005227876A (ja) 画像処理方法、画像処理装置
US11579711B2 (en) Three-dimensional object position tracking system
CN101627356A (zh) 交互式输入系统和方法
US20230418431A1 (en) Interactive three-dimensional representations of objects
CN120322650A (zh) 无源附件
WO2017195299A1 (fr) Système de simulation
JP4868044B2 (ja) オブジェクト属性変更処理装置、オブジェクト属性変更処理方法、および3次元モデル処理装置、3次元モデル処理方法
KR101598807B1 (ko) 펜의 기울기를 측정하는 방법 및 그 디지타이저
JP7452917B2 (ja) 操作入力装置、操作入力方法及びプログラム
WO2019017900A1 (fr) Projection d'entrées vers des représentations d'objets tridimensionnels
JP4997265B2 (ja) 構成部品データ簡略化装置、構成部品データ簡略化方法および構成部品データ簡略化プログラム
JP2017204126A (ja) シミュレーションシステム
JP2002098520A (ja) コンピュータを利用したリアルタイムモニター式三次元測定システム
CN115239798B (zh) 数据标记方法、装置、电子设备及计算机存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16901647

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16901647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP