WO2015111481A1 - 情報処理装置、情報処理システム、ブロックシステム、および情報処理方法 - Google Patents
情報処理装置、情報処理システム、ブロックシステム、および情報処理方法 Download PDFInfo
- Publication number
- WO2015111481A1 WO2015111481A1 PCT/JP2015/050791 JP2015050791W WO2015111481A1 WO 2015111481 A1 WO2015111481 A1 WO 2015111481A1 JP 2015050791 W JP2015050791 W JP 2015050791W WO 2015111481 A1 WO2015111481 A1 WO 2015111481A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- block
- communication
- information processing
- information
- block set
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
- A63F2300/1093—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
Definitions
- the present invention relates to information processing technology using an object in real space.
- a perceived affordance is given in order to produce a sense of reality and enable intuitive operation.
- a device capable of performing the same operation in a shape close to the real thing, such as a car handle type or a handgun type, but the application is limited. If the shape of the device is variable, the application will be expanded, but ingenuity is required to measure the shape change and movement accordingly.
- Non-Patent Document 1 an infrared LED and a photosensor that receives light are incorporated in a joint portion of the component, thereby measuring the rotation angle of the component and specifying the shape.
- the rotation angle that can be measured is limited, the variable range of the shape is also limited.
- manufacturing cost becomes high. As the configuration of the apparatus is made more flexible in this way, the mechanism for measuring it becomes more complicated, and as a result, the manufacturing cost and the processing cost tend to increase.
- the present invention has been made in view of such problems, and an object of the present invention is to realize various and advanced expressions using an apparatus that can be freely formed.
- An aspect of the present invention relates to an information processing apparatus.
- This information processing apparatus is an assembly-type apparatus formed by connecting individually prepared blocks, and includes a communication block having a communication mechanism and capable of transmitting information related to a connection state and other non-communication blocks.
- a structure information receiving unit for acquiring information related to the structure of the communication block from the assembly type device, and an overall image and position of the assembly type device from an image obtained by photographing the assembly type device, and the structure of the communication block
- a structural analysis unit that generates state information including the shape, position, and orientation of the entire assembly-type apparatus by integrating with information related to the information processing unit, and an information processing unit that performs information processing based on the state information.
- This information processing system is an information processing system comprising an assembly type device composed of a plurality of blocks that can be connected to each other, and an information processing device that performs information processing based on an input signal from the assembly type device.
- the mold apparatus includes a communication block that has a communication mechanism and is configured to be able to transmit information related to the connection state, and other non-communication blocks.
- a structure information receiving unit for acquiring such information, and an overall image and position of the assembly type device from an image obtained by photographing the assembly type device, and integrating the information with the information relating to the structure of the communication block.
- a structural analysis unit that generates state information including the overall shape, position, and orientation, and an information processing unit that performs information processing based on the state information are provided.
- Still another aspect of the present invention relates to a block system.
- This block system is a block system composed of a plurality of blocks that can be connected to each other, and the plurality of blocks includes a communication mechanism that can transmit and receive information related to the connection state between the blocks to other blocks.
- a block and a non-communication block other than that, and at least one of the communication blocks is integrated with information acquired from a captured image of the block system by an external information processing apparatus.
- This information processing method is an assembly-type device formed by connecting individually prepared blocks, and includes a communication block having a communication mechanism and capable of transmitting information relating to a connection state, and other non-communications
- FIG. 1 It is a figure which illustrates information required in order to make the motion of a block set and a 3D object correspond in this Embodiment. It is a flowchart which shows the process sequence in which information processing apparatus performs matching which concerns on a motion of a block set and 3D object in this Embodiment. It is a figure which shows the example of the screen displayed on a display device in order to receive the input of the model selection by a user in S46 of FIG. It is a figure which shows the example of the screen displayed on a display device in order to set a coordinate system common to a block set and the model selected in S48 of FIG.
- this Embodiment it is a figure which shows the transition example of the block set when creating an object with two blocks, and the display screen at the time of registration.
- this Embodiment it is a figure which shows the case where the joint matched 1 to 1 moves by the same angle.
- this Embodiment it is a figure which shows another example of a response
- this Embodiment it is a figure which shows the example which matches one joint of a block set with the some joint of 3D object. It is a figure which shows the example of the screen displayed on a display device in order to set the response
- a plurality of blocks are assembled or deformed, and their shapes, postures, and positions are used as input values for information processing. That is, such a block can be positioned as an input device for the information processing apparatus. Furthermore, the shape, posture, and position of the assembled block may be changed to reflect the result of processing performed by the information processing apparatus. In this case, the block is positioned as an output device for the information processing device.
- the processing performed by the information processing apparatus is not particularly limited, but a suitable aspect will be exemplified later.
- such a group of blocks or an assembly of the blocks will be collectively referred to as a “block set”.
- the block set may include objects other than blocks in a general sense such as an article imitating clothing and clay work, and the shape and material thereof are not limited. Hereinafter, these are also referred to as “blocks”.
- FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied.
- the information processing system 2 includes a block set 120, a camera 122 that captures the block set 120, an information processing device 10 that performs predetermined information processing using the block set 120 as an input device or an output device, and an input that receives a user operation on the information processing device 10.
- the device 14 includes a display device 16 that displays data output from the information processing device as an image.
- the information processing apparatus 10 may be a game device or a personal computer, for example, and may implement an information processing function by loading a necessary application program.
- the display device 16 may be a general display such as a liquid crystal display, a plasma display, or an organic EL display. Moreover, the television provided with those displays and a speaker may be sufficient.
- the input device 14 may be any one of general input devices such as a game controller, a keyboard, a mouse, a joystick, or a touch pad provided on the screen of the display device 12, or any combination thereof.
- the connection between the information processing apparatus 10, the camera 122, the input device 14, and the display device 16 may be wired or wireless, and may be via various networks. Alternatively, any two or more of the camera 122, the information processing apparatus 10, the input apparatus 14, and the display apparatus 16, or all of them may be combined and integrally provided. Further, the camera 122 does not necessarily have to be mounted on the display device 16. There may be a plurality of block sets 120 depending on the contents processed by the information processing apparatus 10. The block set 120 and the information processing apparatus 10 establish a wireless connection using a Bluetooth (registered trademark) protocol, an IEEE802.11 protocol, or the like. Alternatively, one block of the block set 120 and the information processing apparatus 10 may be connected via a cable.
- a Bluetooth registered trademark
- IEEE802.11 IEEE802.11
- the block set 120 of this embodiment may be used as an input device for the information processing apparatus 10 or an output device. That is, in the former case, the information processing apparatus 10 performs information processing using the result of the user changing the position, posture, and shape of the block set 120 as an input value, and displays the result on the display device 16 as an image. In the latter case, the information processing apparatus 10 performs information processing by the user operating the input device 14, and as a result, the block set 120 itself is moved.
- the present embodiment may be configured so that any one of the modes can be realized, or only one of them can be realized.
- Fig. 2 shows an example of the appearance of individual blocks that make up a block set.
- the blocks are roughly classified into two types.
- One is a block configured to be communicable with another block and the information processing apparatus 10, and the other is a block having no communication means.
- the former is called “communication block” and the latter is called “non-communication block”.
- the communication block may incorporate various sensors for measuring physical quantities such as the direction, angle, and position of the block in addition to a communication mechanism with other blocks.
- the blocks are shown in the figure, as shown in the figure.
- It can have various shapes such as a mold block 102j.
- Each block is provided with a convex portion 104 and a concave portion 106 having a predetermined size and shape, and by inserting the convex portion 104 into the concave portion 106, the blocks can be connected at a desired position. Or you may comprise so that another block can be included by providing the recessed part 107 etc. of the shape which can insert another block itself like the rectangular parallelepiped block 102j and the cylindrical block 102k.
- the block set may further include joint blocks 102g and 102h that can be inserted into the recesses 106 of the different blocks at both ends in order to adjust the interval between the blocks to be connected. Further, the position / posture relationship between the connected blocks may be changed by rotating the joint block.
- the convex portion 104 and the concave portion 106 of the communication block also serve as terminals that enable signal transmission between the blocks.
- each end is provided with a connector having a structure according to a standard such as a bus provided inside the block.
- Signal transmission and physical connection between blocks can be achieved at the same time by employing various commonly used connectors or by providing a dedicated special connector.
- the connection means between the blocks is not limited to the connection between the convex portion 104 and the concave portion 106, and is realized by a hook-and-loop fastener, magnet, adhesive tape, adhesive, etc. May be.
- the signal transmission path prepared separately here may be a wireless communication mechanism.
- a certain block (in the case of FIG. 2, a quadrangular prism type block 102b) of the communication blocks is composed of two blocks, a bending / extending shaft 110 that enables them to bend and extend, and a potentiometer that detects an angle between the blocks.
- the bending and stretching mechanism includes a form in which the protrusions of the other block are joined to both ends of the bending and stretching shaft 110 penetrating one block, and a form in which two blocks are joined by a hinge or a metal that can bend and stretch.
- a mechanism having a plurality of degrees of freedom similar to a joint in a ball joint doll may be used, and is not particularly limited.
- the angle between the blocks may be changed continuously or may be changed in a plurality of stages.
- the direction of the shaft is not limited to that shown in the figure.
- the angle of the block which is a constituent element, is preferably structured so that it can be maintained even if the user releases the hand.
- the angle between the blocks may be measured by an angle sensor other than the potentiometer.
- an angle sensor other than the potentiometer.
- the blocks do not necessarily have to be connected.
- one block may be configured to be able to bend and stretch and be rotatable, and its bending and stretching angle and rotation angle may be measured.
- the mechanism for making the angle variable in this way is sometimes referred to as “joint”, and the two blocks whose relative angles change depending on the movement of the joint may be referred to as “link”.
- the communication block having the joint as described above may be able to control the joint angle according to a request from the information processing apparatus 10.
- the communication block is provided with an actuator such as a servo motor for controlling the joint angle.
- a certain block (in the case of FIG. 2, the plate-like block 102i) of the communication blocks may have an axis 109 that can rotate and protrudes from the side surface.
- the block By mounting wheels on the plurality of shafts 109, the block can be moved like a car. The movement or the user may push the block, or may be realized by a request from the information processing apparatus 10.
- the communication block is provided with an actuator such as a motor that rotates the shaft.
- the shaft 109 is an axle, a mechanism such as a rack and pinion that changes the direction of the wheel may be provided, and this can also be controlled by an actuator in accordance with a request from the information processing apparatus 10.
- Some communication blocks include motion sensing functions such as acceleration sensors, gyro sensors, geomagnetic sensors, and methods that track the posture using markers and shapes attached to cameras and objects.
- motion sensing functions such as acceleration sensors, gyro sensors, geomagnetic sensors, and methods that track the posture using markers and shapes attached to cameras and objects.
- One or more combinations of the above are mounted.
- the block in which the sensor is mounted, and the type and combination of the sensor to be mounted are determined by information processing realized using a block set. Alternatively, the user selects from various variations when assembling.
- a marker 108 may be provided in a certain block (in the case of FIG. 2, the quadrangular prism type block 102a) of the communication blocks.
- the marker 108 is for specifying a position in a three-dimensional space from a position and a size in an image photographed by a camera described later. Therefore, it is formed with a size, shape, and color that can be detected from the captured image by matching processing or the like.
- a sphere provided with a general light emitter such as a light emitting diode or a light bulb inside a spherical resin having light transmittance, a bar code, a two-dimensional code, or the like may be used.
- the marker 108 is provided in a plurality of blocks, the color may be changed for each block.
- the outer shell of the communication block and the non-communication block are typically made of synthetic resin, but the material such as metal or glass is not limited.
- the non-communication block does not incorporate a communication mechanism or the like, its material, shape, and size can be freely determined.
- various parts such as clothes made of cloth and the head of a doll made of rubber may be used, or additional items such as weapons and accessories may be used.
- a user's own work may be used.
- it may be a solid made by erasing an eraser, clay work, paper work, origami work.
- the block provided with the LED which light-emits predetermined color by the electricity supply from a communication block, or the display apparatus which displays an image may be sufficient.
- the information processing apparatus 10 uses information on the skeleton shape and posture of a block set that can be acquired by communication with a communication block, and information on the appearance shape captured by the camera 122 in a complementary manner.
- the state is specified with high accuracy. Therefore, the appearance of the block set can be freely expressed using non-communication blocks.
- a block having a shape including a communication block such as a rectangular parallelepiped block 102j and a cylindrical block 102k in FIG. 2, may be realized as a non-communication block.
- FIG. 3 shows only the structure of the communication block in the block set 120 shown in FIG. That is, the block set 120a in the figure is obtained by removing the rectangular block 102j and the cylindrical block 102k, which are non-communication blocks, from the block set 120 in FIG. 1, and the rectangular column blocks 102a, 102b shown in FIG. It is composed of a cubic block 102d and a joint block 102h. Among these blocks, the lower block of the quadrangular prism block 102b and the cubic block 102d are respectively included in the rectangular block 102j and the cylindrical block 102k which are non-communication blocks in the block set 120 of FIG. can not see.
- Such a structure composed of communication blocks can be regarded as constituting the skeleton of the entire block set 120.
- a part composed of communication blocks in the assembled block set 120 is referred to as a “core”.
- the posture and shape of the block set 120 are efficiently calculated by detecting necessary parameters with a motion sensor and a potentiometer provided in the communication block constituting the core.
- a motion sensor and a potentiometer provided in the communication block constituting the core.
- the connection position and the block type of each block (2) the inclination vector m1 of the quadrangular prism type block 102a or 102b, and (3) 2 constituting the quadrangular prism type block 102b.
- the angle ⁇ of each block, and (4) the length of each block L1, L2, L3, L4, and L5, the direction of each block, and hence the shape and orientation of the central axis of the block set 120 can be derived.
- the above (1) and (4) are found by signal transmission between blocks, and the above (3) can be measured by a potentiometer.
- a motion sensor is added to the quadrangular prism block 102a or 102b. If it is built in, it will be necessary and sufficient. Alternatively, the built-in block may be selected as the quadrangular prism block 102a or 102b.
- the position coordinates of the block set in the real-world three-dimensional space are specified using the image captured by the camera 122.
- the camera 122 as a stereo camera
- the absolute position of the block set in the three-dimensional space constituted by the depth direction with respect to the camera 122 and the viewing plane of the camera can be acquired.
- a technique for acquiring the position of a target object in a three-dimensional space based on the principle of triangulation using parallax in images taken by a stereo camera from different left and right viewpoints is widely known.
- depth or three-dimensional information acquisition means other than binocular stereoscopic vision may be used.
- a viewpoint moving camera may be used, and the position of the block set may be specified by a TOF (Time Of Flight) method using an infrared irradiation mechanism and an infrared sensor that detects the reflected light.
- a touch panel may be provided on the upper surface of the table on which the block set 120 is placed, and the position placed by the touch panel may be detected.
- the position may be specified based on a still image or a frame image of a moving image captured by the monocular camera 122 by using the square columnar block 102 a provided with the marker 108.
- the marker 108 is a light emitter having a known color, luminance, and size
- the marker image can be easily detected from the captured image.
- the position coordinates (x1, y1, z1) of the marker in the three-dimensional space can be specified from the position and size of the marker image on the image. Even when other markers are used, general image recognition techniques such as pattern matching and feature point extraction can be applied.
- efficient detection can be performed by applying an existing tracking technique.
- the marker 108 may be a device that emits invisible light such as infrared rays. In this case, a device for detecting invisible light is introduced separately, and the position of the marker 108 is detected. Similarly, a depth sensor, an ultrasonic sensor, a sound sensor, or the like may be used.
- the final position coordinates may be calculated by combining any two or more of the absolute position detection methods described above.
- FIG. 4 schematically shows the central axis of the core derived as described above. As illustrated, the position, posture, and shape of the central axis 124 are specified in a three-dimensional space.
- the three-dimensional space may be the camera coordinate system of the camera 122 or may be a transformation of the desired coordinate system.
- the position, posture and shape of the core and block set may be collectively referred to as “state”.
- FIG. 5 schematically shows an example of the internal configuration of the communication block.
- FIG. 5 schematically shows an example of the internal configuration of the communication block.
- the block 126a includes a battery 128a, a communication mechanism 130a, a memory 132a, a position sensor 134, and a motion sensor 136a.
- the communication mechanism 130a includes not only a wired communication mechanism that receives signals from other blocks via connection terminals but also a mechanism that performs wireless communication with the information processing apparatus 10.
- the memory 132a holds the identification number of the block 126a. The identification number is associated with information such as the size of the block 126a, the position of the concave portion, and the convex portion in the information processing apparatus 10, and the same identification number may be assigned to the same type of block. Or you may determine uniquely for every block so that it can utilize for the routing of the signal transmission within the assembled block set.
- the position sensor 134 is a sensor for acquiring the absolute position of the block 126a, and includes a marker for image recognition. However, in the case of a marker, the absolute position is detected by a combination with the camera 122 installed outside as described above.
- the motion sensor 136a is one of an acceleration sensor, a gyro sensor, and a geomagnetic sensor, or a combination of any two or more or a technique using a camera.
- the block 126b includes a battery 128b, a communication mechanism 130b, a memory 132b, and a motion sensor 136b. Each mechanism may be the same as described above for the block 126a, but the communication mechanism 130b may be configured only by a wired communication mechanism that receives signals from other blocks. Such a block is used in combination with a block 126a capable of communicating with the information processing apparatus 10. The same applies to the communication mechanisms of the other blocks.
- the block 126c includes a battery 128c, a communication mechanism 130c, a memory 132c, an angle sensor 138, and an actuator 139a.
- the block 126c is a communication block having a joint like the quadrangular prism block 102b in FIG. 2, and the angle sensor 138 is a sensor that detects a joint angle such as a potentiometer.
- the actuator 139a changes the joint angle according to a control signal from the information processing apparatus 10. For driving the actuator by the control signal, a general technique corresponding to the type of the actuator can be adopted.
- the block 126d includes a battery 128d, a communication mechanism 130d, a memory 132d, a rotary encoder 141, and an actuator 139b.
- the block 126d is a communication block having a rotatable shaft protruding outward like the plate-like block 102i in FIG. 2, and the block 126d itself can be propelled manually or automatically by mounting wheels. Alternatively, the shaft and the wheel may be integrally provided in advance.
- the rotary encoder 141 is a sensor that detects the amount of rotation of the wheel.
- the actuator 139b is a motor that rotates a wheel by a control signal from the information processing apparatus 10.
- the block 126e includes a communication mechanism 130e and a memory 132e. That is, the block 126e is a block that does not include a battery or a sensor. Therefore, it is used in combination with other blocks 126a and 126b equipped with batteries.
- the communication block in FIG. 5 is merely an example, and various sensors and other mechanisms may be combined in any manner.
- a portion where the block set moves not only a joint and an axle, but also a mechanism for changing a steering direction or displacing some blocks may be provided.
- These mechanisms may be moved by an actuator driven by a control signal from the information processing apparatus 10.
- you may provide LED and a display apparatus.
- a mechanism for energizing the connected non-communication block may be provided.
- any practical sensor may be incorporated.
- FIG. 6 shows the configuration of the block set 120 and the information processing apparatus 10 in detail.
- each element described as a functional block for performing various processes can be configured by a CPU (Central Processing Unit), a memory, and other LSIs in terms of hardware. This is realized by a program loaded on the computer.
- each block of the block set 120 includes a communication mechanism, a memory, various sensors, and an actuator. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
- the block set 120 is formed by selecting and assembling individual blocks.
- FIG. 6 shows the functional blocks of the core portion of the assembled block set, and the communication blocks constituting the core block are the first block 142a, the second block 142b, the third block 142c,... .
- out of communication blocks constituting the block set 120 basically, only one block establishes communication with the information processing apparatus 10. Therefore, the role of the hub is given to the first block 142a. Then, information is transmitted from a communication block far from the first block 142a in connection relation, and the information of the entire core is aggregated in the first block 142a.
- a block relatively close to the first block 142a is referred to as “upper”, and a far block is referred to as “lower”.
- One block 142a may be determined in advance, or a block having a communication mechanism with the information processing apparatus 10 is provided with a switch or the like (not shown), and the block turned on by the user is the first block 142a. It is good.
- the first block 142a may be a block that first establishes communication with the information processing apparatus 10 in the assembly stage.
- the block becomes the second block 142b.
- the block becomes the third block 142c.
- the number of communication blocks constituting the core is not limited, and the configuration and operation can be considered in the same way even when the number is one or four or more.
- the first block 142a, the second block 142b, and the third block 142c include first communication units 143a, 143b, 143c, element information acquisition units 144a, 144b, 144c, and second communication units 146a, 146b, 146c, respectively.
- the second block 142b further includes a drive unit 148.
- the drive unit 148 may be provided in any other communication block.
- the first communication units 143a, 143b, 143c receive information transmitted from the directly connected lower blocks.
- the information received here includes an identification number of a block connected lower than the block, an identification number of a connection location, and a measurement result by a built-in sensor. When a plurality of blocks are connected, information is superimposed every time the block passes from the lowest block.
- the element information acquisition units 144a, 144b, and 144c each include a sensor built in the block and a terminal provided at a location where another block is connected, and a measurement result of the sensor and a location where a lower block is connected Information related to is acquired.
- the second communication units 146a, 146b, and 146c receive information received by the first communication units 143a, 143b, and 143c, including the identification number of the lower block, the identification number of the connected portion, and the measurement result by the built-in sensor.
- the information acquired by the element information acquisition units 144a, 144b, and 144c of the block is added and transmitted as a signal to the directly connected upper block.
- the second communication unit 146a of the first block 142a transmits the information to the information processing apparatus 10. Further, the second communication unit 146a receives processing start / end request signals, various signals necessary for establishing communication, and control signals for driving the actuators of the block set from the information processing apparatus 10, etc. It functions as an interface with the device 10.
- the signal is sequentially transferred from the first block 142a to the lower block. That is, the first communication units 143a, 143b, and 143c of each block transmit the signal to the directly connected lower block.
- the second communication units 146b and 146c in each block receive the signal from the directly connected higher-order block.
- the drive unit 148 of the second block 142b includes an actuator that changes the joint angle and rotates the axle, and the second block 142b is designated as a drive target in the control signal transmitted from the upper block, Move the actuator by the corresponding amount.
- the information processing apparatus 10 receives the information related to the core state from the first block 142a of the block set 120, the image captured by the camera 122, and the shape of the block set 120 based on the information related to the core state.
- a structure analysis unit 22 that identifies posture and position, a shape, posture and position of the block set 120, or an information processing unit 30 that performs predetermined information processing according to a user operation on the input device 14.
- a display processing unit 32 that generates a power image and outputs the generated image to the display device 16 and a drive control unit 34 that transmits a signal for controlling the operation of the block set 120 are included.
- the information processing apparatus 10 further includes a block information storage unit 24 that stores information related to individual blocks, a model data storage unit 26 that stores model data of a 3D object to be displayed on the display device 16, a block set, and a part of the 3D object. And a correspondence information storage unit 28 for storing motion correspondence information.
- the core information receiving unit 20 receives signals including the identification numbers of the communication blocks constituting the core, the connection locations thereof, and the information related to the measurement result by the built-in sensor, which are aggregated by the first block 142a of the block set 120.
- the structure analysis unit 22 acquires from the camera 122 data of moving images and still images obtained by capturing the block set 120. Then, the information received by the core information receiving unit 20 and the information obtained from the captured image are integrated, and the position, posture, and shape of the entire block set 120 are specified. Since the signal from the block set 120 and the image data from the camera 122 are input immediately, it is assumed that time correspondence is taken, but synchronization processing or the like may be performed depending on the required time resolution.
- the structure analysis unit 22 identifies the shape and orientation of the core in the block set 120 based on information from the core information reception unit 20.
- the information of L1 to L5 in FIG. 3 is derived based on the identification numbers of the communication blocks constituting the core.
- the connection position and the angle formed by the blocks are specified from the identification number of the actual connection location and the information of the angle sensor.
- the vector m1 in FIG. 3 is derived from the information of the motion sensor.
- Information relating to the position of the block set 120 in the three-dimensional space and the superficial shape of the block set 120 including non-communication blocks is specified based on a captured image transmitted from the camera 122, a depth image generated from the captured image, or the like. .
- an image of a communication block included in the core such as the marker 108 in FIG. 3 is detected from the image, and the position is derived from the depth image and the like as a reference portion.
- the positional relationship between the core and the non-communication block is determined from the structure of the core connected to the reference region and the positional relationship of the non-communication block image with respect to the reference region image.
- the position, posture, and shape can be specified.
- the block information storage unit 24 stores basic information of each block used as a block set.
- the basic information is information in which an identification number given in advance to the block is associated with information regarding a shape, size, and a location where another block can be connected.
- this is information in which an identification number previously assigned to the block is associated with appearance features such as color, pattern, material, and texture.
- the more detailed such external features the more accurate the block identification accuracy.
- information on non-communication blocks may not be stored.
- the information processing unit 30 executes processing to be performed according to the state of the block set 120 specified by the structure analysis unit 22 or a user operation via the input device 14. For example, when the block set 120 is assembled, a 3D object representing the shape of the block set and a model 3D object associated with the 3D object are displayed. Then, the displayed 3D object is moved in accordance with the movement of the block set 120. Alternatively, a computer game is started and progressed according to a user operation via the input device 14, and the block set 120 is moved accordingly.
- the model data storage unit 26 stores data necessary for rendering the object model that the information processing unit 30 displays on the display device 16.
- This object model may be designed in advance such as a character appearing in a game, or may be created by a user in accordance with an assembled block set.
- the information processing unit 30 further performs processing for associating a block set such as a joint and a wheel with parts of an object and further associating movements of both parts.
- the information processing unit 30 may set all the correspondences, or may display a setting screen so as to associate the user and accept a setting input. Or you may combine them suitably.
- the correspondence information storage unit 28 stores information related to the correspondence relationship between the parts and movements set in this way.
- the user creates a block set freely, not only the position but also the shape and posture can be linked with the object on the screen.
- the movements of both may not necessarily be completely the same, and various changes can be set by associating the movements.
- the movement may not be reflected in real time.
- by storing the temporal change of the state of the block set moved by the user it is possible to realize a mode in which the corresponding object reproduces the movement at an arbitrary timing. As a result, it is possible to create a motion of a character in a computer game or animation with an easy operation.
- the drive control unit 34 transmits a control signal to the block set 120 in accordance with a request from the information processing unit 30 in a mode in which the information processing apparatus 10 moves the block set 120.
- the signal to be transmitted varies depending on the control method, and a technique generally used in the field of robot engineering or the like may be appropriately employed.
- the transmitted control signal is received by the second communication unit 146a of the first block 142a in the block set 120, and the signal transmission within the block set 120 causes the drive unit 148 of the target block (second block 142b in the case of FIG. 6). It is reflected in the operation.
- the control signal may be directly transmitted to the target block by wireless communication or the like.
- the display processing unit 32 creates image data as a result of the processing performed by the information processing unit 30 and displays the image data on the display device 16.
- the object is drawn according to the movement of the block set 120 at the output frame rate of the display device 16 and is output to the display device 16 as a video signal.
- a general computer graphics technique can be applied to the drawing process itself.
- the display processing unit 32 further causes the display device 16 to display a screen for setting the correspondence between the block set 120 and the object part and the movement of the object.
- a screen on which the user confirms or corrects the set correspondences may be displayed.
- the display processing unit 32 appropriately displays an image corresponding to information processing executed by the information processing unit 30 such as a game screen.
- FIG. 7 schematically shows an example of information transmission paths and information transmitted in the block set 120.
- each circle with a number written therein represents a block, and a straight line between the circles represents a state in which the blocks are connected.
- the number in the circle is the identification number of each block.
- the block with the identification number “1” corresponds to the first block 142 a in FIG. 6 and establishes communication with the information processing apparatus 10.
- the blocks with the identification numbers “2” and “3” in FIG. 7 are connected in series to the block with the identification number 1, it can be understood that they correspond to the second block 142b and the third block 142c in FIG. it can.
- a plurality of blocks may be connected to one block.
- the block with the identification number “2” and the block with the “5” are connected to the block with the identification number “1”.
- the block with the identification number “3” and the block with the “4” are connected in series in this order to the block with the identification number “2”.
- the block with the identification number “5” is connected in parallel with the block with the identification number “6” and the block with the “7”.
- a block having no identification number is connected to the block having the identification number “6”, and a block having the identification number “8” is connected to the block.
- the block having no identification number corresponds to a non-communication block.
- information transmission is basically transmitted from the lower block to the upper block.
- the content of the information to be transmitted is shown with an arrow indicating the transmission direction.
- information transmitted from the block having the identification number “3” to the block “2” is indicated as [3: J2 (4)].
- This is a signal having a format of “own identification number: identification number of a connection location provided in a block (identification number of a block connected to the block)”.
- the block of the identification number “4” is connected to the location of the identification number “J2”.
- the format and contents of the information are not limited with the same figure.
- the block having the identification number “3” receives the signal from the identification number “4”, the number of the terminal that received the signal is associated as the identification number of the connection location, and further, the identification number “3” is also associated.
- the block with the identification number “2”, which is one level higher. The content of transmission of this signal is [3: J2 (4)] as described above.
- the block of the identification number “2” is also a signal in which its own identification number, the identification number of the connection location (“J5” in the example in the figure), and the identification number “3” of the connected block are associated with each other. 2: J5 (3)] is generated.
- the block with the identification number “2” has a built-in sensor
- a signal in which the measurement result is associated with its own identification number is also generated.
- the measurement result is represented as “result”, but actually, a specific numerical value is substituted according to the type of sensor.
- the block with the identification number “2” and the data generated in this way and the data transmitted from the lower block, that is, [3: J2 (4)], are converted into the block with the identification number “1” which is one higher level. Send.
- these signals need not always be transmitted at the same time, and only the information may be transmitted when the contents of the signal once transmitted are changed.
- the blocks with the identification numbers “6” and “7” connected to the block with the identification number “5” do not have a built-in sensor and the connection location is uniquely determined, the identification number is obtained from these blocks. Similar to the block “4”, the signals [6: ⁇ ] and [7: ⁇ ] are transmitted to the block having the identification number “5”, respectively. Another block is connected to the block with the identification number “6”. However, since the block is a non-communication block, information from the block cannot be obtained.
- the block with the identification number “5” generates a signal in which the identification number of the connected portion and the identification number of the connected block are associated with its own identification number, and the block with the identification number “1” which is one higher level is generated.
- Send As shown in the figure, when a plurality of blocks are connected, they are collectively referred to as [5: J3 (6), J8 (7)] or the like.
- “J3” and “J8” are identification numbers of connection locations to which the blocks of identification numbers in parentheses are connected.
- the core information of the block set is collected in the block having the identification number “1”.
- the block with the identification number “1” generates a signal in which the identification number of the connection location is associated with the identification number of the connected block and the identification number of the block connected thereto. And it transmits to the information processing apparatus 10 with the signal transmitted from the low-order block.
- the information processing apparatus 10 can sequentially acquire the identification numbers of the blocks constituting the core, the connection relationship between the blocks, and the measurement results in the blocks incorporating the sensors.
- the block having the identification number “8” may directly transmit its own data to the information processing apparatus 10.
- the information processing apparatus 10 is connected to the block of the identification number “6” by transmitting its own identification number and the measurement result directly to the information processing apparatus 10. It is possible to grasp that there is a block, and to estimate the shape of the block and the approximate connection status. As the number of sensors built in the block with the identification number “8” increases, the accuracy of the information improves. By combining blocks from which a plurality of pieces of position information can be acquired, the structure of the block in the blind spot from the camera 122 can be specified with high accuracy.
- FIG. 8 shows a data structure example of basic information of communication blocks stored in the block information storage unit 24 of the information processing apparatus 10.
- the communication block information table 160 includes an identification number column 162, a shape column 164, a size column 166, and a connection location column 168.
- the identification number column 162 an identification number previously assigned to the communication blocks constituting the block set is described.
- the shape column 164 the type of the shape of each communication block, that is, the block type as illustrated in FIG. 2, such as “square prism” and “cube” is described.
- the size column 166 describes the horizontal width, depth, and vertical length of each communication block.
- connection location column 168 the connection location provided in each communication block is described in association with its identification number.
- connection location identification number surface number, x coordinate, y coordinate in the surface
- the face number is uniquely determined in advance for each face of the block.
- the communication block with the identification number “1” is a quadrangular prism block having a width of 4 cm, a depth of 4 cm, and a vertical length of 8 cm.
- the connection location with the identification number “J1” is at the position of the coordinates (2, 2) on the first surface.
- the connection location with the identification number “J2” is at the position of coordinates (1, 2) on the second surface.
- the format of the notation is not particularly limited.
- FIG. 9 is a diagram for explaining basic processing for specifying the state of a block set including non-communication blocks.
- the upper left of the figure shows the state of the core 170 specified based on the information received by the core information receiving unit 20. What is specified from this information is the connection relationship of the communication blocks and the core shape based thereon. However, if a position sensor is provided inside, the position in real space can also be determined.
- the structure analysis unit 22 generates a depth image 172 from the image captured by the camera 122.
- the depth image is an image in which the object in the field of view of the camera 122 is represented by the distance from the camera as a pixel value, and can be created by using the camera 122 as a stereo camera as described above.
- the depth image 172 in the figure schematically shows an image having a lower brightness as the distance is longer, and an image of the entire block set 120 appears without distinction between communication blocks and non-communication blocks.
- a color image taken by the camera 122 may be used. Then, by taking the volume difference between the core 170 viewed from the camera 122 side and the block set appearing as the image of the depth image 172, the state of the non-communication block excluding the core portion of the block set can be specified.
- the shaded portion is a non-communication block obtained as a difference.
- the position, posture, and shape of the entire block set 120 including the core and the non-communication block can be specified. If the image of the block set 120 can be identified by background separation or the like, and the image and the core can be aligned based on the apparent size of the core, only a general captured image is used regardless of the depth image. May be used.
- FIG. 10 shows an example data structure of basic information of non-communication blocks stored in the block information storage unit 24 of the information processing apparatus 10.
- the non-communication block information table 180 includes an identification number column 182, a shape column 184, a size column 186, and a color column 188.
- identification number column 182 identification numbers assigned in advance to the non-communication blocks constituting the block set are described. Blocks having the same shape, size, and color may have the same identification number.
- shape column 184 the shape type of each non-communication block, that is, the type of block as illustrated in FIG. 2, such as “cuboid” and “cylinder” is described.
- the size column 186 describes the horizontal width and depth (or diameter) and vertical length of each non-communication block.
- color column 188 describes the color of each non-communication block.
- the information in the shape column 184, the size column 186, and the color column 188 may be information such as polygons and textures as in the object model data in 3D graphics. Further, the information held in the non-communication block information table 180 is not limited to that shown in the figure. For example, if the number of communication blocks that can be connected is limited due to the shape of the recess, etc., if the identification number of the connectable communication block is retained, the communication block belonging to the specified core is connected to it. Narrow down non-communication blocks.
- the structure analysis unit 22 refers to the non-communication block information table 180 and identifies non-communication blocks that respectively match the images of portions other than the core in the depth image 172 shown in FIG.
- the shape of the block is traced and the exact shape is identified by tracking the movement of the block set on the assumption that it does not exist in the initial state. If a part is concealed and the shape cannot be determined even by referring to the non-communication block information table 180, either one of the candidate shapes is assumed or a certain part of the concealed part is assumed. The accuracy of shape recognition is improved by gradually correcting the time.
- FIG. 11 is a diagram for explaining processing for specifying the shape of a block set in a time-development manner.
- the vertical axis in the figure represents time, and it is assumed that time has elapsed from time “T1” to “T2”.
- the block set 190 has a cylindrical non-communication block (displayed with shading) in the lower half of a square column communication block with a marker (displayed with white outline) as shown in the uppermost part of the figure. It is installed.
- T1 it is assumed that the camera 122 has photographed the block set 190 placed on the horizontal plane from the front as shown in the drawing. In this case, as shown in the photographed image 192a, only the side surface of each block appears as an image.
- the remaining portion is apparently a quadrangle that is an image of the side surface of the non-communication block (depth image). 198a). That is, at time T1, there is a possibility that the three-dimensional shape of the non-communication block cannot be specified. However, depending on the resolution of the depth image, the distinction of whether it is a cylinder or a rectangular parallelepiped may be determined by the presence or absence of the curvature of the front surface. Further, among the blocks registered in the non-communication block information table 180, when only one non-communication block having a matching size, aspect ratio, etc., the shape can be specified.
- the depth image 198a represents volume data obtained by taking a volume difference between the image of the block set in the depth image acquired from the photographed image and the core, and is not necessarily generated as an image. The same applies to the subsequent drawings.
- the structure analysis unit 22 detects a candidate non-communication block from the non-communication block information table 180 and assumes that it is one of them.
- the same plane as the communication block whose shape is specified is assumed as an unspecified surface of the non-communication block.
- a block set shape 200 when a non-communication block is assumed to be a rectangular parallelepiped is shown.
- the block set shape 200 shown in the figure is a shape recognized by the information processing apparatus 10 at the time T1, and is not necessarily used for display. For example, while executing an application that displays the state of the block set as a 3D object as it is, the assumed shape shown in the figure may be drawn. Alternatively, it may be used as a base for correcting the shape in the next time step without any display.
- the assumed shape is used to identify the shape of the non-communication block in a time-developing manner and to recognize the shape change of the block set being assembled by the user in real time and efficiently. This information is stored for at least a predetermined period and used for later processing.
- the shape of the block set is managed by giving structural identification numbers (hereinafter referred to as “element numbers”) to the communication blocks and non-communication blocks constituting the block set.
- element numbers “# C1” are assigned to the communication blocks of the block set, and “# N1” is assigned to the non-communication blocks.
- the communication block and the non-communication block are distinguished by alphabets “C” and “N”, but this does not limit the format of the element number.
- element numbers are associated with the identification numbers assigned in advance to each block shown in FIGS. 8 and 10, and the connection relationship between the non-communication block and the communication block (connection surface, connected position, Along with information on the orientation).
- structure data information regarding the structure of the entire block set is referred to as “structure data”.
- the captured image 192b at time T2 is an image that includes some upper surfaces of the communication block 194 and the non-communication block 196, as illustrated.
- the remaining portion includes the upper surface of the cylinder of the non-communication block (depth image 198b). From the shape of the image, it can be determined that the non-communication block is not a rectangular parallelepiped assumed at time T1, but is likely to be a cylinder. By repeating such correction as the posture of the block set changes, the reliability of the shape recognition of the block set increases.
- the structure analysis unit 22 replaces the shape of the non-communication block assumed to be a rectangular parallelepiped with a cylinder at time T1.
- the accurate block set shape 202 is recognized.
- This processing is actually processing for correcting the identification number of the rectangular parallelepiped block associated with the element number # N1 to the identification number of the cylindrical block.
- the polygon model may be corrected.
- FIG. 11 illustrates a block set having a very simple structure. However, actually, for example, a non-communication block is further connected behind the non-communication block, or other blocks are overlapped. It is possible that only a part of the non-communication block is visible.
- a method of gradually identifying only the part that is found as the user tilts or changes direction with the block set may be performed.
- a display that prompts the user to rotate the block set with respect to the camera 122 so that photographing can be performed from a plurality of directions may be performed.
- the user may be able to specify an actual block.
- the shape may be specified from the photographed image by attaching a two-dimensional bar code representing the shape or a graphic marker to each non-communication block.
- FIG. 12 is a diagram for explaining processing for specifying the shape of a block set whose structure changes during assembly or the like.
- the representation of the figure is the same as in FIG. 11, with the vertical axis as the time axis, the captured images 192b to 192d at each time, the depth images 198b to 198d of the image excluding the core portion, and the recognized block set shape 202. , 210, 216 are shown in order from the left.
- the time T2 at the top of the figure corresponds to the time T2 in FIG. 11, and the captured image 192b, the depth image 198b, and the recognized block set shape 202 are the same.
- the structure analysis unit 22 recognizes the connection of the new non-communication block 204 by comparing the depth image 198b at the previous time T2 with the depth image 198c at the current time T3.
- the non-communication block 196 that existed even at the previous time may have changed orientation with respect to the camera, and may not be distinguished from the newly connected block. Therefore, the structure analysis unit 22 continues to perform position and orientation tracking on the non-communication block that has been recognized once, so that the same block can be recognized as the same even if the position and orientation change.
- a general tracking technique using an active contour model may be applied.
- the change in the orientation of the non-communication block connected to the core may be derived from the change in the position and orientation of the core that can be specified by the signal from the core.
- the non-communication block information table 180 is referred to as described with reference to FIG.
- the shape of the non-communication block is specified.
- the connection relationship with the core is specified.
- the block set shape 210 at time T3 can be recognized as shown.
- a new element number “# N2” is assigned to the added non-communication block, and it is associated with the identification number assigned in advance to the block, and the connection relationship with the communication block is recorded, thereby obtaining the structure data. Update.
- the relative speed between the block set including the core and the block in the field of view may be monitored. In this case, when the relative speed becomes 0, it is determined that a new block is connected.
- time T4 after time T3 it is assumed that the user reconnects the previously connected non-communication block 196 to a non-communication block 214 of another shape (captured image 192d).
- the depth image 198d at this time is compared with the depth image 198c at the previous time T3, it can be seen that the shape of the image 215 of the non-communication block is changed.
- the non-communication block information table 180 is referred to as before.
- the shape of the new non-communication block is specified. Since the connection relationship with the core is the same as that previously connected, the previous information can be used as it is. Therefore, by updating only the block identification number associated with the same element number “# N1” as before in the structure data to the one specified this time, the shape 216 of the block set at time T4 is as illustrated. It will be recognized.
- FIG. 13 is a diagram for explaining processing for specifying the shape of a block set that deforms due to a change in the joint angle of the core.
- the representation of the figure is the same as in FIGS. 11 and 12, with the vertical axis as the time axis, the captured images 192e to 192g, the depth images 198e to 198g at each time, and the recognized block set shapes 218, 222, and 224 to the left. They are shown in order.
- the shape of the block set is different from that shown in FIGS. 11 and 12, and the non-communication blocks 228 and 230 are respectively mounted so as to include the upper and lower links of the communication block 226 having markers and joints.
- the non-communication blocks 228 and 230 are respectively mounted so as to include the upper and lower links of the communication block 226 having markers and joints.
- the structure analysis unit 22 assigns element numbers “# C1” and “# N1” to one (series) communication blocks and one non-communication block regarded as one, and associates them with the identification numbers of the respective blocks. By recording the connection relationship between the two as structure data, the shape 218 of the block set is recognized as shown in the figure.
- the structure analysis unit 22 acquires the core state included in the non-communication block from the core information separately transmitted from the block set. In other words, the joint angle of the communication block is also grasped. Therefore, when the joint angle of the communication block inside changes at an angle corresponding to the deformation of the non-communication block, it can be determined that the deformation of the non-communication block is caused by the bending and stretching of the core.
- the distinction between the upper and lower blocks is obvious from the tilt angle and block shape, but if the tilt angle is small or the block is of a shape or material that does not stand out even if the angle changes It is possible that block breaks are difficult to understand.
- the dividing plane 220 perpendicular to the core axis when the joint is not bent as the position of the joint and dividing the block by this dividing plane, the block originally regarded as one is divided into two blocks. It may be divided.
- the plane of the dividing surface 220 is assumed as the top surface of the lower block and the bottom surface of the upper block that were originally in contact with each other, and correction is performed in the subsequent shape specifying process.
- the non-communication block is composed of two blocks by the processing at time t2, even if the joint of the block set returns to the state at time t1 when the joint of the block set is not bent (captured image 192g). , Depth image 198g), the information processing apparatus 10 can recognize that two non-communication blocks are connected (block set 224). Thereafter, even if the core joint is bent and stretched, the non-communication block including the link is managed individually, so that it is not necessary to newly specify the shape.
- each block may be managed individually based on the information.
- the modification of the structure data can be limited to the part where the change has occurred. It is advantageous.
- each non-communication block may be associated with another object model, and at the time of display, the portion of the non-communication block may be replaced with the object model associated therewith and rendered. .
- the block has a rough shape such as a rectangular parallelepiped, it is converted into a real object and displayed at the time of display.
- the time change of the block set is acquired for both the core and the non-communication block. Therefore, various modes can be realized by storing the time change as a history for a predetermined time and reading and using it as necessary. For example, when it is desired to return the block set being assembled to the state several stages before, when the user makes a request via the input device 14, the state of the block set at the stage is displayed as a 3D object. The user can return the actual block set to the previous state while viewing it. If the virtual viewpoint for the block set rendered as a 3D object is changed by the input device 14, the state of the previous block set can be confirmed from a plurality of directions. If the history is stored for a long time, a block set created in the past can be displayed as an object for each production stage, and an actual block set can be assembled and reproduced following that.
- the information of the once removed block is not deleted immediately from the structure data, but the original element number is assigned as it is and removed. You may manage with the flag etc. which show. In this case, when another block is attached to the same location, the same element number is given to a plurality of non-communication blocks, but it is possible to determine whether it is current information or past information by a flag.
- the object of the block set in the previous state can be displayed by detecting the original block from the structure data and returning it.
- FIG. 14 is a flowchart showing a processing procedure for specifying the state of a block set including non-communication blocks.
- a user inputs an instruction to start processing such as selecting an application in the information processing apparatus 10 by turning on one of the blocks of the block set 120 including a battery, and inputting the instruction via the input device 14. When it starts.
- the structure analysis unit 22 causes the camera 122 to start photographing a block set (S10).
- a predetermined initial image is displayed on the display device 16 by the cooperation of the information processing unit 30 and the display processing unit 32 (S12).
- the image displayed at this time may be a live image captured by the camera 112, an image created in advance as a part of an application such as a game image, or the like.
- the structure analyzing unit 22 determines the 3 of the core based on the information.
- the posture and shape in the dimensional space are specified (S14, S16).
- the structure analysis unit 22 acquires a captured image from the camera 122, generates a depth image based on the acquired image, and acquires the entire image and position of the block set (S18). Since there is a possibility that an object other than the block set, the background, the user's hand, and the like are reflected in the photographed image, processing for removing these images is performed at any stage.
- SLAM Simultaneous Localization And Mapping
- color segmentation and dictionary-based object recognition
- S20 It is specified how the core is positioned with respect to the overall image of the block set thus extracted (S20). Specifically, as described above, a part having a characteristic shape, color, pattern or the like such as a core marker is detected from the image of the block set, and the core position is determined based on the part. Then, based on the shape and posture of the core specified in S16, the appearance of the core from the camera is specified. That is, since this is an image of the block set when there is no non-communication block, the image and position of the non-communication block are obtained by taking the volume difference from the entire image of the actual block set (S22).
- SLAM Simultaneous Localization And Mapping
- the state of the non-communication block has been acquired at the time step t-1 before that, it is compared with the image of the current non-communication block (S24), and it is confirmed whether there is a change (S26). ).
- the shape specification of the non-communication block and the update processing of the structure data of the block set are performed (S30).
- time step t 0, the state of the non-communication block acquired in S22 is regarded as a change, and structure data is newly created based on the change.
- the processing of S30 is performed by adding or removing a non-communication block, replacing it with another non-communication block, or changing the shape of a non-communication block due to bending of the core joint. Cases are divided and processing to be performed for each is performed. That is, when an element is added, an element number is assigned, and the connection relationship with the core is recorded after associating the element number with the shape and the connection direction. If excluded, it is managed by deleting it from the structure data or setting a flag indicating that it has been excluded. When the block is replaced with another non-communication block, the block shape or connection direction corresponding to the corresponding element number is updated.
- the structure data may be represented as 3D graphics model data.
- the focus has mainly been on the shape change of the non-communication block, but the structural data can be updated by the same process even when the color or texture of the non-communication block changes.
- the remaining non-communication block image area is fed back to the captured color image. Identify color and texture changes.
- FIG. 15 exemplifies the relationship between the block set and display that can be realized by the mode described so far.
- a block set 240 including communication blocks and non-communication blocks is rendered as a 3D object 242 as displayed by the information processing apparatus 10 and displayed on the display device 16. Since the camera 122 acquires the shape, posture, color, etc. of the non-communication block, it can be recognized even if it is not registered in the non-communication block information table, for example, a user's own. For example, in the same figure, even if the user writes the face of a doll assembled as a block set 240 with magic or puts a sticker, the same face doll or doll with a sticker is drawn as a 3D object. .
- the block set in the field of view of the actual camera 122 is reversed and displayed as an object by mirror processing. Since the block set 240 faces the camera 122 on the display device 16, the 3D object 242 on the display is displayed in a state where the block set 240 is reflected in a mirror. On the other hand, since the information processing apparatus 10 recognizes the three-dimensional shape of the block set 240 as described above, the virtual viewpoint with respect to the 3D object 242 can be freely changed. Therefore, an image on the back side of the doll that is a blind spot from the camera 122 at this time can also be displayed without rotating the block set 240. However, depending on the shape specifying process so far, the details of the shape on the back side may not be deterministic.
- the state recognition of the actually assembled block set is based, but it may be possible to virtually connect what is not actually connected.
- images 244 and 246 of options for the items to be equipped are displayed.
- the information processing apparatus 10 moves to the corresponding part of the recognized block set.
- the structure data may be updated to indicate that the item has been connected.
- the item as an option holds a three-dimensional shape in the block information storage unit 24 or the model data storage unit 26 of the information processing apparatus 10.
- items on the screen are also held in the 3D object 242 according to the movement of the block set in the real space, regardless of whether it is actually connected as a block set or virtually connected. It can be expressed as if it moves in the virtual world. If a user actually creates a real object created by himself / herself using a 3D scanner and acquires three-dimensional shape information, the above item can be used instead. That is, even if there is an actual non-communication block, it can be displayed on the screen as if it was connected without actually connecting.
- FIG. 16 illustrates the relationship between the block set and the display when the appearance is set for the block set.
- the block set 248 itself has a simple configuration such as only a communication block.
- the information processing apparatus 10 displays the recognized state of the block set on the display device 16 as it is or as a 3D object 250 by inverting the left and right.
- candidate 3D object option images 252 a, 252 b, and 252 c are displayed for each part such as a face, for example.
- the user virtually connects to the 3D object 250 by performing selection input from the images 252a, 252b, and 252c for each block of the block set or for each part composed of a plurality of blocks. Finalize.
- 3D objects having various appearances can be freely created, and the 3D objects can be moved in accordance with the actual movement of the block set 248, or the virtual viewpoint can be changed.
- virtual 3D objects may be connected on the display.
- the completed 3D object targets create a block set for each part and the corresponding appearance, and finally connect the block set to create a complete shape with a detailed appearance set to the detailed part. It is possible to realize such a mode as displayed.
- a virtual appearance 3D object that has been created is saved and another 3D object with a different appearance is created using the same block set, a plurality of 3D objects created in this way are then stored. You may connect only in the virtual world. In this case, it is possible to realize a mode in which all connected 3D objects move in accordance with the block set as the 3D object finally associated with the block set is linked with the block set.
- the motion of the block set is reflected in the 3D object in the virtual world.
- the motion of the 3D object may be reflected in the motion of the block set.
- a character in the game executed by the information processing unit 30 a 3D object representing a block set assembled by the user or a 3D object virtually created by the user corresponding to the block set is displayed.
- the user performs a game operation using the input device 14, so that when the 3D object moves, the actual block set also moves.
- the drive control unit 34 transmits a signal for controlling the drive unit 148 of the block set so as to match the movement of the 3D object.
- FIG. 17 illustrates the relationship between the block set and the display when one 3D object is associated with the assembled block set.
- the user creates a block set 260 that looks like a crane truck and selects a crane truck 262 as a 3D object on the display.
- a plurality of joints 264a, 264b, and 264c are provided in a block that looks like a crane, and wheels 266a, 266b, 266c, and 266d are attached to a block that becomes a base.
- the block set 260 is a thing that the user can assemble freely, and therefore, the portion that the user wants to move as a crane, the front-rear relationship of the crane truck, and the like largely depend on the user's intention. Therefore, a method for setting the correspondence relationship between the freely assembled object and the motion of the 3D object prepared in advance will be described.
- FIG. 18 illustrates information necessary for associating the motion of the block set with the 3D object.
- the left side of the figure shows a schematic diagram of a block set 270 corresponding to the block set 260 of FIG. 17, and the right side shows a schematic diagram of a crane vehicle 272 corresponding to the crane vehicle 262 of the 3D object of FIG.
- the coordinate system used when the information processing apparatus 10 recognizes the shape is set in the block set 270, and the local coordinate system for the 3D model is set in the crane vehicle 262. . Since both are set independently, the directions of the block set 270 and the crane vehicle 272 in each coordinate system also vary.
- the shape of the block set 270 and the crane vehicle 272 in a two-dimensional plane composed of the x-axis and y-axis when defined in a state parallel to the x-axis of each coordinate system That is, it represents a side shape.
- the direction of the block set 270 and the crane vehicle 272 with respect to each coordinate system is opposite to the x-axis.
- the three joints of the block set 270 are RJ1, RJ2, and RJ3, and the three joints of the crane vehicle 272 are VJ1, VJ2, and VJ3.
- the four wheels of the block set 270 are RA1, RA2, RA3, RA4, and the caterpillar of the crane truck is VA1, VA2.
- the wheels RA3 and RA4 and the caterpillar VA2 are on the side opposite to the display surface and are originally concealed.
- FIG. 1 the shape of the block set 270 and the crane vehicle 272 in a two-dimensional plane composed of the x-axis and y-axis when defined in a state parallel to
- the joints RJ1, RJ2, and RJ3 of the block set 270 are associated with the joints VJ1, VJ2, and VJ3 of the crane vehicle 272, respectively, and the wheels RA1,
- RA2 is associated with a caterpillar VA2 (not shown) and the wheels RA3 and RA4 are associated with the caterpillar VA1.
- the positions of the joints of the cranes are different from each other. Therefore, for example, if the joints corresponding to the two are bent at the same angle, it may be considered that the movement is not as expected by the user.
- a physical movable range exists in the joint angle of the block set 270, and a movable range on the model also exists in the joint of the crane vehicle 272. If such constraint conditions are not taken into account, it is possible that the 3D model bends at an angle where it cannot be possible, or the joint angle of the block set reaches a limit and stops moving any more. Further, due to the difference in the direction with respect to the coordinate system, it may happen that the crane 272 on the display is moved backward while the block set 270 is moved forward. In order to prevent such problems from occurring, it is possible to link the real object with the object in the virtual world through the process of standardizing the coordinate system, setting the corresponding part, and correlating the specific movement of the corresponding part. Harmonize.
- FIG. 19 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 associates a block set with a motion of a 3D object.
- the user inputs an instruction request for starting association with a 3D object via the input device 14.
- the information processing unit 30 of the information processing apparatus 10 receives the request (S40)
- the information processing unit 30 acquires information on the structure of the assembled block set from the structure analysis unit 22 (S42).
- a suitable model is extracted as a candidate based on the shape of the block set, etc., among the 3D objects for which drawing data is prepared in the model data storage unit 26 (S44).
- the model data storage unit 26 stores drawing data of various object models such as crane vehicles and metadata representing the characteristics of each model in association with each other.
- This metadata is roughly divided into features as objects, structural features, and appearance features.
- Characteristic features include items such as people, animals, vehicles, food, movies that appear as characters, animations, proper names such as game names and character names, primitive times, medieval times, modern times, the future, specific years, etc. Related times, etc.
- the structural features include the number of joints, the movable angle and degree of freedom of each joint, the length and thickness of the link, the joint connection relationship, the driving force, the tire diameter, and the like.
- Appearance features include color, surface shape, number and volume of non-communication blocks, coverage of non-communication blocks in the block set, number of LEDs and display devices if equipped, type of display device, etc. .
- the model data storage unit 26 associates such various features with each model. The more features that are associated, the better the accuracy of candidate extraction. However, this is not intended to associate all features.
- the information processing unit 30 extracts a candidate model having a high similarity to the actual block set based on the structural features and appearance features from the information related to the shape and structure of the block set acquired in S42.
- Similarity evaluation value (N RJ ⁇ N VJ ) ⁇ w J + (N RA ⁇ N VA ) ⁇ w A
- w J and w A are weights for the evaluation of the number of joints and the evaluation of the wheels, and are determined by the importance of both. The closer this evaluation value is to 0, the higher the degree of similarity.
- the block set tends to have many joints and wheels, and when the evaluation value is negative, it means that the 3D object tends to have many joints and wheels. If there is a 3D object with an evaluation value of 0, it is extracted as the most promising candidate model.
- the 3D object having the negative evaluation value is extracted with priority. This is because the more the 3D object has more joints and wheels, the more detailed movement can be expressed on the screen, and the more detailed movement of the block set can be expressed. Prior to such evaluation of similarity, candidates may be narrowed down based on features selected by the user from features as objects. A feature, a structural feature, and an appearance feature may be combined and extracted as appropriate, or a feature other than the feature may be specified by the user.
- the information processing unit 30 displays the plurality of candidate models extracted in this manner on the display device 16 and accepts a selection input made by the user via the input device 14 or the like (S46).
- the example of the model selected by this is the crane vehicle 272 shown in FIG.
- a common coordinate system is set for the object on the block set and the image (S48).
- the parameters that determine the posture and movement direction of the block set can be handled in the same manner in the 3D object. For example, if the forward movement of the block set 270 in FIG. 18 is the negative direction of the x axis, the coordinate system is set so that the forward movement of the 3D object in the crane vehicle 272 is also the negative direction of the x axis.
- the definition of the joint angle can be made common, and when the joint angle changes, it can be properly determined which link has moved with respect to the joint.
- the various parameters defined in the common coordinate system are reflected in the drawing of the 3D object by converting the coordinates to the values in the local coordinate system that were originally set for the 3D object.
- the information processing unit 33 sets a common coordinate system so that the directions of both are the same.
- the information processing unit 30 may set the coordinate system by adjusting the orientation of the 3D object on the screen so that the user matches the orientation of the block set, and using the orientation as a reference. Thereby, even if it is a block set in which front and rear are unclear, it can be linked with the 3D object in the direction intended by the user.
- a corresponding part is set between the block set and the 3D object (S50).
- this processing can be performed collectively by the information processing unit 30. That is, the information processing unit 30 geometrically derives and associates the corresponding joints of both based on the structure of the block set acquired in S42 and the structure of the object model selected in S46. Alternatively, a setting screen is displayed so that the user can make settings.
- the locations to be associated here are typically the joints and wheels in FIG. 18, but if the location of the block set is to reflect changes such as bending, rotation, or displacement, the corresponding points on the 3D object side are Allow users to set freely. This makes it possible to move an animal that does not have a joint, such as a mollusk, or bow an object that does not actually bend.
- the location to be associated does not necessarily have a one-to-one relationship. That is, a plurality of joints of the block set may be associated with one joint of the 3D object, or a plurality of joints of the 3D object may be associated with one joint of the block set. The same applies to the wheels. Even if the number of joints between the block set and the 3D object is different, the information processing unit 30 can group the joints as such if the overall structure can be matched by considering a plurality of joints as one joint. May be.
- the information processing unit 30 sets the correspondence in consideration of the movable angle of each joint in addition to the overall structure of both. For example, joints having the same movable angle are associated with each other. As described above, the degree of similarity between joints may be evaluated from various viewpoints such as the overall structure and the movable angle, and joints having an evaluation value higher than the threshold may be associated with each other. Information on the corresponding part set by the information processing unit 30 or the user is stored in the correspondence information storage unit 28.
- the correspondence of the movement is set for the location associated in this way (S52). That is, whether or not the change in the joint angle is reflected as it is, or the ratio for distributing the change in the joint angle when the joint is not associated one-to-one is set.
- This is a case where the structure of the block set and the 3D object match, and if the movable angle of the corresponding joint is the same, the joint angle can be basically the same, so the information processing unit 30 sets it as such.
- the structure is similar by grouping the joint angles, as described above, a change in one joint angle is distributed to the movements of the joints belonging to one corresponding group.
- the distribution ratio may be determined according to the ratio of the movable angle of each joint.
- the information processing unit 30 may set it. Further, a setting screen is displayed so that the user can freely set the correspondence of the movement or correct the already set correspondence.
- the vehicle speed on the display of the 3D object is associated with the rotation speed of the block set axle. Since this relationship changes depending on the diameter of the wheel connected to the block set, the information processing unit 30 can determine by calculation if the wheel diameter is known. If it is not known, a necessary parameter is acquired by the user actually moving the block set as will be described later.
- the processing of S50 and S52 is repeated for all locations to be associated (N in S54), and when all locations are associated, the processing is terminated (Y in S54).
- FIG. 20 shows an example of a screen displayed on the display device 16 in order to accept an input of model selection by the user in S46 of FIG.
- the model selection acceptance screen 280 includes a plurality of models “model 1”, “model 2”, “model 3” images, a character string 282 that prompts selection input, and a cursor 284 that can be moved by the input device 14.
- “Model 1”, “Model 2”, and “Model 3” are candidate models extracted in S44 of FIG. 19, and are at least one of a feature as an object, a structural feature, and an appearance feature as described above. It is the result filtered by. For example, when the user designates “crane truck” as a feature as an object, all the extracted models are crane trucks. In this case, before displaying the model selection acceptance screen 280, a screen for accepting the selection of “crane truck” is also displayed.
- the number of extractions is not limited to three, and all the items that match the conditions may be extracted, or the above-mentioned similarity evaluation value is used to rank the models, and a predetermined number of models that are higher are extracted. May be.
- model images may be arranged in descending order from the left.
- “Model 1”, “Model 2”, and “Model 3” are all crane vehicles, but the shapes of the crane parts such as the number of joints are different. The user selects one model by pointing a desired model or a model similar to the block set with the cursor 284 and making a definite input with the input device 14.
- means other than the selection input means via the cursor 284 may be used for selecting the model.
- a technique has been proposed in which a user is photographed by the camera 122 and the position of the user's fingertip, and thus the position pointed on the display screen, is detected from the depth image or the like.
- the information processing apparatus 10 may recognize the selected model when the user points to the model to be selected.
- the input by the user on each setting screen shown in FIGS. 21 and 22 may be similarly realized.
- FIG. 21 shows an example of a screen displayed on the display device 16 in order to set a common coordinate system for the block set and the selected model in S48 of FIG.
- the common coordinate system is specified by the user adjusting the orientation of the 3D object on the screen according to the orientation of the block set.
- the model display direction setting reception screen 290 includes an image 292 in which the front of the crane truck is directed to the left, an image 294 that is directed in the right direction, a character string 296 that prompts designation of the orientation, and a cursor 298 that can be moved by the input device 14.
- the information processing unit 30 sets a coordinate system for both so that the direction of the block set 260 acquired by the structure analysis unit 22 and the direction of the 3D object in the selected image are defined in the same direction.
- the information processing unit 30 may set the coordinate system as described above.
- FIG. 22 shows an example of a screen displayed on the display device 16 in order to set the corresponding part of the block set and the 3D object in S50 of FIG.
- the corresponding location setting screen 300 includes a 3D object image 302, a command list field 304, a character string 306 that prompts setting input, and a cursor 308 that can be moved by the input device 14. Since the 3D object on the screen is in the same direction as the block set via the model display direction setting reception screen 290, the correspondence with the block set is easily understood by the user.
- the corresponding part setting screen 300 may further display an image 310 of the block set 260.
- the image 310 is an image captured by the camera 122 or an image in which the state of the block set 260 recognized by the information processing apparatus 10 is drawn as a 3D object.
- the captured image is reversed horizontally and the block set 260 viewed from the user is reproduced.
- the core image may be displayed.
- the user moves the joint to be matched by bending and stretching the block set 260 (for example, arrow A).
- the movement is recognized by the structure analysis unit 22.
- the designation input of the joint (for example, RJ2) to be matched on the block set side is realized.
- the user designates a joint (for example, VJ3) to be made to correspond on the 3D object side in the image 302 of the 3D object with the cursor 308, and performs a definite input with the input device.
- the joint RJ2 of the block set 260 is associated with the joint VJ3 of the 3D object.
- the information processing unit 30 records these correspondences and stores them in the correspondence information storage unit 28.
- the command list field 304 includes a “prev” button for invalidating and resetting the corresponding location set immediately before, a “next” button for confirming the current setting and setting the next corresponding location, A GUI such as a “stop” button for saving all the settings and closing the setting process is displayed. Furthermore, a “menu” button for shifting to a screen that displays various setting screens as menus is also displayed. It should be noted that the movement confirmation process may be performed after setting the movement correspondence described later, and in accordance therewith, whether or not the setting is invalidated may be determined by a “prev” button or a “next” button.
- the information processing unit 30 selects the joint as a candidate for the user. You may suggest. That is, at the time when the joint to be matched on the block set side is specified, among the 3D object images 302 displayed on the corresponding location setting screen 300, the color of the joint that is a candidate for the correspondence destination is changed. Either of the above can be selected by the user. This makes it possible to avoid excessive association even when the user allows free correspondence setting.
- a plurality of joints of the block set can be associated with one joint of the 3D object, and one joint of the block set can be associated with a plurality of joints of the 3D object.
- two marks 312 indicating that two joints of the block set are associated with the joint VJ3 of the 3D model are given. This is realized, for example, by moving the joint RJ1 of the block set 260 to associate with the joint VJ3 of the 3D object, and further moving the joint RJ2 to associate with the same joint VJ3 of the 3D object.
- the joints VJ1 and VJ2 of the 3D object are grouped into one and surrounded by an ellipse 314 indicating that it is associated with one joint of the block set.
- a GUI of a “group” button for drawing an ellipse 314 indicating the group is also displayed in the command list column 304.
- the user moves the joint RJ3 of the block set 260.
- the “group” button on the screen is selected by the cursor 308, and an ellipse 314 surrounding the joints VJ2 and VJ1 of the 3D object is drawn.
- the joint RJ3 of the block set 260 is associated with the joints VJ2 and VJ1 of the 3D object.
- the information processing unit 30 may determine some suitable grouping patterns and display them as candidates on the corresponding location setting screen 300 from the viewpoint of the entire structure of the block set and the 3D object, the movable angle, and the like. In this case, the user selects one pattern from there to determine the grouping. Alternatively, a grouping pattern candidate may be created as 3D object metadata and displayed on the corresponding location setting screen 300 to be selected by the user.
- the joint of the block set may be designated on the image 310 with the cursor 308.
- the associated joints may be displayed in the same color or connected with lines in the 3D object image 302 and the block set image 310 so that the correspondence is clearly indicated.
- FIG. 23 shows, as a basic mode, a case where the joints of the block set and the 3D object are associated one-to-one, and each joint moves at the same angle, the left is the joint part 320a of the block set, and the right is the 3D.
- a corresponding joint portion 322a of the object That is, when the joint portion 320a of the block set is bent by the angle ⁇ from a state where the joint is not bent, the corresponding joint portion 322a of the 3D object is also bent by the angle ⁇ .
- the actuator which is the driving unit 148 of the block set, is driven via the drive control unit 34, so that the corresponding joint 322 a is The angle ⁇ may be bent.
- FIG. 24 shows a correspondence example of angles of each joint when two grouped joints are associated with one joint, the left is the joint part 320b of the block set, and the right is the corresponding joint of the 3D object. Part 322b.
- two joint portions 320b of the block set are associated with one joint portion 322b of the 3D object.
- the corresponding joint portion 322b of the 3D object is bent by the total angle ( ⁇ 1 + ⁇ 2). That is, the joint angle change by the two joints of the block set is realized by one joint of the 3D object.
- This mode is effective when the movable angle of each joint of the block set is smaller than the change in joint angle required for the corresponding joint of the 3D object. Changes in three or more joint angles of the block set may be summed.
- the correspondence relationship of angles is the same. In this case, it is effective when the number of joints of the block set is smaller than the number of joints of the 3D object.
- the corresponding two joint portions 320b of the block set may be bent by angles ⁇ 1 and ⁇ 2, respectively.
- the ratio of the angles ⁇ 1 and ⁇ 2 is determined by the information processing unit 30 according to the constraint conditions such as the movable angle of each joint and the realistic movement of the object that the block set represents, that is, the object represented by the 3D object. it can. For example, if the movable angle of two joints of the block set is 1: 2, the angles ⁇ 1 and ⁇ 2 are also set to a ratio of 1: 2.
- the information processing unit 30 calculates the angles ⁇ 1 and ⁇ 2 so that such a state is maintained. Such a distribution rule is created together with the data of the 3D object.
- FIG. 25 shows another example of correspondence of angles of each joint when two joints are grouped and corresponded to one joint, the left is a joint part 320c of a block set, and the right is a 3D object. Corresponding joint portion 322c.
- the rotation axes of the two joints of the block set are different, one angle changing around an axis perpendicular to the plane of the figure and the other changing angle around the link axis.
- one joint of the 3D object associated with the joint is a joint having two degrees of freedom in which the angle can be changed around two axes corresponding to the joint.
- FIG. 26 shows a case where the angle changes of the joints associated one-to-one are made different, the left is the joint part 320d of the block set, and the right is the corresponding joint part 322d of the 3D object.
- the corresponding joint of the 3D object is changed by an angle 3 ⁇ that is three times that.
- the 3D object may be dynamically moved without much effort to move the block set.
- such a problem can be solved by changing the joint angle of the 3D object by an angle obtained by multiplying the angle change of the joint of the block set by a predetermined value larger than 1.
- the joint angle of the 3D object may be changed by an angle smaller than the block set by multiplying the angle change of the block set by a predetermined value smaller than 1.
- This mode is effective when, for example, manipulator operation is desired to make a fine movement of the 3D object with higher accuracy than the movement of the hand moving the block set.
- the same setting may be used when the angle change of the joint of the 3D object is reflected in the angle change of the block set.
- a mode in which one angle change is multiplied by a predetermined value to obtain the other angle change can be combined with the mode shown in FIGS.
- all grouped joints may be multiplied by the same value, or different values may be multiplied depending on the joint.
- FIG. 27 shows another example in which one joint of a block set is associated with a plurality of joints of a 3D object.
- the left part is a joint part 320e of the block set and the right part is a joint part 322e corresponding to the 3D object.
- one joint part 320e of the block set is associated with three joint parts 322e of the 3D object.
- the block set including only the joint portion 320e of FIG. , It can represent the movement of the snake moving forward.
- the joints associated with the 3D object are not limited to being adjacent to each other as shown in the figure, and may be a plurality of joints at different locations of the 3D object.
- FIG. 28 shows an example of a screen displayed on the display device 16 in order to set the correspondence between the block set and the movement of the 3D object in S52 of FIG.
- the motion correspondence setting screen 330 is a screen in which a dialog box 332 for inputting a motion correspondence is overlaid on the screen every time the user sets a corresponding location on the corresponding location setting screen 300 shown in FIG. is there.
- This example sets the distribution ratio when distributing the angle change of VJ3 to the angle change of RJ1 and RJ2 of the block set after the joints RJ1 and RJ2 of the block set are grouped and associated with the joint VJ3 of the 3D object Assumes that. Therefore, the dialog box 332 displays a character string 334 that prompts the user to input a ratio, and a text box 336 for inputting each numerical value of the ratio.
- the information processing unit 30 When the user inputs a specific numerical value to the text box via the input device 14 or the like, the information processing unit 30 further associates the ratio RJ1 and RJ2 of the block set with the joint VJ3 of the 3D object. Are stored in the correspondence information storage unit 28.
- the information set in the dialog box 332 is not limited to this, and necessary information can be sequentially set in associations as shown in FIGS. 23 to 27 according to whether or not joints are grouped. However, as described above, the setting by the user can be omitted if the information processing unit 30 can perform movement correspondence according to a predetermined rule. Alternatively, the association performed by the information processing unit 30 may be corrected by the user using a similar dialog box.
- FIG. 29 shows an example of the data structure of the information stored in the correspondence information storage unit 28 and corresponding to the movements of the corresponding portions of the block set and the 3D object.
- the correspondence information table 340 includes a block set information column 342 indicating the corresponding location and movement on the block set side, and a 3D object information column indicating the corresponding location and movement on the 3D object side with respect to the location and movement of the block set described in the column. 344.
- the joints of the block set and the joints of the 3D object are associated with each other, and the correspondence between the angle changes is set.
- the correspondence is basically described in units of steps. For example, looking at each “joint” column in the block set information column 342 and the 3D object information column 344, the joints RJ1 and RJ2 of the block set are associated with the joint VJ3 of the 3D object.
- the motion of the block set is reflected in the 3D object, it is defined that the sum of the angle changes of the joints RJ1 and RJ becomes the angle change of the joint VJ3 of the 3D object.
- the movement amount and movement direction of the block set may be acquired from the captured image of the camera 122, and the 3D object may be represented as moving based on the acquired image.
- the 3D object may be represented as moving based on the acquired image.
- setting according to the constraint condition is required. For example, the vehicle cannot be driven unless the wheels cooperate.
- the rotational speed of the wheel for obtaining a desired speed varies depending on the diameter of the wheel, there is a possibility that the speed may become too high and collide with a wall unless adjusted appropriately.
- the information processing unit 30 mainly associates movements. Specifically, when the front, rear, left and right of the block set are clear, the driving wheel, the driven wheel, and the steering wheel are determined according to a preset driving method. Furthermore, the left and right drive wheels and steering wheels are grouped so as to satisfy the constraint condition that they must have the same rotational speed and steering angle. Also, the rotation speed and rudder angle of the wheel of the block set are set so that the block set runs at a suitable speed and direction change corresponding to the speed and direction change of the 3D object expressed in the virtual world. Correspond to speed and direction change.
- FIG. 30 is a flowchart showing a processing procedure for setting the correspondence between the 3D object and the motion of the block set in a mode in which the motion of the 3D object is reflected in the block set. This process corresponds to the processes of S50 and S52 of FIG.
- wheels such as a driving wheel and a steering wheel according to the front / rear and left / right directions of the block set and the driving method. Is assigned to the front wheel or the rear wheel (S58). Further, at least two wheels arranged in parallel and assigned the roles of a drive wheel and a steered wheel are operated as a group (S60).
- control parameters for obtaining a suitable moving speed and direction change reflecting the virtual traveling of the 3D object are acquired.
- the control parameters are the rotational speed of the actuator that rotates the axle of the driving wheel, the amount of movement of the actuator that changes the steering angle of the steering wheel, and the like.
- the plurality of actuators that control the drive wheels and the steering wheels grouped in S60 are controlled so as to perform the same operation by a control signal from the information processing apparatus 10.
- the user actually moves the block set in a predetermined direction, measures the amount of movement, and measures the amount of rotation and the steering angle of the axle at that time (S62, S64, S66). .
- the user may electronically drive the block set by a separately prepared control mechanism, or may move the block set manually, for example, by pushing the main body with his / her hand. In the latter case, the mechanism for changing the axle and steering angle is released from the control of the actuator.
- the amount of movement of the block set can be acquired from an image captured by the camera 122 or a depth image generated therefrom.
- the amount of rotation can be acquired by a signal from a rotary encoder provided on the wheel, and the steering angle can be obtained by a steering angle sensor provided on the wheel. When a sensor such as a rotary encoder is not provided, the calculation may be performed based on the wheel diameter, the moving distance, and the moving direction.
- the control amount such as the motor rotation amount and the actuator movement amount for obtaining the movement amount is acquired (S68).
- the moving speed is a suitable moving speed of the block set itself, but is a value determined by the moving speed in the virtual world when interlocking with the 3D object on the screen. The same applies to the rudder angle. Therefore, the process of S70 associates the movement of the 3D object with the movement of the block set.
- the diameter of the wheel connected to the block set is found from the identification number of the wheel, the correspondence between the moving speed and the control parameter can be obtained by calculation, so that the processing from S62 to S68 can be omitted. Good.
- the 3D object of the crane truck and the 3D object of the crane truck were associated with each other, it was described as the association of the movement of the wheel, but the flowchart shown in FIG. The same is true even if the 3D object has no wheels.
- the block set can be moved at a speed and direction corresponding to the virtual movement of the 3D object.
- the present invention can be used not only in a mode in which the movement of the 3D object is reflected, but also in a mode in which the block set is simply moved according to the result of processing performed by the information processing apparatus 10.
- FIG. 31 is a diagram for explaining a case where the setting related to the wheel of the block set described above is extended to a composite link.
- the joints are not completely independent, but the links are linked. That is, if one joint of the composite link 350 is moved as shown by an arrow, all four joints are moved (composite link 352). In such a case, since only one joint cannot be moved by the actuator, four joints are grouped to perform a cooperative operation as in the case of the drive wheel.
- the information processing unit 30 recognizes the presence of the composite link on the correspondence setting screen or the like by moving the composite link in the actual block set. Then, when setting the correspondence of movement with the 3D object, all joints included in the 3D object are grouped.
- a block that can be freely assembled is used as an input device or an output device for processing in the information processing apparatus. Details such as the skeleton and position of the assembled block are obtained by using various sensors for obtaining the position and orientation and a communication block having a communication function. Further, the surface shape of the assembled block is acquired using a means for detecting the presence of the block in the real space such as a photographed image by the camera. By integrating these pieces of information, the position, posture, and shape of the assembled block can be identified with high accuracy even when a non-communication block having no communication function is used as a part of the block.
- the shape, material, color, etc. of the block are not limited, and even a user's own product can be used, and an object having an appearance suitable for the user's purpose can be freely created.
- the information processing apparatus can acquire the position, posture, and shape with high accuracy regardless of the appearance, various information processing can be performed using an action that the user assembles or moves as input information. Moreover, the assembled block can be moved as a result of information processing.
- a 3D object having the same appearance as the assembled block can be displayed, or a 3D object having a more realistic appearance corresponding to the assembled block can be displayed.
- the user may form the 3D object itself by designating parts of the 3D object for each block part, or the entire block may correspond to one 3D object.
- the setting of the corresponding part and the correspondence of the movement can be set based on the constraint conditions such as the shape of the block set and the 3D object, the number of joints, the movable angle of the joint, and the cooperative operation of the wheels.
- Linking the real world and the virtual world can be freely created while reducing the burden on the user by providing such an association automatically by the information processing device or providing an environment that can be set by the user. .
- a block set is assumed as a target to be associated with the 3D object on the screen, but an object other than the block set, for example, the user himself / herself can be similarly associated.
- the camera 122 captures the user, generates a depth image based on the image, and estimates the position of the skeleton.
- the position of the skeleton can be tracked by applying conventional techniques such as skeleton tracking.
- the information processing apparatus 10 recognizes the joint as in the case of the block set.
- the user and the joint of the 3D object can be associated with each other.
- an image showing the user's image and the position of the skeleton is generated in real time from the captured image and displayed on the corresponding location setting screen, and the user's joint and the joint of the 3D object are associated on the screen. It may be. At this time, both may be displayed at the same time, or designation of joints that are alternately displayed to set correspondence may be received one by one.
- the information processing apparatus determines the roles of the driving wheel, the driven wheel, and the steering wheel by attaching the wheel to the axle while the front and rear of the vehicle are known. And according to the said role, the motion of the actuator which controls the operation
- This block may be expanded to realize a block including an actuator that switches movement according to a connected object. For example, when a component comprising an axle and wheels is mounted, the block set vehicle is driven by rotating the axle. When a component consisting of a cam and a spring is mounted, the spring is released by the rotation of the cam, and the arrow that was set is released. When a part composed of the joint as described above is mounted, it is operated as a joint.
- the information processing apparatus 10 recognizes the type of the connected object from the photographed image, and transmits a control signal corresponding thereto to appropriately switch the movement of the actuator.
- the versatility of the block provided with the actuator increases, and a block set rich in variations can be realized at a lower cost than preparing different blocks for each type.
- Information processing system 10 Information processing device, 14 Input device, 16 Display device, 20 Core information receiving unit, 22 Structure analysis unit, 24 Block information storage unit, 26 Model data storage unit, 28 Corresponding information storage unit, 30 Information processing Unit, 32 display processing unit, 34 drive control unit, 102a square prism block, 122 camera, 120 block set, 126a block, 128a battery, 130a communication mechanism, 132a memory, 134 position sensor, 136a motion sensor, 138 angle sensor, 139a actuator, 141 rotary encoder, 142a first block, 143a first communication unit, 144a element information acquisition unit, 146a second communication unit, 48 drive unit.
- the present invention can be used for toys, game devices, assembly-type devices, learning materials, content display terminals, information processing devices, robots, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Evolutionary Computation (AREA)
- Computer Hardware Design (AREA)
- Toys (AREA)
- User Interface Of Digital Computer (AREA)
- Architecture (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
類似度評価値 = (NRJ-NVJ)×wJ +(NRA-NVA)×wA
ここでwJ、wAは、関節数の評価および車輪の評価に対する重みであり、両者の重要性によって決定する。この評価値は0に近いほど類似度が高いことを意味する。また、評価値が正の場合はブロックセットの関節や車輪が多い傾向にあり、負の場合は3Dオブジェクトの関節や車輪が多い傾向にあることを意味する。評価値が0となる3Dオブジェクトがあれば、それを最有力な候補モデルとして抽出する。
Claims (14)
- 個別に準備されたブロックを連結してなる組立型装置であって通信機構を有し接続状態に係る情報を送信可能に構成される通信ブロックとそれ以外の非通信ブロックとからなる組立型装置から、通信ブロックの構造に係る情報を取得する構造情報受信部と、
前記組立型装置を撮影した画像から、当該組立型装置の全体像および位置を取得するとともに、前記通信ブロックの構造に係る情報と統合することにより、前記組立型装置全体の形状、位置、および姿勢を含む状態情報を生成する構造解析部と、
前記状態情報に基づく情報処理を行う情報処理部と、
を備えたことを特徴とする情報処理装置。 - 前記構造解析部は、前記構造に係る情報に基づき前記全体像のうち前記通信ブロックの部分を、隠蔽部分を含めて特定することにより前記非通信ブロックの像を抽出し、抽出結果に基づき接続されている非通信ブロックの形状と前記通信ブロックとの位置関係を特定して、前記状態情報を生成することを特徴とする請求項1に記載の情報処理装置。
- 前記構造解析部は、所定の頻度で前記状態情報を生成し、接続されている前記非通信ブロックのうち前記画像に像が含まれていない部分について、所定の規則によって形状を仮定し、後の時間ステップにおいて当該部分が前記画像に含まれたときに、仮定した形状を修正することを特徴とする請求項2に記載の情報処理装置。
- 前記構造解析部は、所定の頻度で前記状態情報を生成し、接続されている前記非通信ブロックの数に経時変化があったとき、前記非通信ブロックの着脱を認識し、前記状態情報を更新することを特徴とする請求項2または3に記載の情報処理装置。
- 前記構造解析部は、抽出した前記非通信ブロックの像に基づき、各非通信ブロックの特徴を記録した非通信ブロック基本情報を検索することにより、接続されている非通信ブロックの形状を特定することを特徴とする請求項2から4のいずれかに記載の情報処理装置。
- 前記構造解析部は、前記状態情報において、接続されている非通信ブロックを個別に管理し、前記通信ブロックの形状変化に伴い、それに接続されていた前記非通信ブロックが分離したこと契機に、両非通信ブロックを区別して管理するよう、前記状態情報を更新することを特徴とする請求項2から5のいずれかに記載の情報処理装置。
- 前記情報処理部は、前記状態情報に基づき、前記組立型装置の外観をオブジェクトとして描画し表示装置に表示させることを特徴とする請求項1から6のいずれかに記載の情報処理装置。
- 前記構造解析部は、前記状態情報において、以前に外された非通信ブロックの情報を保持しておき、
前記情報処理部は、ユーザからの要求に従い前記状態情報を参照し、以前に外された非通信ブロックを元に戻したときの前記組立型装置の外観をオブジェクトとして描画し表示装置に表示させることを特徴とする請求項1から7のいずれかに記載の情報処理装置。 - 前記情報処理部は、前記組立型装置の外観をオブジェクトとして表示装置に表示させるとともに、他のオブジェクトモデルを画面上で接続する入力をユーザより受け付け、当該入力による仮想的な接続を、前記状態情報に反映させることを特徴とする請求項1から7のいずれかに記載の情報処理装置。
- 個別に準備されたブロックを連結してなる組立型装置と、当該組立型装置からの入力信号に基づき情報処理を行う情報処理装置と、を備えた情報処理システムであって、
前記組立型装置は、
通信機構を有しブロック同士の接続状態に係る情報を送信可能に構成される通信ブロックと、それ以外の非通信ブロックとを含み、
前記情報処理装置は、
前記組立型装置から、前記通信ブロックの構造に係る情報を取得する構造情報受信部と、
前記組立型装置を撮影した画像から、当該組立型装置の全体像および位置を取得するとともに、前記通信ブロックの構造に係る情報と統合することにより、前記組立型装置全体の形状、位置、および姿勢を含む状態情報を生成する構造解析部と、
前記状態情報に基づく情報処理を行う情報処理部と、
を備えたことを特徴とする情報処理システム。 - 互いに連結可能な複数のブロックからなるブロックシステムであって、
前記複数のブロックは、ブロック同士の接続状態に係る情報を他のブロックとの間で送受可能な通信機構を備える通信ブロックと、それ以外の非通信ブロックとを含み、
前記通信ブロックの少なくともいずれかは、外部の情報処理装置が、当該ブロックシステムの撮影画像から取得される情報と統合することにより、当該ブロックシステムの形状、位置、および姿勢を含む状態情報を生成するために、当該情報処理装置に対し、前記接続状態に係る情報を集約してなる前記通信ブロックの構造に係る情報を送信する通信機構をさらに備えることを特徴とするブロックシステム。 - 個別に準備されたブロックを連結してなる組立型装置であって、通信機構を有し接続状態に係る情報を送信可能に構成される通信ブロックと、それ以外の非通信ブロックとからなる組立型装置から、通信ブロックの構造に係る情報を取得するステップと、
前記組立型装置を撮影した画像から、当該組立型装置の全体像および位置を取得するとともに、前記通信ブロックの構造に係る情報と統合することにより、前記組立型装置全体の形状、位置、および姿勢を含む状態情報を生成し、メモリに格納するステップと、
前記状態情報をメモリから読み出し、それに基づく情報処理を行うステップと、
を含むことを特徴とする情報処理装置による情報処理方法。 - 個別に準備されたブロックを連結してなる組立型装置であって、通信機構を有し接続状態に係る情報を送信可能に構成される通信ブロックと、それ以外の非通信ブロックとからなる組立型装置から、通信ブロックの構造に係る情報を取得する機能と、
前記組立型装置を撮影した画像から、当該組立型装置の全体像および位置を取得するとともに、前記通信ブロックの構造に係る情報と統合することにより、前記組立型装置全体の形状、位置、および姿勢を含む状態情報を生成する機能と、
前記状態情報に基づく情報処理を行う機能と、
をコンピュータに実現させることを特徴とするコンピュータプログラム。 - 個別に準備されたブロックを連結してなる組立型装置であって、通信機構を有し接続状態に係る情報を送信可能に構成される通信ブロックと、それ以外の非通信ブロックとからなる組立型装置から、通信ブロックの構造に係る情報を取得する機能と、
前記組立型装置を撮影した画像から、当該組立型装置の全体像および位置を取得するとともに、前記通信ブロックの構造に係る情報と統合することにより、前記組立型装置全体の形状、位置、および姿勢を含む状態情報を生成する機能と、
前記状態情報に基づく情報処理を行う機能と、
をコンピュータに実現させるコンピュータプログラムを記録したことを特徴とするコンピュータにて読み取り可能な記録媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201580004639.XA CN105917386B (zh) | 2014-01-21 | 2015-01-14 | 信息处理设备、信息处理系统、块系统和信息处理方法 |
| KR1020167019113A KR101810415B1 (ko) | 2014-01-21 | 2015-01-14 | 정보 처리 장치, 정보 처리 시스템, 블록 시스템, 및 정보 처리 방법 |
| US15/110,873 US10146332B2 (en) | 2014-01-21 | 2015-01-14 | Information processing device, information processing system, block system, and information processing method |
| EP15739885.0A EP3098783B1 (en) | 2014-01-21 | 2015-01-14 | Information processing device, information processing system, block system, and information processing method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014-008894 | 2014-01-21 | ||
| JP2014008894A JP6027554B2 (ja) | 2014-01-21 | 2014-01-21 | 情報処理装置、情報処理システム、ブロックシステム、および情報処理方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015111481A1 true WO2015111481A1 (ja) | 2015-07-30 |
Family
ID=53681288
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/050791 Ceased WO2015111481A1 (ja) | 2014-01-21 | 2015-01-14 | 情報処理装置、情報処理システム、ブロックシステム、および情報処理方法 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US10146332B2 (ja) |
| EP (1) | EP3098783B1 (ja) |
| JP (1) | JP6027554B2 (ja) |
| KR (1) | KR101810415B1 (ja) |
| CN (1) | CN105917386B (ja) |
| WO (1) | WO2015111481A1 (ja) |
Families Citing this family (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017039348A1 (en) * | 2015-09-01 | 2017-03-09 | Samsung Electronics Co., Ltd. | Image capturing apparatus and operating method thereof |
| JP6288060B2 (ja) | 2015-12-10 | 2018-03-07 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
| JP6311695B2 (ja) | 2015-12-16 | 2018-04-18 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
| JP6323439B2 (ja) * | 2015-12-17 | 2018-05-16 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
| JP6187623B1 (ja) | 2016-03-14 | 2017-08-30 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
| EP3451292B1 (en) * | 2016-04-28 | 2020-02-12 | Fujitsu Limited | Skeleton estimation device, skeleton estimation method, and skeleton estimation program |
| JP6509279B2 (ja) * | 2017-05-31 | 2019-05-08 | 本田技研工業株式会社 | 物標認識システム、物標認識方法、およびプログラム |
| CN107261490A (zh) * | 2017-07-06 | 2017-10-20 | 腾讯科技(深圳)有限公司 | 实现智能玩具互动的方法、客户端及智能玩具 |
| US10880365B2 (en) * | 2018-03-08 | 2020-12-29 | Ricoh Company, Ltd. | Information processing apparatus, terminal apparatus, and method of processing information |
| CN108919954B (zh) * | 2018-06-29 | 2021-03-23 | 蓝色智库(北京)科技发展有限公司 | 一种动态变化场景虚实物体碰撞交互方法 |
| CN111783187B (zh) * | 2019-04-03 | 2023-12-22 | 京灯(广东)信息科技有限公司 | 一种亮化共享平台应用系统 |
| JP7331769B2 (ja) * | 2020-04-30 | 2023-08-23 | トヨタ自動車株式会社 | 位置推定システム、及び位置推定方法 |
| JP2022077067A (ja) * | 2020-11-11 | 2022-05-23 | 前田建設工業株式会社 | 会議用什器の自動配置システム |
| WO2023073759A1 (ja) * | 2021-10-25 | 2023-05-04 | 株式会社ソニー・インタラクティブエンタテインメント | 操作デバイス |
| JP7612627B2 (ja) * | 2022-02-25 | 2025-01-14 | キヤノン株式会社 | 主被写体判定装置、撮像装置、主被写体判定方法、及びプログラム |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005125460A (ja) * | 2003-10-24 | 2005-05-19 | Sony Corp | ロボット装置のためのモーション編集装置及びモーション編集方法、並びにコンピュータ・プログラム |
| WO2007050885A2 (en) | 2005-10-26 | 2007-05-03 | Sony Computer Entertainment America Inc. | System and method for interfacing with a computer program |
| WO2013122798A1 (en) * | 2012-02-17 | 2013-08-22 | Technology One, Inc. | Baseplate assembly for use with toy pieces |
| WO2014010004A1 (ja) * | 2012-07-13 | 2014-01-16 | 株式会社ソニー・コンピュータエンタテインメント | 入力装置、情報処理システム、情報処理装置、および情報処理方法 |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3091135B2 (ja) * | 1995-05-26 | 2000-09-25 | 株式会社バンダイ | ゲーム装置 |
| JPH10302085A (ja) | 1997-04-30 | 1998-11-13 | Yamaha Corp | Cgモデルの動作記録システム |
| US6290565B1 (en) | 1999-07-21 | 2001-09-18 | Nearlife, Inc. | Interactive game apparatus with game play controlled by user-modifiable toy |
| JP2001134362A (ja) * | 1999-11-01 | 2001-05-18 | Atr Media Integration & Communications Res Lab | 人形型インタフェース仮想空間ウォークスルーシステム |
| US20020196250A1 (en) | 2001-06-20 | 2002-12-26 | Gateway, Inc. | Parts assembly for virtual representation and content creation |
| US7596473B2 (en) | 2003-05-20 | 2009-09-29 | Interlego Ag | Method of constructing a virtual construction model |
| JP4418468B2 (ja) | 2003-05-20 | 2010-02-17 | レゴ エー/エス | 3次元オブジェクトのデジタル表現を操作するための方法およびシステム |
| KR20070026820A (ko) | 2004-06-17 | 2007-03-08 | 레고 에이/에스 | 빌딩 블록 모델들에 대한 빌딩 지침서들의 자동 생성 |
| US7874921B2 (en) | 2005-05-11 | 2011-01-25 | Roblox Corporation | Online building toy |
| JP2007004732A (ja) | 2005-06-27 | 2007-01-11 | Matsushita Electric Ind Co Ltd | 画像生成装置及び画像生成方法 |
| JP4660357B2 (ja) | 2005-11-18 | 2011-03-30 | 任天堂株式会社 | 画像処理プログラムおよび画像処理装置 |
| FR2897680B1 (fr) | 2006-02-17 | 2008-12-05 | Commissariat Energie Atomique | Dispositif de capture de mouvement et procede associe |
| JP4567805B2 (ja) | 2006-05-04 | 2010-10-20 | ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー | 1つ以上の視覚、音響、慣性およびミックスデータに基づく入力にギアリング効果を与える方法ならびに装置 |
| WO2008139482A2 (en) | 2007-05-16 | 2008-11-20 | Eyecue Vision Technologies Ltd. | System and method for physically interactive board games |
| US8257157B2 (en) | 2008-02-04 | 2012-09-04 | Polchin George C | Physical data building blocks system for video game interaction |
| US8690631B2 (en) | 2008-09-12 | 2014-04-08 | Texas Instruments Incorporated | Toy building block with embedded integrated circuit |
| US9498721B2 (en) | 2009-08-04 | 2016-11-22 | Eyecue Vision Technologies Ltd. | System and method for object extraction |
| WO2011039041A2 (en) | 2009-10-02 | 2011-04-07 | Lego A/S | Connectivity depended geometry optimization for real-time rendering |
| DK2714223T3 (en) | 2011-05-23 | 2015-09-14 | Lego As | Production of building instructions for building element models |
| US9609031B1 (en) * | 2013-12-17 | 2017-03-28 | Amazon Technologies, Inc. | Propagating state information to network nodes |
-
2014
- 2014-01-21 JP JP2014008894A patent/JP6027554B2/ja active Active
-
2015
- 2015-01-14 US US15/110,873 patent/US10146332B2/en active Active
- 2015-01-14 KR KR1020167019113A patent/KR101810415B1/ko active Active
- 2015-01-14 CN CN201580004639.XA patent/CN105917386B/zh active Active
- 2015-01-14 EP EP15739885.0A patent/EP3098783B1/en active Active
- 2015-01-14 WO PCT/JP2015/050791 patent/WO2015111481A1/ja not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005125460A (ja) * | 2003-10-24 | 2005-05-19 | Sony Corp | ロボット装置のためのモーション編集装置及びモーション編集方法、並びにコンピュータ・プログラム |
| WO2007050885A2 (en) | 2005-10-26 | 2007-05-03 | Sony Computer Entertainment America Inc. | System and method for interfacing with a computer program |
| WO2013122798A1 (en) * | 2012-02-17 | 2013-08-22 | Technology One, Inc. | Baseplate assembly for use with toy pieces |
| WO2014010004A1 (ja) * | 2012-07-13 | 2014-01-16 | 株式会社ソニー・コンピュータエンタテインメント | 入力装置、情報処理システム、情報処理装置、および情報処理方法 |
Non-Patent Citations (1)
| Title |
|---|
| MICHAEL PHILETUS WELLER; ELLEN YI-LUEN DO; MARK D GROSS: "Posey: Instrumenting a Poseable Hub and Strut Construction Toy", PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON TANGIBLE AND EMBEDDED INTERACTION, 2008, pages 39 - 46, XP002745438, DOI: doi:10.1145/1347390.1347402 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3098783A4 (en) | 2017-09-13 |
| JP2015138345A (ja) | 2015-07-30 |
| EP3098783A1 (en) | 2016-11-30 |
| CN105917386B (zh) | 2020-08-18 |
| US20160334885A1 (en) | 2016-11-17 |
| EP3098783B1 (en) | 2020-07-22 |
| CN105917386A (zh) | 2016-08-31 |
| JP6027554B2 (ja) | 2016-11-16 |
| KR20160099667A (ko) | 2016-08-22 |
| KR101810415B1 (ko) | 2017-12-19 |
| US10146332B2 (en) | 2018-12-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6027554B2 (ja) | 情報処理装置、情報処理システム、ブロックシステム、および情報処理方法 | |
| JP6193135B2 (ja) | 情報処理装置、情報処理システム、および情報処理方法 | |
| EP3098782B1 (en) | Information processing device and information processing method | |
| JP5977231B2 (ja) | 情報処理システム、情報処理装置、および情報処理方法 | |
| CN109069929B (zh) | 用于玩具识别的系统和方法 | |
| US8144148B2 (en) | Method and system for vision-based interaction in a virtual environment | |
| KR101881620B1 (ko) | 게임플레이에서의 3차원 환경 모델 사용 | |
| JP6039594B2 (ja) | 情報処理装置および情報処理方法 | |
| CN109829976A (zh) | 一种基于全息技术实时表演方法及其系统 | |
| JP6177145B2 (ja) | 情報処理装置および情報処理方法 | |
| JP6177146B2 (ja) | 情報処理装置および情報処理方法 | |
| JP6177147B2 (ja) | 情報処理装置および情報処理方法 | |
| JP7752134B2 (ja) | 3d環境の合成表現を算出するための算出上効率的方法 | |
| TW200947347A (en) | Volume recognition method and system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15739885 Country of ref document: EP Kind code of ref document: A1 |
|
| REEP | Request for entry into the european phase |
Ref document number: 2015739885 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2015739885 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15110873 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: 20167019113 Country of ref document: KR Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |