WO2018123431A1 - Robot interactif - Google Patents
Robot interactif Download PDFInfo
- Publication number
- WO2018123431A1 WO2018123431A1 PCT/JP2017/043112 JP2017043112W WO2018123431A1 WO 2018123431 A1 WO2018123431 A1 WO 2018123431A1 JP 2017043112 W JP2017043112 W JP 2017043112W WO 2018123431 A1 WO2018123431 A1 WO 2018123431A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- robot
- image
- upper unit
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
Definitions
- This disclosure relates to an interactive robot that can communicate with a user through conversation.
- Patent Document 1 discloses a robot apparatus that communicates with people (including adults, children, and pets) in a home environment.
- the robot apparatus includes a drive unit including a plurality of links and joints connecting the links, a task instruction input unit that inputs a task instruction, and controls the operation of the drive unit based on the input task.
- a drive control unit for determining a no-go-in area composed of a space required for the operation, and an area display unit for displaying the no-go-in area. According to this robot apparatus, it is possible to suitably take appropriate safety measures against people in the home environment.
- Patent Document 2 discloses a robot apparatus that can identify a person.
- the robot apparatus includes a video acquisition unit that acquires an image, a head detection tracking unit that detects a human head from the image, and a front face position that acquires a frontal face image from the detected partial image of the head
- a person comprising a matching means, a face feature extraction means for converting a front face image into a feature value, a face identification means for identifying a person from the feature value using an identification dictionary, and an identification dictionary storage means for storing the identification dictionary
- An identification device an overall control unit that controls the operation of the robot; and a moving unit that moves the robot in response to an instruction from the overall control unit.
- a person can be identified even in an environment where lighting conditions are not constant, such as a garden environment.
- This disclosure provides an interactive robot that can communicate with a user through conversation.
- an interactive robot capable of communicating with a user through conversation.
- the interactive robot includes an upper unit, a lower unit, and a joint part that connects the upper unit and the lower unit and is foldable so as to raise and lower the upper unit relative to the lower unit.
- the upper unit includes a projection device that projects an image and an imaging device that captures an image.
- the lower unit includes a drive device that drives a wheel for moving the robot, and a control device that controls the operation of the robot.
- the upper unit is placed in contact with the lower unit, the projection device, the imaging device, and the joint unit are housed in the upper and lower units, and the upper unit is lifted away from the lower unit.
- the projection device, the imaging device, and the second state in which at least a part of the joint portion is exposed can be taken.
- the perspective view of the home robot which is one embodiment of this indication (the state where the head part opened) Perspective view of home robot (head is raised and away from body) Perspective view of home robot (head part lowered and in contact with body part) It is an expanded view of a home robot. It is a perspective view of a home robot in a state where the head part is lifted up from the body part. It is a perspective view of the joint part which connects a head part and a body part. (A) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the front), (B) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the side). It is a figure explaining the attitude
- FIG. 11 is a block diagram showing the internal configuration of the home robot
- FIG. 12 is a diagram illustrating a state of the home robot that projects the video. It is a figure explaining the state of the home robot which is interacting with the user.
- FIG. 1 to 3 are perspective views of the home robot according to the first embodiment of the present disclosure.
- the home robot 100 is naturally adapted to the living space and has an egg-shaped design that is highly mentally and psychologically compatible with humans.
- the home robot 100 is divided into two parts, a head unit 10 (an example of an upper unit) and a body unit 50 (an example of a lower unit).
- the position of the boundary of the division (joint surface) is a position above the center position in the height direction.
- the radius of the head part 10 can be made small and the weight of the head part 10 can be reduced, the center of gravity of the entire robot 100 can be positioned on the lower side, and it becomes difficult to fall over.
- the external appearance of the state where the head part 10 is separated from the body part 50 as shown in FIGS. 1 and 2 can remind a person that the egg shell is broken, and can make a person feel familiar.
- the head part 10 and the body part 50 are connected by a joint part 30. Details of the configuration of the joint unit 30 will be described later.
- the head unit 10 as shown in FIGS. 1 and 2 is lifted away from the body unit 50 (hereinafter referred to as “open state”), and the head unit 10 as shown in FIG. It can take two states: a state of being placed in contact with the top (hereinafter referred to as “closed state”).
- a projector 11 for projecting an image and a camera 13 are fixedly attached to the head unit 10.
- the recessed part 41a for accommodating the joint part 30 (32) in the closed state is provided in the head part 10 (refer FIG. 1).
- the body portion 50 is also provided with a concave portion 41b for accommodating the joint portion 30 (32) and the projector 11 and the camera 13 of the head portion 10 in the closed state (see FIG. 2).
- FIG. 4 is a development view of the home robot 100.
- the head unit 10 includes a first upper housing 10a and a second upper housing 10b as outer shell cases.
- the body part 50 includes a first lower casing 50a, a second lower casing 50b, and a third lower casing 50c as outer shell cases.
- Each housing 10a to 10b, 50a to 50c is made of resin.
- the projector 11 and the camera 13 are attached to the first upper housing 10a.
- the first upper housing 10 a is provided with an arm attachment portion 10 d for attaching the joint portion 30.
- a speaker hole 17b is provided in the upper part of the second upper housing 10b.
- the second upper housing 10b is a case that covers the projector 11 and the like mounted on the first upper housing 10a.
- Lens covers 11b and 13b for protecting the respective lenses are attached to the projector 11 and the camera 13 attached to the head unit 10.
- a base 51 for arranging hardware parts (circuits, etc.) is arranged in the body part 50.
- an infrared sensor 55b (see FIG. 11) is disposed at a position 55 of the second lower housing 50b. Note that the infrared sensors 55b are arranged at equal intervals in four locations of the second lower housing 50b.
- a wheel 57 for moving the home robot 100 is attached in the lower region of the body portion 50.
- the wheel 57 is attached at four locations in the lower region of the body portion 50.
- a wheel cover 57b for protecting the hole is attached to each wheel 57.
- joint part 30 which is a mechanism for connecting the head part 10 and the body part 50 will be described.
- FIG. 5 is a perspective view of the home robot 100 with the head unit 10 raised from the body unit 50.
- FIG. 6 is a perspective view of the joint unit 30 in the home robot 100 in the posture shown in FIG.
- FIG. 7A is a perspective view of the home robot 100 in the state shown in FIG. 5, and is a view seen from the front of the home robot 100.
- FIG. 7B is a perspective view of the home robot 100 in the same state, as viewed from the side of the home robot 100.
- the joint part 30 is composed of a first arm 31 attached to the body part 50 side and a second arm 32 attached to the head part 10 side.
- the first and second arms 31 and 32 are made of metal.
- the first arm 31 is rotatably connected to the first lower housing 50a of the body part 50.
- the second arm 32 is rotatably connected to the arm attachment portion 10d of the first upper housing 50a of the head portion 10.
- the 1st arm 31 and the 2nd arm 32 are connected so that rotation is mutually possible.
- the joint part 30 connects the head part 10 and the body part 50 by three joint mechanisms using two arms.
- the irradiation angle (tilt angle) of the projector can be widely changed while keeping the balance of the center of gravity optimal.
- An actuator 35 for driving the arm is provided at the joint where the first arm 31 is connected to the first lower housing 50a. At the joint where the first arm 31 and the second arm 32 are coupled, an arm driving actuator 36 is provided. At the joint where the second arm 32 is connected to the arm mounting portion 10d of the head portion 10, an arm driving actuator 37 is provided. These actuators 35 to 37 drive the first and second arms 31 and 32 under the control of the controller 51.
- the actuators 35 to 37 are servo motors, for example.
- the actuator 35 By driving the actuator 35, the angle of the first arm 31 with respect to the body portion 50 can be adjusted.
- the actuator 36 By driving the actuator 36, the angle of the second arm 32 with respect to the first arm 31 can be adjusted.
- the actuator 37 By driving the actuator 37, the angle of the head unit 10 with respect to the second arm 32 can be adjusted.
- the height of the projection position by the projector 11 or the height of the photographing area by the camera 13 can be adjusted.
- FIGS. 8A to 8C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front lower side.
- FIGS. 9A to 9C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front.
- FIGS. 10A to 10C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected upward.
- 8 (A), 9 (A), 10 (A) are front views
- FIGS. 8 (B), 9 (B), 10 (B) are side views
- FIGS. C) and 10 (C) are rear views.
- the projection direction of the projector 11 can be freely changed by adjusting the angles of the first and second arms 31 and 32 of the joint portion 30.
- the opening angle of the head unit 10 is increased.
- the controller 51 controls the angles of the first and second arms 31 and 32 so that the balance is not lost due to the weight of the head unit 10 and the robot 100 does not fall.
- the first and second arms prevent the robot 100 from tipping over when the image is projected directly upward so that the balance is lost due to the weight of the head unit 10.
- the angles of 31 and 32 are adjusted.
- FIG. 11 is a block diagram showing an electrical configuration of the home robot 100. As shown in FIG. 11, the home robot 100 includes a projector 11, a camera 13, a microphone 15, a speaker 17, a temperature sensor 19, and a gyro sensor 21 in the header unit 10.
- the home robot 100 includes a controller 51, a communication device 53, a data storage unit 54, an infrared sensor 55b, a wheel drive unit 56, and a battery 59 in the body unit 50.
- the home robot 100 is provided with arm drive actuators 35 to 37 at the connecting portions of the head portion 10, the body portion 50 and the joint portion 30.
- the projector 11 is a projection device that projects a high-quality image such as 4K.
- the projector 11 receives the video signal from the controller 51 and projects the video indicated by the video signal.
- the camera 13 is an imaging device that includes an optical system and an image sensor, and shoots a subject to generate image data.
- the microphone 15 inputs external sound, converts it into a sound signal, and transmits it to the controller 51.
- the speaker 17 outputs sound based on the sound signal from the controller 51.
- the temperature sensor 19 detects the temperature inside the head unit 10 (particularly, around the projector 11).
- the gyro sensor 21 detects the movement (angular acceleration) of the head unit 10.
- the infrared sensor 55 b is provided below the housing 50 b of the body unit 50 and detects the presence or absence of an obstacle around the home robot 100.
- the data storage unit 54 is a recording medium that stores control parameters, data, control programs, and the like necessary for realizing the functions of the home robot 100.
- the data storage unit 54 can be composed of, for example, a hard disk (HDD), a semiconductor storage device (SSD), or an optical disk medium.
- the controller 51 is a control device that controls the operation of the home robot 100.
- the controller 51 includes a CPU, a RAM, a video signal processing circuit, an audio signal processing circuit, and the like that control the operation of the home robot 100 as a whole.
- the CPU realizes the function of the home robot 100 by executing a control program (software).
- the function of the home robot 100 may be realized by cooperation of software and hardware, or may be realized only by a hardware circuit designed exclusively. That is, an MPU, DSP, FPGA, ASIC, or the like may be mounted instead of the CPU.
- the wireless communication unit 53 is a communication module (communication circuit) for performing communication according to the Bluetooth (registered trademark) standard, the WiFi standard, or the like.
- the wireless communication unit 53 may include a communication module (circuit) for performing communication according to a communication standard for wide-area communication such as 3G, 4G, LTE, WiMAX (registered trademark).
- the home robot 100 has four independent wheels 57.
- the wheel drive unit 56 is a drive circuit that generates control signals for independently driving the four wheels 57.
- the wheel 57 is composed of an omni wheel that enables various movements.
- the omni wheel is configured by arranging a plurality of barrel-shaped rotating bodies on the circumference.
- the omni wheel can realize movement in multiple directions by a combination of rotation of the main body (wheel) on the shaft (back and forth movement) and rotation of the barrel-shaped rotating body on the circumference (left and right movement).
- the wheel driving unit 56 independently drives the four wheels 57 under the control of the controller 51, thereby realizing various movements including linear movement and turning in the front-rear and left-right directions.
- the battery 59 supplies power for causing each part of the home robot 100 to function.
- the battery 56 is a rechargeable secondary battery.
- the home robot 100 has a voice recognition function and can accept voice instructions from the user. Specifically, the voice from the user is input through the microphone 15 mounted on the head unit 10.
- the controller 51 performs AD conversion on the audio signal from the microphone 15 and recognizes the content of the user instruction by performing voice recognition based on the converted audio data.
- the home robot 100 can interact with the user by outputting sound from the speaker 17.
- Various kinds of voices such as adult / child voices, male / female voices, beep sounds, and the like can be output as voice types.
- the home robot 100 stores various types of vocabulary information in the data storage unit 54, and can output information having the same meaning in different expressions depending on situations in a conversation with the user.
- the head unit 10 When the function of the home robot 100 is stopped, as shown in FIG. 3, the head unit 10 is lowered and brought into contact with the body unit 50. In this closed state, the projector 11 and the camera 13 are housed inside the head unit 10 and the body unit 50 so as not to be exposed to the outside. At this time, the home robot 100 has an egg shape as shown in FIG.
- the voice recognition function via the microphone 15 is turned on, and the other functions are in a standby state in which they are turned off from the viewpoint of energy saving.
- the controller 51 wakes up, and as shown in FIG. 1 or FIG. 2, the head unit 10 rises from the body unit 50 and the home robot 100 enters an open state. In this open state, the projector 11 and the camera 13 are exposed from the head unit 10 and the body unit 50, thereby enabling video projection and image shooting.
- the home robot 100 of the present embodiment is deformed from the egg shape as shown in FIG. 3 to the shape as shown in FIGS.
- Such a deformation can give the user the impression that a new creature has been born as a result of a broken egg, and is pronounced of creatures.
- the home robot 100 can be connected to an access point via the wireless communication unit 53 and can be connected to a network (for example, the Internet) via the access point. Therefore, the home robot 100 can access a server and other devices connected on the network (cloud), and can acquire various content information.
- a network for example, the Internet
- the controller 51 of the home robot 100 determines obstacles and surrounding conditions around the robot based on the detection signal from the infrared sensor 55b. For example, when it is determined that there is an obstacle around the controller 51, the controller 51 determines a movement path so as to avoid the obstacle and controls the wheel driving unit 56. If the controller 51 detects that there is a hole in the destination floor surface or there is no table surface in the destination while moving on the floor or table, the controller 51 drops into the hole or from the table surface. The movement path is determined so as to prevent this, and the wheel drive unit 56 is controlled.
- the home robot 100 When the home robot 100 receives a shutdown (power off) instruction from the user, the home robot 100 changes from an open state to a closed state. At this time, in order to prevent the user's finger from being sandwiched between the head unit 10 and the body unit 50, the head unit 10 and the body unit 50 are not brought into contact with each other with a certain gap therebetween. It may be held. In addition, a warning message or a warning sound may be output before starting shutdown.
- the home robot 100 has a function of shutting down the function of the robot 100 (that is, stopping the projector 11) when the temperature of the head unit 10 becomes high. For this reason, the controller 51 determines whether or not the robot 100 has reached a high temperature based on the temperature detected by the temperature sensor 19. Specifically, when the temperature detected by the temperature sensor 19 exceeds a predetermined value, the controller 51 determines that the robot 100 is hot and shuts down (stops the projector 11). At that time, the controller 51 may output a voice message (warning message) “Please cool down a little” from the speaker 17 and then automatically shut down. Accordingly, the user can recognize that the robot 100 will be shut down soon because the temperature of the robot 100 is high.
- a voice message warning message
- the home robot 100 is shut down after changing the posture of the home robot 100 from the open state to the closed state. Thereby, it can prevent touching a user's finger
- the controller 51 may determine whether or not the robot 100 is hot based on the continuous operation time of the projector 11 in addition to the temperature of the head unit 10.
- the home robot 100 performs actions with cute caress and emotion (depressing, nodding, shaking the body, etc.) according to various scenes. For example, the home robot 100 recognizes a question from the user and recognizes the content of the question. When “YES” is shown in response to the question, the home robot 100 moves the head unit 10 up and down so that it looks like it is creeping. Specifically, the controller 51 drives an actuator 37 at a connection portion between the second arm 32 and the head unit 10 to cause the head unit 10 to slightly vibrate up and down with respect to the second arm 32. Alternatively, when “NO” is indicated in response to an inquiry from the user, the entire home robot 100 is moved to the left and right.
- the controller 51 controls the movement of the four wheels 57 so that the entire home robot 100 is swung left and right.
- embarrassment and anger may be expressed by bringing the head portion 10 into a closed state in contact with the body portion 50. Emotions can be expressed by these various movements.
- the home robot 100 has a function of capturing an image in front of the robot 100 by the camera 13.
- An image captured by the camera 13 may be stored in the data storage unit 54.
- the controller 51 can always acquire an image from the camera 13 and recognize the situation in front of the robot 100 by analyzing the acquired image. For example, the controller 51 can detect whether or not there is a user (person) ahead, the position of the user, the number of people, the orientation of the user's face, and the like based on the result of the image analysis.
- the home robot 100 can project a video (content information 80) from the projector 11 as shown in FIG.
- the projection position of the image can be changed by changing the direction of the projector 11.
- the vertical direction of the projector 11 can be changed by controlling the arc drive actuators 35 to 37 and changing the height and tilt angle of the head unit 10 (ie, the projector 11). In the vertical direction, the projection position can be changed within a range of approximately 180 degrees from approximately directly below to directly above.
- the horizontal direction (pan direction) of the projector 11 can be changed by controlling the wheel 57 to change the horizontal direction (pan direction) of the entire robot 100.
- an image can be projected on various places such as a table surface, a wall, a floor, and a ceiling.
- the controller 51 of the home robot 100 may recognize the user position from the image analysis result and control the image projection operation by the projector 11 based on the user position. For example, when the controller 51 detects that the user is at a position where the video is to be projected, the projector 11 stops the video projection operation or reduces the intensity of the video light from the viewpoint of safety. May be controlled. Thereby, it is possible to prevent the user from being irradiated with strong image light from the projector 11.
- the home robot 100 when the home robot 100 (the controller 51) detects that the user is at a position where the video is to be projected, the home robot 100 (controller 51) may control the orientation of the projector 11 so as to project the video at a position where the user is not present. Thereby, it is possible to prevent the user from being irradiated with the image light from the projector 11.
- the orientation of the home robot 100 when the orientation of the home robot 100 is changed, an image is projected within a predetermined time after the orientation is changed. This is to prevent the user from feeling sad about being turned away.
- the controller 51 may project an image on a position that is easy for the user to see (for example, an area in front of the user). At that time, the controller 51 detects the user's face from the image, determines the orientation of the user's face, and appropriately adjusts the orientation of the video (content) so that the user can visually recognize the video (content) in the correct orientation ( Project the image upside down and horizontally. For example, the controller 51 controls the projection position of the video and the direction of the video so that a video in the correct orientation as viewed from the user is displayed in an area in front of the user from the side of the user. Thereby, the user can always visually recognize an image in the correct direction regardless of the position of the robot 100.
- the controller 51 of the home robot 100 When projecting an image from the projector 11, the controller 51 of the home robot 100 performs an operation on the image to be projected so that an image having a correct shape is displayed on the projection surface based on the relative positional relationship between the projection surface and the projector 11. Correct keystone.
- the home robot 100 can create and project an image obtained by inverting the image vertically and horizontally based on the original image signal.
- the projector 11 may be provided with a shake correction function. Therefore, the projector 11 includes a correction lens for changing the optical path, and a drive mechanism that moves the correction lens in a plane orthogonal to the optical axis of the projection light in accordance with the shake.
- the projector 11 receives a signal indicating the shake of the head unit 10 from the gyro sensor 21, and moves the correction lens in a plane orthogonal to the optical axis of the projection light so as to cancel the shake.
- the home robot 100 can project an image without blurring the image at a desired position even when the whirling operation is performed during the image projection.
- a shake correction technique can be realized by a known camera shake correction technique used in a general camera.
- the video signal may be controlled according to the detected blur. That is, the position of the image (object) indicated by the video signal may be shifted to a position where the blur is canceled according to the blur.
- the home robot 100 captures a book or document by the camera 13 and recognizes information described in the book or document from the captured image.
- the controller 51 has an OCR (Optical Character Recognition) function and recognizes text from a captured image.
- the controller 51 may synthesize speech based on the recognized text content and output it from the speaker 17. Thereby, the text of the book can be read out.
- the controller 51 may read the answer sheet in which the answer is written, score it, and convey the score result to the user by video or audio.
- the controller 51 may analyze the image, recognize a note, and play music according to the note.
- the home robot 100 configured as described above can present various information (video, audio) to the user while talking to the user.
- the home robot 100 is an interactive robot that can communicate with a user through conversation.
- the home robot 100 connects the head unit 10 (an example of an upper unit), the body unit 50 (an example of a lower unit), the head unit 10 and the body unit 50, and moves the head unit 10 up and down relative to the body unit 50. And a bendable joint portion 30.
- the head unit 10 includes a projector 11 (an example of a projection device) that projects an image, and a camera 13 (an example of an imaging device) that captures an image.
- the body unit 50 includes a wheel drive unit 56 (an example of a drive device) that drives a wheel 57 for moving the robot 100, and a controller 51 (an example of a control device) that controls the operation of the robot 100.
- the home robot 100 is placed in contact with the head unit 10 on the body unit 50, and the projector 11, the camera 13, and the joint unit 30 are stored in the head unit 10 and the body unit 50 (in the first state). For example, see FIG. Further, the home robot 100 is in an open state in which the head unit 10 is lifted away from the body unit 50 and at least a part of the projector 11, the camera 13, and the joint unit 30 is exposed (an example of the second state, see FIGS. 1 and 2). ) Can be taken.
- the home robot 100 configured as described above can present various information (video, audio) while talking to the user. Therefore, nursing care support for the elderly (prevention of dementia and post-growth depression, etc.), child education (distance education, learning support, answering machine when parents are absent, etc.), recreation (games, reading aloud, etc.) Various aspects of support can be provided (see FIGS. 13A and 13B).
- the home robot 100 has an egg shape when closed. By having an egg-shaped design, the home robot 100 can naturally blend into the living space and make people feel mentally and psychologically friendly.
- the first embodiment has been described as an example of the technique disclosed in the present application.
- the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed.
- the home robot of the present disclosure is an interactive robot, and can provide various user support such as recreation, education, and nursing care at home.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne un robot interactif (100) pouvant communiquer avec un utilisateur au moyen d'une conversation, et comprenant : une unité supérieure (10) ; une unité inférieure (50) ; et une articulation (30) qui accouple l'unité supérieure et l'unité inférieure et qui peut plier librement de façon à élever/abaisser l'unité supérieure par rapport à l'unité inférieure. L'unité supérieure (10) comporte : un dispositif de projection (11) qui projette une vidéo ; et un dispositif d'imagerie (13) qui photographie une image. L'unité inférieure comporte : un dispositif d'entraînement qui entraîne des roues servant à déplacer le robot ; et un dispositif de commande qui commande le déplacement du robot. Le robot peut prendre un premier état dans lequel l'unité supérieure est montée en contact avec l'unité inférieure, et le dispositif de projection, le dispositif d'imagerie et l'articulation sont logés dans les unités supérieure et inférieure, et un second état dans lequel l'unité supérieure est surélevée par rapport à l'unité inférieure, et au moins certaines parties du dispositif de projection, du dispositif d'imagerie et de l'articulation sont exposées.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662439575P | 2016-12-28 | 2016-12-28 | |
| US62/439,575 | 2016-12-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018123431A1 true WO2018123431A1 (fr) | 2018-07-05 |
Family
ID=62707219
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/043112 Ceased WO2018123431A1 (fr) | 2016-12-28 | 2017-11-30 | Robot interactif |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018123431A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108877347A (zh) * | 2018-08-02 | 2018-11-23 | 安徽硕威智能科技有限公司 | 基于机器人投影功能的课堂实景再现交互式教学系统 |
| CN109015685A (zh) * | 2018-08-28 | 2018-12-18 | 广东海翔教育科技有限公司 | 一种用于教育机器人 |
| WO2019070050A1 (fr) * | 2017-10-06 | 2019-04-11 | 本田技研工業株式会社 | Robot mobile |
| WO2020253118A1 (fr) * | 2019-06-20 | 2020-12-24 | 深圳前海微众银行股份有限公司 | Procédé et appareil de développement commercial |
| US20230191632A1 (en) * | 2021-12-22 | 2023-06-22 | Lg Electronics Inc. | Mobile robot |
| JP2023177780A (ja) * | 2022-06-03 | 2023-12-14 | 三菱ロジスネクスト株式会社 | 荷役車両及びその制御方法 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005084789A (ja) * | 2003-09-05 | 2005-03-31 | Advanced Telecommunication Research Institute International | パーソナルコンピュータ |
| JP2005313308A (ja) * | 2004-03-30 | 2005-11-10 | Nec Corp | ロボット、ロボット制御方法、ロボット制御プログラム、ならびに思考装置 |
| JP2012519264A (ja) * | 2009-02-27 | 2012-08-23 | アール. ブルックス アソシエイツ インコーポレーティッド | 磁気検査車両を利用する検査システムおよび検査プロセス |
| WO2013099104A1 (fr) * | 2011-12-28 | 2013-07-04 | パナソニック株式会社 | Bras de robot |
-
2017
- 2017-11-30 WO PCT/JP2017/043112 patent/WO2018123431A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005084789A (ja) * | 2003-09-05 | 2005-03-31 | Advanced Telecommunication Research Institute International | パーソナルコンピュータ |
| JP2005313308A (ja) * | 2004-03-30 | 2005-11-10 | Nec Corp | ロボット、ロボット制御方法、ロボット制御プログラム、ならびに思考装置 |
| JP2012519264A (ja) * | 2009-02-27 | 2012-08-23 | アール. ブルックス アソシエイツ インコーポレーティッド | 磁気検査車両を利用する検査システムおよび検査プロセス |
| WO2013099104A1 (fr) * | 2011-12-28 | 2013-07-04 | パナソニック株式会社 | Bras de robot |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019070050A1 (fr) * | 2017-10-06 | 2019-04-11 | 本田技研工業株式会社 | Robot mobile |
| CN108877347A (zh) * | 2018-08-02 | 2018-11-23 | 安徽硕威智能科技有限公司 | 基于机器人投影功能的课堂实景再现交互式教学系统 |
| CN109015685A (zh) * | 2018-08-28 | 2018-12-18 | 广东海翔教育科技有限公司 | 一种用于教育机器人 |
| WO2020253118A1 (fr) * | 2019-06-20 | 2020-12-24 | 深圳前海微众银行股份有限公司 | Procédé et appareil de développement commercial |
| US20230191632A1 (en) * | 2021-12-22 | 2023-06-22 | Lg Electronics Inc. | Mobile robot |
| US11981025B2 (en) * | 2021-12-22 | 2024-05-14 | Lg Electronics Inc. | Mobile robot |
| JP2023177780A (ja) * | 2022-06-03 | 2023-12-14 | 三菱ロジスネクスト株式会社 | 荷役車両及びその制御方法 |
| JP2024098057A (ja) * | 2022-06-03 | 2024-07-19 | 三菱ロジスネクスト株式会社 | 荷役車両及びその制御方法 |
| JP7534064B2 (ja) | 2022-06-03 | 2024-08-14 | 三菱ロジスネクスト株式会社 | 荷役車両及びその制御方法 |
| JP7632987B2 (ja) | 2022-06-03 | 2025-02-19 | 三菱ロジスネクスト株式会社 | 荷役車両及びその制御方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018123431A1 (fr) | Robot interactif | |
| US10921818B2 (en) | Robot | |
| US10120386B2 (en) | Robotic creature and method of operation | |
| EP3624442A1 (fr) | Robot et son procédé de fonctionnement | |
| US20180154513A1 (en) | Robot | |
| CN109262606B (zh) | 装置、方法、记录介质以及机器人 | |
| CN111736585B (zh) | 机器人以及机器人的控制方法 | |
| CN111163906A (zh) | 能够移动的电子设备及其操作方法 | |
| CN104800950A (zh) | 自闭症儿童辅助机器人及系统 | |
| CN210155626U (zh) | 信息处理装置 | |
| CN106217393B (zh) | 移动式远端临场交互平台 | |
| JP6565853B2 (ja) | コミュニケーション装置 | |
| US20180376069A1 (en) | Erroneous operation-preventable robot, robot control method, and recording medium | |
| US12236152B2 (en) | Information processing apparatus and information processing method for displaying a feeling parameter associated with an autonomous moving body | |
| JP2005335053A (ja) | ロボット、ロボット制御装置およびロボットの制御方法 | |
| US11065769B2 (en) | Robot, method for operating the same, and server connected thereto | |
| JP6586810B2 (ja) | 玩具 | |
| US12197188B2 (en) | Information processing apparatus and information processing method | |
| CN110382174A (zh) | 一种用于执行情绪姿势以与用户交互作用的装置 | |
| WO2023243431A1 (fr) | Robot de soins infirmiers, procédé de commande de robot de soins infirmiers et dispositif de traitement d'informations | |
| Maheux et al. | T-Top, an open source tabletop robot with advanced onboard audio, vision and deep learning capabilities | |
| JP6686583B2 (ja) | ロボット及びプログラム | |
| US12311539B2 (en) | Information processing apparatus and control method | |
| US20250244749A1 (en) | Information processing apparatus and information processing method | |
| US12001226B2 (en) | Information processing apparatus, control method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17885696 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17885696 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |