[go: up one dir, main page]

WO2022034839A1 - Robot system - Google Patents

Robot system Download PDF

Info

Publication number
WO2022034839A1
WO2022034839A1 PCT/JP2021/028903 JP2021028903W WO2022034839A1 WO 2022034839 A1 WO2022034839 A1 WO 2022034839A1 JP 2021028903 W JP2021028903 W JP 2021028903W WO 2022034839 A1 WO2022034839 A1 WO 2022034839A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
robot
operated
light emission
control data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2021/028903
Other languages
French (fr)
Japanese (ja)
Inventor
クリス フランシス クリストファーズ
崇博 太田
真一 仲川
健太郎 清水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ipresence
Ipresence Ltd
Original Assignee
Ipresence
Ipresence Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ipresence, Ipresence Ltd filed Critical Ipresence
Publication of WO2022034839A1 publication Critical patent/WO2022034839A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to a robot system that can be controlled remotely.
  • Japanese Patent Application Laid-Open No. 2017-50018 discloses a self-propelled robot that can be equipped with a tablet. The user can remove this tablet from the self-propelled robot and control the movement of the camera and each part provided in the self-propelled robot at hand. At this time, the control signal from the tablet is given to the robot via the server device of the Internet.
  • the robot system according to the present invention is a robot system including an operation side portable device, a robot side portable device, and a robot substrate device.
  • the operation-side mobile device includes a captured image display means for displaying a robot-side captured image transmitted from the robot-side portable device on a touch display, and an operation-side captured image captured by a camera via a server device or directly.
  • the captured image transmitting means transmitted to the robot-side mobile device by the transmitting unit, the operation receiving means for receiving the user's operation input to the touch display, and the received operation as control data via the server device or directly.
  • a control data transmission means for transmitting to the robot-side mobile device by the transmission unit is provided.
  • the robot-side portable device is a captured image display means for displaying an operation-side captured image transmitted from the operation-side portable device on a touch display, and a robot-side captured image captured by a camera via a server device or directly.
  • the robot substrate device is a holding unit provided so as to be movable with respect to the main body portion by a main body portion and a driving unit, and physically holds the touch display facing the touch display of the robot-side portable device.
  • the control light in the control light emission region of the robot-side portable device held by the holding unit on the robot substrate device is detected by the holding unit having a holding wall for the robot and the light emission detecting unit, and control light data is obtained.
  • It is provided with a light emission detecting means and, at least, a robot side control means for controlling the holding unit for holding the robot side portable device to be moved by the driving unit based on the control light data from the light emitting detecting means. It is a feature.
  • the system according to the present invention is characterized in that the drive unit includes a motor for rotating the holding unit that holds the robot-side portable device.
  • the system according to the present invention is characterized in that the drive unit drives at least an arm member provided on the robot substrate device.
  • the arm member can be driven by remote control.
  • the system according to the present invention is characterized in that the light emission control means of the portable device on the robot side outputs control light having a different duty ratio from the control light emission region according to the received control data.
  • control can be performed with a simple configuration that is not easily affected by noise.
  • the light emitting means changes the duty ratio of the control light at least depending on whether the control data is an operation content for driving the drive unit or an operation content for stopping the drive unit. It is characterized by that.
  • the drive unit includes a motor that rotates the holding unit that holds the robot-side portable device, and the light emitting means is at least an operation content in which the control data rotates the motor clockwise. It is characterized in that the duty ratio of the control light is changed depending on whether the operation content is to rotate counterclockwise or to stop.
  • the system according to the present invention is characterized in that the mobile device on the operation side or the mobile device on the robot side is a smartphone or a tablet computer.
  • the control system is a control system including an operation side device, an operated side display device, and an operated side main body device.
  • the operation-side device is an operation reception means for receiving a user's operation input to the operation input unit, and the operated-side display device by the transmission unit via the server device or directly using the received operation as control data. Equipped with a control data transmission means to transmit to
  • the operated side display device has a control data receiving means for receiving control data transmitted from the operation side device, and at least a part of the display as a control light emitting region, and from the control light emitting region based on the control data.
  • the operated side main body device is held by the operated side main body device by a holding portion having a holding wall for physically holding the display facing the display of the operated side display device and the holding portion.
  • a light emission detection unit that detects the control light in the control light emission region of the operated side display device and obtains control light data, and a main body device to be operated that performs control processing based on the control light data from the light emission detection unit. It is equipped with side control means.
  • the videophone system according to the present invention is a videophone system having an operating side videophone device and an operated side videophone device.
  • the operation-side videophone device is an operation reception means that receives an operation of a user input to the operation input unit, and control data transmission that the transmission unit transmits to the operated-side videophone device using the received operation as control data. Equipped with means,
  • the operated videophone device directs the camera or microphone of the operated videophone device based on the control data receiving means for receiving the control data transmitted from the operated videophone device and the control data. It is characterized by being equipped with a control means on the operated side to be changed.
  • the direction of the camera or microphone of the other party can be controlled, and an image that is easy for oneself to see and a voice that is easy to hear can be obtained.
  • step S3 corresponds to the "captured image display means" of the mobile device on the operation side.
  • step S1 corresponds to the "captured image transmitting means" of the mobile device on the operation side.
  • steps S4 and S5 correspond to the "operation receiving means".
  • step S7 corresponds to the "control data transmission means".
  • step S43 corresponds to the "captured image display means" of the mobile device on the robot side.
  • step S41 corresponds to the "captured image transmitting means" of the mobile device on the robot side.
  • step S44 corresponds to the "control data receiving means".
  • step S45 corresponds to the "light emission control means”.
  • step S61 corresponds to the "light emission detecting means”.
  • step S62 corresponds to the "robot side control means".
  • the "program” is a concept that includes not only a program that can be directly executed by the CPU, but also a source format program, a compressed program, an encrypted program, and the like.
  • System Configuration Figure 1 shows the overall configuration of a robot system according to an embodiment of the present invention.
  • the robot-side portable device 30 is mounted on the robot substrate device 50.
  • An operation-side portable device 10 is provided at a distant position so that the robot substrate device 50 and the server device 70 can communicate with each other.
  • FIG. 2 shows the functional configuration of the robot system.
  • the camera 14 of the operation-side mobile device 10 captures an image of the operation-side user having the operation-side mobile device 10.
  • the captured image transmitting means 20 transmits the captured image to the robot-side portable device 30 by the communication unit 24.
  • the captured image display means 38 of the robot-side portable device 30 displays the received captured image on the display 32. As a result, the robot side user can see the state of the operation side user.
  • the camera 34 of the robot-side mobile device 30 captures an image of a robot-side user near the robot-side mobile device 30.
  • the captured image transmitting means 40 transmits the captured image to the operating side portable device 10 by the communication unit 44.
  • the captured image display means 18 of the operation-side portable device 10 displays the received captured image on the display 12. As a result, the operation side user can see the state of the robot side user.
  • control data transmission means 22 transmits this control data to the robot side mobile device 30 by the communication unit 24.
  • the control data receiving means 42 of the robot-side portable device 30 receives this control data and gives it to the light emission controlling means 36. Based on this control data, the light emission control means 36 blinks a control light emission region provided in a part of the display 32 to emit control light.
  • the light emission detecting means 32 of the robot substrate device 50 holding the robot-side portable device 30 receives this control light and obtains control light data.
  • This control optical data is given to the robot side control means 55.
  • the robot-side control means 55 controls the drive unit 57 based on the control optical data.
  • the drive unit 57 moves the holding unit that holds the robot-side mobile device 30.
  • the orientation of the camera 34 of the robot-side mobile device 30 is also changed, and the orientation of the image displayed on the operation-side portable device 10 is also changed.
  • the direction of the camera 34 of the robot-side mobile device 30 can be changed to control the imaging direction based on the operation command (control data) of the operation-side user. Therefore, the operation side user can perform the operation as if his / her alter ego is on the side of the robot side portable device 30.
  • FIG. 3 shows the appearance of the smartphone 10 which is an operation side terminal device. Since the appearance of the smartphone 30 which is a terminal device on the robot side is the same, the reference numerals are shown in parentheses.
  • a touch screen 64 (84) is provided on the front surface of the smartphone 10 (30).
  • a speaker 66 (86) and a camera 72 (92) are provided on the right side.
  • a microphone 70 (90) is provided on the left side.
  • the touch screen 84 of the smartphone 30, which is a terminal device on the robot side, is provided with a control light emitting region 31 in the lower central portion thereof.
  • the control light emitting region 31 is used as a region for emitting control light.
  • FIG. 4 shows the hardware configuration of the smartphone 10 (30).
  • the CPU 60 (80) includes a memory 62 (82), a touch display 64 (84), a speaker 66 (86), a non-volatile memory 68 (88), a microphone 70 (90), a camera 72 (92), and a communication circuit 74 ( 94) is connected.
  • the call circuit for the telephone is omitted.
  • the communication circuit 74 (94) is a circuit for connecting to the Internet.
  • the operating system 76 (96) and the operation side terminal program 78 (robot side terminal program 98) are recorded in the non-volatile memory 68 (88).
  • the operation side terminal program 78 (robot side terminal program 98) exerts its function in cooperation with the operating system 76 (96).
  • FIG. 5a shows the appearance of the robot substrate device 50.
  • the main body 52 is provided with four legs 58 so that it can be stably placed on a desk or the like.
  • a holding portion 54 is provided on the upper portion of the main body portion 52.
  • a recess 56 for holding the smartphone 30 is provided on the upper surface of the holding portion 54.
  • FIG. 5b shows a cross section of the robot substrate device 50 near the center of the smartphone 30 mounted in the recess 56.
  • the touch display 84 of the smartphone 30 can be tilted and held so as to face upward.
  • the holding unit 54 is provided with a phototransistor 51, which is a photodetector, so as to face the controlled light emitting region 31 of the touch display 84 of the attached smartphone 30.
  • FIG. 5c shows a cross-sectional view of the robot substrate device 50.
  • the main body 52 is provided with a stepping motor 108 and a control circuit 101.
  • the holding portion 54 is fixed to the shaft of the stepping motor 108 provided in the main body portion 52. Therefore, the orientation of the holding portion 54 can be changed by rotating the stepping motor 108. As a result, the orientation of the smartphone 30 mounted in the recess 56 of the holding portion 54 can be changed.
  • the phototransistor 51 shown in FIG. 5b is provided on the wall opposite to the recess 56 shown in FIG. 5c.
  • FIG. 6 shows the hardware configuration of the control circuit 101 of the robot substrate device 50.
  • a memory 102, a non-volatile memory 104, a waveform shaping circuit 107, and a driver circuit 109 are connected to the CPU 100.
  • the signal from the phototransistor 106 is shaped by the waveform shaping circuit 107 and given to the CPU 100 as data.
  • the driver circuit 109 is a circuit for driving the stepping motor 108 based on the command of the CPU 100.
  • the operating system 110 and the robot substrate program 112 are recorded in the non-volatile memory 104.
  • the robot substrate program 112 cooperates with the operating system 110 to exert its function. It should be noted that the robot substrate program 112 may be configured to operate independently without using an operating system.
  • FIG. 7 shows the hardware configuration of the server device.
  • a memory 122, a hard disk 124, and a communication circuit 126 are connected to the CPU 120.
  • the communication circuit 126 is for connecting to the Internet.
  • the operating system 128 and the server program 130 are recorded on the hard disk 124.
  • the server program 130 exerts its function in cooperation with the operating system 128.
  • FIGS. 8 and 9 show a flowchart of robot operation.
  • the CPU 60 of the operation-side smartphone 10 (hereinafter, may be abbreviated as the operation-side smartphone 10) acquires voice data from the microphone 70 and image data from the camera 72, and transmits them to the server device 70 by the communication circuit 74. (Step S1).
  • the CPU 120 of the server device 70 receives the voice data and the image data via the Internet by the communication circuit 126, and transfers the voice data and the image data to the smartphone 30 on the robot side (step S21). ..
  • the CPU 80 of the robot-side smartphone 30 receives voice data and image data via the Internet by the communication circuit 94 (step S42).
  • the robot-side smartphone 30 outputs the received voice data from the speaker 86, and displays the received image data on the touch display 84 (step S43).
  • the robot-side user can obtain the image and voice of the operation-side user by the smartphone 30.
  • the controlled light emitting area 31 of the touch display 84 of the robot side smartphone 30 and its vicinity are hidden by the holding unit 54, and the robot side user cannot see the display of that part. However, by making the hidden area smaller, visibility is not impaired.
  • the robot-side smartphone 30 acquires voice data from the microphone 90 and image data from the camera 92 and transmits them to the server device 70 by the communication circuit 94 (step S41).
  • the server device 70 receives the voice data and the image data via the Internet by the communication circuit 126, and transfers the voice data and the image data to the operating smartphone 10 (step S21).
  • the operation-side smartphone 10 receives voice data and image data via the Internet through the communication circuit 74 (step S2).
  • the operating smartphone 10 outputs the received voice data from the speaker 66, and displays the received image data on the touch display 64 (step S3).
  • the operation side user can obtain the image and voice of the robot side user by the smartphone 10. For example, as shown in FIG. 10, the face of the robot-side user and its background can be seen.
  • the operation side smartphone 10 displays the operation button (control mark) 13 on the screen (step S3).
  • the operation button 13 includes a right rotation button 13a and a left rotation button 13b.
  • the operation-side smartphone 10 receives the right-handed rotation control input and transmits the right-handed rotation control data to the server device 70 (steps S5 and S7).
  • the operating side user taps the left rotation button 13b.
  • the operation side smartphone 10 receives the left rotation control input and transmits the left rotation control data to the server device 70 (steps S5 and S7).
  • the operating side smartphone 10 when the operating side user has not tapped any of the buttons 13a and 13b, the operating side smartphone 10 always transmits the stop control data to the server device 70 (steps S6 and S7). This is to prevent accidental rotation due to the influence of noise or the like.
  • the robot-side smartphone 30 receives this control data (step S44). In the following, the description will be made assuming that the right rotation control data has been received.
  • the robot-side smartphone 30 blinks the control light emitting area 31 (see FIG. 3) of the touch display 84 with the control light corresponding to the right rotation control data (step S45).
  • the control light that blinks at 10 Hz is used, and the lighting duty ratio (ratio of the lighting time to the time of one cycle) of the control light is changed according to the control content.
  • the duty ratio is set to 15% for right rotation control data, and the duty ratio is set to 85% for left rotation control data. Further, in the case of stop control data, the duty ratio is set to 50%.
  • the control light having a duty ratio of 15% is emitted.
  • the control light emitted from the control light emitting region 31 of the touch display 84 is received by the phototransistor 51 provided in the holding portion 54 of the robot substrate device 50 (see FIG. 5b).
  • the phototransistor 51 outputs a pulse signal (control light data) corresponding to the control light.
  • the CPU 100 of the robot substrate device 50 (hereinafter, may be abbreviated as the robot substrate device 50) acquires this control optical data via the waveform shaping circuit 107 (see FIG. 6) (step S61).
  • the robot substrate device 50 detects the on-duty ratio of the control optical data, and thereby determines whether the content of the control data is clockwise rotation, counterclockwise rotation, or stop.
  • the content of the control data is determined based on the table as shown in FIG.
  • the robot substrate device 50 determines that the right rotation command has been given, controls the driver circuit 109, and rotates the stepping motor 108 to the right (step S62). The robot substrate device 50 continues to rotate the stepping motor 108 while the right rotation command is given.
  • the holding portion 54 rotates clockwise, and the robot-side smartphone 30 mounted on the holding portion 54 also rotates as a whole. Therefore, the imaging angle of the camera 92 of the robot-side smartphone 30 also changes.
  • the operating side user can freely adjust the direction of the camera 92 of the robot side user who is the other party of the call and obtain an image of the angle desired by himself / herself.
  • the operation side smartphone 10 and the robot side smartphone 30 communicate with each other via the server device 70.
  • direct communication may be performed by Bluetooth, Wifi, or the like without going through the server device 70.
  • the communication may be performed via a device other than the server device.
  • the phototransistor 51 is used as the light emission detection unit. However, it may be detected by using a camera or the like.
  • the rotation is performed only while the right rotation button 13a and the left rotation button 13b are tapped.
  • the rotation may be performed by a predetermined angle (for example, 5 degrees).
  • the content of the control data (right rotation, stop, left rotation) is shown by changing the duty ratio of the control light.
  • the phase, frequency, amplitude, and the like of the control light may be changed or combined with these to indicate the content of the control data.
  • the holding portion 54 is configured to be driven not only to the left and right but also to the top and bottom. Then, right rotation, left rotation, stop, up rotation, down rotation, and stop are set to duty ratios of 10%, 30%, 50%, 70%, and 90%, respectively. By doing so, it is possible to control not only the left and right but also the top and bottom.
  • FIG. 13 shows a robot substrate device 50 for rotating the smartphone 30 up / down / left / right.
  • a holding portion 54 is provided on the main body portion 52.
  • the holding portion 54 includes a first base member 54a, a vertical member 54b, a second base member 54c, and a groove-shaped member 54d.
  • the first base 54 is provided so that it can rotate in the X direction of the arrow.
  • the rotation mechanism is the same as in FIG. 5c.
  • a vertical member 54b extending in the vertical direction is provided at one end of the circumference of the first base member 54a, and rotates in the same manner as the first base member 54a rotates.
  • the vertical member 54b is provided with a second base member 54c that rotates in the Y direction of the arrow.
  • the rotation mechanism is the same as in FIG. 5c.
  • a groove-shaped member 54d is provided at one end of the circumference of the second base member 54a, and rotates in the same manner as the second base member 54a rotates.
  • the groove-shaped member 54d has a U-shaped groove, and the lower end of the smartphone 30 is held by this groove. Therefore, the camera (not shown) provided in the smartphone 30 is rotated in the vertical and horizontal directions by the rotation of the first base member 54a and the second base member 54d.
  • the groove-shaped member 54d is provided with a phototransistor 51 as in FIG. 5b.
  • control light emitting region 31 is provided at only one place. However, if a plurality of locations are provided, more control contents can be realized. For example, as shown in FIG. 14, by providing two control light emitting regions 31a and 31b, two types of control signals can be transmitted at the same time.
  • the holding unit 54 is provided with a phototransistor 51 corresponding to each of the control light emitting regions 31a and 31b.
  • control light emitted from the control light emission region 31a may be used as clock light, and the data light indicated by "1” and “0” may be emitted from the control light emission region 31b.
  • "1" and "0” are transmitted depending on the value of the data light at the rising edge of the clock light. As a result, a large amount of information can be transmitted by the combination of "1” and "0" to perform complicated control.
  • the smartphone 10 (30) is used as the mobile terminal device.
  • a tablet computer, PDA, or the like may be used.
  • a stationary PC or the like may be used instead of the mobile terminal device. In this case, on the robot side, at least the camera 92 may be held and driven by the holding unit 64.
  • the operation side user controls to change the direction of the robot side smartphone 30.
  • other drive control may be performed.
  • the arm structure (arm mechanism) as shown in FIG. 16A may be driven and controlled.
  • This structure can be attached to the left and right of the holding portion 54 of the robot substrate device 50.
  • the arm fixing portion 152 is fixed to the holding portion 54.
  • a first rotating portion 154 is attached to the arm fixing portion 152 so as to be rotatable in the A direction.
  • a second rotating portion 156 is attached to the first rotating portion 154 so as to be rotatable in the B direction.
  • FIG. 16B shows the internal mechanism of the arm structure. Inside the arm fixing portion 152, an arm fixing portion base member 152a is provided. The arm fixing portion 152 is configured by covering the outside of the arm fixing portion base member 152a with a housing.
  • a rotation shaft 162 is rotatably provided on the arm fixing portion base member 152a.
  • the rotary shaft 162 is configured to be rotatable in the direction of arrow A by a stepping motor (not shown).
  • the first rotating portion base member 154a of the first rotating portion 154 is fixed to the rotating shaft 162.
  • the first rotating portion 154 is configured by covering the outside of the first rotating portion base member 154a with a housing. Therefore, by controlling the stepping motor, the first rotating portion 154 can be rotated in the direction of arrow A with respect to the arm fixing portion 152.
  • the second rotating portion base member 156a is rotatably fixed to the first rotating portion base member 154a by the rotating shaft 164.
  • the rotary shaft 164 is configured to be rotatable in the direction of arrow B by another stepping motor (not shown).
  • a second rotating portion base member 156a is fixed to the rotating shaft 164.
  • the second rotation portion 156 is configured by covering the outside of the second rotation base member 156a with a housing. Therefore, by controlling the stepping motor, the second rotating portion 156 can be rotated in the direction of arrow B with respect to the first rotating portion 154.
  • the holding portion 54 of the robot substrate device 50 may be fixed to the main body portion 52 to control only the arm mechanism.
  • the orientation of the robot-side smartphone 30 mounted on the robot-based device 50 is controlled from the operation side.
  • the operation side can give an instruction to the robot to perform the tasks. For example, in the case of a cooking robot, while controlling the cooking robot from the operation side to cook, it is possible to teach the robot side user the cooking by using both image and voice calls.
  • the present invention by attaching a smartphone to the robot, it is possible to make an image / voice call while remotely controlling the robot. Further, since the angle of the camera and the microphone of the robot side smartphone 30 can be controlled remotely, the situation on the robot side can be accurately known.
  • control for driving the mechanism of the robot substrate device 50 from the operation side has been described. However, it may be controlled to drive a mechanism of a device other than the robot substrate device 50. For example, the operation of the cleaning robot configured separately from the robot substrate device 50 may be controlled.
  • FIG. 17 shows the functional configuration.
  • the operation reception means 16 accepts the input.
  • the control data transmission means 22 transmits the control data corresponding to the received input by the communication unit 24 to the operated side display device 30.
  • the control data receiving means 42 of the operated side display device 30 receives this by the communication unit 44.
  • the light emission control means 36 emits control light for the control data from a part of the display 32.
  • the light emission detecting means 52 of the main body device 50 on the operated side receives the control light.
  • the control means 54 on the operated main body side performs predetermined control based on the control light.
  • the lower part of the screen 184 of the display 180 (video conference display) on the robot side is covered with the support member 186.
  • the screen 184 is provided with control light emitting regions 31a, 31b, 31c, 31d, 31e.
  • a phototransistor is provided inside the support member 186 so as to face this. The state of this room is transmitted to the other party (operation side) tablet or the like by the camera 182.
  • the user on the operating side can operate the tablet to send control data, and the phototransistor can acquire this and change the brightness of the lighting in this room, for example.
  • the magnification and focus of the camera 182 can be controlled.
  • the robot is a concept including not only a driving one but also a controlling one.
  • blinking light is used as control light.
  • the color of light may be used to convey information.
  • a phototransistor capable of detecting color, a camera, or the like is used.
  • the rotation command is given by tapping the buttons 13a and 13b as shown in FIG.
  • the position of the controlled light emitting region 31 on the touch display 84 of the smartphone 30 is predetermined.
  • the smartphone 30 itself detects a change in the positional capacitance due to mounting on the recess 56 (preferably a change at a level at which it is not determined that there was a touch input), and the portion is controlled by the control light emitting region 31. You may try to. By doing so, the control light can be reliably transmitted even if the mounting position is displaced.
  • control light emitting region 31 is provided in the lower part of the smartphone 30, and the smartphone 30 is supported by this portion.
  • the control light emitting region 31 may be provided in the upper portion, the side portion, or the like, and the smartphone 30 may be supported by this portion.
  • the clip 33 may cover an arbitrary position of the touch display 84 of the robot-side smartphone 30.
  • the clip 33 is provided with a photodiode, the signal of which is transmitted to the robot by a control line (not shown) or by radio.
  • the control light emitting region 31 is set by the smartphone 30 itself as described above.
  • the robot substrate device 50 and the robot-side portable device 30 are configured as separate devices. However, these may be integrated into a robot device. In this case, the control data may not be converted into the control light, and the received control data may be used as it is.
  • FIG. 20 shows the functional configuration.
  • the operation reception means 16 accepts the input.
  • the control data transmission means 22 transmits the control data corresponding to the received input by the communication unit 24 to the videophone device 30 on the operated side.
  • the control data receiving means 42 of the videophone device 30 on the operated side receives this by the communication unit 44.
  • the operated side control means 36 changes the direction of the camera or the microphone 34 based on the control data. Therefore, the operating side user can adjust the image and sound transmitted from the operated side videophone device 30.
  • the videophone system as described above can also be realized by the configuration shown in FIG.
  • a stepping motor is used.
  • a normal motor such as a servo motor may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manipulator (AREA)
  • Toys (AREA)
  • Telephonic Communication Services (AREA)
  • Selective Calling Equipment (AREA)

Abstract

[Problem] Provided is a robot system capable of performing remote control even with a simple arrangement. [Means] When control data is input, by an operation side user, from an input unit 16 of an operation side portable device 10, a control data sending means 22 sends the control data to a robot side portable device 30. A control data receiving means 42 of the robot side portable device 30 receives the control data and provides the same to a light emission control means 36. The light emission control means 36 causes a control light emission region provided in a part of a display 32 to flash on the basis of the control data to emit control light. A light emission detection means 52 of a robot base body device 50 holding the robot side portable device 30 receives the control light to obtain control light data. A robot side control means 54 controls a driving part 56 on the basis of the control light data. The driving part 56 moves a holding part holding the robot side portable device 30. The orientation of a camera 34 of the robot side portable device 30 is changed thereby, and the orientation of an image displayed on the operation side portable device 10 is also changed.

Description

ロボットシステムRobot system

 この発明は、遠隔などから制御を行うことのできるロボットシステムに関するものである。 The present invention relates to a robot system that can be controlled remotely.

 離れた場所からロボットを制御するシステムが提案されている。たとえば、特開2017-50018には、タブレットを装着可能な自走ロボットが開示されている。ユーザは、このタブレットを自走ロボットから取り外し、手元で自走ロボットに設けられたカメラや各部の動きを制御できる。この際、タブレットからの制御信号は、インターネットのサーバ装置を介してロボットに与えられる。 A system that controls the robot from a remote location has been proposed. For example, Japanese Patent Application Laid-Open No. 2017-50018 discloses a self-propelled robot that can be equipped with a tablet. The user can remove this tablet from the self-propelled robot and control the movement of the camera and each part provided in the self-propelled robot at hand. At this time, the control signal from the tablet is given to the robot via the server device of the Internet.

 これにより、ユーザによるロボットの制御を手元で行うことができ、操作が容易となる。 This allows the user to control the robot at hand, which facilitates operation.

 しかしながら、特開2017-50018のような従来技術では、携帯端末装置であるタブレットからインターネットを介してロボットに制御信号を与えるために、ロボット側に通信装置を設ける必要がある。 However, in the conventional technology such as JP-A-2017-50018, in order to give a control signal to the robot from the tablet which is a mobile terminal device via the Internet, it is necessary to provide a communication device on the robot side.

 また、特開2017-50018においては、通常時、操作のためのタブレットをロボットに装着しておく構成となっているため、他のユーザが有するタブレットにて、本格的な遠隔操作を行うことは難しかった。 Further, in Japanese Patent Laid-Open No. 2017-50018, since a tablet for operation is normally attached to the robot, it is not possible to perform full-scale remote control with a tablet owned by another user. was difficult.

 この発明は、上記のような問題点を解決して、簡易な構成でありながら、遠隔操作を行うことのできるロボットシステムを提供することを目的とする。 It is an object of the present invention to solve the above-mentioned problems and to provide a robot system capable of remote control while having a simple configuration.

 この発明の独立して適用可能ないくつかの特徴を以下に列挙する。 Some of the independently applicable features of this invention are listed below.

(1)-(7)この発明に係るロボットシステムは、操作側携帯装置とロボット側携帯装置とロボット基体装置とを備えたロボットシステムにおいて、
 前記操作側携帯装置は、ロボット側携帯装置から送信されてきたロボット側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、カメラによって撮像した操作側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する撮像画像送信手段と、タッチディスプレイに入力されたユーザの操作を受け付ける操作受付手段と、前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する制御データ送信手段とを備え、
 前記ロボット側携帯装置は、操作側携帯装置から送信されてきた操作側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、カメラによって撮像したロボット側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記操作側携帯装置に送信する撮像画像送信手段と、前記操作側携帯装置から送信されてきた制御データを受信する制御データ受信手段と、前記タッチディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域の発光を制御し、制御光を送出する発光制御手段とを備え、
 前記ロボット基体装置は、本体部と、駆動部によって前記本体部に対して移動可能に設けられた保持部であって、前記ロボット側携帯装置のタッチディスプレイに対向してタッチディスプレイを物理的に保持するための保持壁を有する保持部と、発光検出部によって、前記保持部により前記ロボット基体装置に保持された前記ロボット側携帯装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出手段と、前記発光検出手段からの制御光データに基づいて、少なくとも、前記ロボット側携帯装置を保持する保持部を前記駆動部によって移動させるよう制御するロボット側制御手段とを備えたことを特徴としている。
(1)-(7) The robot system according to the present invention is a robot system including an operation side portable device, a robot side portable device, and a robot substrate device.
The operation-side mobile device includes a captured image display means for displaying a robot-side captured image transmitted from the robot-side portable device on a touch display, and an operation-side captured image captured by a camera via a server device or directly. , The captured image transmitting means transmitted to the robot-side mobile device by the transmitting unit, the operation receiving means for receiving the user's operation input to the touch display, and the received operation as control data via the server device or directly. In addition, a control data transmission means for transmitting to the robot-side mobile device by the transmission unit is provided.
The robot-side portable device is a captured image display means for displaying an operation-side captured image transmitted from the operation-side portable device on a touch display, and a robot-side captured image captured by a camera via a server device or directly. A control light emitting region of at least a part of the touch display, a captured image transmitting means for transmitting to the operation-side portable device by a transmission unit, a control data receiving means for receiving control data transmitted from the operation-side portable device, and a control light emitting unit. It is provided with a light emission control means for controlling light emission in the control light emission region based on the control data and transmitting control light.
The robot substrate device is a holding unit provided so as to be movable with respect to the main body portion by a main body portion and a driving unit, and physically holds the touch display facing the touch display of the robot-side portable device. The control light in the control light emission region of the robot-side portable device held by the holding unit on the robot substrate device is detected by the holding unit having a holding wall for the robot and the light emission detecting unit, and control light data is obtained. It is provided with a light emission detecting means and, at least, a robot side control means for controlling the holding unit for holding the robot side portable device to be moved by the driving unit based on the control light data from the light emitting detecting means. It is a feature.

 したがって、簡易な構造でありながら、離れた場所からロボット側携帯装置のカメラの向きを制御することができる。 Therefore, although it has a simple structure, it is possible to control the orientation of the camera of the mobile device on the robot side from a remote location.

(8)この発明に係るシステムは、駆動部は、ロボット側携帯装置を保持した保持部を回転させるモータを備えることを特徴としている。 (8) The system according to the present invention is characterized in that the drive unit includes a motor for rotating the holding unit that holds the robot-side portable device.

 したがって、遠隔操作によってモータによる回転駆動を行うことができる。 Therefore, it is possible to drive the rotation by a motor by remote control.

(9)この発明に係るシステムは、駆動部は、少なくとも、ロボット基体装置に設けられたアーム部材を駆動させることを特徴としている。 (9) The system according to the present invention is characterized in that the drive unit drives at least an arm member provided on the robot substrate device.

 したがって、遠隔操作によって、アーム部材を駆動させることができる。 Therefore, the arm member can be driven by remote control.

(10)この発明に係るシステムは、ロボット側携帯装置の発光制御手段は、受信した制御データに応じて、ディユーティ比の異なる制御光を前記制御発光領域から出力することを特徴としている。 (10) The system according to the present invention is characterized in that the light emission control means of the portable device on the robot side outputs control light having a different duty ratio from the control light emission region according to the received control data.

 したがって、ノイズの影響を受けにくく簡易な構成で制御を行うことができる。 Therefore, control can be performed with a simple configuration that is not easily affected by noise.

(11)この発明に係るシステムは、発光手段は、少なくとも、前記制御データが駆動部を駆動させる操作内容であるか、停止させる操作内容であるかに応じて、前記制御光のデューティ比を変えることを特徴としている。 (11) In the system according to the present invention, the light emitting means changes the duty ratio of the control light at least depending on whether the control data is an operation content for driving the drive unit or an operation content for stopping the drive unit. It is characterized by that.

 したがって、停止についても制御データを送るようにしているので、誤判断を減少させることができる。 Therefore, since the control data is sent even for the stop, it is possible to reduce the misjudgment.

(12)この発明に係るシステムは、駆動部は、ロボット側携帯装置を保持した保持部を回転させるモータを備え、発光御手段は、少なくとも、前記制御データがモータを右回転させる操作内容であるか、左回転させる操作内容であるか、停止させる操作内容であるかに応じて、前記制御光のデューティ比を変えることを特徴としている。 (12) In the system according to the present invention, the drive unit includes a motor that rotates the holding unit that holds the robot-side portable device, and the light emitting means is at least an operation content in which the control data rotates the motor clockwise. It is characterized in that the duty ratio of the control light is changed depending on whether the operation content is to rotate counterclockwise or to stop.

 したがって、右回転、左回転、停止の制御を確実に行うことができる。 Therefore, it is possible to reliably control clockwise rotation, counterclockwise rotation, and stop.

(13)この発明に係るシステムは、操作側携帯装置またはロボット側携帯装置は、スマートフォンまたはタブレットコンピュータであることを特徴としている。 (13) The system according to the present invention is characterized in that the mobile device on the operation side or the mobile device on the robot side is a smartphone or a tablet computer.

 したがって、スマートフォンやタブレットコンピュータのカメラやディスプレイを利用してロボットシステムを構築することができる。 Therefore, it is possible to construct a robot system using the camera and display of a smartphone or tablet computer.

(14)-(18)この発明に係る制御システムは、操作側装置と被操作側表示装置と被操作側本体装置とを備えた制御システムにおいて、
 前記操作側装置は、操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記被操作側表示装置に送信する制御データ送信手段とを備え、
 前記被操作側表示装置は、前記操作側装置から送信されてきた制御データを受信する制御データ受信手段と、ディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域から制御光を出力する発光制御手段とを備え、
 前記被操作側本体装置は、前記被操作側表示装置のディスプレイに対向してディスプレイを物理的に保持するための保持壁を有する保持部と、前記保持部によって前記被操作側本体装置に保持された前記被操作側表示装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出部と、前記発光検出部からの制御光データに基づいて、制御処理を行う被操作本体装置側制御手段とを備えている。
(14)-(18) The control system according to the present invention is a control system including an operation side device, an operated side display device, and an operated side main body device.
The operation-side device is an operation reception means for receiving a user's operation input to the operation input unit, and the operated-side display device by the transmission unit via the server device or directly using the received operation as control data. Equipped with a control data transmission means to transmit to
The operated side display device has a control data receiving means for receiving control data transmitted from the operation side device, and at least a part of the display as a control light emitting region, and from the control light emitting region based on the control data. Equipped with a light emission control means that outputs control light,
The operated side main body device is held by the operated side main body device by a holding portion having a holding wall for physically holding the display facing the display of the operated side display device and the holding portion. A light emission detection unit that detects the control light in the control light emission region of the operated side display device and obtains control light data, and a main body device to be operated that performs control processing based on the control light data from the light emission detection unit. It is equipped with side control means.

 したがって、簡易な構造でありながら、離れた場所から被操作装置による制御を行うことができる。 Therefore, although it has a simple structure, it can be controlled by the operated device from a remote location.

(19)-(23)この発明に係るテレビ電話システムは、操作側テレビ電話装置と被操作側テレビ電話装置を有するテレビ電話システムにおいて、
 前記操作側テレビ電話装置は、操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、前記受け付けた操作を制御データとして、送信部によって前記被操作側テレビ電話装置に送信する制御データ送信手段とを備え、
 被操作側テレビ電話装置は、前記操作側テレビ電話装置から送信されてきた制御データを受信する制御データ受信手段と、当該制御データに基づいて、被操作側テレビ電話装置のカメラまたはマイクの向きを変える被操作側制御手段とを備えたことを特徴としている。
(19)-(23) The videophone system according to the present invention is a videophone system having an operating side videophone device and an operated side videophone device.
The operation-side videophone device is an operation reception means that receives an operation of a user input to the operation input unit, and control data transmission that the transmission unit transmits to the operated-side videophone device using the received operation as control data. Equipped with means,
The operated videophone device directs the camera or microphone of the operated videophone device based on the control data receiving means for receiving the control data transmitted from the operated videophone device and the control data. It is characterized by being equipped with a control means on the operated side to be changed.

 したがって、テレビ電話中に、通話相手のカメラやマイクの向きを制御することができ、自らが見やすい画像、聞きやすい音声を得ることができる。 Therefore, during a videophone call, the direction of the camera or microphone of the other party can be controlled, and an image that is easy for oneself to see and a voice that is easy to hear can be obtained.

 この発明において、操作側携帯装置の「撮像画像表示手段」は、実施形態においては、ステップS3がこれに対応する。 In the present invention, in the embodiment, step S3 corresponds to the "captured image display means" of the mobile device on the operation side.

 操作側携帯装置の「撮像画像送信手段」は、実施形態においては、ステップS1がこれに対応する。 In the embodiment, step S1 corresponds to the "captured image transmitting means" of the mobile device on the operation side.

 「操作受付手段」は、実施形態においては、ステップS4、S5がこれに対応する。 In the embodiment, steps S4 and S5 correspond to the "operation receiving means".

 「制御データ送信手段」は、実施形態においては、ステップS7がこれに対応する。 In the embodiment, step S7 corresponds to the "control data transmission means".

 ロボット側携帯装置の「撮像画像表示手段」は、実施形態においては、ステップS43がこれに対応する。 In the embodiment, step S43 corresponds to the "captured image display means" of the mobile device on the robot side.

 ロボット側携帯装置の「撮像画像送信手段」は、実施形態においては、ステップS41がこれに対応する。 In the embodiment, step S41 corresponds to the "captured image transmitting means" of the mobile device on the robot side.

 「制御データ受信手段」は、実施形態においては、ステップS44がこれに対応する。 In the embodiment, step S44 corresponds to the "control data receiving means".

 「発光制御手段」は、実施形態においては、ステップS45がこれに対応する。 In the embodiment, step S45 corresponds to the "light emission control means".

 「発光検出手段」は、実施形態においては、ステップS61がこれに対応する。 In the embodiment, step S61 corresponds to the "light emission detecting means".

 「ロボット側制御手段」は、実施形態においては、ステップS62がこれに対応する。 In the embodiment, step S62 corresponds to the "robot side control means".

 「プログラム」とは、CPUにより直接実行可能なプログラムだけでなく、ソース形式のプログラム、圧縮処理がされたプログラム、暗号化されたプログラム等を含む概念である。 The "program" is a concept that includes not only a program that can be directly executed by the CPU, but also a source format program, a compressed program, an encrypted program, and the like.

この発明の一実施形態によるロボットシステムのシステム構成である。It is a system configuration of a robot system according to an embodiment of the present invention. ロボットシステムの機能構成図である。It is a functional block diagram of a robot system. スマートフォン10(30)の外観を示す図である。It is a figure which shows the appearance of the smartphone 10 (30). スマートフォン10(30)のハードウエア構成である。This is the hardware configuration of the smartphone 10 (30). ロボット基体装置50の外観図である。It is an external view of the robot substrate apparatus 50. ロボット基体装置50の保持部50の断面図である。It is sectional drawing of the holding part 50 of the robot substrate apparatus 50. ロボット基体装置50の断面図である。It is sectional drawing of the robot substrate apparatus 50. ロボット基体装置50のハードウエア構成である。It is a hardware configuration of the robot substrate device 50. サーバ装置70のハードウエア構成である。This is the hardware configuration of the server device 70. 操作処理のフローチャートである。It is a flowchart of an operation process. 操作処理のフローチャートである。It is a flowchart of an operation process. 操作側スマートフォン10に表示された画像例である。This is an example of an image displayed on the operation side smartphone 10. 操作側スマートフォン10に表示された画像例である。This is an example of an image displayed on the operation side smartphone 10. 制御内容を判断するためのテーブルである。It is a table for judging the control contents. 上下駆動を行うための機構を示す図である。It is a figure which shows the mechanism for performing a vertical drive. 制御発光領域を複数設けた場合を示す図である。It is a figure which shows the case where a plurality of control light emitting regions are provided. クロック光とデータ光を設けた場合の例である。This is an example when clock light and data light are provided. アームの構造を示す図である。It is a figure which shows the structure of an arm. 他の実施形態による制御システムの機能構成図である。It is a functional block diagram of the control system by another embodiment. 他の実施形態による構成例である。It is a configuration example according to another embodiment. クリップ33のフォトトランジスタによって制御光を受光する例を示す図である。It is a figure which shows the example which receives the control light by the phototransistor of a clip 33. 他の実施形態によるテレビ電話システムの機能構成図である。It is a functional block diagram of the videophone system by another embodiment.

1.システム構成
 図1に、この発明の一実施形態によるロボットシステムの全体構成を示す。ロボット基体装置50には、ロボット側携帯装置30が装着されている。ロボット基体装置50と、サーバ装置70を介して互いに通信可能なように、離れた位置に操作側携帯装置10が設けられている。
1. 1. System Configuration Figure 1 shows the overall configuration of a robot system according to an embodiment of the present invention. The robot-side portable device 30 is mounted on the robot substrate device 50. An operation-side portable device 10 is provided at a distant position so that the robot substrate device 50 and the server device 70 can communicate with each other.

 図2に、ロボットシステムの機能構成を示す。操作側携帯装置10のカメラ14によって、操作側携帯装置10を持つ操作側ユーザの画像が撮像される。撮像画像送信手段20は、通信部24によって、撮像画像をロボット側携帯装置30に送信する。 FIG. 2 shows the functional configuration of the robot system. The camera 14 of the operation-side mobile device 10 captures an image of the operation-side user having the operation-side mobile device 10. The captured image transmitting means 20 transmits the captured image to the robot-side portable device 30 by the communication unit 24.

 ロボット側携帯装置30の撮像画像表示手段38は、受信した撮像画像をディスプレイ32に表示する。これにより、ロボット側ユーザは、操作側ユーザの様子を見ることができる。 The captured image display means 38 of the robot-side portable device 30 displays the received captured image on the display 32. As a result, the robot side user can see the state of the operation side user.

 ロボット側携帯装置30のカメラ34によって、ロボット側携帯装置30近くにいるロボット側ユーザの画像が撮像される。撮像画像送信手段40は、通信部44によって、撮像画像を操作側携帯装置10に送信する。 The camera 34 of the robot-side mobile device 30 captures an image of a robot-side user near the robot-side mobile device 30. The captured image transmitting means 40 transmits the captured image to the operating side portable device 10 by the communication unit 44.

 操作側携帯装置10の撮像画像表示手段18は、受信した撮像画像をディスプレイ12に表示する。これにより、操作側ユーザは、ロボット側ユーザの様子を見ることができる。 The captured image display means 18 of the operation-side portable device 10 displays the received captured image on the display 12. As a result, the operation side user can see the state of the robot side user.

 操作側ユーザにより、操作側携帯装置10の入力部16から制御データが入力されると、制御データ送信手段22は、通信部24によってこの制御データをロボット側携帯装置30に送信する。 When the control data is input from the input unit 16 of the operation side mobile device 10 by the operation side user, the control data transmission means 22 transmits this control data to the robot side mobile device 30 by the communication unit 24.

 ロボット側携帯装置30の制御データ受信手段42は、この制御データを受信して、発光制御手段36に与える。発光制御手段36は、この制御データに基づいて、ディスプレイ32の一部に設けられた制御発光領域を点滅させて、制御光を放出する。 The control data receiving means 42 of the robot-side portable device 30 receives this control data and gives it to the light emission controlling means 36. Based on this control data, the light emission control means 36 blinks a control light emission region provided in a part of the display 32 to emit control light.

 ロボット側携帯装置30を保持するロボット基体装置50の発光検出手段32は、この制御光を受けて制御光データを得る。この制御光データは、ロボット側制御手段55に与えられる。ロボット側制御手段55は、制御光データに基づいて、駆動部57を制御する。駆動部57は、ロボット側携帯装置30を保持する保持部を移動させる。これにより、ロボット側携帯装置30のカメラ34の向きも変更されて、操作側携帯装置10において表示される画像の向きも変化する。 The light emission detecting means 32 of the robot substrate device 50 holding the robot-side portable device 30 receives this control light and obtains control light data. This control optical data is given to the robot side control means 55. The robot-side control means 55 controls the drive unit 57 based on the control optical data. The drive unit 57 moves the holding unit that holds the robot-side mobile device 30. As a result, the orientation of the camera 34 of the robot-side mobile device 30 is also changed, and the orientation of the image displayed on the operation-side portable device 10 is also changed.

 以上のようにして、操作側ユーザの操作指令(制御データ)に基づいて、ロボット側携帯装置30のカメラ34の向きを変えて、撮像方向を制御することができる。したがって、操作側ユーザは、あたかも、ロボット側携帯装置30の側に自分の分身がいるかのごとくの操作を行うことができる。
 
As described above, the direction of the camera 34 of the robot-side mobile device 30 can be changed to control the imaging direction based on the operation command (control data) of the operation-side user. Therefore, the operation side user can perform the operation as if his / her alter ego is on the side of the robot side portable device 30.

2.各装置の構成
 図3に、操作側端末装置であるスマートフォン10の外観を示す。ロボット側端末装置であるスマートフォン30の外観も同様であるので、括弧内にその符号を示す。
2. 2. Configuration of Each Device FIG. 3 shows the appearance of the smartphone 10 which is an operation side terminal device. Since the appearance of the smartphone 30 which is a terminal device on the robot side is the same, the reference numerals are shown in parentheses.

 スマートフォン10(30)の前面には、タッチスクリーン64(84)が設けられている。右側にはスピーカ66(86)、カメラ72(92)が設けられている。左側にはマイク70(90)が設けられている。 A touch screen 64 (84) is provided on the front surface of the smartphone 10 (30). A speaker 66 (86) and a camera 72 (92) are provided on the right side. A microphone 70 (90) is provided on the left side.

 なお、ロボット側端末装置であるスマートフォン30のタッチスクリーン84には、その下方中央部に制御発光領域31が設けられている。この制御発光領域31は、制御光を発光する領域として使用される。 The touch screen 84 of the smartphone 30, which is a terminal device on the robot side, is provided with a control light emitting region 31 in the lower central portion thereof. The control light emitting region 31 is used as a region for emitting control light.

 図4に、スマートフォン10(30)のハードウエア構成を示す。CPU60(80)には、メモリ62(82)、タッチディスプレイ64(84)、スピーカ66(86)、不揮発性メモリ68(88)、マイク70(90)、カメラ72(92)、通信回路74(94)が接続されている。なお、電話のための通話回路などは省略している。 FIG. 4 shows the hardware configuration of the smartphone 10 (30). The CPU 60 (80) includes a memory 62 (82), a touch display 64 (84), a speaker 66 (86), a non-volatile memory 68 (88), a microphone 70 (90), a camera 72 (92), and a communication circuit 74 ( 94) is connected. The call circuit for the telephone is omitted.

 通信回路74(94)は、インターネットに接続するための回路である。不揮発性メモリ68(88)には、オペレーティングシステム76(96)、操作側端末プログラム78(ロボット側端末プログラム98)が記録されている。 The communication circuit 74 (94) is a circuit for connecting to the Internet. The operating system 76 (96) and the operation side terminal program 78 (robot side terminal program 98) are recorded in the non-volatile memory 68 (88).

 操作側端末プログラム78(ロボット側端末プログラム98)は、オペレーティングシステム76(96)と協働してその機能を発揮するものである。 The operation side terminal program 78 (robot side terminal program 98) exerts its function in cooperation with the operating system 76 (96).

 図5aに、ロボット基体装置50の外観を示す。本体部52には、4本の脚58が設けられ、机などの上に安定して載置できるようになっている。本体部52の上部には、保持部54が設けられている。保持部54の上面には、図1に示すようにスマートフォン30を保持するための凹部56が設けられている。 FIG. 5a shows the appearance of the robot substrate device 50. The main body 52 is provided with four legs 58 so that it can be stably placed on a desk or the like. A holding portion 54 is provided on the upper portion of the main body portion 52. As shown in FIG. 1, a recess 56 for holding the smartphone 30 is provided on the upper surface of the holding portion 54.

 図5bに、凹部56に装着されたスマートフォン30の中央付近の、ロボット基体装置50の断面を示す。図に示すように、スマートフォン30のタッチディスプレイ84が上に向くように、傾けて保持できるようになっている。保持部54には、装着されたスマートフォン30のタッチディスプレイ84の制御発光領域31に対向するように、光検出部であるフォトトランジスタ51が設けられている。 FIG. 5b shows a cross section of the robot substrate device 50 near the center of the smartphone 30 mounted in the recess 56. As shown in the figure, the touch display 84 of the smartphone 30 can be tilted and held so as to face upward. The holding unit 54 is provided with a phototransistor 51, which is a photodetector, so as to face the controlled light emitting region 31 of the touch display 84 of the attached smartphone 30.

 図5cに、ロボット基体装置50の断面図を示す。本体部52には、ステッピングモータ108と制御回路101が設けられている。保持部54は、本体部52に設けられたステッピングモータ108の軸に固定されている。したがって、ステッピングモータ108を回転させることで、保持部54の向きを変えることができる。これにより、保持部54の凹部56に装着されたスマートフォン30の向きを変えることができる。 FIG. 5c shows a cross-sectional view of the robot substrate device 50. The main body 52 is provided with a stepping motor 108 and a control circuit 101. The holding portion 54 is fixed to the shaft of the stepping motor 108 provided in the main body portion 52. Therefore, the orientation of the holding portion 54 can be changed by rotating the stepping motor 108. As a result, the orientation of the smartphone 30 mounted in the recess 56 of the holding portion 54 can be changed.

 なお、図5cに示された凹部56の反対側の壁には、図5bに示すフォトトランジスタ51が設けられている。 The phototransistor 51 shown in FIG. 5b is provided on the wall opposite to the recess 56 shown in FIG. 5c.

 図6に、ロボット基体装置50の制御回路101のハードウエア構成を示す。CPU100には、メモリ102、不揮発性メモリ104、波形整形回路107、ドライバ回路109が接続されている。 FIG. 6 shows the hardware configuration of the control circuit 101 of the robot substrate device 50. A memory 102, a non-volatile memory 104, a waveform shaping circuit 107, and a driver circuit 109 are connected to the CPU 100.

 フォトトランジスタ106からの信号は、波形整形回路107にて整形されてデータとしてCPU100に与えられる。ドライバ回路109は、CPU100の指令に基づいて、ステッピングモータ108を駆動させる回路である。 The signal from the phototransistor 106 is shaped by the waveform shaping circuit 107 and given to the CPU 100 as data. The driver circuit 109 is a circuit for driving the stepping motor 108 based on the command of the CPU 100.

 不揮発性メモリ104には、オペレーティングシステム110、ロボット基体プログラム112が記録されている。ロボット基体プログラム112は、オペレーティングシステム110と協働してその機能を発揮するものである。なお、オペレーティングシステムを用いずに、ロボット基体プログラム112単独で動作するように構成してもよい。 The operating system 110 and the robot substrate program 112 are recorded in the non-volatile memory 104. The robot substrate program 112 cooperates with the operating system 110 to exert its function. It should be noted that the robot substrate program 112 may be configured to operate independently without using an operating system.

 図7に、サーバ装置のハードウエア構成を示す。CPU120には、メモリ122、ハードディスク124、通信回路126が接続されている。通信回路126は、インターネットに接続するためのものである。 FIG. 7 shows the hardware configuration of the server device. A memory 122, a hard disk 124, and a communication circuit 126 are connected to the CPU 120. The communication circuit 126 is for connecting to the Internet.

 ハードディスク124には、オペレーティングシステム128、サーバプログラム130が記録されている。サーバプログラム130は、オペレーティングシステム128と協働してその機能を発揮するものである。
 
The operating system 128 and the server program 130 are recorded on the hard disk 124. The server program 130 exerts its function in cooperation with the operating system 128.

3.ロボット操作
 図8、9に、ロボット操作のフローチャートを示す。操作側スマートフォン10のCPU60(以下、操作側スマートフォン10と省略することがある)は、マイク70からの音声データ、カメラ72からの画像データを取得し、通信回路74によって、サーバ装置70に送信する(ステップS1)。
3. 3. Robot operation FIGS. 8 and 9 show a flowchart of robot operation. The CPU 60 of the operation-side smartphone 10 (hereinafter, may be abbreviated as the operation-side smartphone 10) acquires voice data from the microphone 70 and image data from the camera 72, and transmits them to the server device 70 by the communication circuit 74. (Step S1).

 サーバ装置70のCPU120(以下、サーバ装置70と省略することがある)は、通信回路126によって、インターネットを介してこの音声データ、画像データを受信し、ロボット側スマートフォン30に転送する(ステップS21)。 The CPU 120 of the server device 70 (hereinafter, may be abbreviated as the server device 70) receives the voice data and the image data via the Internet by the communication circuit 126, and transfers the voice data and the image data to the smartphone 30 on the robot side (step S21). ..

 ロボット側スマートフォン30のCPU80(以下、ロボット側スマートフォン30と省略することがある)は、通信回路94によって、インターネットを介して、音声データ、画像データを受信する(ステップS42)。ロボット側スマートフォン30は、受信した音声データをスピーカ86から出力し、受信した画像データをタッチディスプレイ84に表示する(ステップS43)。 The CPU 80 of the robot-side smartphone 30 (hereinafter, may be abbreviated as the robot-side smartphone 30) receives voice data and image data via the Internet by the communication circuit 94 (step S42). The robot-side smartphone 30 outputs the received voice data from the speaker 86, and displays the received image data on the touch display 84 (step S43).

 したがって、ロボット側ユーザは、スマートフォン30によって、操作側ユーザの画像や音声を得ることができる。なお、ロボット側スマートフォン30のタッチディスプレイ84の制御発光領域31およびその近傍は、保持部54によって隠された状態となり、その部分の表示をロボット側ユーザは見ることはできない。しかし、隠された領域を小さくすることで、視認性を損なわないようにしている。 Therefore, the robot-side user can obtain the image and voice of the operation-side user by the smartphone 30. The controlled light emitting area 31 of the touch display 84 of the robot side smartphone 30 and its vicinity are hidden by the holding unit 54, and the robot side user cannot see the display of that part. However, by making the hidden area smaller, visibility is not impaired.

 ロボット側スマートフォン30は、マイク90からの音声データ、カメラ92からの画像データを取得し、通信回路94によって、サーバ装置70に送信する(ステップS41)。 The robot-side smartphone 30 acquires voice data from the microphone 90 and image data from the camera 92 and transmits them to the server device 70 by the communication circuit 94 (step S41).

 サーバ装置70は、通信回路126によって、インターネットを介してこの音声データ、画像データを受信し、操作側スマートフォン10に転送する(ステップS21)。 The server device 70 receives the voice data and the image data via the Internet by the communication circuit 126, and transfers the voice data and the image data to the operating smartphone 10 (step S21).

 操作側スマートフォン10は、通信回路74によって、インターネットを介して、音声データ、画像データを受信する(ステップS2)。操作側スマートフォン10は、受信した音声データをスピーカ66から出力し、受信した画像データをタッチディスプレイ64に表示する(ステップS3)。 The operation-side smartphone 10 receives voice data and image data via the Internet through the communication circuit 74 (step S2). The operating smartphone 10 outputs the received voice data from the speaker 66, and displays the received image data on the touch display 64 (step S3).

 したがって、操作側ユーザは、スマートフォン10によって、ロボット側ユーザの画像や音声を得ることができる。たとえば、図10に示すように、ロボット側ユーザの顔やその背景を見ることができる。 Therefore, the operation side user can obtain the image and voice of the robot side user by the smartphone 10. For example, as shown in FIG. 10, the face of the robot-side user and its background can be seen.

 また、操作側スマートフォン10は、画面上に操作ボタン(制御マーク)13を表示する(ステップS3)。操作ボタン13には、右回転ボタン13aと左回転ボタン13bが含まれる。 Further, the operation side smartphone 10 displays the operation button (control mark) 13 on the screen (step S3). The operation button 13 includes a right rotation button 13a and a left rotation button 13b.

 操作側ユーザが右回転ボタン13aをタップすると、そのタップの期間だけロボット側スマートフォン30の向きが右側に回転される。同様に、左回転ボタン13bをタップすると、そのタップの期間だけロボット側スマートフォン30の向きが左側に回転される。 When the user on the operation side taps the right rotation button 13a, the direction of the smartphone 30 on the robot side is rotated to the right for the period of the tap. Similarly, when the left rotation button 13b is tapped, the orientation of the robot-side smartphone 30 is rotated to the left for the tap period.

 たとえば、図10の状態において、操作側ユーザが右回転ボタン13aをタップしたとする。操作側スマートフォン10は、この右回転の制御入力を受けて、右回転制御データをサーバ装置70に送信する(ステップS5、S7)。 For example, suppose that the user on the operating side taps the right rotation button 13a in the state shown in FIG. The operation-side smartphone 10 receives the right-handed rotation control input and transmits the right-handed rotation control data to the server device 70 (steps S5 and S7).

 同様に、図10の状態において、操作側ユーザが左回転ボタン13bをタップしたとする。操作側スマートフォン10は、この左回転の制御入力を受けて、左回転制御データをサーバ装置70に送信する(ステップS5、S7)。 Similarly, in the state of FIG. 10, it is assumed that the operating side user taps the left rotation button 13b. The operation side smartphone 10 receives the left rotation control input and transmits the left rotation control data to the server device 70 (steps S5 and S7).

 また、操作側ユーザが、いずれのボタン13a、13bもタップしていない時には、操作側スマートフォン10は、常時、停止制御データをサーバ装置70に送信する(ステップS6、S7)。これは、ノイズなどの影響で、誤って、回転が行われないようにするためである。 Further, when the operating side user has not tapped any of the buttons 13a and 13b, the operating side smartphone 10 always transmits the stop control data to the server device 70 (steps S6 and S7). This is to prevent accidental rotation due to the influence of noise or the like.

 ロボット側スマートフォン30は、この制御データを受信する(ステップS44)。以下では右回転制御データを受けたものとして説明を行う。ロボット側スマートフォン30は、右回転制御データに対応する制御光にて、タッチディスプレイ84の制御発光領域31(図3参照)を点滅させる(ステップS45)。 The robot-side smartphone 30 receives this control data (step S44). In the following, the description will be made assuming that the right rotation control data has been received. The robot-side smartphone 30 blinks the control light emitting area 31 (see FIG. 3) of the touch display 84 with the control light corresponding to the right rotation control data (step S45).

 この実施形態では、10Hzにて点滅する制御光を用いるようにし、制御内容に応じて、制御光の点灯デューティ比(1周期の時間に対する点灯時間の比率)を変えるようにしている。たとえば、この実施形態では、右回転制御データであればデューティ比を15%とし、左回転制御データであればデューティ比を85%としている。また、停止制御データであればデューティ比を50%としている。ここでは、上述のように右回転制御データであるから、15%のデューティ比の制御光を出す。 In this embodiment, the control light that blinks at 10 Hz is used, and the lighting duty ratio (ratio of the lighting time to the time of one cycle) of the control light is changed according to the control content. For example, in this embodiment, the duty ratio is set to 15% for right rotation control data, and the duty ratio is set to 85% for left rotation control data. Further, in the case of stop control data, the duty ratio is set to 50%. Here, since it is the right rotation control data as described above, the control light having a duty ratio of 15% is emitted.

 タッチディスプレイ84の制御発光領域31から放出された制御光は、ロボット基体装置50の保持部54に設けられたフォトトランジスタ51によって受光される(図5b参照)。フォトトランジスタ51は、制御光に応じたパルス信号(制御光データ)を出力する。ロボット基体装置50のCPU100(以下、ロボット基体装置50と省略することがある)は、波形整形回路107(図6参照)を介して、この制御光データを取得する(ステップS61)。 The control light emitted from the control light emitting region 31 of the touch display 84 is received by the phototransistor 51 provided in the holding portion 54 of the robot substrate device 50 (see FIG. 5b). The phototransistor 51 outputs a pulse signal (control light data) corresponding to the control light. The CPU 100 of the robot substrate device 50 (hereinafter, may be abbreviated as the robot substrate device 50) acquires this control optical data via the waveform shaping circuit 107 (see FIG. 6) (step S61).

 ロボット基体装置50は、制御光データのオン・デューティー比を検出し、これによって制御データの内容が、右回転、左回転、停止のいずれであるかを判断する。この実施形態では、図12に示すようなテーブルに基づいて、制御データの内容を判断している。 The robot substrate device 50 detects the on-duty ratio of the control optical data, and thereby determines whether the content of the control data is clockwise rotation, counterclockwise rotation, or stop. In this embodiment, the content of the control data is determined based on the table as shown in FIG.

 ここでは、15%のデューティ比の制御光に基づく制御光データであるから、この制御光データもデューティ比は15%となる。したがって、ロボット基体装置50は、右回転指令が与えられたと判断し、ドライバ回路109を制御して、ステッピングモータ108を右回転させる(ステップS62)。なお、ロボット基体装置50は、右回転指令が与えられている間は継続して、ステッピングモータ108を回転し続ける。 Here, since the control light data is based on the control light having a duty ratio of 15%, the duty ratio of this control light data is also 15%. Therefore, the robot substrate device 50 determines that the right rotation command has been given, controls the driver circuit 109, and rotates the stepping motor 108 to the right (step S62). The robot substrate device 50 continues to rotate the stepping motor 108 while the right rotation command is given.

 これにより、保持部54が右回転し、保持部54に装着されているロボット側スマートフォン30も全体的に回転する。したがって、ロボット側スマートフォン30のカメラ92の撮像アングルも変化する。 As a result, the holding portion 54 rotates clockwise, and the robot-side smartphone 30 mounted on the holding portion 54 also rotates as a whole. Therefore, the imaging angle of the camera 92 of the robot-side smartphone 30 also changes.

 図8、図9のフローチャートでは、音声・画像の送信と、制御データの送信が順次行われるように示されているが、実際には、双方の処理は並列して行われる。したがって、図10の状態から、操作側ユーザが右回転ボタン13aをタップすると、当該タップの時間に応じて、たとえば図11のような、右回転したロボット側スマートフォン30のカメラ92が撮像した画像を得ることができる。操作側ユーザは、表示された画像を見ながら、所望の位置まで回転をさせることができる。 In the flowcharts of FIGS. 8 and 9, it is shown that the transmission of audio / image and the transmission of control data are sequentially performed, but in reality, both processes are performed in parallel. Therefore, when the user on the operating side taps the right-handed rotation button 13a from the state of FIG. 10, the image captured by the camera 92 of the robot-side smartphone 30 rotated to the right, for example, as shown in FIG. 11, is obtained according to the tap time. Obtainable. The operating user can rotate the image to a desired position while viewing the displayed image.

 これにより、操作側ユーザは、通話相手であるロボット側ユーザのカメラ92の向きを自由に調整して、自らが希望するアングルの画像を得ることができる。
 
As a result, the operating side user can freely adjust the direction of the camera 92 of the robot side user who is the other party of the call and obtain an image of the angle desired by himself / herself.

4.その他
(1)上記実施形態では、サーバ装置70を介して、操作側スマートフォン10とロボット側スマートフォン30がやりとりをしている。しかし、サーバ装置70を介さずに、ブルーツース、Wifiなどによって直接やりとりを行うようにしてもよい。あるいは、サーバ装置以外の装置を介してやり取りを行うようにしてもよい。
4. others
(1) In the above embodiment, the operation side smartphone 10 and the robot side smartphone 30 communicate with each other via the server device 70. However, direct communication may be performed by Bluetooth, Wifi, or the like without going through the server device 70. Alternatively, the communication may be performed via a device other than the server device.

(2)上記実施形態では、発光検出部としてフォトトランジスタ51を用いている。しかし、カメラなどを用いて検出するようにしてもよい。 (2) In the above embodiment, the phototransistor 51 is used as the light emission detection unit. However, it may be detected by using a camera or the like.

(3)上記実施形態では、右回転ボタン13a、左回転ボタン13bをタップしている間だけ、回転を行うようにしている。しかし、ボタンをタップするごとに、所定角度(たとえば5度)だけ回転を行うようにしてもよい。 (3) In the above embodiment, the rotation is performed only while the right rotation button 13a and the left rotation button 13b are tapped. However, each time the button is tapped, the rotation may be performed by a predetermined angle (for example, 5 degrees).

(4)上記実施形態では、制御光のデューティ比を変えることで、制御データの内容(右回転、停止、左回転)を示すようにしている。しかし、制御光の位相、周波数、振幅などを変えたり、これらと組み合わせたりして、制御データの内容を示すようにしてもよい。 (4) In the above embodiment, the content of the control data (right rotation, stop, left rotation) is shown by changing the duty ratio of the control light. However, the phase, frequency, amplitude, and the like of the control light may be changed or combined with these to indicate the content of the control data.

 また、上記では、右回転、左回転、停止だけであるが、デューティ比を細かく(10%ごとなどに)設定することで、さらに多くの制御内容を示すことができる。 Also, in the above, only clockwise rotation, counterclockwise rotation, and stop are performed, but by setting the duty ratio finely (every 10%, etc.), more control contents can be shown.

 たとえば、ロボット基体装置50において、保持部54を左右だけでなく、上下にも駆動するように構成する。その上で、右回転、左回転、停止、上回転、下回転、停止を、それぞれ、10%、30%、50%、70%、90%のデューティ比に設定する。このようにすれば、左右だけでなく上下も制御することができる。 For example, in the robot substrate device 50, the holding portion 54 is configured to be driven not only to the left and right but also to the top and bottom. Then, right rotation, left rotation, stop, up rotation, down rotation, and stop are set to duty ratios of 10%, 30%, 50%, 70%, and 90%, respectively. By doing so, it is possible to control not only the left and right but also the top and bottom.

 上下左右にスマートフォン30を回転させるためのロボット基体装置50を、図13に示す。本体部52の上に、保持部54が設けられている。保持部54は、第1ベース部材54a、垂直部材54b、第2ベース部材54c、溝状部材54dを備えている。 FIG. 13 shows a robot substrate device 50 for rotating the smartphone 30 up / down / left / right. A holding portion 54 is provided on the main body portion 52. The holding portion 54 includes a first base member 54a, a vertical member 54b, a second base member 54c, and a groove-shaped member 54d.

 矢印X方向に回転可能なように第1ベース54が設けられている。その回転機構は、図5cと同様である。第1ベース部材54aの円周の一端に、垂直方向に延びる垂直部材54bが設けられており、第1ベース部材54aが回転すると同じように回転する。 The first base 54 is provided so that it can rotate in the X direction of the arrow. The rotation mechanism is the same as in FIG. 5c. A vertical member 54b extending in the vertical direction is provided at one end of the circumference of the first base member 54a, and rotates in the same manner as the first base member 54a rotates.

 垂直部材54bには、矢印Y方向に回転する第2ベース部材54cが設けられている。その回転機構は、図5cと同様である。第2ベース部材54aの円周の一端に、溝状部材54dが設けられており、第2ベース部材54aが回転すると同じように回転する。 The vertical member 54b is provided with a second base member 54c that rotates in the Y direction of the arrow. The rotation mechanism is the same as in FIG. 5c. A groove-shaped member 54d is provided at one end of the circumference of the second base member 54a, and rotates in the same manner as the second base member 54a rotates.

 溝状部材54dはコ字状の溝を有しており、スマートフォン30の下端をこの溝によって保持する。したがって、スマートフォン30に設けられたカメラ(図示せず)は、第1ベース部材54a、第2ベース部材54dの回転によって、上下左右方向に回転することになる。なお、溝状部材54dには、図5bと同じようにフォトトランジスタ51が設けられている。 The groove-shaped member 54d has a U-shaped groove, and the lower end of the smartphone 30 is held by this groove. Therefore, the camera (not shown) provided in the smartphone 30 is rotated in the vertical and horizontal directions by the rotation of the first base member 54a and the second base member 54d. The groove-shaped member 54d is provided with a phototransistor 51 as in FIG. 5b.

 図13のようなロボット基体装置50を制御する場合、図10に示す左右の矢印ボタン13a、13bだけでなく、上下の矢印ボタンを設けることが好ましい。 When controlling the robot substrate device 50 as shown in FIG. 13, it is preferable to provide not only the left and right arrow buttons 13a and 13b shown in FIG. 10 but also the up and down arrow buttons.

(5)上記実施形態では、制御発光領域31を1カ所だけ設けている。しかし、複数箇所設ければ、さらに多くの制御内容を実現することができる。たとえば、図14に示すように、2つの制御発光領域31a、31bを設けることで、2種類の制御信号を同時に送ることができる。この場合、保持部54には、制御発光領域31a、31bのそれぞれに対応して、フォトトランジスタ51を設ける。 (5) In the above embodiment, the control light emitting region 31 is provided at only one place. However, if a plurality of locations are provided, more control contents can be realized. For example, as shown in FIG. 14, by providing two control light emitting regions 31a and 31b, two types of control signals can be transmitted at the same time. In this case, the holding unit 54 is provided with a phototransistor 51 corresponding to each of the control light emitting regions 31a and 31b.

 また、制御発光領域31aから発する制御光をクロック光とし、制御発光領域31bから、「1」「0」によって示されるデータ光を発するようにしてもよい。クロック光の立ち上がりの際のデータ光の値によって、「1」「0」を送信するものである。これにより「1」「0」の組合せによって多くの情報を伝達して複雑な制御を行うことができる。 Further, the control light emitted from the control light emission region 31a may be used as clock light, and the data light indicated by "1" and "0" may be emitted from the control light emission region 31b. "1" and "0" are transmitted depending on the value of the data light at the rising edge of the clock light. As a result, a large amount of information can be transmitted by the combination of "1" and "0" to perform complicated control.

(6)上記実施形態では、携帯端末装置としてスマートフォン10(30)を用いている。しかし、タブレットコンピュータやPDAなどを用いてもよい。また、携帯端末装置に代えて、据え置き型のPCなどを用いてもよい。この場合、ロボット側においては、少なくともカメラ92を保持部64に保持して駆動させるようにしてもよい。 (6) In the above embodiment, the smartphone 10 (30) is used as the mobile terminal device. However, a tablet computer, PDA, or the like may be used. Further, instead of the mobile terminal device, a stationary PC or the like may be used. In this case, on the robot side, at least the camera 92 may be held and driven by the holding unit 64.

(7)上記実施形態では、操作側ユーザがロボット側スマートフォン30の向きを変える制御を行うようにしている。しかし、その他の駆動制御を行うようにしてもよい。 (7) In the above embodiment, the operation side user controls to change the direction of the robot side smartphone 30. However, other drive control may be performed.

 たとえば、図16Aに示すような腕構造(アーム機構)を駆動制御するようにしてもよい。この構造は、ロボット基体装置50の保持部54の左右に取り付けることができる。腕固定部152は、保持部54に固定される。腕固定部152には、A方向に回転可能に、第1回転部154が取り付けられている。第1回転部154には、B方向に回転可能に、第2回転部156が取り付けられている。 For example, the arm structure (arm mechanism) as shown in FIG. 16A may be driven and controlled. This structure can be attached to the left and right of the holding portion 54 of the robot substrate device 50. The arm fixing portion 152 is fixed to the holding portion 54. A first rotating portion 154 is attached to the arm fixing portion 152 so as to be rotatable in the A direction. A second rotating portion 156 is attached to the first rotating portion 154 so as to be rotatable in the B direction.

 図16Bに、腕構造の内部機構を示す。腕固定部152の内部には、腕固定部ベース部材152aが設けられている。腕固定部ベース部材152aの外側を筐体で覆うことにより、腕固定部152が構成されている。 FIG. 16B shows the internal mechanism of the arm structure. Inside the arm fixing portion 152, an arm fixing portion base member 152a is provided. The arm fixing portion 152 is configured by covering the outside of the arm fixing portion base member 152a with a housing.

 腕固定部ベース部材152aには、回転軸162が回転可能に設けられている。この回転軸162は、ステッピングモータ(図示せず)によって矢印A方向に回転可能に構成されている。回転軸162には、第1回転部154の第1回転部ベース部材154aが固定されている。第1回転部ベース部材154aの外側を筐体で覆うことにより、第1回転部154が構成されている。したがって、ステッピングモータを制御することで、腕固定部152に対して、第1回転部154を矢印A方向に回転させることができる。 A rotation shaft 162 is rotatably provided on the arm fixing portion base member 152a. The rotary shaft 162 is configured to be rotatable in the direction of arrow A by a stepping motor (not shown). The first rotating portion base member 154a of the first rotating portion 154 is fixed to the rotating shaft 162. The first rotating portion 154 is configured by covering the outside of the first rotating portion base member 154a with a housing. Therefore, by controlling the stepping motor, the first rotating portion 154 can be rotated in the direction of arrow A with respect to the arm fixing portion 152.

 第1回転部ベース部材154aには、回転軸164によって、第2回転部ベース部材156aが回転可能に固定されている。この回転軸164は、他のステッピングモータ(図示せず)によって矢印B方向に回転可能に構成されている。回転軸164には、第2回転部ベース部材156aが固定されている。第2回転ベース部材156aの外側を筐体で覆うことにより、第2回転部156が構成されている。したがって、ステッピングモータを制御することで、第1回転部154に対して、第2回転部156を矢印B方向に回転させることができる。 The second rotating portion base member 156a is rotatably fixed to the first rotating portion base member 154a by the rotating shaft 164. The rotary shaft 164 is configured to be rotatable in the direction of arrow B by another stepping motor (not shown). A second rotating portion base member 156a is fixed to the rotating shaft 164. The second rotation portion 156 is configured by covering the outside of the second rotation base member 156a with a housing. Therefore, by controlling the stepping motor, the second rotating portion 156 can be rotated in the direction of arrow B with respect to the first rotating portion 154.

 このように、腕機構を設けることで、ロボットの表現力を増すことができる。なお、ロボット基体装置50の保持部54を本体部52に対して固定構造として、腕機構のみを制御するようにしてもよい。 By providing the arm mechanism in this way, the expressive power of the robot can be increased. The holding portion 54 of the robot substrate device 50 may be fixed to the main body portion 52 to control only the arm mechanism.

(8)上記実施形態では、ロボット基体装置50に装着されたロボット側スマートフォン30の向きを、操作側から制御するようにしている。しかし、ロボット側基体装置50が、種々の作業を行うロボットである場合、操作側からロボットに対して指示を与えて、作業を行うことができる。たとえば、お料理ロボットである場合、操作側からお料理ロボットを制御して料理を行いつつ、ロボット側ユーザに対して、画像・音声による通話を併用して料理の指導を行うことができる。 (8) In the above embodiment, the orientation of the robot-side smartphone 30 mounted on the robot-based device 50 is controlled from the operation side. However, when the robot-side substrate device 50 is a robot that performs various tasks, the operation side can give an instruction to the robot to perform the tasks. For example, in the case of a cooking robot, while controlling the cooking robot from the operation side to cook, it is possible to teach the robot side user the cooking by using both image and voice calls.

 すなわち、この発明によれば、ロボットにスマートフォンを装着することで、ロボットを遠隔で制御しつつ、画像・音声による通話を行うことができる。また、遠隔から、ロボット側スマートフォン30のカメラやマイクの角度を制御できるので、ロボット側の状況を正確に知ることができる。 That is, according to the present invention, by attaching a smartphone to the robot, it is possible to make an image / voice call while remotely controlling the robot. Further, since the angle of the camera and the microphone of the robot side smartphone 30 can be controlled remotely, the situation on the robot side can be accurately known.

(9)上記実施形態では、ロボット基体装置50の機構を、操作側から駆動する制御を説明した。しかし、ロボット基体装置50とは別の装置の機構を駆動するように制御してもよい。たとえば、ロボット基体装置50とは別体として構成されているお掃除ロボットの動作を、制御するようにしてもよい。 (9) In the above embodiment, the control for driving the mechanism of the robot substrate device 50 from the operation side has been described. However, it may be controlled to drive a mechanism of a device other than the robot substrate device 50. For example, the operation of the cleaning robot configured separately from the robot substrate device 50 may be controlled.

 また、機構を駆動させる制御ではなく、その他の制御一般に用いることができる。図17に、その機能構成を示す。 Also, it can be used not for the control that drives the mechanism but for other controls in general. FIG. 17 shows the functional configuration.

 操作側装置10の入力部14を操作して、操作側ユーザが入力を行うと、操作受付手段16はこれを受け付ける。制御データ送信手段22は、通信部24によって、受け付けた入力に応じた制御データを、被操作側表示装置30に送信する。 When the operation side user inputs an input by operating the input unit 14 of the operation side device 10, the operation reception means 16 accepts the input. The control data transmission means 22 transmits the control data corresponding to the received input by the communication unit 24 to the operated side display device 30.

 被操作側表示装置30の制御データ受信手段42は、通信部44によってこれを受信する。発光制御手段36は、制御データに対する制御光を、ディスプレイ32の一部領域から放出する。 The control data receiving means 42 of the operated side display device 30 receives this by the communication unit 44. The light emission control means 36 emits control light for the control data from a part of the display 32.

 被操作側本体装置50の発光検出手段52は制御光を受光する。被操作本体側制御手段54は、制御光に基づいて、所定の制御をおこなう。 The light emission detecting means 52 of the main body device 50 on the operated side receives the control light. The control means 54 on the operated main body side performs predetermined control based on the control light.

 たとえば、図18に示すように、ロボット側のディスプレイ180(テレビ会議のディスプレイ)の画面184の下部を支持部材186で覆う。画面184には、制御発光領域31a、31b、31c、31d、31eが設けられている。これに対向するように、支持部材186の内部には、フォトトランジスタが設けられる。カメラ182により、この部屋の様子は、相手方(操作側)タブレットなどに送信される。 For example, as shown in FIG. 18, the lower part of the screen 184 of the display 180 (video conference display) on the robot side is covered with the support member 186. The screen 184 is provided with control light emitting regions 31a, 31b, 31c, 31d, 31e. A phototransistor is provided inside the support member 186 so as to face this. The state of this room is transmitted to the other party (operation side) tablet or the like by the camera 182.

 操作側ユーザは、タブレットを操作して制御データを送ることで、フォトトランジスタがこれを取得し、たとえば、この部屋の照明の輝度を変更することができる。あるいは、カメラ182の倍率やフォーカスなどを制御することもできる。 The user on the operating side can operate the tablet to send control data, and the phototransistor can acquire this and change the brightness of the lighting in this room, for example. Alternatively, the magnification and focus of the camera 182 can be controlled.

 以上のように、この発明においてロボットとは、駆動するものだけでなく、何らかの制御を行うものを含む概念である。 As described above, in the present invention, the robot is a concept including not only a driving one but also a controlling one.

(10)上記実施形態では、制御光として光の点滅を利用している。しかし、これに代えて、あるいはこれに加えて、光の色を用いて情報を伝達するようにしてもよい。この場合、カラーを検出可能なフォトトランジスタや、カメラなどを用いることになる。 (10) In the above embodiment, blinking light is used as control light. However, instead of or in addition to this, the color of light may be used to convey information. In this case, a phototransistor capable of detecting color, a camera, or the like is used.

(11)上記実施形態では、光の点滅によって情報を伝達するようにしている。しかし、制御発光領域31にQRコード(商標)などのコードを表示し、これを読み取るようにしてもよい。 (11) In the above embodiment, information is transmitted by blinking light. However, a code such as a QR code (trademark) may be displayed in the control light emitting area 31 and read.

(12)上記実施形態では、図10に示すようなボタン13a、13bをタップすることで回転指令を与えるようにしている。しかし、画面上を指でスライドすることで、その方向にカメラを回転させる指令を与えるようにしてもよい。 (12) In the above embodiment, the rotation command is given by tapping the buttons 13a and 13b as shown in FIG. However, you may give a command to rotate the camera in that direction by sliding it on the screen with your finger.

(13)上記実施形態では、スマートフォン30のタッチディスプレイ84における制御発光領域31の位置を予め定めている。しかし、凹部56への装着による位置的な静電容量の変化(タッチ入力があったとは判定されないレベルの変化であることが好ましい)を、スマートフォン30が自ら検出し、当該部分を制御発光領域31とするようにしてもよい。このようにすれば、装着位置がずれたとしても制御光を確実に伝達することができる。 (13) In the above embodiment, the position of the controlled light emitting region 31 on the touch display 84 of the smartphone 30 is predetermined. However, the smartphone 30 itself detects a change in the positional capacitance due to mounting on the recess 56 (preferably a change at a level at which it is not determined that there was a touch input), and the portion is controlled by the control light emitting region 31. You may try to. By doing so, the control light can be reliably transmitted even if the mounting position is displaced.

(14)上記実施形態では、スマートフォン30の下部に制御発光領域31を設け、この部分にてスマートフォン30を支えるようにしている。しかし、上部、側部などに制御発光領域31を設けて、この部分にてスマートフォン30を支えるようにしてもよい。 (14) In the above embodiment, the control light emitting region 31 is provided in the lower part of the smartphone 30, and the smartphone 30 is supported by this portion. However, the control light emitting region 31 may be provided in the upper portion, the side portion, or the like, and the smartphone 30 may be supported by this portion.

 また、図19A、図19Bに示すように、ロボット側スマートフォン30のタッチディスプレイ84の任意の位置をクリップ33で覆うようにしてもよい。クリップ33には、フォトダイオードが設けられ、その信号は、制御線(図示せず)あるいは無線によって、ロボットに伝達される。この場合、制御発光領域31は、前述のようにスマートフォン30自らが設定するようにすることが好ましい。 Further, as shown in FIGS. 19A and 19B, the clip 33 may cover an arbitrary position of the touch display 84 of the robot-side smartphone 30. The clip 33 is provided with a photodiode, the signal of which is transmitted to the robot by a control line (not shown) or by radio. In this case, it is preferable that the control light emitting region 31 is set by the smartphone 30 itself as described above.

(15)上記実施形態では、ロボット基体装置50とロボット側携帯装置30を別装置として構成している。しかし、これらを一体としてロボット装置としてもよい。この場合には、制御データを制御光に変換せず、受信した制御データをそのまま用いるようにしてもよい。 (15) In the above embodiment, the robot substrate device 50 and the robot-side portable device 30 are configured as separate devices. However, these may be integrated into a robot device. In this case, the control data may not be converted into the control light, and the received control data may be used as it is.

 これにより、たとえば、テレビ電話において、相手方のカメラ(やマイク)の向きを制御したりすることができる。図20にその機能構成を示す。 This makes it possible to control the direction of the other party's camera (or microphone), for example, in a videophone. FIG. 20 shows the functional configuration.

 操作側テレビ電話装置10の入力部14を操作して、操作側ユーザが入力を行うと、操作受付手段16はこれを受け付ける。制御データ送信手段22は、通信部24によって、受け付けた入力に応じた制御データを、被操作側テレビ電話装置30に送信する。 When the operation side user inputs an input by operating the input unit 14 of the operation side videophone device 10, the operation reception means 16 accepts the input. The control data transmission means 22 transmits the control data corresponding to the received input by the communication unit 24 to the videophone device 30 on the operated side.

 被操作側テレビ電話装置30の制御データ受信手段42は、通信部44によってこれを受信する。被操作側制御手段36は、制御データに基づいて、カメラやマイク34の向きを変える。したがって、被操作側テレビ電話装置30から送信される画像や音声を、操作側ユーザが調整することができる。 The control data receiving means 42 of the videophone device 30 on the operated side receives this by the communication unit 44. The operated side control means 36 changes the direction of the camera or the microphone 34 based on the control data. Therefore, the operating side user can adjust the image and sound transmitted from the operated side videophone device 30.

 なお、上記のようなテレビ電話システムは、図1の構成によっても実現することができる。 The videophone system as described above can also be realized by the configuration shown in FIG.

(16)上記実施形態では、ステッピングモータを用いている。しかし、これに代えて、サーボモータなどの通常のモータを用いるようにしてもよい。 (16) In the above embodiment, a stepping motor is used. However, instead of this, a normal motor such as a servo motor may be used.

(17)上記に示した各変形例は、互いに組み合わせて実施可能である。 (17) Each of the above-mentioned modifications can be implemented in combination with each other.

Claims (23)

 操作側携帯装置とロボット側携帯装置とロボット基体装置とを備えたロボットシステムにおいて、
 前記操作側携帯装置は、
 ロボット側携帯装置から送信されてきたロボット側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像した操作側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する撮像画像送信手段と、
 タッチディスプレイに入力されたユーザの操作を受け付ける操作受付手段と、
 前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する制御データ送信手段とを備え、
 前記ロボット側携帯装置は、
 操作側携帯装置から送信されてきた操作側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像したロボット側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記操作側携帯装置に送信する撮像画像送信手段と、
 前記操作側携帯装置から送信されてきた制御データを受信する制御データ受信手段と、
 前記タッチディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域の発光を制御し、制御光を送出する発光制御手段とを備え、
 前記ロボット基体装置は、
 本体部と、
 駆動部によって前記本体部に対して移動可能に設けられた保持部であって、前記ロボット側携帯装置のタッチディスプレイに対向してタッチディスプレイを物理的に保持するための保持壁を有する保持部と、
 発光検出部によって、前記保持部により前記ロボット基体装置に保持された前記ロボット側携帯装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出手段と、
 前記発光検出手段からの制御光データに基づいて、少なくとも、前記ロボット側携帯装置を保持する保持部を前記駆動部によって移動させるよう制御するロボット側制御手段と、
 を備えたロボットシステム。
In a robot system equipped with a mobile device on the operation side, a mobile device on the robot side, and a robot substrate device,
The operation-side portable device is
An image capture image display means for displaying the image captured on the robot side transmitted from the mobile device on the robot side on a touch display, and
An image transmitting means for transmitting an operation-side image captured by a camera to the robot-side portable device by a transmission unit via a server device or directly.
An operation reception means that accepts user operations input to the touch display,
A control data transmission means for transmitting the received operation as control data to the robot-side mobile device by the transmission unit via the server device or directly is provided.
The robot-side portable device is
An image capture image display means for displaying an operation side image transmitted from an operation side mobile device on a touch display, and
An image transmitting means for transmitting a robot-side image captured by a camera to the operation-side portable device by a transmission unit via a server device or directly.
A control data receiving means for receiving the control data transmitted from the operation-side mobile device, and
A control light emitting region is provided at least a part of the touch display, and a light emission control means for controlling light emission in the control light emission region based on the control data and transmitting the control light is provided.
The robot substrate device is
With the main body
A holding unit that is movably provided with respect to the main body by a driving unit and has a holding wall for physically holding the touch display facing the touch display of the robot-side mobile device. ,
A light emission detecting means for detecting control light in the control light emission region of the robot-side portable device held by the holding unit in the robot substrate device by the light emission detection unit and obtaining control light data.
Based on the control light data from the light emission detecting means, at least, a robot-side control means for controlling the holding portion for holding the robot-side portable device to be moved by the drive unit, and a robot-side control means.
A robot system equipped with.
 ロボット側携帯装置から送信されてきたロボット側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像した操作側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する撮像画像送信手段と、
 タッチディスプレイに入力されたユーザの操作を受け付ける操作受付手段と、
 ロボット側携帯装置が、制御データに基づいてタッチディスプレイの制御発光領域の発光状態を制御して、当該ロボット側携帯装置が保持されたロボット基体装置の駆動部を制御光データによって制御するために、前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する制御データ送信手段と、
 を備えた操作側携帯装置。
An image capture image display means for displaying the image captured on the robot side transmitted from the mobile device on the robot side on a touch display, and
An image transmitting means for transmitting an operation-side image captured by a camera to the robot-side portable device by a transmission unit via a server device or directly.
An operation reception means that accepts user operations input to the touch display,
In order for the robot-side mobile device to control the light emission state of the control light emitting area of the touch display based on the control data and to control the drive unit of the robot substrate device in which the robot-side portable device is held by the control light data. A control data transmission means for transmitting the received operation as control data to the robot-side mobile device by the transmission unit via the server device or directly.
Operation side mobile device equipped with.
 ロボットシステムに用いる操作側携帯装置をコンピュータによって実現するための操作側携帯装置プログラムであって、コンピュータを、
 ロボット側携帯装置から送信されてきたロボット側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像した操作側撮像画像を、前記操作側携帯装置にサーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する撮像画像送信手段と、
 タッチディスプレイに入力されたユーザの操作を受け付ける操作受付手段と、
 ロボット側携帯装置が、制御データに基づいてタッチディスプレイの制御発光領域の発光状態を制御して、当該ロボット側携帯装置が保持されたロボット基体装置の駆動部を制御光データによって制御するために、前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記ロボット側携帯装置に送信する制御データ送信手段として機能させるための操作側携帯装置プログラム。
An operation-side mobile device program for realizing an operation-side mobile device used in a robot system by a computer.
An image capture image display means for displaying the image captured on the robot side transmitted from the mobile device on the robot side on a touch display, and
An image pickup transmitting means for transmitting an operation-side image captured by a camera to the operation-side mobile device via a server device or directly by a transmission unit to the robot-side mobile device.
An operation reception means that accepts user operations input to the touch display,
In order for the robot-side mobile device to control the light emission state of the control light emitting area of the touch display based on the control data and to control the drive unit of the robot substrate device in which the robot-side portable device is held by the control light data. An operation-side mobile device program for functioning as a control data transmission means for transmitting the received operation as control data to the robot-side mobile device by a transmission unit via a server device or directly.
 操作側携帯装置から送信されてきた操作側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像したロボット側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記操作側携帯装置に送信する撮像画像送信手段と、
 前記操作側携帯装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該ロボット側携帯装置を保持するロボット基体装置に設けられた発光検出部によって制御光を受光し、制御光による制御光データに基づいてロボット基体装置の駆動部を制御させるために、前記タッチディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域の発光を制御して、制御光を送出する発光制御手段と、
 を備えたロボット側携帯装置。
An image capture image display means for displaying an operation side image transmitted from an operation side mobile device on a touch display, and
An image transmitting means for transmitting a robot-side image captured by a camera to the operation-side portable device by a transmission unit via a server device or directly.
A control data receiving means for receiving the control data transmitted from the operation-side mobile device, and
In order to receive the control light by the light emission detection unit provided in the robot base device holding the robot-side portable device and to control the drive unit of the robot base device based on the control light data by the control light, the touch display is used. A light emission control means that controls light emission in the control light emission region based on the control data and emits control light, with at least a part thereof as a control light emission region.
Robot side mobile device equipped with.
 ロボット側携帯装置をコンピュータによって実現するためのロボット側携帯装置プログラムであって、コンピュータを、
 操作側携帯装置から送信されてきた操作側撮像画像をタッチディスプレイに表示する撮像画像表示手段と、
 カメラによって撮像したロボット側撮像画像を、サーバ装置を介してあるいは直接に、送信部によって前記操作側携帯装置に送信する撮像画像送信手段と、
 前記操作側携帯装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該ロボット側携帯装置を保持するロボット基体装置に設けられた発光検出部によって制御光を受光し、制御光による制御光データに基づいてロボット基体装置の駆動部を制御させるために、前記タッチディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域の発光を制御して、制御光を送出する発光制御手段として機能させるためのロボット側携帯装置プログラム。
A robot-side mobile device program for realizing a robot-side mobile device by a computer.
An image capture image display means for displaying an operation side image transmitted from an operation side mobile device on a touch display, and
An image transmitting means for transmitting a robot-side image captured by a camera to the operation-side portable device by a transmission unit via a server device or directly.
A control data receiving means for receiving the control data transmitted from the operation-side mobile device, and
In order to receive the control light by the light emission detection unit provided in the robot base device holding the robot-side portable device and to control the drive unit of the robot base device based on the control light data by the control light, the touch display is used. A robot-side portable device program for controlling at least a part of the light emission in the control light emission region based on the control data and functioning as a light emission control means for transmitting the control light.
 本体部と、
 駆動部によって前記本体部に対して移動可能に設けられた保持部であって、前記ロボット側携帯装置のタッチディスプレイに対向してタッチディスプレイを物理的に保持するための保持壁を有する保持部と、
 発光検出部によって、前記保持部により前記ロボット基体装置に保持された前記ロボット側携帯装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出手段と、
 前記発光検出手段からの制御光データに基づいて、少なくとも、前記ロボット側携帯装置を保持する保持部を前記駆動部によって移動させるよう制御するロボット側制御手段と、
 を備えたロボット基体装置。
With the main body
A holding unit that is movably provided with respect to the main body by a driving unit and has a holding wall for physically holding the touch display facing the touch display of the robot-side mobile device. ,
A light emission detecting means for detecting control light in the control light emission region of the robot-side portable device held by the holding unit in the robot substrate device by the light emission detection unit and obtaining control light data.
Based on the control light data from the light emission detecting means, at least, a robot-side control means for controlling the holding portion for holding the robot-side portable device to be moved by the drive unit, and a robot-side control means.
Robot base device equipped with.
 コンピュータによってロボット基体装置を実現するためのロボット基体装置プログラムであって、コンピュータを、
 発光検出部によって、保持部により前記ロボット基体装置に保持された前記ロボット側携帯装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出手段と、
 前記発光検出手段からの制御光データに基づいて、少なくとも、前記ロボット側携帯装置を保持する保持部を前記駆動部によって移動させるよう制御するロボット側制御手段として機能させるためのロボット基体装置プログラム。
A robot-based device program for realizing a robot-based device by a computer.
A light emission detecting means for detecting the control light in the control light emission region of the robot-side portable device held by the holding unit in the robot substrate device by the light emission detection unit and obtaining control light data.
A robot substrate device program for functioning as a robot-side control means for controlling at least a holding portion for holding the robot-side portable device to be moved by the drive unit based on control optical data from the light emission detecting means.
 請求項1~7のいずれかのシステム、装置またはプログラムにおいて、
 前記駆動部は、ロボット側携帯装置を保持した保持部を回転させるモータを備えることを特徴とするシステム、装置またはプログラム。
In any of the systems, devices or programs of claims 1-7.
The drive unit is a system, device, or program including a motor for rotating a holding unit that holds the robot-side portable device.
 請求項1~8のいずれかのシステム、装置またはプログラムにおいて、
 前記駆動部は、少なくとも、ロボット基体装置に設けられたアーム部材を駆動させることを特徴とするシステム、装置またはプログラム。
In any of the systems, devices or programs of claims 1-8.
The drive unit is, at least, a system, device, or program that drives an arm member provided on the robot substrate device.
 請求項1~9のいずれかのシステム、装置またはプログラムにおいて、
 前記ロボット側携帯装置の発光制御手段は、受信した制御データに応じて、ディユーティ比の異なる制御光を前記制御発光領域から出力することを特徴とするシステム、装置またはプログラム。
In any of the systems, devices or programs of claims 1-9.
The light emission control means of the robot-side portable device is a system, device or program characterized in that control light having a different duty ratio is output from the control light emission region according to received control data.
 請求項10のシステム、装置またはプログラムにおいて、
 前記発光手段は、少なくとも、前記制御データが駆動部を駆動させる操作内容であるか、停止させる操作内容であるかに応じて、前記制御光のデューティ比を変えることを特徴とするシステム、装置またはプログラム。
In the system, apparatus or program of claim 10.
The light emitting means is, at least, a system, an apparatus, or a system, an apparatus, characterized in that the duty ratio of the control light is changed depending on whether the control data is an operation content for driving the drive unit or an operation content for stopping the drive unit. program.
 請求項11のシステム、装置またはプログラムにおいて、
 前記駆動部は、ロボット側携帯装置を保持した保持部を回転させるモータを備え、
 前記発光御手段は、少なくとも、前記制御データがモータを右回転させる操作内容であるか、左回転させる操作内容であるか、停止させる操作内容であるかに応じて、前記制御光のデューティ比を変えることを特徴とするシステム、装置またはプログラム。
In the system, apparatus or program of claim 11.
The drive unit includes a motor for rotating the holding unit that holds the robot-side mobile device.
The light emitting means has at least a duty ratio of the control light depending on whether the control data is an operation content for rotating the motor clockwise, an operation content for rotating the motor counterclockwise, or an operation content for stopping the motor. A system, device or program characterized by change.
 請求項1~12のいずれかのシステム、装置またはプログラムにおいて、
 操作側携帯装置またはロボット側携帯装置は、スマートフォンまたはタブレットコンピュータであることを特徴とするシステム、装置またはプログラム。
In any of the systems, devices or programs of claims 1-12.
An operating-side portable device or a robot-side portable device is a system, device, or program characterized by being a smartphone or tablet computer.
 操作側装置と被操作側表示装置と被操作側本体装置とを備えた制御システムにおいて、
 前記操作側装置は、
 操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、
 前記受け付けた操作を制御データとして、サーバ装置を介してあるいは直接に、送信部によって前記被操作側表示装置に送信する制御データ送信手段とを備え、
 前記被操作側表示装置は、
 前記操作側装置から送信されてきた制御データを受信する制御データ受信手段と、
 ディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域から制御光を出力する発光制御手段とを備え、
 前記被操作側本体装置は、
 前記被操作側表示装置のディスプレイに対向してディスプレイを物理的に保持するための保持壁を有する保持部と、
 前記保持部によって前記被操作側本体装置に保持された前記被操作側表示装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出部と、
 前記発光検出部からの制御光データに基づいて、制御処理を行う被操作本体装置側制御手段と、
 を備えた制御システム。
In a control system including an operation side device, an operated side display device, and an operated side main body device,
The operation side device is
An operation receiving means that accepts the user's operation input to the operation input unit,
A control data transmission means for transmitting the received operation as control data to the operated side display device by the transmission unit via the server device or directly is provided.
The operated side display device is
A control data receiving means for receiving the control data transmitted from the operation side device, and
A control light emitting area is provided at least a part of the display, and a light emission control means for outputting control light from the control light emission area based on the control data is provided.
The operated main body device is
A holding portion having a holding wall for physically holding the display facing the display of the operated display device, and a holding portion.
A light emitting detection unit that detects the control light in the controlled light emitting region of the operated side display device held by the holding unit in the operated side main body device and obtains control light data.
A control means on the side of the main unit to be operated that performs control processing based on the control light data from the light emission detection unit.
Control system with.
 操作側表示装置から送信されてきた操作側撮像画像をディスプレイに表示する操作側撮像画像表示手段と、
 前記操作側表示装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該被操作側表示装置を保持する被操作側本体装置に設けられた発光検出部によって制御光を受光し、制御光による制御光データに基づいて被操作側本体装置の駆動部を制御させるために、 前記ディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域から制御光を出力する発光制御手段と、
 を備えた被操作側表示装置。
An operation-side captured image display means for displaying an operation-side captured image transmitted from an operation-side display device on a display, and an operation-side captured image display means.
A control data receiving means for receiving the control data transmitted from the operation side display device, and
In order to receive the control light by the light emission detection unit provided in the operated side main body device holding the operated side display device and to control the drive unit of the operated side main body device based on the control light data by the control light. A light emission control means that outputs control light from the control light emission region based on the control data, wherein at least a part of the display is a control light emission region.
Operated side display device equipped with.
 被操作側表示装置をコンピュータによって実現するための被操作側プログラムであって、コンピュータを、
 操作側表示装置から送信されてきた操作側撮像画像をディスプレイに表示する操作側撮像画像表示手段と、
 前記操作側表示装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該被操作側表示装置を保持する被操作側本体装置に設けられた発光検出部によって制御光を受光し、制御光による制御光データに基づいて被操作側本体装置の駆動部を制御させるために、 前記ディスプレイの少なくとも一部を制御発光領域とし、前記制御データに基づいて当該制御発光領域から制御光を出力する発光制御手段として機能させるための被操作側プログラム。
The operated side program for realizing the operated side display device by the computer, and the computer.
An operation-side captured image display means for displaying an operation-side captured image transmitted from an operation-side display device on a display, and an operation-side captured image display means.
A control data receiving means for receiving the control data transmitted from the operation side display device, and
In order to receive the control light by the light emission detection unit provided in the operated side main body device holding the operated side display device and to control the drive unit of the operated side main body device based on the control light data by the control light. , A program on the operated side for using at least a part of the display as a control light emission region and functioning as a light emission control means for outputting control light from the control light emission region based on the control data.
 被操作側表示装置のディスプレイに対向してディスプレイを物理的に保持するための保持壁を有する保持部と、
 前記保持部によって被操作側本体装置に保持された前記被操作側表示装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出部と、
 前記発光検出部からの制御光データに基づいて、制御処理を行う被操作本体装置側制御手段と、
 を備えた被操作側本体装置。
A holding unit having a holding wall for physically holding the display facing the display of the display device on the operated side,
A light emitting detection unit that detects the control light in the controlled light emitting region of the operated side display device held by the holding unit in the main body device on the operated side and obtains control light data.
A control means on the side of the main unit to be operated that performs control processing based on the control light data from the light emission detection unit.
Operated side main unit equipped with.
 被操作側表示装置のディスプレイに対向してディスプレイを物理的に保持するための保持壁を有する保持部と、前記保持部によって被操作側本体装置に保持された前記被操作側表示装置の前記制御発光領域の制御光を検出し、制御光データを得る発光検出部とを有する被操作側本体装置をコンピュータによって実現するための被操作側本体プログラムであって、コンピュータを、
 前記発光検出部からの制御光データに基づいて、制御処理を行う被操作本体装置側制御手段として機能させるための被操作側本体プログラム。
A holding portion having a holding wall for physically holding the display facing the display of the operated side display device, and the control of the operated side display device held by the holding portion on the operated side main body device. An operated main unit program for realizing a controlled main unit device having a light emitting detection unit that detects control light in a light emitting region and obtains controlled light data by a computer, and is a computer.
A program to be operated that functions as a control means on the device to be operated to perform control processing based on the control light data from the light emission detection unit.
 操作側テレビ電話装置と被操作側テレビ電話装置を有するテレビ電話システムにおいて、
 前記操作側テレビ電話装置は、
 操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、
 前記受け付けた操作を制御データとして、送信部によって前記被操作側テレビ電話装置に送信する制御データ送信手段とを備え、
 被操作側テレビ電話装置は、
 前記操作側テレビ電話装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該制御データに基づいて、被操作側テレビ電話装置のカメラまたはマイクの向きを変える被操作側制御手段と、
 を備えたことを特徴とするテレビ電話システム。
In a videophone system having a videophone device on the operating side and a videophone device on the operated side,
The operation side videophone device
An operation reception means that accepts user operations input to the operation input unit,
A control data transmission means for transmitting the received operation as control data to the operated videophone device by the transmission unit is provided.
The videophone device on the operated side is
A control data receiving means for receiving the control data transmitted from the videophone device on the operating side, and
Based on the control data, the operated side control means that changes the direction of the camera or microphone of the operated side videophone device, and
A videophone system featuring a videophone system.
 操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、
 被操作側テレビ電話装置のカメラまたはマイクの向きを変えるために、前記受け付けた操作を制御データとして、送信部によって前記被操作側テレビ電話装置に送信する制御データ送信手段と、
 を備えた操作側テレビ電話装置。
An operation reception means that accepts user operations input to the operation input unit,
A control data transmitting means for transmitting the received operation as control data to the operated videophone device by the transmission unit in order to change the direction of the camera or microphone of the operated videophone device.
Operation side videophone device equipped with.
 操作側テレビ電話装置をコンピュータによって実現するための操作側テレビ電話プログラムであって、コンピュータを、
 操作入力部に入力されたユーザの操作を受け付ける操作受付手段と、
 被操作側テレビ電話装置のカメラまたはマイクの向きを変えるために、前記受け付けた操作を制御データとして、送信部によって前記被操作側テレビ電話装置に送信する制御データ送信手段として機能させるための操作側テレビ電話プログラム。
An operation-side videophone program for realizing an operation-side videophone device by a computer, which is a computer.
An operation reception means that accepts user operations input to the operation input unit,
The operation side for functioning as a control data transmission means for transmitting the received operation as control data to the operated videophone device by the transmitting unit in order to change the direction of the camera or microphone of the operated videophone device. Videophone program.
 操作側テレビ電話装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該制御データに基づいて、被操作側テレビ電話装置のカメラまたはマイクの向きを変える被操作側制御手段と、
 を備えた被操作側テレビ電話装置。
Control data receiving means for receiving control data transmitted from the videophone device on the operating side, and
Based on the control data, the operated side control means that changes the direction of the camera or microphone of the operated side videophone device, and
The operated side videophone device equipped with.
 被操作側テレビ電話装置をコンピュータによって実現するための被操作側テレビ電話プログラムであって、コンピュータを、
 操作側テレビ電話装置から送信されてきた制御データを受信する制御データ受信手段と、
 当該制御データに基づいて、被操作側テレビ電話装置のカメラまたはマイクの向きを変える被操作側制御手段として機能させるための被操作側テレビ電話プログラム。
 
 
 
 
A videophone program on the operated side for realizing a videophone device on the operated side by a computer.
Control data receiving means for receiving control data transmitted from the videophone device on the operating side, and
An operated videophone program for functioning as an operated control means for changing the direction of the camera or microphone of the operated videophone device based on the control data.



PCT/JP2021/028903 2020-08-11 2021-08-04 Robot system Ceased WO2022034839A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-135946 2020-08-11
JP2020135946A JP7012272B1 (en) 2020-08-11 2020-08-11 Robot system

Publications (1)

Publication Number Publication Date
WO2022034839A1 true WO2022034839A1 (en) 2022-02-17

Family

ID=80247892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/028903 Ceased WO2022034839A1 (en) 2020-08-11 2021-08-04 Robot system

Country Status (2)

Country Link
JP (1) JP7012272B1 (en)
WO (1) WO2022034839A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024509342A (en) * 2021-02-10 2024-03-01 インテリジェント レーシング インコーポレイテッド Devices, systems, and methods for operating intelligent vehicles using separate equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11136652A (en) * 1997-10-31 1999-05-21 Aisin Seiki Co Ltd Television camera remote control device
JP2007088771A (en) * 2005-09-21 2007-04-05 Toshiba Corp Video compression transmitter and video compression method
JP2013162345A (en) * 2012-02-06 2013-08-19 Sony Corp Imaging control device, imaging control method, program, remote operation device, remote operation method, and imaging system
US20140320542A1 (en) * 2013-04-29 2014-10-30 Sony Mobile Communications, Inc. Device and method of information transfer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11136652A (en) * 1997-10-31 1999-05-21 Aisin Seiki Co Ltd Television camera remote control device
JP2007088771A (en) * 2005-09-21 2007-04-05 Toshiba Corp Video compression transmitter and video compression method
JP2013162345A (en) * 2012-02-06 2013-08-19 Sony Corp Imaging control device, imaging control method, program, remote operation device, remote operation method, and imaging system
US20140320542A1 (en) * 2013-04-29 2014-10-30 Sony Mobile Communications, Inc. Device and method of information transfer

Also Published As

Publication number Publication date
JP7012272B1 (en) 2022-01-28
JP2022032308A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
Schmidt et al. PICOntrol: using a handheld projector for direct control of physical devices through visible light
US9210459B2 (en) Operation terminal, electronic unit, and electronic unit system
US7721207B2 (en) Camera based control
CN110383214B (en) Information processing apparatus, information processing method, and recording medium
JP7652236B2 (en) Information processing system, information processing device, program, information processing method, and room
JP2024015043A (en) Lighting system and remote controller
JP7012272B1 (en) Robot system
US8184211B2 (en) Quasi analog knob control method and appartus using the same
US10310676B2 (en) Image projection apparatus and operation method thereof
JP2019004243A (en) Display system, display device, and control method of display system
JP2020074234A (en) Electronic device and method of providing feedback
US11449451B2 (en) Information processing device, information processing method, and recording medium
JP2001282424A (en) Presentation system and wireless remote control
JP2016208222A (en) Variable equipment system
KR102254857B1 (en) Image display apparatus
KR20110032224A (en) System and method for providing user interface by gesture and gesture signal generator and terminal for same
JP2004356711A (en) Interphone master unit, doorphone slave unit, and interphone system
JP7053442B2 (en) Lighting system
US20240427422A1 (en) Information processing system, control method, and control program
JP2004147307A (en) Display device, terminal device, and interactive interactive system
JP3101967U (en) Information terminal controller
JP2007025801A (en) Information processing apparatus, information processing method, display control apparatus, display control method, and program
JP2025030392A (en) Remote communication system, remote communication method, relay server, and program
JP5656002B2 (en) Projector and control method
CN102270038B (en) Operation terminal, electronic unit and electronic unit system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21855918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21855918

Country of ref document: EP

Kind code of ref document: A1