[go: up one dir, main page]

WO2018221204A1 - Mobile body provided with radio antenna, and vehicle dispatch system - Google Patents

Mobile body provided with radio antenna, and vehicle dispatch system Download PDF

Info

Publication number
WO2018221204A1
WO2018221204A1 PCT/JP2018/018724 JP2018018724W WO2018221204A1 WO 2018221204 A1 WO2018221204 A1 WO 2018221204A1 JP 2018018724 W JP2018018724 W JP 2018018724W WO 2018221204 A1 WO2018221204 A1 WO 2018221204A1
Authority
WO
WIPO (PCT)
Prior art keywords
beacon
processing circuit
signal
signal processing
array antenna
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/018724
Other languages
French (fr)
Japanese (ja)
Inventor
伊藤 順治
華璽 劉
朋彦 友金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Corp
Original Assignee
Nidec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Corp filed Critical Nidec Corp
Publication of WO2018221204A1 publication Critical patent/WO2018221204A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/14Systems for determining direction or deviation from predetermined direction
    • G01S3/46Systems for determining direction or deviation from predetermined direction using antennas spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present application relates to a moving object including a movable array antenna and a vehicle allocation system.
  • an indoor positioning system that estimates the position of mobile terminals in an indoor environment where satellite radio waves cannot be received is actively underway. For example, when a beacon built in a mobile terminal emits a signal wave (electromagnetic wave such as microwave or millimeter wave), the position of the mobile terminal is determined by receiving the signal wave with a plurality of array antennas fixed in the environment. It becomes possible to estimate.
  • a signal wave electromagnetic wave such as microwave or millimeter wave
  • the direction of a beacon that radiates electromagnetic waves that is, the arrival direction of a signal wave.
  • the exact distance from the array antenna to the beacon cannot be obtained. Therefore, in order to accurately estimate the position of the beacon, it is necessary to use a plurality of array antennas arranged at different positions and perform geometric calculation from the arrival directions of the signal waves based on each array antenna. There is.
  • Japanese Patent Application Laid-Open No. 2007-19828 discloses a technique for estimating the direction of an electromagnetic wave radiation source with one array antenna and displaying the estimated position in an image acquired by a camera. According to such a technique, it is possible to estimate the direction or position of the radio wave radiation source with reference to the arrangement of buildings and the like included in the image acquired by the camera.
  • An embodiment of the present disclosure relates to an array antenna that can receive a signal wave radiated from a beacon, for example, while the user freely moves the position of the array antenna and estimates the arrival direction of the signal wave.
  • a vehicle allocation system having a beacon and a moving body.
  • a mobile unit of the present disclosure includes an imaging device that outputs image data, and an array that includes a plurality of antenna elements that receive signal waves periodically or intermittently emitted from beacons.
  • An antenna and a signal processing circuit that estimates a direction of arrival of the signal wave based on a signal output from the array antenna and determines coordinates that define the direction of arrival, the signal processing circuit including the direction of arrival Is output to the image data.
  • a vehicle allocation system includes, in an exemplary non-limiting embodiment, a plurality of beacons and a plurality of vehicles, and the vehicles cycle from any of the imaging devices that output image data and the plurality of beacons.
  • An array antenna having a plurality of antenna elements for receiving signal waves, including additional information having identification information related to the beacon or the person carrying the beacon.
  • a signal processing circuit that estimates an arrival direction of the signal wave based on a signal output from an antenna and determines coordinates defining the arrival direction; and a communication circuit that acquires the additional information from the signal wave.
  • the signal processing circuit outputs a video signal in which information indicating the direction of arrival is added to the image data; Other retrieves the position information of the person who carries the beacon is transmitted to the vehicle.
  • a signal wave radiated periodically or intermittently from a beacon is received while the user freely moves the position of the array antenna, and the arrival direction of the signal wave Can be estimated.
  • the moving body has an imaging device that outputs image data, and can output a video signal in which information indicating the arrival direction is added to the image data.
  • FIG. 1 is a front view illustrating a basic configuration example of a mobile device according to an embodiment of the present disclosure.
  • FIG. 2 is a side view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 3 is a perspective view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 4 is a hardware block diagram of the mobile device.
  • FIG. 5 is a diagram illustrating a coordinate system based on the imaging apparatus.
  • FIG. 6 is a diagram illustrating a two-dimensional coordinate uv stretched on the image plane SI of the imaging apparatus.
  • FIG. 7 is a diagram schematically showing the image plane SA of the array antenna in the coordinate system based on the imaging device.
  • FIG. 1 is a front view illustrating a basic configuration example of a mobile device according to an embodiment of the present disclosure.
  • FIG. 2 is a side view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 8 is a diagram illustrating the angles ⁇ and ⁇ that define the estimated value of the arrival direction of the signal wave.
  • FIG. 9 is a diagram schematically showing a luggage rack in the warehouse and an image of the luggage rack displayed on the display device.
  • FIG. 10A is a diagram for illustrating an example of a posture change of the mobile device.
  • FIG. 10B is a diagram for illustrating an example of the posture change of the mobile device.
  • FIG. 11A is a diagram illustrating dots and identification information indicating beacon positions displayed on a display device of a mobile device.
  • FIG. 11B is a diagram showing dots with corrected position coordinates and identification information.
  • FIG. 11C is a diagram illustrating an example of “display misalignment” when correction processing is not performed.
  • FIG. 12 is a flowchart illustrating an example of a beacon position display process.
  • FIG. 13 is a hardware block diagram of the mobile device according to the second example.
  • FIG. 14 is a hardware block diagram of the mobile device according to the third example.
  • FIG. 15 is a hardware block diagram of the mobile device according to the fourth example.
  • FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200.
  • FIG. 17 is an external view of the vehicle 200.
  • FIG. 18 is a diagram illustrating an internal configuration of the electronic apparatus 300 connected to the display device 310.
  • FIG. 19 is a schematic diagram showing the smartphone 240 and the electronic device 300 of the passenger 230 with the connection established.
  • FIG. 20 is a diagram showing a display example of the display device 310.
  • FIG. 21 is a diagram showing a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18).
  • FIG. 22 is an external view of an exemplary AGV 400.
  • FIG. 23 is a diagram illustrating a hardware configuration of the AGV 400.
  • FIG. 24 is a diagram illustrating a relationship between the AGV 400 and the direction P in which the beacon 10 exists.
  • FIG. 25 is a diagram illustrating a relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists.
  • FIG. 26 is a diagram illustrating a configuration example of the search system 1200.
  • FIG. 27 is an external perspective view of an exemplary multicopter 600.
  • FIG. 28 is a side view of the multicopter 600.
  • FIG. 1 to FIG. 3 are a front view and a side view, respectively, showing a basic configuration example of a mobile device according to an embodiment of the present disclosure
  • FIG. 3 is a perspective view.
  • the mobile device 100 is a device including a portable array antenna 20 and has a shape and a size that can be easily carried.
  • the array antenna 20 has a plurality of antenna elements 22 that receive signal waves radiated periodically or intermittently from the beacon 10 shown in FIGS.
  • the beacon 10 is also called a tag.
  • the beacon 10 emits a signal wave in accordance with the Bluetooth (registered trademark) Law Energy standard.
  • the beacon 10 may be a device that operates according to another standard.
  • the frequency of the signal wave is, for example, a microwave band or a millimeter wave band.
  • the beacon 10 radiates a 2.4 GHz signal wave at a time interval of, for example, 10 milliseconds to 200 milliseconds, typically 100 milliseconds.
  • the frequency of the signal wave does not need to be constant as long as it can be received by the array antenna 20, and a plurality of frequencies can be hopped.
  • the signal wave radiated from the beacon 10 is schematically described, but directivity is not particularly given to the actual radiation of the signal wave from the beacon 10. Although it is desirable that the signal wave emitted by the beacon 10 is isotropic, it may have anisotropy depending on the antenna of the beacon 10.
  • the signal wave emitted by the beacon 10 may include additional information having identification information regarding the beacon 10 or the person carrying the beacon 10.
  • An example of the additional information is a beacon ID and / or an owner ID of the beacon 10.
  • the beacon 10 may incorporate various sensors such as a heart rate monitor, a thermometer, an altimeter, and / or an acceleration sensor.
  • the beacon 10 may be electrically connected to various external sensors. In such a case, the beacon 10 can radiate by including various measurement values acquired by these sensors in the signal wave.
  • a typical example of the beacon 10 includes an antenna for radiating a signal wave, a high-frequency circuit, a battery for driving the high-frequency circuit, and a processor for controlling these operations.
  • the beacon 10 may be directly carried by a person, or may be incorporated in an electronic device such as a smartphone. Further, the beacon 10 may be used by being attached to a circulated article or case, like a general IC tag.
  • the diameter of the array antenna 20 is about 20 centimeters, for example, and includes seven antenna elements 22 arranged in a two-dimensional manner in a plane.
  • the weight of the array antenna 20 is, for example, about 500 grams.
  • the configuration and size of the array antenna 20 are not limited to this example as long as a person can carry it with one or both hands.
  • the external shape of the array antenna 20 viewed from the front is not necessarily circular, and may be an ellipse, a rectangle, a polygon, a star, or other shapes.
  • the number of antenna elements 22 may be 8 or more, or may be in the range of 3-6.
  • the antenna elements 22 in this example are arranged in a plane extending in both the horizontal direction and the vertical direction (vertical direction) in the drawing. Specifically, six antenna elements 22 are concentrically arranged at equal intervals around one antenna element 22 located at the center of the array antenna 20. This arrangement is only an example.
  • the antenna elements 22 may be arranged in a straight line along the horizontal direction, for example. When a plurality of antenna elements 22 are arranged linearly along one direction, the arrival direction of the signal wave cannot be estimated with respect to the direction intersecting the direction. In the example shown, it is possible to estimate the direction of arrival for both the
  • the array antenna 20 may incorporate a high-frequency circuit such as a monolithic microwave integrated circuit (not shown) and an AD conversion circuit. Such a circuit may be connected between the signal processing circuit 30 described later and the array antenna 20 instead of being provided in the array antenna 20.
  • a high-frequency circuit such as a monolithic microwave integrated circuit (not shown) and an AD conversion circuit.
  • Such a circuit may be connected between the signal processing circuit 30 described later and the array antenna 20 instead of being provided in the array antenna 20.
  • the mobile device 100 includes a signal processing circuit 30 that estimates the arrival direction of a signal wave based on a signal output from the array antenna 20 and determines coordinates that define the arrival direction of the signal wave.
  • the signal processing circuit 30 is configured to execute an array signal processing algorithm and estimate the arrival direction of the signal wave.
  • An array signal processing algorithm is arbitrary.
  • the direction of arrival of the signal wave may be referred to as DOA (Direction Of Arrival) or AOA (Angle Of Arrival).
  • the signal processing circuit 30 is disposed inside the casing of the base 72 in the illustrated example.
  • the base 72 is coupled to one end of the columnar grip 70.
  • the grip 70 supports the array antenna 20 via a fixture 74.
  • the grip 70 has a shape and a size (length and diameter) suitable for being gripped by a human hand.
  • the grip 70 is not held by a person, in other words, when the mobile device 100 is not carried by the user, the mobile device 100 is placed on the fixed object such that the base 72 is in contact with a fixed object such as a table or a floor. It may be placed.
  • the mobile device 100 can be used even when it is mounted on or attached to a mobile body such as a mobile robot, an automatic guided vehicle, a drone, or a car.
  • the mobile device 100 of the present embodiment includes a communication circuit 40 that acquires the above-described additional information from the signal wave of the beacon 10. Therefore, the mobile device 100 can also operate as a “handy scanner” that reads a signal or data emitted from the beacon 10 wirelessly in a contactless manner.
  • a communication circuit 40 is disposed inside the casing of the base 72 in the illustrated example.
  • the communication circuit 40 may transmit and receive other signal waves by an antenna (not shown) other than the array antenna 20, or may be connected to a telephone line or the Internet.
  • the signal processing circuit 30 and the communication circuit 40 are realized by a single or a plurality of semiconductor integrated circuits.
  • the signal processing circuit 30 may be referred to as a CPU (Central Processing Unit) or a computer.
  • the signal processing circuit 30 can be realized by a circuit including a computer such as a general-purpose microcontroller or a digital signal processor, and a memory in which a computer program for causing the computer to execute various instructions is incorporated.
  • the signal processing circuit 30 may include a register (not shown), a cache memory, and / or a buffer.
  • the mobile device 100 includes an imaging device 50 that outputs image data.
  • the imaging device 50 includes a lens 52 and an image sensor 54 as shown in FIGS.
  • a typical example of the imaging device 50 is a digital camera or a digital movie.
  • the relative arrangement relationship between the imaging device 50 and the array antenna 20 is fixed.
  • the optical axis (camera axis) of the imaging device 50 that is, the Z axis is parallel to the Za axis of the array antenna 20. “Parallel” in this specification need not be mathematically strictly parallel. Some misalignment is allowed.
  • the position or orientation of the imaging device 50 changes together with the array antenna 20.
  • the user can point the array antenna 20 in an arbitrary direction.
  • the signal processing circuit 30 outputs a video signal in which information indicating the arrival direction is added to the image data.
  • the mobile device 100 of this embodiment includes a display device 60 that displays image data. As shown in FIG. 2, the display device 60 in this embodiment is supported by the grip 70 via an angle adjustment device 76.
  • the display device 60 can display information indicating the direction of arrival and image data based on the video signal output from the signal processing circuit 30.
  • the information indicating the arrival direction includes marks such as dots or lines displayed at the estimated position coordinates of the arrival direction in the image defined by the image data. Typical examples of such marks can be graphics such as dots, circles, crosses, and arrows, symbols, letters, numbers, or combinations thereof.
  • the signal processing circuit 30 may cause the display device 60 to display a part or all of the selected additional information.
  • the display device 60 may be a liquid crystal display, an OLED display, or various flexible displays.
  • the display device 60 may be a projector that projects an image on the surface of an object such as a wall surface or a desktop.
  • the display device 60 includes a driver circuit, a memory circuit, an input / output interface circuit, and the like (not shown). All or part of the signal processing circuit 30, the storage device 32, and the communication circuit 40 may be mounted on the same printed circuit board together with the driver circuit of the display device 60 and the like.
  • the mobile device 100 may take the form of a smartphone, a tablet terminal, or a laptop computer provided with the array antenna 20.
  • the mobile device of the present disclosure may take the form of a wearable device such as a wristwatch or a head-mounted display.
  • FIG. 4 is a hardware block diagram of the mobile device.
  • the mobile device 100 includes various components. Each component is connected by an internal bus 34, and each component can exchange data with other components.
  • the mobile device 100 includes a storage device 32.
  • the storage device 32 includes a random access memory (RAM) 32a, a read only memory (ROM) 32b, and a storage 32c.
  • the RAM 32a is a volatile memory that can be used as a work memory when the signal processing circuit 30 performs an operation.
  • the read only memory (ROM) 32b is, for example, a non-volatile memory that stores a computer program.
  • the storage 32c is a non-volatile memory that holds information acquired by the mobile device 100. An example of the information is registration information including beacon 10 or identification information of a user who carries beacon 10 described later. In the present disclosure, the storage device 32 may be simply referred to as “memory”.
  • the signal processing circuit 30 reads out the computer program stored in the ROM 32b, develops it in the RAM 32a, reads out the commands constituting the computer program from the RAM 32a, and executes them. Thereby, the signal processing circuit 30 can implement
  • the signal processing circuit 30 performs processing for displaying an image on the display device 60.
  • the signal processing circuit 30 receives a video (moving image) output from the imaging device 50 described later, and adjusts the luminance of the video, for example, or outputs the video to the display device 60 with noise reduced. Further, the signal processing circuit 30 displays an image indicating the position of the beacon 10, for example, an icon, at the position coordinates of the display device 60. Note that the processing for displaying an image on the display device 60 may be performed by a circuit different from the signal processing circuit 30, for example, an image processing circuit.
  • the mobile device 100 in this embodiment includes a motion sensor 80.
  • An example of the motion sensor 80 is a gyro sensor.
  • a three-axis gyro sensor is employed as the motion sensor 80.
  • the motion sensor 80 detects angular velocities around three axes orthogonal to each other. When the mobile device 100 is placed on a horizontal plane, the three axes are typically parallel to the vertical direction (yaw axis), parallel to the horizontal direction (pitching axis), and parallel to the front-rear direction. Axis (roll axis).
  • the motion sensor 80 outputs a detected value of angular velocity around each axis, for example, every 1 millisecond.
  • the mobile device 100 may include an acceleration sensor.
  • the signal processing circuit 30 can acquire the rotation angle around each axis by time-integrating the detected value of the angular velocity around each axis output from the three-axis gyro sensor.
  • FIG. 5 shows a coordinate system based on the imaging device 50.
  • the XYZ coordinates shown in the figure are so-called camera coordinates (right-handed system).
  • the origin O of this coordinate is the optical center (principal point) of the imaging device 50, and the Z-axis is the camera optical axis.
  • a point P (X, Y, Z) is assumed to indicate the position of the beacon 10.
  • FIG. 5 shows the image plane SC of the imaging device 50.
  • the image plane SC is separated from the origin O by the focal length f in the Z-axis direction.
  • the two-dimensional coordinates xy stretched on the image plane SC are the coordinates of the camera image.
  • Corresponding points of the point P (X, Y, Z) appearing on the image plane SC are determined by perspective projection transformation based on the Peanhall camera model.
  • the beacon 10 is visually recognized from the imaging device 50, the beacon 10 is observed at the position of the point Mc (x, y) on the image plane SC.
  • FIG. 6 shows a two-dimensional coordinate uv stretched on the image plane SI of the imaging device 50.
  • This image plane SI corresponds to a pixel area of the image sensor 54.
  • the two-dimensional coordinate uv is a pixel unit coordinate.
  • the coordinates (u0, v0) are the intersections of the Z axis and the image plane SI.
  • Mc (u, v) can be obtained by converting (X, Y, Z) by perspective projection.
  • Equation 1 A basic example of a matrix that defines the perspective transformation is shown in the following Equation 1.
  • ⁇ and ⁇ in Equation 1 are internal parameters of the imaging apparatus 50, and specifically, are determined by the focal length f of the lens, the pixel size of the image sensor 54, and the like.
  • FIG. 7 is a diagram schematically showing the image plane SA of the array antenna 20 in the coordinate system with the imaging device 50 as a reference.
  • the point oa is the center of the array antenna 20, and the Za axis is the center axis of the array antenna 20.
  • the point oa is at a position shifted by a distance d in the positive direction of the Y axis.
  • the distance d is, for example, about 10 centimeters, but may be 10 centimeters or less.
  • the distance from the center oa of the array antenna 20 to the position P of the beacon 10 is unknown.
  • the direction of the position P of the beacon 10 (the arrival direction of the signal wave) can be estimated from the center oa of the array antenna 20.
  • the estimated value of the arrival direction of the signal wave is defined by, for example, the angles ⁇ and ⁇ shown in FIG.
  • the angle parameter that defines the arrival direction of the signal wave is not limited to the angles ⁇ and ⁇ .
  • the arrival direction can be expressed using the inclination angle ⁇ 1 from the Za axis to the X axis direction and the inclination angle ⁇ 2 from the Za axis to the y axis direction.
  • the antenna elements 22 constituting the array antenna 20 can be arranged on a straight line parallel to the X axis.
  • the embodiment of the mobile device according to the present disclosure is realized by a miniaturized device such as a smartphone, for example, three or four antenna elements 22 extend along a direction parallel to the long side of the housing. Can be arranged on a straight line or on a curve.
  • the image plane SA of the array antenna 20 can be virtually set at a position shifted by, for example, 1 meter from the point oa in the positive direction of the Za axis.
  • the image plane SA is a virtual screen.
  • the image plane SA is, for example, a rectangle 4 meters long and 4 meters wide.
  • Such a point Ma (xa, ya) on the image plane SA is determined by the angles ⁇ and ⁇ .
  • the coordinates on the virtual screen can be correlated with the coordinates on the image plane SC or SI of the imaging device 50 by scaling using the value of the Za coordinate of the screen.
  • a point Ma (xa, ya) on the image plane SA is at coordinates (X, Y, Z) when the beacon 10 on the line segment oa-P is located 1 meter in front of the array antenna 20. It corresponds to the X component and the Y component. However, there is a difference by a distance d between the center of the image plane SA and the center of the image plane SC. By correcting this difference, it is possible to calculate the position of the beacon 10 in the camera coordinates, and thus the position coordinates in the image plane SC and the image plane SI.
  • the line segment OP and the line segment oa-P are parallel to each other. May be approximated.
  • the position of the beacon 10 on the image plane SC calculated by the above method almost coincides with the position of the point Mc.
  • the closer the actual position P of the beacon 10 is to the mobile device 100 the more the angle between the line segment OP and the line segment oa-P deviates from the parallel, and thus the image calculated from the coordinates of the point Ma on the image plane SA.
  • the position of the beacon 10 on the plane SC is shifted in the vertical direction from the position of the point Mc.
  • the accuracy of the direction of the beacon 10 is preferably higher in the horizontal direction than in the vertical direction. This is because the movable range of the beacon 10 or the person having the beacon 10 tends to be constrained in a plane substantially parallel to the horizontal plane. For this reason, the positional relationship between the imaging device 50 and the array antenna 20 is preferably a vertical relationship as in this embodiment.
  • the center of the imaging device 50 and the center of the array antenna 20 may coincide with each other.
  • the Z axis may coincide with the Za axis. If such an arrangement relationship is realized, the estimated direction of the beacon 10 can be superimposed on the image plane SC of the imaging device 50 with a simpler calculation.
  • FIG. 9 is a diagram schematically showing the luggage rack 200 in the warehouse and the image of the luggage rack 200 displayed on the display device 60.
  • the dot 90 indicating the position of the beacon 10 and the identification information 92 acquired from the additional information included in the signal wave emitted by the beacon 10 are displayed. Based on such an image, a package having the beacon 10 can be identified. For simplicity, the case where there is one beacon 10 is illustrated, but each of a plurality of packages may have a beacon 10. Since the identification information 92 is unique to each beacon 10, a plurality of packages can be appropriately determined based on the identification information 92. Additional information other than identification information may be selectively displayed on the image, or may be hidden. Further, only the beacon 10 having the specific identification information 92 may be displayed on the display device 60.
  • the member that transmits electromagnetic waves is substantially transparent to the signal wave radiated from the beacon 10.
  • a wall or the like formed mainly of an insulating material transmits electromagnetic waves. For example, when searching for a person carrying a specific beacon 10 on the mobile device 100, the signal wave emitted by the beacon 10 can be received and the arrival direction of the signal wave can be known even if the person is located across the wall. Is possible. When searching for a person carrying the beacon 10 inside a building such as a building, it is only necessary to find a direction in which an incoming wave can be detected while changing the direction of the mobile device 100. Once the arrival direction is detected and the arrival direction can be estimated, it is possible to finally meet the person carrying the beacon 10 by moving the mobile device 100 in the arrival direction. Even if many people or things each have a beacon 10, it becomes possible to find the intended person or thing from the identification information included in the signal wave emitted by each beacon 10.
  • the use of the mobile device according to the present disclosure is not limited to indoors but may be outdoor. When a person carrying the beacon 10 is lost, the mobile device according to the present disclosure can be used to quickly rescue the victim.
  • ⁇ Correction of display position 1 (correction of misalignment due to camera shake, etc.)> As described above, a signal wave is intermittently emitted from the beacon 10. For this reason, the estimated value of the direction of arrival based on the signal wave is also calculated intermittently.
  • the estimated value of the arrival direction is updated at intervals of 100 milliseconds. At least one of the position and posture of the mobile device 100 may change due to camera shake or the like.
  • the image data of each frame is acquired at an interval shorter than 100 milliseconds (for example, an interval of about 8 to 16 milliseconds), it can be updated at a higher frequency than the estimated value of the arrival direction. Due to such a difference in data update, a “display position shift” may occur.
  • the signal processing circuit 30 determines the direction of arrival displayed on the display device 60 from the direction of arrival estimated based on the signal output from the array antenna 20. The position coordinates of the information to be shown are determined. In parallel with the position coordinate determination process, when the signal processing circuit 30 detects a hand shake or the like, the signal correction circuit 30 corrects the signal so as to compensate for the influence. Hereinafter, an example of signal correction will be described.
  • the output of the motion sensor 80 is used.
  • FIG. 10A and FIG. 10B show examples of changes in the posture of the mobile device 100 due to the user's intentional movement or camera shake.
  • the attitude of the mobile device 100 changes from the state shown in FIG. 10A to the state shown in FIG. 10B in a period shorter than about 100 milliseconds.
  • the mobile device 100 is rotated by an angle R in the clockwise direction of the illustrated Y axis (yaw axis) due to camera shake or the like.
  • the time interval at which the beacon 10 emits a signal wave is about 100 milliseconds.
  • the field of view of the imaging device 50 When the attitude of the mobile device 100 changes, the field of view of the imaging device 50 also changes.
  • the imaging device 50 outputs image data (frame group) following the change in the visual field.
  • the frames are updated at intervals of about 8 to 16 milliseconds, the imaging device 50 outputs 6 to 12 frames during a period of about 100 milliseconds until the mobile device 100 starts rotating and ends.
  • the video (background video) from the imaging device 50 displayed on the display device 60 follows the rotation relatively quickly. More specifically, the background image follows the rotation and flows from right to left.
  • the position of the beacon 10 is updated only once during a rotation period of about 100 milliseconds. Until the update is performed, the image indicating the position of the beacon 10 is continuously displayed at the same position on the display device 60. Although the background video flows from right to left on the display device 60, the image indicating the position of the beacon 10 is fixed at a certain point on the display device 60. This is the “display misalignment”. It is preferable for the user that the position of the beacon 10 is always displayed on the display device 60 with a certain degree of accuracy.
  • the inventor of the present application causes the mobile device 100 to perform display position correction processing to increase the accuracy of the position of the beacon 10 displayed on the display device 60.
  • the signal processing circuit 30 detects that the position and / or posture of the mobile device 100 has changed based on the detection value output from the motion sensor 80. Further, the signal processing circuit 30 calculates the rotation angle R by performing time integration of the detected value.
  • the image plane SA of the array antenna 20 is set to a position one meter away from the position of the mobile device 100.
  • the signal processing circuit 30 has moved R meters to the left from the position coordinates of the beacon 10 shown in FIG. 10A. The subsequent position coordinates are calculated.
  • FIG. 11A shows dots 90 indicating the position of the beacon 10 and identification information 92 displayed on the display device 60 of the mobile device 100 in the posture shown in FIG. 10A.
  • FIG. 11B shows the dot 90a and the identification information 92a whose position coordinates are corrected.
  • the “display position shift” on the display device 60 is shorter than the time interval in which the signal wave is emitted from the beacon 10. Can be corrected.
  • emitted intermittently from the beacon 10 is performed in parallel. For this reason, the correction error of the position coordinates using the output of the motion sensor 80 is not accumulated, and the correction error is reset every time update is performed.
  • FIG. 11C shows an example of “display misalignment” when correction processing is not performed.
  • the position coordinates of the dot 90 a indicating the position of the beacon 10 on the display device 60 are updated based only on the signal wave emitted intermittently from the beacon 10. Even if the posture of the mobile device 100 changes due to camera shake before the update, the position coordinates of the dots 90a on the display device 60 are maintained without being updated. Since a shift occurs between the background image in which the change due to the camera shake occurs and the position coordinates of the dot 90a indicating the position of the beacon 10, the smoothness is lacking and the visibility is inferior compared with the case of performing the correction process.
  • the signal processing circuit 30 corrects the position coordinates such as the dot 90 indicating the position of the beacon 10 with respect to the rotation around each of the three axes. can do.
  • the signal processing circuit 30 may estimate a change in the position and / or orientation of the mobile device 100 based on the image data, and correct the arrival direction or the position coordinates of the dot 90 according to the change.
  • the signal processing circuit 30 acquires image data of two frames at time t and (t + ⁇ t) from the imaging device 50.
  • the signal processing circuit 30 determines an arbitrary pattern (referred to as a “feature pattern”) that is commonly included in the two frames.
  • the signal processing circuit 30 acquires information on the relative position between the feature pattern in the frame at time t and the dot 90 indicating the position of the beacon 10 superimposed on the frame.
  • the relative position information is, for example, each difference value in the X-axis direction and the Y-axis direction.
  • the signal processing circuit 30 determines the position coordinate of the dot 90 using the position of the feature pattern of the frame at time (t + ⁇ t) and the acquired information on the relative position, and displays it on the display device 60. Since a plurality of continuous frames include the influence of camera shake, if the position of the dot 90 is corrected by the above-described method, the position coordinates of the dot 90 can be corrected in consideration of the influence of camera shake.
  • ⁇ Display position correction 2 (distortion correction by lens)>
  • distortion due to the lens 52 of the imaging device 50 may occur. Such distortion becomes prominent when a wide-angle lens is employed. The distortion causes a misalignment of the beacon 10 in the image.
  • the distortion can be corrected by calculation or a table if internal parameters unique to the imaging device 50 are known.
  • a table in which each position on a virtual image plane set 1 meter ahead of the position of the imaging device 50 is associated with the position coordinates of the image data output by the imaging device 50 is prepared in advance. 32 can be stored.
  • the signal processing circuit 30 determines the position coordinates on the virtual image plane from the arrival direction of the signal wave estimated based on the signals output from the array antenna 20, and refers to the above table to display the display device The position coordinates on 60 are determined.
  • the position of the beacon 10 may be corrected and superimposed on the image data. This is because if the accuracy of the position or direction of the beacon 10 in the image is high, there is no particular problem even if the image itself is distorted in the vicinity. Since only the process of correcting the position of the beacon 10 is performed without performing the process of correcting the distortion of each frame, the load required for the calculation is greatly reduced.
  • FIG. 12 is a flowchart showing an example of a process for displaying the position of the beacon 10. The process shown in FIG. 12 includes the display position correction process described above.
  • step S1 the signal processing circuit 30 receives signal wave data radiated from the beacon 10 and received by the array antenna 20.
  • the signal processing circuit 30 estimates the arrival direction of the signal wave based on the signal wave data.
  • step S ⁇ b> 3 the signal processing circuit 30 displays information indicating the arrival direction and the image data output from the imaging device 50 on the display device 60.
  • step S ⁇ b> 4 the signal processing circuit 30 corrects the display position shift caused by the shake of the mobile device 100. Thereafter, the process returns to step S1.
  • FIG. 13 is a hardware block diagram of the mobile device 110 according to the second example.
  • the mobile device 110 is different from the mobile device 100 in that the mobile device 110 includes a haptic device 82.
  • the tactile device 82 receives a command indicating the vibration pattern from the signal processing circuit 30, and generates a stimulus to be given to the user according to the command.
  • a command is a PWM signal.
  • the tactile device 82 includes a vibration motor 82a and a motor drive circuit 82b connected to the vibration motor 82a.
  • the vibration motor 82a is, for example, a horizontal linear actuator.
  • the motor drive circuit 82b supplies current to the vibration motor 82a in accordance with the command PWM signal, and causes the vibration motor 82a to operate with a predetermined vibration pattern.
  • the vibration pattern can be determined by, for example, the rising speed, the amplitude of vibration, the frequency of the applied current or voltage, and / or the frequency of amplitude.
  • the signal processing circuit 30 estimates the arrival direction of the signal wave of the beacon 10 and drives the vibration motor 82a of the haptic device 82 based on the estimated direction.
  • a lateral force field phenomenon can be caused to the user, that is, the illusion of being pulled can be given.
  • the haptic device 82 can give information indicating the arrival direction to the haptic sense of the user.
  • the mobile device 110 can guide the user to the position of the beacon 10.
  • the signal processing circuit 30 displays the dots 90 on the display device 60, the direction of the beacon 10 can be presented to the user in a visual angle.
  • vibration patterns for giving the illusion of being pulled are known as disclosed in JP2012-143054A, JP2010-2101010, and the like.
  • FIG. 14 is a hardware block diagram of the mobile device 120 according to the third example.
  • the mobile device 120 is configured by omitting the imaging device 50, the display device 60, and the motion sensor 80 from the mobile device 100.
  • the mobile device 110 can receive a signal wave emitted from the beacon 10 and transmit data indicating the position or direction of the beacon 10 and additional information to the external device via the communication circuit 40.
  • the mobile device 110 can operate as a “wireless handy scanner” for searching for the position of the beacon 10 and acquiring additional information of the beacon 10. Data indicating the position or direction of the beacon 10 may be stored in the storage device 32.
  • the external device can be a smartphone, a tablet terminal, or a laptop computer.
  • the external device can receive data indicating the position or direction of the beacon 10 from the mobile device 110 and display information indicating the arrival direction of the signal wave from the beacon 10 on a display display of the external device.
  • the external device may be a small electronic device having a haptic device.
  • a haptic device 82 built in the mobile device 110 FIG. 13 can be employed.
  • FIG. 15 is a hardware block diagram of the mobile device 130 according to the fourth example.
  • the mobile device 130 is the mobile device 120 (FIG. 13) to which the haptic device 82 is added.
  • the mobile device 130 can receive the signal wave emitted by the beacon 10 to estimate the arrival direction of the signal wave, and give the user the illusion that the signal is pulled in the estimated direction.
  • ⁇ Moving object> The mobile devices described so far are based on the assumption that people can carry them. However, it is not essential that a person is portable.
  • the imaging device, the array antenna, and the signal processing circuit may be attached to, for example, a moving body and move together with the moving body.
  • a moving body examples include flying bodies such as taxis, automatic guided vehicles, and multicopters.
  • an electronic device including an imaging device, an array antenna, and a signal processing circuit is attached to a moving body.
  • Electronic devices can be manufactured, sold, etc. in a size that can be mounted on a mobile object.
  • a mobile object equipped with an electronic device having the configuration of a mobile device will be described.
  • examples of the moving body include a vehicle such as an automobile, an automatic guided vehicle, and a multicopter.
  • the electronic device described below has the same components as the mobile device 100 (FIG. 4) and the mobile device 120 (FIG. 14) described above.
  • FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200.
  • the vehicle allocation system 1000 is used to make a taxi, which is the vehicle 200, go to a passenger who has made a vehicle allocation request and to board the passenger.
  • the solid line indicates a road.
  • the beacon 10 is built in the smartphone.
  • An application program provided by an operator of the vehicle allocation system 1000 is installed on the smartphone.
  • the application program controls a communication circuit built in the smartphone, and radiates a signal wave in accordance with the Bluetooth (registered trademark) Low Energy standard. That is, the beacon 10 is realized by a smartphone communication circuit and an application program.
  • the signal wave emitted by the beacon 10 includes additional information having identification information regarding the person carrying the smartphone.
  • An example of the additional information is a smartphone owner ID.
  • the owner ID may be a unique value issued for each user by an application program, for example.
  • a passenger requests a dispatch using a mobile communication network.
  • a passenger having a beacon 10 in a broken-line circle shown in the upper left of FIG. The passenger activates the application program and makes a vehicle allocation request.
  • the smartphone Under the control of the application program, the smartphone transmits a vehicle allocation request to the server 212 installed at the base 210 of the operator of the vehicle allocation system 1000 via the base station 220a.
  • the smartphone sends location information indicating the location of the passenger along with the dispatch request.
  • the position information can be measured and acquired by a GPS module mounted on the smartphone.
  • the smartphone may acquire position information by using access to a public Wi-Fi (registered trademark) spot whose position is known in advance, or the base stations 220a and 220b whose positions are fixed. It is also possible to acquire position information by using communication with the network.
  • a public Wi-Fi registered trademark
  • the server 212 determines the empty vehicle closest to the position of the passenger indicated by the position information in response to receiving the dispatch request. In the example shown in FIG. 16, it is assumed that the server 200 determines the vehicle 200 within the broken circle shown near the center of the drawing. The server 212 transmits an instruction to the vehicle 200 to the position indicated by the position information via the base station 220b of the mobile communication network.
  • the electronic device 300 receives the registration information including the passenger's beacon 10 or the identification information of the passenger from the server 212.
  • the storage device 32 of the electronic device 300 stores the received registration information.
  • FIG. 17 is an external view of the vehicle 200.
  • An electronic device 300 is installed on the ceiling of the vehicle 200.
  • a display device 310 is installed in the company.
  • the electronic device 300 includes the array antenna 20 and the imaging device 50.
  • FIG. 18 shows an internal configuration of the electronic device 300 connected to the display device 310.
  • the internal configuration of the electronic device 300 is substantially the same as the internal configuration of the mobile device 100 (FIG. 4).
  • the display device 60 separated from the mobile device 100 corresponds to the display device 310.
  • Components having the same structure and / or function are denoted by the same reference numerals, and the description thereof is omitted.
  • the difference in configuration between the mobile device 100 and the electronic device 300 is that the electronic device 300 has a driving device 302.
  • the drive device 302 incorporates a motor (not shown) and changes the postures of the array antenna 20 and the imaging device 50 with respect to the moving body.
  • the array antenna 20 and the imaging device 50 are integrally provided in one housing.
  • the drive device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction (yaw axis), an axis parallel to the horizontal direction (pitching axis), and an axis parallel to the front-rear direction (roll axis ) Can be rotated around.
  • the communication circuit 40 of the electronic device 300 can perform communication using a mobile communication network in addition to communication according to the Bluetooth (registered trademark) Law Energy standard.
  • a communication module that performs communication using a mobile communication network may be provided independently of the communication circuit 40.
  • the electronic device 300 When the vehicle 200 enters a predetermined range from the position of the passenger specified based on the position information, for example, within 300 m, the electronic device 300 operates the drive device 302 to change the array antenna 20 and the imaging device 50.
  • the beacon 10 is searched for the identification information of the passenger who requested the vehicle dispatch by changing the posture and scanning the surroundings. Eventually, when the beacon 10 is discovered, communication in accordance with the Bluetooth (registered trademark) Law Energy standard is established between the passenger's smartphone and the electronic device 300. Thereby, communication according to the above-mentioned standard becomes possible between the smart phone and the electronic device 300.
  • FIG. 19 is a schematic diagram showing the smart phone 240 of the passenger 230 and the electronic device 300 with the connection established.
  • the smartphone 240 is described as radiating a signal wave only in the direction of the electronic device 300, but the signal wave is radiated substantially isotropically.
  • FIG. 20 shows a display example of the display device 310.
  • the signal processing circuit 30 of the electronic device 300 displays on the display device 310 a mark indicating the position of the beacon 10 (the arrival direction of the signal wave from the beacon 10), for example, the image 250. Details of the processing are as described in detail with reference to FIGS. 1 to 9, for example. As a result, the driver of the vehicle 200 can accurately find the passenger 230 even if another person exists around the passenger 230.
  • the signal processing circuit 30 may additionally display the identification information 252 on the display device 310.
  • the identification information 252 may include a smartphone owner ID or a user ID of the vehicle allocation system 1000 and a passenger name. As shown in FIG. 20, a leader line for connecting the image of the passenger 230 and the identification information 252 may be provided so that the identification information 252 of the passenger 230 can be easily grasped. With the above-described additional display, it is possible to more accurately identify the passenger 230 who requested the dispatch.
  • the electronic device 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B using the detection value of the motion sensor 80.
  • display position correction processing using changes in image data may be performed. This is because display misalignment may occur when the vehicle 200 travels, as in the case of camera shake. By correcting the display position, the driver of the vehicle 200 can find the passenger 230 more easily and reliably.
  • the accuracy of the position information transmitted from the passenger 230 may be rough. Even in such a case, if the electronic device 300 receives the beacon 10 and can estimate the arrival direction of the signal wave, the approximate position where the passenger 230 waits can be displayed on the display device 310. Thereby, the driver of the vehicle 200 can determine whether the passenger 230 is waiting on the right side or the left side of the road, or whether the passenger 230 is waiting at the destination of the right turn or the left turn.
  • the server 212 of the base 210 may associate and register the owner ID of the smartphone or the user ID of the vehicle allocation system 1000 and each passenger's face photograph.
  • the electronic device 300 of the vehicle 200 receives an instruction from the server 212 to go to the position of the passenger 230, the electronic device 300 also receives image data of a registered facial photograph of the passenger 230.
  • the signal processing circuit 30 of the electronic device 300 extracts a feature pattern indicating the facial features of the passenger 230 from the facial photograph data, and an image defined by the facial photograph image and the image data output from the imaging device 50. Check against the person inside. If they match as a result of the collation, the signal processing circuit 30 displays a mark indicating the passenger 230, such as an image 250 shown in FIG.
  • the relative positional relationship between the array antenna 20 and the imaging device 50 may be fixed or movable. In the example of FIG. 17, both are fixed.
  • Electronic device 300 may be removed from vehicle 200 and carried by the driver of the vehicle. In this case, the electronic device 300 may transmit an image to, for example, a driver's smartphone instead of the display device 310.
  • AAV Automatic Guided Vehicle
  • FIG. 21 shows a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18).
  • the transport system 1100 can be installed in the factory 500.
  • the factory 500 is provided with a plurality of shelves 510.
  • the AGV 400 searches for the target beacon 10 using the identification information of the beacon 10 provided in the target package, and thereby discovers the package.
  • the wall of the factory 500 is provided with beacons 10N, 10S, 10W, and 10E indicating directions.
  • the electronic device 300 of the AGV 400 can receive signal waves from the beacons 10N, 10S, 10W, and 10E in an identifiable manner, and can estimate the position of each beacon. Using the estimated position of each beacon, the AGV 400 can recognize the current position of the own apparatus and the direction in which the own apparatus is facing, that is, the own position and attitude.
  • the four beacons 10N, 10S, 10W, and 10E are examples. Even if two beacons are provided, it is possible to recognize the self-position and posture. Further, by using five or more beacons 10, the accuracy of the self position and the posture can be further improved.
  • the AGV 400 can recognize the path using the image data output from the imaging device and determine the path. For example, a specific color is given to the floor surface, while a color different from the color is given to the shelves 510, luggage, walls, etc. other than the floor.
  • the AGV 400 can recognize a specific color of the floor surface from the image data output from the imaging device 50. Thereby, it can recognize that it is a path
  • the AGV 400 does not need to perform processing that has been conventionally performed in order to acquire information on the self-location.
  • the AGV 400 does not need to have map data and a laser range finder in the factory 500 where it travels. Since it is not necessary to estimate the current position by comparing the sensor data output from the laser range finder with the map data, the processing load of the arithmetic circuit is greatly reduced. Since the arithmetic circuit does not necessarily have high performance, the cost can be reduced. Further, the AGV 400 does not need to perform estimation / interpolation of the current position using odometry.
  • the laser range finder is used, the self-position estimation accuracy can be improved.
  • obstacles can be avoided by using image data output from the imaging device 50 included in the electronic device 300.
  • FIG. 22 is an external view of an exemplary AGV 400 according to the present embodiment.
  • the AGV 400 includes an electronic device 300, four wheels 411a to 411d, a frame 412, a transfer table 413, a travel control device 414, and a laser range finder 415.
  • the electronic device 300 and the laser range finder 415 are installed on the traveling direction side of the AGV 400.
  • the AGV 400 also has a plurality of motors, which are not shown in FIG. FIG. 22 shows a front wheel 411a, a rear wheel 411b, and a rear wheel 411c, but the front wheel 411d is not clearly shown because it is hidden behind the frame 12.
  • the traveling control device 414 is a device that controls the operation of the AGV 400, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
  • the laser range finder 415 is an optical device that measures the distance to the target by, for example, irradiating the target with infrared laser light 415a and detecting the reflected light of the laser light 415a.
  • the laser range finder 415 of the AGV 400 has a pulse shape while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 400, for example.
  • Laser light 415a is emitted, and the reflected light of each laser light 415a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
  • the AGV 400 can create a map of the factory 500 based on the position and orientation of the AGV 400 and the scan result of the laser range finder 15.
  • the map can reflect the arrangement of objects such as walls around the AGV, structures such as pillars, and shelves placed on the floor.
  • the map data is stored in a storage device provided in the AGV 400.
  • the position and posture of the moving body are called poses.
  • the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and orientation of the AGV 400, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” below.
  • the position of the reflection point seen from the radiation position of the laser beam 415a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 415 outputs sensor data expressed in polar coordinates.
  • the laser range finder 415 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
  • Examples of objects that can be detected by the laser range finder 415 are people, luggage, shelves, and walls.
  • the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
  • Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
  • the traveling control device 414 can estimate its current position by comparing the measurement result of the laser range finder 415 with the map data held by itself.
  • the map data may be acquired by the AGV 400 itself using SLAM (Simultaneous Localization and Mapping) technology.
  • FIG. 23 shows the hardware configuration of AGV400.
  • FIG. 23 also shows a specific configuration of the travel control device 14.
  • the AGV 400 includes a travel control device 414, a laser range finder 415, two motors 416a and 416b, and a drive device 417.
  • the traveling control device 414 includes a microcomputer 414a, a memory 414b, a storage device 414c, a communication circuit 414d, and a positioning device 414e.
  • the microcomputer 414a, the memory 414b, the storage device 414c, the communication circuit 414d, and the positioning device 414e are connected by a communication bus 414f and can exchange data with each other.
  • the laser range finder 415 is also connected to the communication bus 414f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 414a, the positioning device 414e, and / or the memory 414b.
  • the electronic device 300 is connected to the communication bus 414f.
  • the electronic device 300 transmits information indicating the position of the beacon 10 as a target position to the microcomputer 414a via the communication bus 414f, and outputs the image data output from the imaging device 50.
  • the microcomputer 414a is a processor or a control circuit (computer) that performs an operation for controlling the entire AGV 400 including the travel control device 414.
  • the microcomputer 414a is a semiconductor integrated circuit.
  • the microcomputer 414a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the driving device 417 to control the driving device 417 and adjust the voltage applied to the motor.
  • PWM Pulse Width Modulation
  • the memory 414b is a volatile storage device that stores a computer program executed by the microcomputer 414a.
  • the memory 414b can also be used as a work memory when the microcomputer 414a and the positioning device 414e perform calculations.
  • the storage device 414c is a nonvolatile semiconductor memory device.
  • the storage device 414c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
  • the storage device 414c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
  • the storage device 414c stores map data M of the traveling factory 500.
  • the map data M is created in advance and stored in the storage device 414c.
  • the AGV 400 can travel toward the position of the target beacon 10 while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during the traveling. it can.
  • the positioning device 414e receives the sensor data from the laser range finder 415, and reads the map data M stored in the storage device 414c.
  • the local map data created from the scan results of the laser range finder 415 is matched (matched) with a wider range of map data M to identify the self-position (x, y, ⁇ ) on the map data M To do.
  • the microcomputer 414a and the positioning device 414e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 414a and the positioning device 414e.
  • FIG. 23 shows a chip circuit 414g including the microcomputer 414a and the positioning device 414e.
  • the microcomputer 414a and the positioning device 414e are separately provided.
  • the two motors 416a and 416b are attached to the two wheels 411b and 411c, respectively, and rotate each wheel. That is, the two wheels 411b and 411c are drive wheels, respectively.
  • the motor 416a and the motor 416b are motors that drive the right wheel and the left wheel of the AGV 400, respectively.
  • the drive device 417 has motor drive circuits 417a and 417b for adjusting the voltage applied to each of the two motors 416a and 416b.
  • Each of the motor drive circuits 417a and 417b is a so-called inverter circuit, and the current applied to each motor is turned on or off by the PWM signal transmitted from the microcomputer 414a, thereby adjusting the voltage applied to the motor.
  • the electronic device 300 estimates the arrival direction of the signal wave radiated from the beacon 10.
  • the AGV 400 may not be able to travel on a straight route. The reason is that even when an obstacle exists between the self-position of the AGV 400 and the beacon 10, the signal wave radiated from the beacon 10 can pass through the obstacle and reach the electronic device 300. Examples of obstacles are walls, pillars, and shelves.
  • FIG. 24 shows the relationship between the AGV 400 and the direction P in which the beacon 10 exists.
  • the beacon 10 exists on the back side of the shelf 510 in the drawing.
  • a shelf 510 exists on the direction P as viewed from the AGV 400.
  • the microcomputer 414a uses the map data M to determine whether or not there is an obstacle in the direction P of the beacon 10 viewed from its own position. When there is an obstacle, the microcomputer 414a uses the map data M to calculate a route to reach the target position while avoiding the obstacle. In the example of FIG. 24, the AGV 400 does not travel along the route D1, but travels along the route D2.
  • FIG. 25 shows the relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists.
  • the direction P in which the beacon 10 exists changes.
  • the AGV 400 uses the map data M to travel on a route D3 that avoids the shelf 510. Thereby, the position of the beacon 10 can be reached while avoiding the shelf 510.
  • the microcomputer 414a may further determine the presence or absence of an obstacle from the image data by further using the image data received from the electronic device 300. If a stereo camera is employed as the imaging device 50, the position of the obstacle can be recognized with higher accuracy.
  • FIG. 26 shows a configuration example of the search system 1200.
  • Search system 1200 includes beacon 10, multicopter 600, PC 702 and display device 704 in search and rescue center facility 700.
  • the beacon 10 is owned by a leisure customer in a mountainous area, the sea or a river, for example.
  • the multicopter 600 is equipped with the electronic device 300.
  • the multicopter 600 detects a signal wave radiated from the beacon 10 of the rescuer while flying over a mountainous area, the sea, or a river, for example. Estimate the direction of arrival.
  • the imaging device 50 of the electronic device 300 starts outputting image data.
  • the signal processing circuit 30 of the electronic device 300 transmits a video signal obtained by adding information indicating the arrival direction to the image data via the communication circuit 40.
  • the multicopter 600 has a GPS module and acquires position information of the multicopter 600.
  • the signal processing circuit 30 also transmits the position information of the multicopter 600 via the communication circuit 40.
  • the video signal and the position information are transmitted to the search and rescue center facility 700.
  • the staff uses the PC 702 to reproduce the video signal on the display device 704.
  • the staff can specify the position of the multicopter 600 at that time and the direction in which the rescuer is present as viewed from the position, using the position information. Since the estimated position of the beacon 10 is displayed on the display device 704, it is easy to grasp the position, and rescue activities such as search team organization and dispatch can be started quickly.
  • FIG. 27 is an external perspective view of an exemplary multicopter 600 according to the present disclosure.
  • FIG. 28 is a side view of the multicopter 600.
  • An electronic device 300 is attached to the lower part of the central housing 602 of the multicopter 600 via a driving device 302.
  • the driving device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction of the multicopter 600 (yaw axis), an axis parallel to the horizontal direction (pitching axis), And it can be rotated around an axis (roll axis) parallel to the front-rear direction.
  • omitted since the general structure of the multicopter 600 is well-known, description is abbreviate
  • the electronic apparatus 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B by using the detection value of the motion sensor 80.
  • display position correction processing using changes in image data may be performed. This is because, as in the case of camera shake, display misalignment may occur when the multicopter 600 flies. By correcting the display position, the staff of the search / rescue center facility 700 can more easily and reliably find a rescuer.
  • the mobile body of the present disclosure includes an apparatus, a device, or an article including a beacon, or an electronic device that guides a person at or near the position where the beacon is arranged.
  • a moving body can be a mobile robot, an automated guided vehicle, a drone, an automobile, and the like.
  • beacons tags
  • 20 array antennas 30 signal processing circuits
  • 32 storage devices memory
  • 40 communication circuits 50 imaging devices, 52 lenses, 54 image sensors, 80 motion sensors, 82 tactile devices, 82a vibration motors, 82b
  • Motor drive circuit 100, 110, 120, 130 mobile device, 200 vehicle, 300 electronic device, 302 drive device, 310 display device, 400 AGV, 600 multicopter, 1000 dispatch system, 1100 transport system, 1200 search system

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

This electric device (300) is provided with: an imaging device (50) which outputs image data; an array antenna (20) having multiple array elements (22) which receive a signal wave emitted periodically or intermittently from a beacon (10); and a signal processing circuit (30) which estimates the direction of arrival of the signal wave on the basis of a signal outputted from the array antenna (20), and determines coordinates defining the direction of arrival. The signal processing circuit (30) outputs a video signal in which information indicating the direction of arrival has been added to the image data.

Description

アレーアンテナを備える移動体、および、配車システムMobile object equipped with array antenna and vehicle allocation system

 本願は、移動可能なアレーアンテナを備える移動体、および、配車システムに関する。 The present application relates to a moving object including a movable array antenna and a vehicle allocation system.

 衛星電波を受信できない屋内などの環境で携帯端末などの位置を推定するインドア・ポジショニング・システムの開発が活発に進められている。例えば携帯端末が内蔵するビーコンが信号波(マイクロ波またはミリ波などの電磁波)を放射する場合、この信号波を環境内に固定された複数のアレーアンテナで受信することにより、携帯端末の位置を推定することが可能になる。 Development of an indoor positioning system that estimates the position of mobile terminals in an indoor environment where satellite radio waves cannot be received is actively underway. For example, when a beacon built in a mobile terminal emits a signal wave (electromagnetic wave such as microwave or millimeter wave), the position of the mobile terminal is determined by receiving the signal wave with a plurality of array antennas fixed in the environment. It becomes possible to estimate.

 1個のアレーアンテナによれば、電磁波を放射するビーコンの方向、すなわち信号波の到来方向を推定することができる。ただし、アレーアンテナからビーコンまでの正確な距離を求めることはできない。従って、ビーコンの位置を正確に推定するためには、異なる位置に配置された複数のアレーアンテナを用いて、それぞれのアレーアンテナを基準とする信号波の到来方向から幾何学的な計算を行う必要がある。 According to one array antenna, it is possible to estimate the direction of a beacon that radiates electromagnetic waves, that is, the arrival direction of a signal wave. However, the exact distance from the array antenna to the beacon cannot be obtained. Therefore, in order to accurately estimate the position of the beacon, it is necessary to use a plurality of array antennas arranged at different positions and perform geometric calculation from the arrival directions of the signal waves based on each array antenna. There is.

 電磁波放射源の方向を1個のアレーアンテナによって推定し、その推定位置をカメラで取得した画像内に表示する技術が特開2007-19828号公報に開示されている。このような技術によれば、カメラで取得した画像に含まれている建造物などの配置を参考にして、電波放射源の方向または位置を推定することが可能になる。 Japanese Patent Application Laid-Open No. 2007-19828 discloses a technique for estimating the direction of an electromagnetic wave radiation source with one array antenna and displaying the estimated position in an image acquired by a camera. According to such a technique, it is possible to estimate the direction or position of the radio wave radiation source with reference to the arrangement of buildings and the like included in the image acquired by the camera.

特開2007-19828号公報Japanese Patent Laid-Open No. 2007-19828

 従来の技術は、いずれも、アレーアンテナが環境に対して実質的に固定されることを前提としている。 All the conventional technologies are based on the premise that the array antenna is substantially fixed to the environment.

 本開示の実施形態は、ビーコンから放射された信号波を、例えばユーザがアレーアンテナの位置を自由に移動させながら、受信して、前記信号波の到来方向を推定することが可能な、アレーアンテナを備える移動体、および、ビーコンおよび移動体を有する配車システムを提供する。 An embodiment of the present disclosure relates to an array antenna that can receive a signal wave radiated from a beacon, for example, while the user freely moves the position of the array antenna and estimates the arrival direction of the signal wave. And a vehicle allocation system having a beacon and a moving body.

 本開示の移動体は、例示的な非限定的な実施形態において、画像データを出力する撮像装置と、ビーコンから周期的または断続的に放射された信号波を受信する複数のアンテナ素子を有するアレーアンテナと、前記アレーアンテナから出力された信号に基づいて前記信号波の到来方向を推定し、前記到来方向を規定する座標を決定する信号処理回路とを備え、前記信号処理回路は、前記到来方向を示す情報を前記画像データに付加した映像信号を出力する。 In an exemplary non-limiting embodiment, a mobile unit of the present disclosure includes an imaging device that outputs image data, and an array that includes a plurality of antenna elements that receive signal waves periodically or intermittently emitted from beacons. An antenna; and a signal processing circuit that estimates a direction of arrival of the signal wave based on a signal output from the array antenna and determines coordinates that define the direction of arrival, the signal processing circuit including the direction of arrival Is output to the image data.

 本開示の配車システムは、例示的な非限定的な実施形態において、複数のビーコンおよび複数の車両を含み、前記車両は、画像データを出力する撮像装置と、前記複数のビーコンのいずれかから周期的または断続的に放射された信号波であって、前記ビーコンまたは前記ビーコンを携帯する人に関する識別情報を有する付加情報を含む、信号波を受信する複数のアンテナ素子を有するアレーアンテナと、前記アレーアンテナから出力された信号に基づいて前記信号波の到来方向を推定し、前記到来方向を規定する座標を決定する信号処理回路と、前記信号波から前記付加情報を取得する通信回路と、を備え、前記信号処理回路は、前記到来方向を示す情報を前記画像データに付加した映像信号を出力し、前記配車システムは、前記ビーコンまたは前記ビーコンを携帯する人の位置情報を取得して、前記車両に伝達する。 A vehicle allocation system according to the present disclosure includes, in an exemplary non-limiting embodiment, a plurality of beacons and a plurality of vehicles, and the vehicles cycle from any of the imaging devices that output image data and the plurality of beacons. An array antenna having a plurality of antenna elements for receiving signal waves, including additional information having identification information related to the beacon or the person carrying the beacon. A signal processing circuit that estimates an arrival direction of the signal wave based on a signal output from an antenna and determines coordinates defining the arrival direction; and a communication circuit that acquires the additional information from the signal wave. The signal processing circuit outputs a video signal in which information indicating the direction of arrival is added to the image data; Other retrieves the position information of the person who carries the beacon is transmitted to the vehicle.

 本開示の移動体の実施形態によれば、ビーコンから周期的または断続的に放射された信号波を、ユーザがアレーアンテナの位置を自由に移動させながら、受信して、前記信号波の到来方向を推定することができる。移動体は画像データを出力する撮像装置を有しており、到来方向を示す情報を画像データに付加した映像信号を出力することができる。 According to the embodiment of the mobile body of the present disclosure, a signal wave radiated periodically or intermittently from a beacon is received while the user freely moves the position of the array antenna, and the arrival direction of the signal wave Can be estimated. The moving body has an imaging device that outputs image data, and can output a video signal in which information indicating the arrival direction is added to the image data.

図1は本開示の実施形態におけるモバイル機器の基本的な構成例を示す正面図である。FIG. 1 is a front view illustrating a basic configuration example of a mobile device according to an embodiment of the present disclosure. 図2は本開示の実施形態におけるモバイル機器の基本的な構成例を示す側面図である。FIG. 2 is a side view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure. 図3は本開示の実施形態におけるモバイル機器の基本的な構成例を示す斜視図である。FIG. 3 is a perspective view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure. 図4はモバイル機器のハードウェアブロック図である。FIG. 4 is a hardware block diagram of the mobile device. 図5は撮像装置を基準とする座標系を示す図である。FIG. 5 is a diagram illustrating a coordinate system based on the imaging apparatus. 図6は撮像装置の像面SI上に張られる2次元座標uvを示す図である。FIG. 6 is a diagram illustrating a two-dimensional coordinate uv stretched on the image plane SI of the imaging apparatus. 図7は撮像装置を基準とする座標系にアレーアンテナの像面SAを模式的に示す図である。FIG. 7 is a diagram schematically showing the image plane SA of the array antenna in the coordinate system based on the imaging device. 図8は信号波の到来方向の推定値を規定する角度φおよびθを示す図である。FIG. 8 is a diagram illustrating the angles φ and θ that define the estimated value of the arrival direction of the signal wave. 図9は倉庫内の荷物棚とディスプレイ装置に表示された荷物棚の画像とを模式的に示す図である。FIG. 9 is a diagram schematically showing a luggage rack in the warehouse and an image of the luggage rack displayed on the display device. 図10Aはモバイル機器の姿勢変化の例を示すための図である。FIG. 10A is a diagram for illustrating an example of a posture change of the mobile device. 図10Bはモバイル機器の姿勢変化の例を示すための図である。FIG. 10B is a diagram for illustrating an example of the posture change of the mobile device. 図11Aはモバイル機器のディスプレイ装置に表示されたビーコンの位置を示すドットおよび識別情報を示す図である。FIG. 11A is a diagram illustrating dots and identification information indicating beacon positions displayed on a display device of a mobile device. 図11Bは位置座標が補正されたドットおよび識別情報を示す図である。FIG. 11B is a diagram showing dots with corrected position coordinates and identification information. 図11Cは補正処理を行わない場合の「表示の位置ずれ」の例を示す図である。FIG. 11C is a diagram illustrating an example of “display misalignment” when correction processing is not performed. 図12はビーコンの位置の表示処理例を示すフローチャートである。FIG. 12 is a flowchart illustrating an example of a beacon position display process. 図13は第2の例によるモバイル機器のハードウェアブロック図である。FIG. 13 is a hardware block diagram of the mobile device according to the second example. 図14は第3の例によるモバイル機器のハードウェアブロック図である。FIG. 14 is a hardware block diagram of the mobile device according to the third example. 図15は第4の例によるモバイル機器のハードウェアブロック図である。FIG. 15 is a hardware block diagram of the mobile device according to the fourth example. 図16は複数のビーコン10および複数の車両200を含む配車システム1000を説明するための模式図である。FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200. 図17は車両200の外観図である。FIG. 17 is an external view of the vehicle 200. 図18はディスプレイ装置310と接続された電子機器300の内部構成を示す図である。FIG. 18 is a diagram illustrating an internal configuration of the electronic apparatus 300 connected to the display device 310. 図19は接続が確立された、乗客230のスマートフォン240と電子機器300とを示す模式図である。FIG. 19 is a schematic diagram showing the smartphone 240 and the electronic device 300 of the passenger 230 with the connection established. 図20はディスプレイ装置310の表示例を示す図である。FIG. 20 is a diagram showing a display example of the display device 310. 図21は各々が電子機器300(図18)を搭載する複数のAGV400を有する搬送システム1100を示す図である。FIG. 21 is a diagram showing a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18). 図22は例示的なAGV400の外観図である。FIG. 22 is an external view of an exemplary AGV 400. 図23はAGV400のハードウェアの構成を示す図である。FIG. 23 is a diagram illustrating a hardware configuration of the AGV 400. 図24はAGV400とビーコン10が存在する方向Pとの関係を示す図である。FIG. 24 is a diagram illustrating a relationship between the AGV 400 and the direction P in which the beacon 10 exists. 図25は走行後のAGV400とビーコン10が存在する方向Pとの関係を示す図である。FIG. 25 is a diagram illustrating a relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists. 図26は探索システム1200の構成例を示す図である。FIG. 26 is a diagram illustrating a configuration example of the search system 1200. 図27は例示的なマルチコプター600の外観斜視図である。FIG. 27 is an external perspective view of an exemplary multicopter 600. 図28はマルチコプター600の側面図である。FIG. 28 is a side view of the multicopter 600.

 以下、本開示の実施形態を説明する。なお、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。本発明者らは、当業者が本開示を十分に理解するために添付図面および以下の説明を提供する。これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。 Hereinafter, embodiments of the present disclosure will be described. A more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art. The inventors provide the accompanying drawings and the following description to enable those skilled in the art to fully understand the present disclosure. They are not intended to limit the claimed subject matter.

<基本構成例>
 まず、図1から図3を参照する。図1および図2は、それぞれ、本開示の実施形態におけるモバイル機器の基本的な構成例を示す正面図および側面図であり、図3は斜視図である。
<Example of basic configuration>
First, FIG. 1 to FIG. 3 will be referred to. 1 and 2 are a front view and a side view, respectively, showing a basic configuration example of a mobile device according to an embodiment of the present disclosure, and FIG. 3 is a perspective view.

 本実施形態におけるモバイル機器100は、携帯型のアレーアンテナ20を備えるデバイスであり、持ち運びが容易な形状および大きさを備えている。このアレーアンテナ20は、図2および図3に示されるビーコン10から周期的または断続的に放射された信号波を受信する複数のアンテナ素子22を有する。ビーコン10は、タグとも呼ばれる。本実施形態において、ビーコン10は、ブルートゥース(登録商標)・ロー・エナジー規格に従って信号波を放射する。ただし、ビーコン10は、他の規格に従って動作する機器であってもよい。信号波の周波数は、例えばマイクロ波帯域またはミリ波帯域である。ビーコン10からは、例えば10ミリ秒以上200ミリ秒以下の時間間隔、典型的には100ミリ秒の時間間隔で2.4ギガヘルツ帯の信号波が放射される。信号波の周波数は、アレーアンテナ20で受信できる限り、一定である必要はなく、複数の周波数をホッピングし得る。 The mobile device 100 according to the present embodiment is a device including a portable array antenna 20 and has a shape and a size that can be easily carried. The array antenna 20 has a plurality of antenna elements 22 that receive signal waves radiated periodically or intermittently from the beacon 10 shown in FIGS. The beacon 10 is also called a tag. In this embodiment, the beacon 10 emits a signal wave in accordance with the Bluetooth (registered trademark) Law Energy standard. However, the beacon 10 may be a device that operates according to another standard. The frequency of the signal wave is, for example, a microwave band or a millimeter wave band. The beacon 10 radiates a 2.4 GHz signal wave at a time interval of, for example, 10 milliseconds to 200 milliseconds, typically 100 milliseconds. The frequency of the signal wave does not need to be constant as long as it can be received by the array antenna 20, and a plurality of frequencies can be hopped.

 図2において、ビーコン10から放射される信号波が模式的に記載されているが、実際のビーコン10からの信号波の放射には特に指向性は与えられていない。ビーコン10による信号波の放射は、等方的であることが望ましいが、ビーコン10のアンテナに依存する異方性を有していても良い。 In FIG. 2, the signal wave radiated from the beacon 10 is schematically described, but directivity is not particularly given to the actual radiation of the signal wave from the beacon 10. Although it is desirable that the signal wave emitted by the beacon 10 is isotropic, it may have anisotropy depending on the antenna of the beacon 10.

 ビーコン10が放射する信号波は、ビーコン10またはビーコン10を携帯する人に関する識別情報を有する付加情報を含み得る。付加情報の例は、ビーコンIDおよび/またはビーコン10の所有者IDである。ビーコン10は、心拍計、温度計、高度計、および/または加速度センサなどの種々のセンサを内蔵していても良い。また、ビーコン10は、外部にある種々のセンサと電気的に接続されていても良い。このような場合、ビーコン10は、これらのセンサによって取得された種々の測定値を信号波に含めて放射することが可能である。ビーコン10の典型例は、信号波を放射するためのアンテナ、高周波回路、高周波回路を駆動するためのバッテリ、および、これらの動作を制御するプロセッサを内蔵している。ビーコン10は、人によって直接に携帯されても良いし、スマートフォンなどの電子機器に内蔵されていても良い。また、ビーコン10は、一般のICタグのように、流通する物品またはケースなどに取り付けられて使用されても良い。 The signal wave emitted by the beacon 10 may include additional information having identification information regarding the beacon 10 or the person carrying the beacon 10. An example of the additional information is a beacon ID and / or an owner ID of the beacon 10. The beacon 10 may incorporate various sensors such as a heart rate monitor, a thermometer, an altimeter, and / or an acceleration sensor. Moreover, the beacon 10 may be electrically connected to various external sensors. In such a case, the beacon 10 can radiate by including various measurement values acquired by these sensors in the signal wave. A typical example of the beacon 10 includes an antenna for radiating a signal wave, a high-frequency circuit, a battery for driving the high-frequency circuit, and a processor for controlling these operations. The beacon 10 may be directly carried by a person, or may be incorporated in an electronic device such as a smartphone. Further, the beacon 10 may be used by being attached to a circulated article or case, like a general IC tag.

 本実施形態において、アレーアンテナ20の直径は、例えば20センチメートル程度であり、平面内に2次元状に配列された7個のアンテナ素子22を備える。アレーアンテナ20の重量は、例えば500グラム程度である。 In this embodiment, the diameter of the array antenna 20 is about 20 centimeters, for example, and includes seven antenna elements 22 arranged in a two-dimensional manner in a plane. The weight of the array antenna 20 is, for example, about 500 grams.

 アレーアンテナ20の構成およびサイズは、人が片手または両手で持ち運びできる限り、この例に限定されない。正面から視たアレーアンテナ20の外観形状も、円形である必要はなく、楕円形、長方形、多角形、星形、その他の形状であり得る。アンテナ素子22の個数は、8個以上であってもよいし、3~6個の範囲内にあってもよい。この例におけるアンテナ素子22は、図面内において、水平方向および垂直方向(鉛直方向)の両方に拡がる平面内に配列されている。具体的には、アレーアンテナ20の中心に位置する1個のアンテナ素子22の周りに、6個のアンテナ素子22が等間隔で同心円上に配列されている。この配置はあくまでも一例にすぎない。アンテナ素子22は、例えば水平方向に沿って直線状に配列されていても良い。複数個のアンテナ素子22が1方向に沿って直線状に配列されている場合、その方向に交差する方向に関しては信号波の到来方向を推定することはできない。図示されている例では、水平方向および垂直方向の両方について到来方向を推定することが可能になる。 The configuration and size of the array antenna 20 are not limited to this example as long as a person can carry it with one or both hands. The external shape of the array antenna 20 viewed from the front is not necessarily circular, and may be an ellipse, a rectangle, a polygon, a star, or other shapes. The number of antenna elements 22 may be 8 or more, or may be in the range of 3-6. The antenna elements 22 in this example are arranged in a plane extending in both the horizontal direction and the vertical direction (vertical direction) in the drawing. Specifically, six antenna elements 22 are concentrically arranged at equal intervals around one antenna element 22 located at the center of the array antenna 20. This arrangement is only an example. The antenna elements 22 may be arranged in a straight line along the horizontal direction, for example. When a plurality of antenna elements 22 are arranged linearly along one direction, the arrival direction of the signal wave cannot be estimated with respect to the direction intersecting the direction. In the example shown, it is possible to estimate the direction of arrival for both the horizontal and vertical directions.

 アレーアンテナ20は、不図示のモノリシック・マイクロ波集積回路などの高周波回路およびAD変換回路を内蔵していてもよい。このような回路は、アレーアンテナ20に備えられる代わりに、後述する信号処理回路30とアレーアンテナ20との間に接続されていてもよい。 The array antenna 20 may incorporate a high-frequency circuit such as a monolithic microwave integrated circuit (not shown) and an AD conversion circuit. Such a circuit may be connected between the signal processing circuit 30 described later and the array antenna 20 instead of being provided in the array antenna 20.

 本実施形態におけるモバイル機器100は、アレーアンテナ20から出力された信号に基づいて信号波の到来方向を推定し、信号波の到来方向を規定する座標を決定する信号処理回路30を備えている。信号処理回路30は、アレー信号処理のアルゴリズムを実行して信号波の到来方向を推定するように構成されている。アレー信号処理のアルゴリズムは任意である。信号波の到来方向は、DOA(Direction Of Arrival)またはAOA(Angle Of Arrival)と称されることがある。 The mobile device 100 according to the present embodiment includes a signal processing circuit 30 that estimates the arrival direction of a signal wave based on a signal output from the array antenna 20 and determines coordinates that define the arrival direction of the signal wave. The signal processing circuit 30 is configured to execute an array signal processing algorithm and estimate the arrival direction of the signal wave. An array signal processing algorithm is arbitrary. The direction of arrival of the signal wave may be referred to as DOA (Direction Of Arrival) or AOA (Angle Of Arrival).

 信号処理回路30は、図示されている例において、ベース72の筐体内部に配置されている。このベース72は、柱状のグリップ70の一端に結合している。グリップ70は、固定具74を介してアレーアンテナ20を支持している。グリップ70は、人の手が把持するのに適した形状およびサイズ(長さおよび直径)を有している。グリップ70が人によって把持されていないとき、言い換えると、モバイル機器100がユーザによって携帯されていないとき、モバイル機器100は、ベース72がテーブルまたは床などの固定物に接するようにして固定物上に置かれていても良い。モバイル機器100は、移動ロボット、無人搬送車、ドローン、自動車などの移動体に載せられたり、取り付けられたりした状態でも使用され得る。 The signal processing circuit 30 is disposed inside the casing of the base 72 in the illustrated example. The base 72 is coupled to one end of the columnar grip 70. The grip 70 supports the array antenna 20 via a fixture 74. The grip 70 has a shape and a size (length and diameter) suitable for being gripped by a human hand. When the grip 70 is not held by a person, in other words, when the mobile device 100 is not carried by the user, the mobile device 100 is placed on the fixed object such that the base 72 is in contact with a fixed object such as a table or a floor. It may be placed. The mobile device 100 can be used even when it is mounted on or attached to a mobile body such as a mobile robot, an automatic guided vehicle, a drone, or a car.

 本実施形態のモバイル機器100は、ビーコン10の信号波から前述した付加情報を取得する通信回路40を備えている。従って、モバイル機器100は、ビーコン10が発する信号またはデータを無線によって非接触で読み取る「ハンディスキャナ」としても動作し得る。このような通信回路40は、図示される例において、ベース72の筐体内部に配置されている。通信回路40は、アレーアンテナ20以外のアンテナ(不図示)によって、他の信号波を送受信してもよいし、電話回線またはインターネットに接続されてもよい。 The mobile device 100 of the present embodiment includes a communication circuit 40 that acquires the above-described additional information from the signal wave of the beacon 10. Therefore, the mobile device 100 can also operate as a “handy scanner” that reads a signal or data emitted from the beacon 10 wirelessly in a contactless manner. Such a communication circuit 40 is disposed inside the casing of the base 72 in the illustrated example. The communication circuit 40 may transmit and receive other signal waves by an antenna (not shown) other than the array antenna 20, or may be connected to a telephone line or the Internet.

 信号処理回路30および通信回路40は、単一または複数の半導体集積回路によって実現されている。信号処理回路30は、CPU(Central Processing Unit)またはコンピュータと呼ばれることがある。信号処理回路30は、汎用的なマイクロコントローラまたはデジタルシグナルプロセッサなどのコンピュータと、当該コンピュータに種々の命令を実行させるコンピュータプログラムを内蔵したメモリとを備える回路によって実現され得る。信号処理回路30には、不図示のレジスタ、キャッシュメモリおよび/またはバッファが含まれ得る。 The signal processing circuit 30 and the communication circuit 40 are realized by a single or a plurality of semiconductor integrated circuits. The signal processing circuit 30 may be referred to as a CPU (Central Processing Unit) or a computer. The signal processing circuit 30 can be realized by a circuit including a computer such as a general-purpose microcontroller or a digital signal processor, and a memory in which a computer program for causing the computer to execute various instructions is incorporated. The signal processing circuit 30 may include a register (not shown), a cache memory, and / or a buffer.

<撮像装置>
 本実施形態におけるモバイル機器100は、画像データを出力する撮像装置50を備えている。撮像装置50は、図1および図2に示されるように、レンズ52とイメージセンサ54とを有している。撮像装置50の典型例は、デジタルカメラまたはデジタルムービである。図2に示されるように、撮像装置50とアレーアンテナ20との相対配置関係は固定されている。好ましい例において、撮像装置50の光軸(カメラ軸)、すなわちZ軸は、アレーアンテナ20のZa軸に平行である。本明細書における「平行」は数学的に厳密な平行である必要はない。多少のアライメントずれは許容される。
<Imaging device>
The mobile device 100 according to this embodiment includes an imaging device 50 that outputs image data. The imaging device 50 includes a lens 52 and an image sensor 54 as shown in FIGS. A typical example of the imaging device 50 is a digital camera or a digital movie. As shown in FIG. 2, the relative arrangement relationship between the imaging device 50 and the array antenna 20 is fixed. In a preferred example, the optical axis (camera axis) of the imaging device 50, that is, the Z axis is parallel to the Za axis of the array antenna 20. “Parallel” in this specification need not be mathematically strictly parallel. Some misalignment is allowed.

 モバイル機器100のユーザがグリップ70を持ってアレーアンテナ20の位置または姿勢を変化させると、アレーアンテナ20とともに、撮像装置50の位置または姿勢も変化する。ユーザは、アレーアンテナ20を任意の方向に向けることができる。 When the user of the mobile device 100 changes the position or orientation of the array antenna 20 while holding the grip 70, the position or orientation of the imaging device 50 changes together with the array antenna 20. The user can point the array antenna 20 in an arbitrary direction.

 本実施形態のモバイル機器100によれば、信号処理回路30が到来方向を示す情報を画像データに付加した映像信号を出力する。 According to the mobile device 100 of the present embodiment, the signal processing circuit 30 outputs a video signal in which information indicating the arrival direction is added to the image data.

<ディスプレイ装置>
 本実施形態のモバイル機器100は、画像データを表示するディスプレイ装置60を備えている。図2に示されるように、本実施形態におけるディスプレイ装置60は、角度調整装置76を介してグリップ70に支持されている。ディスプレイ装置60は、信号処理回路30から出力された映像信号に基づいて、到来方向を示す情報および画像データを表示することができる。到来方向を示す情報は、画像データによって規定される画像内において到来方向の推定位置座標に表示される点または線などのマークを含む。このようなマークの典型例は、ドット、円、クロス、および矢印などの図形、記号、文字、数字、またはこれらの組み合わせであり得る。信号処理回路30は、前述した付加情報の選択された一部または全部をディスプレイ装置60に表示させてもよい。
<Display device>
The mobile device 100 of this embodiment includes a display device 60 that displays image data. As shown in FIG. 2, the display device 60 in this embodiment is supported by the grip 70 via an angle adjustment device 76. The display device 60 can display information indicating the direction of arrival and image data based on the video signal output from the signal processing circuit 30. The information indicating the arrival direction includes marks such as dots or lines displayed at the estimated position coordinates of the arrival direction in the image defined by the image data. Typical examples of such marks can be graphics such as dots, circles, crosses, and arrows, symbols, letters, numbers, or combinations thereof. The signal processing circuit 30 may cause the display device 60 to display a part or all of the selected additional information.

 ディスプレイ装置60は、液晶ディスプレイ、OLEDディスプレイ、あるいは、種々のフレキシブルディスプレイであり得る。また、ディスプレイ装置60は、壁面またはデスクトップなどの物体の表面に画像を投影するプロジェクタであってもよい。ディスプレイ装置60は、不図示のドライバ回路、メモリ回路、入出力イタフェース回路などを内蔵している。信号処理回路30、記憶装置32および通信回路40の全部または一部は、ディスプレイ装置60のドライバ回路などとともに同一のプリント基板上に実装されていても良い。 The display device 60 may be a liquid crystal display, an OLED display, or various flexible displays. The display device 60 may be a projector that projects an image on the surface of an object such as a wall surface or a desktop. The display device 60 includes a driver circuit, a memory circuit, an input / output interface circuit, and the like (not shown). All or part of the signal processing circuit 30, the storage device 32, and the communication circuit 40 may be mounted on the same printed circuit board together with the driver circuit of the display device 60 and the like.

 モバイル機器100は、アレーアンテナ20を備えるスマートフォン、タブレット端末、またはノートパソコンなどの形態をとり得る。また、本開示のモバイル機器は、リストウォッチまたはヘッドマウントディスプレイなどのウェアラブル機器の形態をとってもよい。 The mobile device 100 may take the form of a smartphone, a tablet terminal, or a laptop computer provided with the array antenna 20. In addition, the mobile device of the present disclosure may take the form of a wearable device such as a wristwatch or a head-mounted display.

 次に図4を参照する。図4は、モバイル機器のハードウェアブロック図である。 Next, refer to FIG. FIG. 4 is a hardware block diagram of the mobile device.

 図4に示すように、本実施形態におけるモバイル機器100は種々の構成要素を備えている。各構成要素は内部バス34によって接続され、各構成要素は他の構成要素とのデータを授受することができる。 As shown in FIG. 4, the mobile device 100 according to this embodiment includes various components. Each component is connected by an internal bus 34, and each component can exchange data with other components.

 モバイル機器100は記憶装置32を備えている。記憶装置32は、ランダム・アクセス・メモリ(RAM)32a、読み出し専用メモリ(ROM)32b、および、ストレージ32cを含む。RAM32aは、信号処理回路30が演算を行う際のワークメモリとして利用され得る揮発性メモリである。読み出し専用メモリ(ROM)32bは、例えばコンピュータプログラムを記憶する不揮発性メモリである。ストレージ32cはモバイル機器100が取得した情報を保持する不揮発性メモリである。当該情報の一例は、後述する、ビーコン10またはビーコン10を携帯するユーザの識別情報を含む登録情報である。本開示では、記憶装置32を単に「メモリ」と呼ぶことがある。 The mobile device 100 includes a storage device 32. The storage device 32 includes a random access memory (RAM) 32a, a read only memory (ROM) 32b, and a storage 32c. The RAM 32a is a volatile memory that can be used as a work memory when the signal processing circuit 30 performs an operation. The read only memory (ROM) 32b is, for example, a non-volatile memory that stores a computer program. The storage 32c is a non-volatile memory that holds information acquired by the mobile device 100. An example of the information is registration information including beacon 10 or identification information of a user who carries beacon 10 described later. In the present disclosure, the storage device 32 may be simply referred to as “memory”.

 信号処理回路30は、ROM32bに格納されたコンピュータプログラムを読み出してRAM32aに展開し、コンピュータプログラムを構成する指令をRAM32aから読み出して実行する。これにより、信号処理回路30は、後述するモバイル機器100の動作を実現することができる。 The signal processing circuit 30 reads out the computer program stored in the ROM 32b, develops it in the RAM 32a, reads out the commands constituting the computer program from the RAM 32a, and executes them. Thereby, the signal processing circuit 30 can implement | achieve operation | movement of the mobile device 100 mentioned later.

 信号処理回路30は、ディスプレイ装置60に映像を表示するための処理を行う。信号処理回路30は、後述の撮像装置50から出力される映像(動画像)を受け取って、例えば映像の輝度を調整し、またはノイズを低減してディスプレイ装置60に出力する。また信号処理回路30は、ディスプレイ装置60の位置座標に、ビーコン10の位置を示す画像、例えばアイコン、を表示する。なお、ディスプレイ装置60上に画像を表示する処理を、信号処理回路30とは別の回路、例えば画像処理回路、が行ってもよい。 The signal processing circuit 30 performs processing for displaying an image on the display device 60. The signal processing circuit 30 receives a video (moving image) output from the imaging device 50 described later, and adjusts the luminance of the video, for example, or outputs the video to the display device 60 with noise reduced. Further, the signal processing circuit 30 displays an image indicating the position of the beacon 10, for example, an icon, at the position coordinates of the display device 60. Note that the processing for displaying an image on the display device 60 may be performed by a circuit different from the signal processing circuit 30, for example, an image processing circuit.

 本実施形態におけるモバイル機器100は、モーションセンサ80を備えている。モーションセンサ80の一例はジャイロセンサである。本開示ではモーションセンサ80として3軸ジャイロセンサを採用する。モーションセンサ80は、互いに直交する3本の軸周りの角速度を検出する。モバイル機器100を水平面上に載置したとき、3本の軸は、典型的には鉛直方向に平行な軸(ヨー軸)、水平方向に平行な軸(ピッチング軸)、および、前後方向に平行な軸(ロール軸)である。モーションセンサ80は、例えば1ミリ秒ごとに各軸周りの角速度の検出値を出力する。モバイル機器100は、加速度センサを備えていても良い。 The mobile device 100 in this embodiment includes a motion sensor 80. An example of the motion sensor 80 is a gyro sensor. In the present disclosure, a three-axis gyro sensor is employed as the motion sensor 80. The motion sensor 80 detects angular velocities around three axes orthogonal to each other. When the mobile device 100 is placed on a horizontal plane, the three axes are typically parallel to the vertical direction (yaw axis), parallel to the horizontal direction (pitching axis), and parallel to the front-rear direction. Axis (roll axis). The motion sensor 80 outputs a detected value of angular velocity around each axis, for example, every 1 millisecond. The mobile device 100 may include an acceleration sensor.

 信号処理回路30は、3軸ジャイロセンサから出力された、各軸周りの角速度の検出値を時間積分することにより、各軸周りの回転角度を取得することができる。 The signal processing circuit 30 can acquire the rotation angle around each axis by time-integrating the detected value of the angular velocity around each axis output from the three-axis gyro sensor.

<画像の合成>
 図5から図8を参照して、撮像装置50が取得して出力する画像データと、アレーアンテナ20によって推定される到来方向を示す情報とを合成する方法の一例を説明する。
<Image composition>
With reference to FIGS. 5 to 8, an example of a method for synthesizing image data acquired and output by the imaging device 50 and information indicating the arrival direction estimated by the array antenna 20 will be described.

 図5は、撮像装置50を基準とする座標系を示している。図示されているXYZ座標は、いわゆるカメラ座標(右手系)である。この座標の原点Oは、撮像装置50の光学中心(主点)であり、Z軸はカメラ光軸である。点P(X,Y,Z)は、ビーコン10の位置を示しているとする。 FIG. 5 shows a coordinate system based on the imaging device 50. The XYZ coordinates shown in the figure are so-called camera coordinates (right-handed system). The origin O of this coordinate is the optical center (principal point) of the imaging device 50, and the Z-axis is the camera optical axis. A point P (X, Y, Z) is assumed to indicate the position of the beacon 10.

 図5には、撮像装置50の像面SCが記載されている。像面SCは、原点OからZ軸の方向に焦点距離fだけ離れている。像面SC上に張られる2次元座標xyは、カメラ画像の座標である。像面SC上に現れる点P(X,Y,Z)の対応点は、ピーンホールカメラモデルに基づく透視投影変換によって決定される。撮像装置50からビーコン10が視認されているとき、ビーコン10は、像面SC上の点Mc(x,y)の位置に観察される。 FIG. 5 shows the image plane SC of the imaging device 50. The image plane SC is separated from the origin O by the focal length f in the Z-axis direction. The two-dimensional coordinates xy stretched on the image plane SC are the coordinates of the camera image. Corresponding points of the point P (X, Y, Z) appearing on the image plane SC are determined by perspective projection transformation based on the Peanhall camera model. When the beacon 10 is visually recognized from the imaging device 50, the beacon 10 is observed at the position of the point Mc (x, y) on the image plane SC.

 (x,y)は、(X,Y,Z)を透視投影によって変換すれば求められる。具体的には、x = (f/Z)・Xであり、y = (f/Z)・Yである。 (X, y) can be obtained by converting (X, Y, Z) by perspective projection. Specifically, x = (f / Z) · X, and y = (f / Z) · Y.

 また図6は、撮像装置50の像面SI上に張られる2次元座標uvを示している。この像面SIは、イメージセンサ54の画素領域に相当する。2次元座標uvは画素単位の座標である。座標(u0,v0)はZ軸と像面SIとの交点である。このとき、Mc(u,v)は、(X,Y,Z)を透視投影によって変換すれば求められる。透視変換を規定する行列の基本的な例は、下記の数1に示される。

Figure JPOXMLDOC01-appb-M000001
FIG. 6 shows a two-dimensional coordinate uv stretched on the image plane SI of the imaging device 50. This image plane SI corresponds to a pixel area of the image sensor 54. The two-dimensional coordinate uv is a pixel unit coordinate. The coordinates (u0, v0) are the intersections of the Z axis and the image plane SI. At this time, Mc (u, v) can be obtained by converting (X, Y, Z) by perspective projection. A basic example of a matrix that defines the perspective transformation is shown in the following Equation 1.
Figure JPOXMLDOC01-appb-M000001

 数1中の「α」および「β」は撮像装置50の内部パラメータであり、具体的には、レンズの焦点距離f、イメージセンサ54の画素サイズなどによって決まる。 “Α” and “β” in Equation 1 are internal parameters of the imaging apparatus 50, and specifically, are determined by the focal length f of the lens, the pixel size of the image sensor 54, and the like.

 図7は、撮像装置50を基準とする座標系にアレーアンテナ20の像面SAを模式的に記載した図である。点oaは、アレーアンテナ20の中心であり、Za軸は、アレーアンテナ20の中心軸である。この例において、点oaはY軸の正方向に距離dだけシフトした位置にある。距離dは、例えば10センチメートル程度であるが、10センチメートル以下であっても良い。 FIG. 7 is a diagram schematically showing the image plane SA of the array antenna 20 in the coordinate system with the imaging device 50 as a reference. The point oa is the center of the array antenna 20, and the Za axis is the center axis of the array antenna 20. In this example, the point oa is at a position shifted by a distance d in the positive direction of the Y axis. The distance d is, for example, about 10 centimeters, but may be 10 centimeters or less.

 アレーアンテナ20の中心oaからビーコン10の位置Pまでの距離は不明である。公知の種々のアレー信号処理技術を用いることにより、アレーアンテナ20の中心oaからビーコン10の位置Pの方向(信号波の到来方向)を推定することができる。信号波の到来方向の推定値は、例えば図8に示される角度φおよびθによって規定される。なお、信号波の到来方向を規定する角度パラメータは、角度φおよびθに限定されない。例えばZa軸からX軸方向への傾斜角度θ1、およびZa軸からy軸方向への傾斜角度θ2を用いても到来方向を表現することは可能である。また、信号波の到来方向として、例えばZa軸からX軸方向への傾斜角度θ1のみを推定したいという用途もあり得る。そのような場合、到来方向はθ1のみによって規定されるため、アレーアンテナ20を構成するアンテナ素子22は、X軸に平行な直線上に配列され得る。本開示によるモバイル機器の実施形態が、例えばスマートフォンのように小型化されたデバイスによって実現される場合は、例えば3個または4個のアンテナ素子22が、筐体の長辺に平行な方向に沿って直線上または曲線上に配列され得る。 The distance from the center oa of the array antenna 20 to the position P of the beacon 10 is unknown. By using various known array signal processing techniques, the direction of the position P of the beacon 10 (the arrival direction of the signal wave) can be estimated from the center oa of the array antenna 20. The estimated value of the arrival direction of the signal wave is defined by, for example, the angles φ and θ shown in FIG. The angle parameter that defines the arrival direction of the signal wave is not limited to the angles φ and θ. For example, the arrival direction can be expressed using the inclination angle θ1 from the Za axis to the X axis direction and the inclination angle θ2 from the Za axis to the y axis direction. In addition, as the arrival direction of the signal wave, for example, there may be a purpose of estimating only the inclination angle θ1 from the Za axis to the X axis direction. In such a case, since the arrival direction is defined only by θ1, the antenna elements 22 constituting the array antenna 20 can be arranged on a straight line parallel to the X axis. When the embodiment of the mobile device according to the present disclosure is realized by a miniaturized device such as a smartphone, for example, three or four antenna elements 22 extend along a direction parallel to the long side of the housing. Can be arranged on a straight line or on a curve.

 アレーアンテナ20の像面SAは、点oaからZa軸の正方向に例えば1メートルシフトした位置に仮想的に設定され得る。この意味で、像面SAは仮想的なスクリーンである。像面SAは、例えば、縦4メートル、横4メートルの矩形である。このような像面SA上における点Ma(xa,ya)は、角度φおよびθによって決まる。この仮想的なスクリーン上の座標は、スクリーンのZa座標の値を用いてスケーリングすることにより、撮像装置50の像面SCまたはSI上の座標に対応づけることができる。 The image plane SA of the array antenna 20 can be virtually set at a position shifted by, for example, 1 meter from the point oa in the positive direction of the Za axis. In this sense, the image plane SA is a virtual screen. The image plane SA is, for example, a rectangle 4 meters long and 4 meters wide. Such a point Ma (xa, ya) on the image plane SA is determined by the angles φ and θ. The coordinates on the virtual screen can be correlated with the coordinates on the image plane SC or SI of the imaging device 50 by scaling using the value of the Za coordinate of the screen.

 像面SA上における点Ma(xa,ya)は、線分oa-P上にあるビーコン10がアレーアンテナ20の前方1メートルに位置しているとした場合の座標(X,Y,Z)におけるX成分およびY成分に相当する。ただし、像面SAの中心と像面SCの中心との間には、距離dだけの差分が存在している。この差分を補正することにより、ビーコン10のカメラ座標における位置、ひいては像面SCおよび像面SIにおける位置の座標を算出することができる。 A point Ma (xa, ya) on the image plane SA is at coordinates (X, Y, Z) when the beacon 10 on the line segment oa-P is located 1 meter in front of the array antenna 20. It corresponds to the X component and the Y component. However, there is a difference by a distance d between the center of the image plane SA and the center of the image plane SC. By correcting this difference, it is possible to calculate the position of the beacon 10 in the camera coordinates, and thus the position coordinates in the image plane SC and the image plane SI.

 なお、仮想的なスクリーンである像面SAの距離(1メートル)よりも離れた位置に実際にビーコン10が存在している場合、線分O-Pと線分oa-Pとは平行であると近似してもよい。そのような場合、上記の方法で算出した像面SCにおけるビーコン10の位置は、点Mcの位置にほとんど一致する。しかしながら、ビーコン10の実際の位置Pがモバイル機器100に近くなるほど、線分O-Pと線分oa-Pとの角度が平行から外れるため、像面SA上の点Maの座標から算出した像面SCにおけるビーコン10の位置は、点Mcの位置から垂直方向にシフトすることになる。ビーコン10の方向の正確度は、垂直方向よりも水平方向で高いことが好ましい。これは、ビーコン10またはビーコン10を持つ人の可動範囲が、水平面に実質的に平行な面内に拘束される傾向があるからである。このため、撮像装置50とアレーアンテナ20との配置関係は、本実施形態のように垂直の関係にあることが好ましい。 When the beacon 10 is actually present at a position away from the distance (1 meter) of the image plane SA, which is a virtual screen, the line segment OP and the line segment oa-P are parallel to each other. May be approximated. In such a case, the position of the beacon 10 on the image plane SC calculated by the above method almost coincides with the position of the point Mc. However, the closer the actual position P of the beacon 10 is to the mobile device 100, the more the angle between the line segment OP and the line segment oa-P deviates from the parallel, and thus the image calculated from the coordinates of the point Ma on the image plane SA. The position of the beacon 10 on the plane SC is shifted in the vertical direction from the position of the point Mc. The accuracy of the direction of the beacon 10 is preferably higher in the horizontal direction than in the vertical direction. This is because the movable range of the beacon 10 or the person having the beacon 10 tends to be constrained in a plane substantially parallel to the horizontal plane. For this reason, the positional relationship between the imaging device 50 and the array antenna 20 is preferably a vertical relationship as in this embodiment.

 なお、撮像装置50の中心とアレーアンテナ20の中心とは一致していてもよい。言い換えると、Z軸とZa軸とが一致していても良い。そのような配置関係を実現すれば、より簡単な計算でビーコン10の推定方向を撮像装置50の像面SCに重畳できる。 Note that the center of the imaging device 50 and the center of the array antenna 20 may coincide with each other. In other words, the Z axis may coincide with the Za axis. If such an arrangement relationship is realized, the estimated direction of the beacon 10 can be superimposed on the image plane SC of the imaging device 50 with a simpler calculation.

 図9は、倉庫内の荷物棚200と、ディスプレイ装置60に表示された荷物棚200の画像とを模式的に示す図である。ディスプレイ装置60には、ビーコン10の位置を示すドット90と、ビーコン10が放射する信号波に含まれる付加情報から取得した識別情報92が表示されている。このような画像に基づいて、ビーコン10を有する荷物を特定することができる。なお、簡単のため、ビーコン10が1個の場合が例示されているが、複数の荷物のそれぞれがビーコン10を有していてもよい。識別情報92は個々のビーコン10に固有であるため、識別情報92に基づいて複数の荷物を適切に判別することができる。画像には、識別情報以外の付加情報が選択的に表示されても良いし、非表示にされてもよい。また、特定の識別情報92を持つビーコン10についてのみ、ディスプレイ装置60に表示するようにしてもよい。 FIG. 9 is a diagram schematically showing the luggage rack 200 in the warehouse and the image of the luggage rack 200 displayed on the display device 60. On the display device 60, the dot 90 indicating the position of the beacon 10 and the identification information 92 acquired from the additional information included in the signal wave emitted by the beacon 10 are displayed. Based on such an image, a package having the beacon 10 can be identified. For simplicity, the case where there is one beacon 10 is illustrated, but each of a plurality of packages may have a beacon 10. Since the identification information 92 is unique to each beacon 10, a plurality of packages can be appropriately determined based on the identification information 92. Additional information other than identification information may be selectively displayed on the image, or may be hidden. Further, only the beacon 10 having the specific identification information 92 may be displayed on the display device 60.

 電磁波を透過する部材は、ビーコン10から放射される信号波にとって実質的に透明である。主として絶縁材料から形成された壁などは電磁波を透過する。例えば特定のビーコン10を携帯する人をモバイル機器100で探索するとき、その人が壁の向こうに位置していてもビーコン10が放射した信号波を受信して信号波の到来方向を知ることが可能である。ビルディングなどの建物の内部でビーコン10を携帯する人を探索するとき、モバイル機器100の向きを変えながら、到来波を検知できる向きを見つければよい。いったん、到来波を検知して到来方向を推定できれば、モバイル機器100を到来方向に向けながら移動することにより、最終的には、ビーコン10を携帯する人に会うことが可能になる。多くの人々または物がそれぞれビーコン10を備えていても、それぞれのビーコン10が発する信号波に含まれる識別情報から、目的とする人または物を見つけ出すことが可能になる。 The member that transmits electromagnetic waves is substantially transparent to the signal wave radiated from the beacon 10. A wall or the like formed mainly of an insulating material transmits electromagnetic waves. For example, when searching for a person carrying a specific beacon 10 on the mobile device 100, the signal wave emitted by the beacon 10 can be received and the arrival direction of the signal wave can be known even if the person is located across the wall. Is possible. When searching for a person carrying the beacon 10 inside a building such as a building, it is only necessary to find a direction in which an incoming wave can be detected while changing the direction of the mobile device 100. Once the arrival direction is detected and the arrival direction can be estimated, it is possible to finally meet the person carrying the beacon 10 by moving the mobile device 100 in the arrival direction. Even if many people or things each have a beacon 10, it becomes possible to find the intended person or thing from the identification information included in the signal wave emitted by each beacon 10.

 本開示によるモバイル機器の使用は、屋内に限られず、屋外であっても良い。ビーコン10を携える人が遭難した場合、本開示によるモバイル機器を使用して速やかに遭難者の救助にあたることも可能になる。 The use of the mobile device according to the present disclosure is not limited to indoors but may be outdoor. When a person carrying the beacon 10 is lost, the mobile device according to the present disclosure can be used to quickly rescue the victim.

<表示位置の補正1(手振れ等による位置ずれの補正)>
 前述したように、ビーコン10からは断続的に信号波が放射される。このため、信号波に基づく到来方向の推定値も、断続的に算出される。ビーコン10から100ミリ秒間隔で信号波が放射される場合、到来方向の推定値は、100ミリ秒間隔で更新される。手振れなどによってモバイル機器100の位置および姿勢の少なくとも一方が変化することがある。このとき、各フレームの画像データは、100ミリ秒よりも短い間隔(例えば約8~16ミリ秒の間隔)で取得されるため、到来方向の推定値に比べて高い周波数で更新可能である。このようなデータ更新の差に起因して、「表示の位置ずれ」が発生し得る。
<Correction of display position 1 (correction of misalignment due to camera shake, etc.)>
As described above, a signal wave is intermittently emitted from the beacon 10. For this reason, the estimated value of the direction of arrival based on the signal wave is also calculated intermittently. When signal waves are radiated from the beacon 10 at intervals of 100 milliseconds, the estimated value of the arrival direction is updated at intervals of 100 milliseconds. At least one of the position and posture of the mobile device 100 may change due to camera shake or the like. At this time, since the image data of each frame is acquired at an interval shorter than 100 milliseconds (for example, an interval of about 8 to 16 milliseconds), it can be updated at a higher frequency than the estimated value of the arrival direction. Due to such a difference in data update, a “display position shift” may occur.

 信号処理回路30は、ビーコン10から放射された最新の信号波を受けたときに、アレーアンテナ20から出力された信号に基づいて推定された到来方向から、ディスプレイ装置60に表示される到来方向を示す情報の位置座標を決定する。位置座標の決定処理と並行して、信号処理回路30は、手振れなどを検知すると、その影響を補償するように信号の補正を行う。以下、信号の補正の例を説明する。 When receiving the latest signal wave radiated from the beacon 10, the signal processing circuit 30 determines the direction of arrival displayed on the display device 60 from the direction of arrival estimated based on the signal output from the array antenna 20. The position coordinates of the information to be shown are determined. In parallel with the position coordinate determination process, when the signal processing circuit 30 detects a hand shake or the like, the signal correction circuit 30 corrects the signal so as to compensate for the influence. Hereinafter, an example of signal correction will be described.

 第1の補正処理では、モーションセンサ80の出力を利用する。 In the first correction process, the output of the motion sensor 80 is used.

 図10Aおよび図10Bは、ユーザの意図的動作または手振れによるモバイル機器100の姿勢の変化の例を示している。モバイル機器100の姿勢は、約100ミリ秒よりも短い期間に、図10Aに示す状態から図10Bに示す状態に変化したとする。説明の簡単のため、手振れ等により、図示されたY軸(ヨー軸)の時計回りに角度Rだけモバイル機器100が回転したとする。この例において、ビーコン10が信号波を放射する時間間隔は約100ミリ秒であるとする。 FIG. 10A and FIG. 10B show examples of changes in the posture of the mobile device 100 due to the user's intentional movement or camera shake. Assume that the attitude of the mobile device 100 changes from the state shown in FIG. 10A to the state shown in FIG. 10B in a period shorter than about 100 milliseconds. For the sake of simplicity, it is assumed that the mobile device 100 is rotated by an angle R in the clockwise direction of the illustrated Y axis (yaw axis) due to camera shake or the like. In this example, it is assumed that the time interval at which the beacon 10 emits a signal wave is about 100 milliseconds.

 まず、「表示の位置ずれ」の例を説明する。 First, an example of “display misalignment” will be described.

 モバイル機器100の姿勢が変化することにより、撮像装置50の視野も変化する。撮像装置50は、視野の変化に追従した画像データ(フレーム群)を出力する。フレームが約8~16ミリ秒の間隔で更新される場合、モバイル機器100が回転を開始して終了するまでの約100ミリ秒の期間中に、撮像装置50は6~12枚のフレームを出力する。従って、ディスプレイ装置60に表示される撮像装置50からの映像(背景映像)は比較的速く回転に追従する。より具体的には、背景映像は回転に追従して右から左の方向に流れる。 When the attitude of the mobile device 100 changes, the field of view of the imaging device 50 also changes. The imaging device 50 outputs image data (frame group) following the change in the visual field. When the frames are updated at intervals of about 8 to 16 milliseconds, the imaging device 50 outputs 6 to 12 frames during a period of about 100 milliseconds until the mobile device 100 starts rotating and ends. To do. Therefore, the video (background video) from the imaging device 50 displayed on the display device 60 follows the rotation relatively quickly. More specifically, the background image follows the rotation and flows from right to left.

 一方、ビーコン10の位置は、約100ミリ秒の回転期間中に1度しか更新されない。更新が行われるまでは、ビーコン10の位置を示す画像は、ディスプレイ装置60上の同じ位置に表示され続ける。ディスプレイ装置60上では背景映像は右から左方向に流れているにもかかわらず、ビーコン10の位置を示す画像はディスプレイ装置60上のある点に固定されている。これが「表示の位置ずれ」である。ユーザにとっては、ビーコン10の位置は常にある程度の正確度をもってディスプレイ装置60上に表示されていることが好ましい。 On the other hand, the position of the beacon 10 is updated only once during a rotation period of about 100 milliseconds. Until the update is performed, the image indicating the position of the beacon 10 is continuously displayed at the same position on the display device 60. Although the background video flows from right to left on the display device 60, the image indicating the position of the beacon 10 is fixed at a certain point on the display device 60. This is the “display misalignment”. It is preferable for the user that the position of the beacon 10 is always displayed on the display device 60 with a certain degree of accuracy.

 「表示の位置ずれ」を抑制するためには、ビーコン10が信号波を放射する頻度を高めることが考えられる。しかしながらそれではビーコン10の消費電力が多くなり、不図示の内蔵電池の寿命が短くなる。そこで本願発明者は、モバイル機器100に表示位置の補正処理を行わせ、ディスプレイ装置60に表示されるビーコン10の位置の正確度を高めることとした。 In order to suppress the “display position shift”, it is conceivable to increase the frequency with which the beacon 10 emits a signal wave. However, this increases the power consumption of the beacon 10 and shortens the life of the internal battery (not shown). Therefore, the inventor of the present application causes the mobile device 100 to perform display position correction processing to increase the accuracy of the position of the beacon 10 displayed on the display device 60.

 信号処理回路30は、モーションセンサ80から出力された検出値に基づいてモバイル機器100の位置および/または姿勢が変化したことを検出する。さらに、信号処理回路30は、検出値の時間積分を行うことにより、回転角度Rを算出する。例示的な実施形態では、アレーアンテナ20の像面SAは、モバイル機器100の位置から1メートル先の位置に設定されている。ビーコン10の位置は、モバイル機器100から見て1(m)・R(rad)=R(m)だけ左方向に移動したと近似することができる。水平方向1mあたりの画素位置の変化量を予め計算して記憶装置32に記憶させておくことにより、信号処理回路30は、図10Aに示すビーコン10の位置座標から、左にRメートル移動させた後の位置座標を算出する。 The signal processing circuit 30 detects that the position and / or posture of the mobile device 100 has changed based on the detection value output from the motion sensor 80. Further, the signal processing circuit 30 calculates the rotation angle R by performing time integration of the detected value. In the exemplary embodiment, the image plane SA of the array antenna 20 is set to a position one meter away from the position of the mobile device 100. The position of the beacon 10 can be approximated as moving to the left by 1 (m) · R (rad) = R (m) when viewed from the mobile device 100. By calculating in advance the amount of change in pixel position per 1 m in the horizontal direction and storing it in the storage device 32, the signal processing circuit 30 has moved R meters to the left from the position coordinates of the beacon 10 shown in FIG. 10A. The subsequent position coordinates are calculated.

 図11Aは、図10Aに示す姿勢にあるモバイル機器100のディスプレイ装置60に表示された、ビーコン10の位置を示すドット90および識別情報92を示している。一方、図11Bは、位置座標が補正されたドット90aおよび識別情報92aを示している。 FIG. 11A shows dots 90 indicating the position of the beacon 10 and identification information 92 displayed on the display device 60 of the mobile device 100 in the posture shown in FIG. 10A. On the other hand, FIG. 11B shows the dot 90a and the identification information 92a whose position coordinates are corrected.

 モーションセンサ80の出力を利用することにより、モバイル機器100に姿勢の変化が生じても、ビーコン10から信号波が放射される時間間隔よりも短い時間間隔でディスプレイ装置60上の「表示の位置ずれ」を補正することができる。なお、ビーコン10から断続的に放射される信号波に基づく位置座標の更新は並行して行われる。そのため、モーションセンサ80の出力を利用した位置座標の補正誤差が累積されることはなく、更新が行われる度に補正誤差はリセットされる。 By using the output of the motion sensor 80, even if the attitude of the mobile device 100 changes, the “display position shift” on the display device 60 is shorter than the time interval in which the signal wave is emitted from the beacon 10. Can be corrected. In addition, the update of the position coordinate based on the signal wave radiated | emitted intermittently from the beacon 10 is performed in parallel. For this reason, the correction error of the position coordinates using the output of the motion sensor 80 is not accumulated, and the correction error is reset every time update is performed.

 参考として、図11Cは、補正処理を行わない場合の「表示の位置ずれ」の例を示している。補正処理を行わない場合、ディスプレイ装置60上の、ビーコン10の位置を示すドット90aの位置座標は、ビーコン10から断続的に放射される信号波のみに基づいて更新される。更新されるまでに手振れに起因してモバイル機器100の姿勢が変化したとしても、ディスプレイ装置60上のドット90aの位置座標は更新されずに維持される。手振れによる変化が生じた背景の映像と、ビーコン10の位置を示すドット90aの位置座標との間にずれが生じるため、補正処理を行う場合よりも滑らかさを欠き、視認性が劣る。 For reference, FIG. 11C shows an example of “display misalignment” when correction processing is not performed. When the correction process is not performed, the position coordinates of the dot 90 a indicating the position of the beacon 10 on the display device 60 are updated based only on the signal wave emitted intermittently from the beacon 10. Even if the posture of the mobile device 100 changes due to camera shake before the update, the position coordinates of the dots 90a on the display device 60 are maintained without being updated. Since a shift occurs between the background image in which the change due to the camera shake occurs and the position coordinates of the dot 90a indicating the position of the beacon 10, the smoothness is lacking and the visibility is inferior compared with the case of performing the correction process.

 上述の例では、ヨー軸周りの回転が生じた例を説明したが、ピッチ軸およびロール軸の各々についても同様の処理を行えばよい。モーションセンサ80から出力された各軸周りの角速度の検出値を利用することにより、信号処理回路30は、3軸の各軸周りの回転についてビーコン10の位置を示すドット90等の位置座標を補正することができる。 In the above example, an example in which rotation about the yaw axis has been described, but the same processing may be performed for each of the pitch axis and the roll axis. By using the detected value of the angular velocity around each axis output from the motion sensor 80, the signal processing circuit 30 corrects the position coordinates such as the dot 90 indicating the position of the beacon 10 with respect to the rotation around each of the three axes. can do.

 なお、信号処理回路30は、画像データに基づいて、モバイル機器100の位置および/または姿勢の変化を推定し、変化に応じて前記到来方向またはドット90の位置座標を補正してもよい。 The signal processing circuit 30 may estimate a change in the position and / or orientation of the mobile device 100 based on the image data, and correct the arrival direction or the position coordinates of the dot 90 according to the change.

 補正方法の一例を説明する。信号処理回路30は、撮像装置50から、時刻tおよび(t+Δt)の2枚のフレームの画像データを取得する。信号処理回路30は、2枚のフレームに共通に含まれる任意のパターン(「特徴パターン」と呼ぶ。)を決定する。信号処理回路30は、時刻tのフレーム内の特徴パターンと、当該フレームに重畳したビーコン10の位置を示すドット90との相対位置の情報を取得する。相対位置の情報は、例えばX軸方向およびY軸方向の各差分値である。信号処理回路30は、時刻(t+Δt)のフレームの特徴パターンの位置と、取得した相対位置の情報とを利用して、ドット90の位置座標を決定し、ディスプレイ装置60に表示する。連続する複数枚のフレームは手振れの影響を含むため、上述の方法によってドット90の位置を補正すれば、手振れの影響を考慮してドット90の位置座標を補正できる。 An example of the correction method will be described. The signal processing circuit 30 acquires image data of two frames at time t and (t + Δt) from the imaging device 50. The signal processing circuit 30 determines an arbitrary pattern (referred to as a “feature pattern”) that is commonly included in the two frames. The signal processing circuit 30 acquires information on the relative position between the feature pattern in the frame at time t and the dot 90 indicating the position of the beacon 10 superimposed on the frame. The relative position information is, for example, each difference value in the X-axis direction and the Y-axis direction. The signal processing circuit 30 determines the position coordinate of the dot 90 using the position of the feature pattern of the frame at time (t + Δt) and the acquired information on the relative position, and displays it on the display device 60. Since a plurality of continuous frames include the influence of camera shake, if the position of the dot 90 is corrected by the above-described method, the position coordinates of the dot 90 can be corrected in consideration of the influence of camera shake.

<表示位置の補正2(レンズによる歪の補正)>
 撮像装置50が生成する画像データには、撮像装置50のレンズ52による歪が発生し得る。このような歪は、広角レンズを採用した場合に顕著になる。歪は、画像内でビーコン10の位置ずれを生じさせる。
<Display position correction 2 (distortion correction by lens)>
In the image data generated by the imaging device 50, distortion due to the lens 52 of the imaging device 50 may occur. Such distortion becomes prominent when a wide-angle lens is employed. The distortion causes a misalignment of the beacon 10 in the image.

 歪は、撮像装置50に固有の内部パラメータが既知であれば、計算またはテーブルによって補正可能である。例えば、撮像装置50の位置から1メートル先に設定された仮想的な像面上の各位置と、撮像装置50が出力した画像データの位置座標とを対応付けたテーブルを予め用意し、記憶装置32に記憶させておくことができる。信号処理回路30は、アレーアンテナ20から出力された信号に基づいて推定された信号波の到来方向から、当該仮想的な像面上の位置座標を決定し、上述のテーブルを参照してディスプレイ装置60上の位置座標を決定する。 The distortion can be corrected by calculation or a table if internal parameters unique to the imaging device 50 are known. For example, a table in which each position on a virtual image plane set 1 meter ahead of the position of the imaging device 50 is associated with the position coordinates of the image data output by the imaging device 50 is prepared in advance. 32 can be stored. The signal processing circuit 30 determines the position coordinates on the virtual image plane from the arrival direction of the signal wave estimated based on the signals output from the array antenna 20, and refers to the above table to display the display device The position coordinates on 60 are determined.

 本実施形態では、画像の歪を補正する代わりに、ビーコン10の位置のみを補正して画像データに重畳してもよい。画像内におけるビーコン10の位置または方向の正確度が高ければ、画像そのものが周辺で歪んでいても特に支障がないからである。各フレームの歪を補正する処理を行わず、ビーコン10の位置を補正する処理のみを行うため、演算に要する負荷は大幅に低減される。 In this embodiment, instead of correcting image distortion, only the position of the beacon 10 may be corrected and superimposed on the image data. This is because if the accuracy of the position or direction of the beacon 10 in the image is high, there is no particular problem even if the image itself is distorted in the vicinity. Since only the process of correcting the position of the beacon 10 is performed without performing the process of correcting the distortion of each frame, the load required for the calculation is greatly reduced.

 図12は、ビーコン10の位置の表示処理例を示すフローチャートである。図12に示す処理には、上述した表示位置の補正処理を含めている。 FIG. 12 is a flowchart showing an example of a process for displaying the position of the beacon 10. The process shown in FIG. 12 includes the display position correction process described above.

 ステップS1において信号処理回路30は、ビーコン10から放射され、アレーアンテナ20によって受信された信号波のデータを受け取る。ステップS2において信号処理回路30は、信号波のデータに基づいて信号波の到来方向を推定する。ステップS3において信号処理回路30は、到来方向を示す情報と、撮像装置50から出力された画像データとをディスプレイ装置60に表示する。ステップS4において信号処理回路30は、モバイル機器100のぶれに起因する表示位置のずれを補正する。その後処理はステップS1に戻る。 In step S1, the signal processing circuit 30 receives signal wave data radiated from the beacon 10 and received by the array antenna 20. In step S2, the signal processing circuit 30 estimates the arrival direction of the signal wave based on the signal wave data. In step S <b> 3, the signal processing circuit 30 displays information indicating the arrival direction and the image data output from the imaging device 50 on the display device 60. In step S <b> 4, the signal processing circuit 30 corrects the display position shift caused by the shake of the mobile device 100. Thereafter, the process returns to step S1.

<モバイル機器の他の例>
 次に、モバイル機器の他の例を説明する。以下で説明するモバイル機器の構成要素のうち、モバイル機器100の構成要素と同等の機能を有するものには同じ参照符号を付し、その説明は省略する。
<Other examples of mobile devices>
Next, another example of the mobile device will be described. Of the components of the mobile device described below, components having functions equivalent to those of the components of the mobile device 100 are denoted by the same reference numerals, and description thereof is omitted.

 図13は、第2の例によるモバイル機器110のハードウェアブロック図である。モバイル機器110が触覚デバイス82を有する点において、モバイル機器110とモバイル機器100とは相違する。 FIG. 13 is a hardware block diagram of the mobile device 110 according to the second example. The mobile device 110 is different from the mobile device 100 in that the mobile device 110 includes a haptic device 82.

 触覚デバイス82は、信号処理回路30から振動パターンを示す指令を受け取り、指令に従ってユーザに与える刺激を生成する。指令の一例はPWM信号である。 The tactile device 82 receives a command indicating the vibration pattern from the signal processing circuit 30, and generates a stimulus to be given to the user according to the command. An example of a command is a PWM signal.

 触覚デバイス82は、振動モータ82aと、当該振動モータ82aに接続されたモータ駆動回路82bとを有する。振動モータ82aは、例えば水平リニアアクチュエータである。モータ駆動回路82bは指令PWM信号に従って振動モータ82aに電流を流し、振動モータ82aを所定の振動パターンで動作させる。振動パターンは、例えば立ち上がり速度、振動の振幅、与える電流または電圧の周波数、および/または振幅の周波数によって決定され得る。 The tactile device 82 includes a vibration motor 82a and a motor drive circuit 82b connected to the vibration motor 82a. The vibration motor 82a is, for example, a horizontal linear actuator. The motor drive circuit 82b supplies current to the vibration motor 82a in accordance with the command PWM signal, and causes the vibration motor 82a to operate with a predetermined vibration pattern. The vibration pattern can be determined by, for example, the rising speed, the amplitude of vibration, the frequency of the applied current or voltage, and / or the frequency of amplitude.

 信号処理回路30は、ビーコン10の信号波の到来方向を推定し、推定した方向に基づいて触覚デバイス82の振動モータ82aを駆動する。振動モータ82aを用いてユーザの皮膚に所定の振動パターンを与えると、ユーザにラテラルフォースフィールド現象を引き起こすこと、つまりあたかも引っ張られているような錯覚を与えることができる。つまり触覚デバイス82は、到来方向を示す情報を、ユーザの触覚に与えることができる。これによりモバイル機器110は、ユーザをビーコン10の位置に誘導することができる。同時に、信号処理回路30がディスプレイ装置60上にドット90を表示すれば、ビーコン10の方向を視角的にユーザに提示することができる。なお、引っ張られているような錯覚を与えるための振動パターンは、特開2012-143054号公報、特開2010-210010号公報等に開示されているように公知である。 The signal processing circuit 30 estimates the arrival direction of the signal wave of the beacon 10 and drives the vibration motor 82a of the haptic device 82 based on the estimated direction. When a predetermined vibration pattern is given to the user's skin using the vibration motor 82a, a lateral force field phenomenon can be caused to the user, that is, the illusion of being pulled can be given. That is, the haptic device 82 can give information indicating the arrival direction to the haptic sense of the user. Thereby, the mobile device 110 can guide the user to the position of the beacon 10. At the same time, if the signal processing circuit 30 displays the dots 90 on the display device 60, the direction of the beacon 10 can be presented to the user in a visual angle. Note that vibration patterns for giving the illusion of being pulled are known as disclosed in JP2012-143054A, JP2010-2101010, and the like.

 図14は、第3の例によるモバイル機器120のハードウェアブロック図である。 FIG. 14 is a hardware block diagram of the mobile device 120 according to the third example.

 モバイル機器120は、モバイル機器100から、撮像装置50、ディスプレイ装置60およびモーションセンサ80を省略して構成されている。モバイル機器110は、ビーコン10が発する信号波を受信して、ビーコン10の位置または方向を示すデータおよび付加情報を、通信回路40を介して外部機器に送信することができる。モバイル機器110は、ビーコン10の位置を探索し、ビーコン10の付加情報を取得するための「ワイヤレスハンディスキャナ」として動作し得る。ビーコン10の位置または方向を示すデータは、記憶装置32に記憶されてもよい。 The mobile device 120 is configured by omitting the imaging device 50, the display device 60, and the motion sensor 80 from the mobile device 100. The mobile device 110 can receive a signal wave emitted from the beacon 10 and transmit data indicating the position or direction of the beacon 10 and additional information to the external device via the communication circuit 40. The mobile device 110 can operate as a “wireless handy scanner” for searching for the position of the beacon 10 and acquiring additional information of the beacon 10. Data indicating the position or direction of the beacon 10 may be stored in the storage device 32.

 外部機器は、スマートフォン、タブレット端末、またはノートパソコンであり得る。外部機器は、モバイル機器110からビーコン10の位置または方向を示すデータを受信して、外部機器が有する表示ディスプレイにビーコン10からの信号波の到来方向を示す情報を表示することができる。または、外部機器は触覚デバイスを有する小型の電子機器であり得る。当該触覚デバイスとして、モバイル機器110(図13)に内蔵された触覚デバイス82を採用することができる。 The external device can be a smartphone, a tablet terminal, or a laptop computer. The external device can receive data indicating the position or direction of the beacon 10 from the mobile device 110 and display information indicating the arrival direction of the signal wave from the beacon 10 on a display display of the external device. Alternatively, the external device may be a small electronic device having a haptic device. As the haptic device, a haptic device 82 built in the mobile device 110 (FIG. 13) can be employed.

 図15は、第4の例によるモバイル機器130のハードウェアブロック図である。モバイル機器130は、触覚デバイス82が追加されたモバイル機器120(図13)である。モバイル機器130は、ビーコン10が発する信号波を受信して信号波の到来方向を推定し、推定した方向に引っ張られるような錯覚をユーザに与えることができる。 FIG. 15 is a hardware block diagram of the mobile device 130 according to the fourth example. The mobile device 130 is the mobile device 120 (FIG. 13) to which the haptic device 82 is added. The mobile device 130 can receive the signal wave emitted by the beacon 10 to estimate the arrival direction of the signal wave, and give the user the illusion that the signal is pulled in the estimated direction.

<移動体>
 これまでに説明したモバイル機器は、人が持ち運び可能であることを前提とした。しかしながら、人が持ち運び可能であることは必須ではない。
<Moving object>
The mobile devices described so far are based on the assumption that people can carry them. However, it is not essential that a person is portable.

 撮像装置、アレーアンテナおよび信号処理回路が、例えば移動体に取り付けられ、移動体と共に移動してもよい。移動体の例は、タクシー、無人搬送車、マルチコプター等の飛行体である。 The imaging device, the array antenna, and the signal processing circuit may be attached to, for example, a moving body and move together with the moving body. Examples of the moving body are flying bodies such as taxis, automatic guided vehicles, and multicopters.

 以下では、説明の便宜上、撮像装置、アレーアンテナおよび信号処理回路を備えた電子機器が、移動体に取り付けられるとする。電子機器は、移動体に搭載できる程度の大きさで製造され、販売等され得る。 Hereinafter, for convenience of explanation, it is assumed that an electronic device including an imaging device, an array antenna, and a signal processing circuit is attached to a moving body. Electronic devices can be manufactured, sold, etc. in a size that can be mounted on a mobile object.

<配車システム>
 次に、モバイル機器の構成を有する電子機器を搭載した移動体を説明する。本開示の実施形態では、移動体として、自動車等の車両、無人搬送車、および、マルチコプターを例示する。以下で説明する電子機器は、上述したモバイル機器100(図4)およびモバイル機器120(図14)と同等の構成要素を有している。
<Vehicle allocation system>
Next, a mobile object equipped with an electronic device having the configuration of a mobile device will be described. In the embodiment of the present disclosure, examples of the moving body include a vehicle such as an automobile, an automatic guided vehicle, and a multicopter. The electronic device described below has the same components as the mobile device 100 (FIG. 4) and the mobile device 120 (FIG. 14) described above.

 図16は、複数のビーコン10および複数の車両200を含む配車システム1000を説明するための模式図である。配車システム1000は、配車依頼を行った乗客に車両200であるタクシーを向かわせ、乗客を搭乗させるために利用される。なお、図16では、実線は道路を示している。 FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200. The vehicle allocation system 1000 is used to make a taxi, which is the vehicle 200, go to a passenger who has made a vehicle allocation request and to board the passenger. In FIG. 16, the solid line indicates a road.

 本実施形態では、ビーコン10はスマートフォンに内蔵されているとする。スマートフォンには、配車システム1000の運営業者が提供するアプリケーションプログラムがインストールされている。アプリケーションプログラムはスマートフォンに内蔵された通信回路を制御して、ブルートゥース(登録商標)・ロー・エナジー規格に従った信号波を放射させる。つまりビーコン10は、スマートフォンの通信回路とアプリケーションプログラムとによって実現される。 In this embodiment, it is assumed that the beacon 10 is built in the smartphone. An application program provided by an operator of the vehicle allocation system 1000 is installed on the smartphone. The application program controls a communication circuit built in the smartphone, and radiates a signal wave in accordance with the Bluetooth (registered trademark) Low Energy standard. That is, the beacon 10 is realized by a smartphone communication circuit and an application program.

 ビーコン10が放射する信号波は、スマートフォンを携帯する人に関する識別情報を有する付加情報を含む。付加情報の例はスマートフォンの所有者IDである。所有者IDは、例えばアプリケーションプログラムによってユーザごとに発行される固有の値であり得る。 The signal wave emitted by the beacon 10 includes additional information having identification information regarding the person carrying the smartphone. An example of the additional information is a smartphone owner ID. The owner ID may be a unique value issued for each user by an application program, for example.

 図16に示す配車システム1000では、乗客は移動体通信網を利用して配車を依頼する。図16の左上に示す破線円内のビーコン10を有する乗客が配車依頼を行う例を説明する。乗客は、アプリケーションプログラムを起動して、配車依頼を行う。アプリケーションプログラムの制御により、スマートフォンは、基地局220aを介して配車システム1000の運営業者の拠点210に設置されたサーバ212に配車リクエストを送信する。 In the dispatch system 1000 shown in FIG. 16, a passenger requests a dispatch using a mobile communication network. An example in which a passenger having a beacon 10 in a broken-line circle shown in the upper left of FIG. The passenger activates the application program and makes a vehicle allocation request. Under the control of the application program, the smartphone transmits a vehicle allocation request to the server 212 installed at the base 210 of the operator of the vehicle allocation system 1000 via the base station 220a.

 スマートフォンは、配車リクエストとともに乗客の位置を示す位置情報を送信する。位置情報は、スマートフォンが搭載するGPSモジュールによって測位され取得され得る。または、スマートフォンは、予め位置が分かっている、公開されたWi-Fi(登録商標)スポットへのアクセスを利用して位置情報を取得してもよいし、位置が固定された基地局220a、220b等との通信を利用して位置情報を取得することもできる。 The smartphone sends location information indicating the location of the passenger along with the dispatch request. The position information can be measured and acquired by a GPS module mounted on the smartphone. Alternatively, the smartphone may acquire position information by using access to a public Wi-Fi (registered trademark) spot whose position is known in advance, or the base stations 220a and 220b whose positions are fixed. It is also possible to acquire position information by using communication with the network.

 サーバ212は、配車リクエストの受信に応答して、位置情報が示す乗客の位置に最も近い空車を決定する。図16に示す例では、図面中央付近に示す破線円内の車両200がサーバ212によって決定されたとする。サーバ212は、移動体通信網の基地局220bを介して、位置情報が示す位置に向かうよう当該車両200に指示を送信する。 The server 212 determines the empty vehicle closest to the position of the passenger indicated by the position information in response to receiving the dispatch request. In the example shown in FIG. 16, it is assumed that the server 200 determines the vehicle 200 within the broken circle shown near the center of the drawing. The server 212 transmits an instruction to the vehicle 200 to the position indicated by the position information via the base station 220b of the mobile communication network.

 電子機器300は、サーバ212から、乗客のビーコン10または当該乗客の識別情報を含む登録情報を受け取る。電子機器300の記憶装置32は、受け取った登録情報を記憶する。 The electronic device 300 receives the registration information including the passenger's beacon 10 or the identification information of the passenger from the server 212. The storage device 32 of the electronic device 300 stores the received registration information.

 図17は、車両200の外観図である。車両200の天井には電子機器300が設置されている。また、社内にはディスプレイ装置310が設置されている。電子機器300にはアレーアンテナ20および撮像装置50が搭載されている。 FIG. 17 is an external view of the vehicle 200. An electronic device 300 is installed on the ceiling of the vehicle 200. A display device 310 is installed in the company. The electronic device 300 includes the array antenna 20 and the imaging device 50.

 図18は、ディスプレイ装置310と接続された電子機器300の内部構成を示している。電子機器300の内部構成は、モバイル機器100(図4)の内部構成と概ね同じである。モバイル機器100から切り離されたディスプレイ装置60が、ディスプレイ装置310に相当する。構造および/または機能が同じ構成要素には同じ参照符号を付し、その説明は省略する。 FIG. 18 shows an internal configuration of the electronic device 300 connected to the display device 310. The internal configuration of the electronic device 300 is substantially the same as the internal configuration of the mobile device 100 (FIG. 4). The display device 60 separated from the mobile device 100 corresponds to the display device 310. Components having the same structure and / or function are denoted by the same reference numerals, and the description thereof is omitted.

 モバイル機器100と電子機器300との構成上の相違点は、電子機器300が駆動装置302を有していることである。駆動装置302は、不図示のモータを内蔵しており、移動体に対するアレーアンテナ20および撮像装置50の姿勢を変化させる。本実施形態ではアレーアンテナ20および撮像装置50は1つの筐体内に一体的に設けられている。駆動装置302は、アレーアンテナ20および撮像装置50を同時に、車両の鉛直方向に平行な軸(ヨー軸)、水平方向に平行な軸(ピッチング軸)、および、前後方向に平行な軸(ロール軸)の周りに回転させることができる。 The difference in configuration between the mobile device 100 and the electronic device 300 is that the electronic device 300 has a driving device 302. The drive device 302 incorporates a motor (not shown) and changes the postures of the array antenna 20 and the imaging device 50 with respect to the moving body. In this embodiment, the array antenna 20 and the imaging device 50 are integrally provided in one housing. The drive device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction (yaw axis), an axis parallel to the horizontal direction (pitching axis), and an axis parallel to the front-rear direction (roll axis ) Can be rotated around.

 なお、電子機器300の通信回路40は、ブルートゥース(登録商標)・ロー・エナジー規格に従った通信の他、移動体通信網を利用した通信を行うことが可能であるとする。しかしながら、移動体通信網を利用した通信を行う通信モジュールを、通信回路40とは独立して設けてもよい。 It is assumed that the communication circuit 40 of the electronic device 300 can perform communication using a mobile communication network in addition to communication according to the Bluetooth (registered trademark) Law Energy standard. However, a communication module that performs communication using a mobile communication network may be provided independently of the communication circuit 40.

 車両200が、位置情報に基づいて特定される乗客の位置から所定の範囲内、例えば300m以内、に入ると、電子機器300は、駆動装置302を動作させて、アレーアンテナ20および撮像装置50の姿勢を変化させて周囲をスキャンさせ、配車を依頼した乗客の識別情報を発するビーコン10を探索させる。やがて、ビーコン10を発見すると、乗客のスマートフォンと電子機器300との間には、ブルートゥース(登録商標)・ロー・エナジー規格に従った通信が確立される。これにより、スマートフォンと電子機器300との間で上記規格に従った通信が可能になる。 When the vehicle 200 enters a predetermined range from the position of the passenger specified based on the position information, for example, within 300 m, the electronic device 300 operates the drive device 302 to change the array antenna 20 and the imaging device 50. The beacon 10 is searched for the identification information of the passenger who requested the vehicle dispatch by changing the posture and scanning the surroundings. Eventually, when the beacon 10 is discovered, communication in accordance with the Bluetooth (registered trademark) Law Energy standard is established between the passenger's smartphone and the electronic device 300. Thereby, communication according to the above-mentioned standard becomes possible between the smart phone and the electronic device 300.

 図19は、接続が確立された、乗客230のスマートフォン240と電子機器300とを示す模式図である。なお、図19では、スマートフォン240は電子機器300の方向にのみ信号波を放射しているかのように記載されているが、信号波は実質的には等方的に放射されている。 FIG. 19 is a schematic diagram showing the smart phone 240 of the passenger 230 and the electronic device 300 with the connection established. In FIG. 19, the smartphone 240 is described as radiating a signal wave only in the direction of the electronic device 300, but the signal wave is radiated substantially isotropically.

 図20は、ディスプレイ装置310の表示例を示している。電子機器300の信号処理回路30は、ディスプレイ装置310にビーコン10の位置(ビーコン10からの信号波の到来方向)を示すマーク、例えば画像250、を表示する。処理の詳細は、例えば図1から図9を参照しながら詳述した通りである。これにより、車両200の運転手は、乗客230の周囲に他の人が存在していたとしても乗客230を正確に発見することができる。 FIG. 20 shows a display example of the display device 310. The signal processing circuit 30 of the electronic device 300 displays on the display device 310 a mark indicating the position of the beacon 10 (the arrival direction of the signal wave from the beacon 10), for example, the image 250. Details of the processing are as described in detail with reference to FIGS. 1 to 9, for example. As a result, the driver of the vehicle 200 can accurately find the passenger 230 even if another person exists around the passenger 230.

 信号処理回路30は識別情報252をディスプレイ装置310に追加的に表示してもよい。識別情報252には、スマートフォンの所有者IDまたは配車システム1000のユーザID、および、乗客名が含まれ得る。図20に示されるように、乗客230の識別情報252であることを容易に把握できるよう、乗客230の画像と識別情報252とを接続する引出線を設けてもよい。上述の追加的な表示により、配車を依頼した乗客230をより正確に特定することができる。 The signal processing circuit 30 may additionally display the identification information 252 on the display device 310. The identification information 252 may include a smartphone owner ID or a user ID of the vehicle allocation system 1000 and a passenger name. As shown in FIG. 20, a leader line for connecting the image of the passenger 230 and the identification information 252 may be provided so that the identification information 252 of the passenger 230 can be easily grasped. With the above-described additional display, it is possible to more accurately identify the passenger 230 who requested the dispatch.

 電子機器300は、モーションセンサ80の検出値を利用して、図10A~図11Bを参照しながら説明した表示位置の補正処理を行ってもよい。または、画像データの変化を利用した表示位置の補正処理を行ってもよい。手振れの場合と同様、車両200が走行することにより、表示の位置ずれが発生し得るためである。表示位置を補正することにより、車両200の運転手は、より容易に、かつ確実に、乗客230を発見することができる。 The electronic device 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B using the detection value of the motion sensor 80. Alternatively, display position correction processing using changes in image data may be performed. This is because display misalignment may occur when the vehicle 200 travels, as in the case of camera shake. By correcting the display position, the driver of the vehicle 200 can find the passenger 230 more easily and reliably.

 乗客230から送信された位置情報の精度が粗い場合があり得る。そのような場合でも、電子機器300がビーコン10を受信し、信号波の到来方向を推定することができれば、乗客230が待つ大体の位置をディスプレイ装置310に表示することができる。これにより、車両200の運転手は、乗客230が道路の右側で待っているのか左側で待っているのか、あるいは、右折または左折した先で待っているのかを判断できる。 The accuracy of the position information transmitted from the passenger 230 may be rough. Even in such a case, if the electronic device 300 receives the beacon 10 and can estimate the arrival direction of the signal wave, the approximate position where the passenger 230 waits can be displayed on the display device 310. Thereby, the driver of the vehicle 200 can determine whether the passenger 230 is waiting on the right side or the left side of the road, or whether the passenger 230 is waiting at the destination of the right turn or the left turn.

 拠点210のサーバ212は、スマートフォンの所有者IDまたは配車システム1000のユーザIDと各乗客の顔写真とを紐付けて登録してもよい。車両200の電子機器300は、サーバ212から乗客230の位置に向かうよう指示を受けた時、登録されている乗客230の顔写真の画像データも受信する。電子機器300の信号処理回路30は、顔写真のデータから、乗客230の顔の特徴を示す特徴パターンを抽出し、顔写真の画像と、撮像装置50から出力された画像データによって規定される画像内の人物とを照合する。照合の結果、一致した場合には、信号処理回路30は、図20に示す画像250のような、乗客230を示すマークを表示する。 The server 212 of the base 210 may associate and register the owner ID of the smartphone or the user ID of the vehicle allocation system 1000 and each passenger's face photograph. When the electronic device 300 of the vehicle 200 receives an instruction from the server 212 to go to the position of the passenger 230, the electronic device 300 also receives image data of a registered facial photograph of the passenger 230. The signal processing circuit 30 of the electronic device 300 extracts a feature pattern indicating the facial features of the passenger 230 from the facial photograph data, and an image defined by the facial photograph image and the image data output from the imaging device 50. Check against the person inside. If they match as a result of the collation, the signal processing circuit 30 displays a mark indicating the passenger 230, such as an image 250 shown in FIG.

 アレーアンテナ20および撮像装置50の相対的な位置関係は固定であってもよいし可動であってもよい。図17の例では両者は固定されている。電子機器300は車両200から取り外され、車両の運転手によって持ち運ばれてもよい。この場合、電子機器300は、ディスプレイ装置310に代えて、例えば運転手のスマートフォンに映像を送信してもよい。 The relative positional relationship between the array antenna 20 and the imaging device 50 may be fixed or movable. In the example of FIG. 17, both are fixed. Electronic device 300 may be removed from vehicle 200 and carried by the driver of the vehicle. In this case, the electronic device 300 may transmit an image to, for example, a driver's smartphone instead of the display device 310.

<無人搬送車>
 次に、移動体の第2の例である、アレーアンテナおよび撮像装置を搭載した無人搬送車を説明する。無人搬送車は「AGV」(Automatic Guided Vehicle)と呼ばれることがある。以下では無人搬送車を「AGV」と記述する。
<Automated guided vehicle>
Next, an automatic guided vehicle equipped with an array antenna and an imaging device, which is a second example of the moving body, will be described. The automated guided vehicle is sometimes called “AGV” (Automatic Guided Vehicle). Hereinafter, the automatic guided vehicle is described as “AGV”.

 図21は、各々が電子機器300(図18)を搭載する複数のAGV400を有する搬送システム1100を示している。搬送システム1100は工場500に導入され得る。工場500には、複数の棚510が設けられている。AGV400は、目的とする荷物に設けられたビーコン10の識別情報を利用して目的のビーコン10を探索し、それにより荷物を発見する。 FIG. 21 shows a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18). The transport system 1100 can be installed in the factory 500. The factory 500 is provided with a plurality of shelves 510. The AGV 400 searches for the target beacon 10 using the identification information of the beacon 10 provided in the target package, and thereby discovers the package.

 工場500の壁には、方位を示すビーコン10N、10S、10W、10Eが設けられている。AGV400の電子機器300は、各ビーコン10N、10S、10W、10Eからの信号波も識別可能に受信し、かつ、各ビーコンの位置を推定することができる。AGV400は、推定された各ビーコンの位置を利用して、自機の現在位置および自機が向いている方向、すなわち自己位置および姿勢、を認識することができる。なお、4つのビーコン10N、10S、10W、10Eは一例である。2つのビーコンを設けても、自己位置および姿勢を認識することが可能である。また、5以上のビーコン10を用いることにより、自己位置および姿勢の精度をより向上させることができる。 The wall of the factory 500 is provided with beacons 10N, 10S, 10W, and 10E indicating directions. The electronic device 300 of the AGV 400 can receive signal waves from the beacons 10N, 10S, 10W, and 10E in an identifiable manner, and can estimate the position of each beacon. Using the estimated position of each beacon, the AGV 400 can recognize the current position of the own apparatus and the direction in which the own apparatus is facing, that is, the own position and attitude. Note that the four beacons 10N, 10S, 10W, and 10E are examples. Even if two beacons are provided, it is possible to recognize the self-position and posture. Further, by using five or more beacons 10, the accuracy of the self position and the posture can be further improved.

 さらにAGV400は、撮像装置から出力される画像データを利用して通路を認識し、経路を決定することができる。例えば床面に特定の色を与え、その一方、床以外の棚510、荷物、壁等には当該色とは異なる色を与える。AGV400は、撮像装置50から出力された画像データから、床面の特定の色を認識することができる。これにより、走行可能な通路であることを認識して、ビーコン10の位置に向けて走行することができる。 Furthermore, the AGV 400 can recognize the path using the image data output from the imaging device and determine the path. For example, a specific color is given to the floor surface, while a color different from the color is given to the shelves 510, luggage, walls, etc. other than the floor. The AGV 400 can recognize a specific color of the floor surface from the image data output from the imaging device 50. Thereby, it can recognize that it is a path | route which can drive | work and can drive | work toward the position of the beacon 10. FIG.

 上述のように、AGV400は自己位置の情報を取得するために従来行われていた処理を行う必要がなくなる。例えばAGV400は、走行する工場500内の地図データおよびレーザ・レンジ・ファインダを有する必要はない。レーザ・レンジ・ファインダから出力されたセンサデータと地図データとを照合して現在位置を推定する必要がなくなるため、演算回路の処理の負荷も大幅に小さくなる。演算回路は必ずしも高性能でなくてもよいため、コストも抑えられる。また、AGV400はオドメトリを利用した現在位置の推定・補間を行う必要もない。 As described above, the AGV 400 does not need to perform processing that has been conventionally performed in order to acquire information on the self-location. For example, the AGV 400 does not need to have map data and a laser range finder in the factory 500 where it travels. Since it is not necessary to estimate the current position by comparing the sensor data output from the laser range finder with the map data, the processing load of the arithmetic circuit is greatly reduced. Since the arithmetic circuit does not necessarily have high performance, the cost can be reduced. Further, the AGV 400 does not need to perform estimation / interpolation of the current position using odometry.

 勿論、レーザ・レンジ・ファインダを利用すれば、自己位置の推定精度は向上する。また、電子機器300が有する撮像装置50が出力する画像データを利用することにより、障害物の回避を行うことが可能になる。 Of course, if the laser range finder is used, the self-position estimation accuracy can be improved. In addition, obstacles can be avoided by using image data output from the imaging device 50 included in the electronic device 300.

 図22は、本実施形態にかかる例示的なAGV400の外観図である。AGV400は、電子機器300と、4つの車輪411a~411dと、フレーム412と、搬送テーブル413と、走行制御装置414と、レーザ・レンジ・ファインダ415とを有する。図22の例において、電子機器300およびレーザ・レンジ・ファインダ415は、AGV400の進行方向側に設置されている。なお、AGV400は複数のモータも有するが図22には示されていない。また、図22には、前輪411a、後輪411bおよび後輪411cが示されているが、前輪411dはフレーム12の蔭に隠れているため明示されていない。 FIG. 22 is an external view of an exemplary AGV 400 according to the present embodiment. The AGV 400 includes an electronic device 300, four wheels 411a to 411d, a frame 412, a transfer table 413, a travel control device 414, and a laser range finder 415. In the example of FIG. 22, the electronic device 300 and the laser range finder 415 are installed on the traveling direction side of the AGV 400. The AGV 400 also has a plurality of motors, which are not shown in FIG. FIG. 22 shows a front wheel 411a, a rear wheel 411b, and a rear wheel 411c, but the front wheel 411d is not clearly shown because it is hidden behind the frame 12.

 走行制御装置414は、AGV400の動作を制御する装置であり、主としてマイコン(後述)を含む集積回路、電子部品およびそれらが搭載された基板を含む。 The traveling control device 414 is a device that controls the operation of the AGV 400, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.

 レーザ・レンジ・ファインダ415は、たとえば赤外のレーザ光415aを目標物に照射し、当該レーザ光415aの反射光を検出することにより、目標物までの距離を測定する光学機器である。本実施形態では、AGV400のレーザ・レンジ・ファインダ415は、たとえばAGV400の正面を基準として左右135度(合計270度)の範囲の空間に、0.25度ごとに方向を変化させながらパルス状のレーザ光415aを放射し、各レーザ光415aの反射光を検出する。これにより、0.25度ごと、合計1080ステップ分の角度で決まる方向における反射点までの距離のデータを得ることができる。 The laser range finder 415 is an optical device that measures the distance to the target by, for example, irradiating the target with infrared laser light 415a and detecting the reflected light of the laser light 415a. In the present embodiment, the laser range finder 415 of the AGV 400 has a pulse shape while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 400, for example. Laser light 415a is emitted, and the reflected light of each laser light 415a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.

 AGV400の位置および姿勢と、レーザ・レンジ・ファインダ15のスキャン結果とにより、AGV400は、工場500の地図を作成することができる。地図には、AGVの周囲の壁、柱等の構造物、床の上に載置された棚等の物体の配置が反映され得る。地図のデータは、AGV400内に設けられた記憶装置に格納される。 The AGV 400 can create a map of the factory 500 based on the position and orientation of the AGV 400 and the scan result of the laser range finder 15. The map can reflect the arrangement of objects such as walls around the AGV, structures such as pillars, and shelves placed on the floor. The map data is stored in a storage device provided in the AGV 400.

 一般に、移動体の位置および姿勢は、ポーズ(pose)と呼ばれる。2次元面内における移動体の位置および姿勢は、XY直交座標系における位置座標(x, y)と、X軸に対する角度θによって表現される。AGV400の位置および姿勢、すなわちポーズ(x, y, θ)を、以下、単に「位置」と呼ぶことがある。 Generally, the position and posture of the moving body are called poses. The position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle θ with respect to the X axis. The position and orientation of the AGV 400, that is, the pose (x, y, θ) may be simply referred to as “position” below.

 なお、レーザ光415aの放射位置から見た反射点の位置は、角度および距離によって決定される極座標を用いて表現され得る。本実施形態では、レーザ・レンジ・ファインダ415は極座標で表現されたセンサデータを出力する。ただし、レーザ・レンジ・ファインダ415は、極座標で表現された位置を直交座標に変換して出力してもよい。 In addition, the position of the reflection point seen from the radiation position of the laser beam 415a can be expressed using polar coordinates determined by the angle and the distance. In this embodiment, the laser range finder 415 outputs sensor data expressed in polar coordinates. However, the laser range finder 415 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.

 レーザ・レンジ・ファインダの構造および動作原理は公知であるため、本明細書ではこれ以上の詳細な説明は省略する。なお、レーザ・レンジ・ファインダ415によって検出され得る物体の例は、人、荷物、棚、壁である。 Since the structure and principle of operation of the laser range finder are known, further detailed description is omitted in this specification. Examples of objects that can be detected by the laser range finder 415 are people, luggage, shelves, and walls.

 レーザ・レンジ・ファインダ15は、周囲の空間をセンシングしてセンサデータを取得するための外界センサの一例である。そのような外界センサの他の例としては、イメージセンサおよび超音波センサが考えられる。 The laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data. Other examples of such an external sensor include an image sensor and an ultrasonic sensor.

 走行制御装置414は、レーザ・レンジ・ファインダ415の測定結果と、自身が保持する地図データとを比較して、自身の現在位置を推定することができる。地図データは、SLAM(Simultaneous Localization and Mapping)技術を用いて、AGV400自身によって取得されてもよい。 The traveling control device 414 can estimate its current position by comparing the measurement result of the laser range finder 415 with the map data held by itself. The map data may be acquired by the AGV 400 itself using SLAM (Simultaneous Localization and Mapping) technology.

 図23は、AGV400のハードウェアの構成を示している。また図23は、走行制御装置14の具体的な構成も示している。 FIG. 23 shows the hardware configuration of AGV400. FIG. 23 also shows a specific configuration of the travel control device 14.

 AGV400は、走行制御装置414と、レーザ・レンジ・ファインダ415と、2台のモータ416aおよび416bと、駆動装置417とを備えている。 The AGV 400 includes a travel control device 414, a laser range finder 415, two motors 416a and 416b, and a drive device 417.

 走行制御装置414は、マイコン414aと、メモリ414bと、記憶装置414cと、通信回路414dと、測位装置414eとを有している。マイコン414a、メモリ414b、記憶装置414c、通信回路414dおよび測位装置414eは通信バス414fで接続されており、相互にデータを授受することが可能である。レーザ・レンジ・ファインダ415もまた通信インタフェース(図示せず)を介して通信バス414fに接続されており、計測結果である計測データを、マイコン414a、測位装置414eおよび/またはメモリ414bに送信する。 The traveling control device 414 includes a microcomputer 414a, a memory 414b, a storage device 414c, a communication circuit 414d, and a positioning device 414e. The microcomputer 414a, the memory 414b, the storage device 414c, the communication circuit 414d, and the positioning device 414e are connected by a communication bus 414f and can exchange data with each other. The laser range finder 415 is also connected to the communication bus 414f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 414a, the positioning device 414e, and / or the memory 414b.

 電子機器300は通信バス414fに接続されている。通信バス414fを介して、電子機器300はマイコン414aに、目標位置としてビーコン10の位置を示す情報を送信し、また撮像装置50から出力された画像データを出力する。 The electronic device 300 is connected to the communication bus 414f. The electronic device 300 transmits information indicating the position of the beacon 10 as a target position to the microcomputer 414a via the communication bus 414f, and outputs the image data output from the imaging device 50.

 マイコン414aは、走行制御装置414を含むAGV400の全体を制御するための演算を行うプロセッサまたは制御回路(コンピュータ)である。典型的にはマイコン414aは半導体集積回路である。マイコン414aは、制御信号であるPWM(Pulse Width Modulation)信号を駆動装置417に送信して駆動装置417を制御し、モータに印加する電圧を調整させる。これによりモータ416aおよび416bの各々が所望の回転速度で回転する。 The microcomputer 414a is a processor or a control circuit (computer) that performs an operation for controlling the entire AGV 400 including the travel control device 414. Typically, the microcomputer 414a is a semiconductor integrated circuit. The microcomputer 414a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the driving device 417 to control the driving device 417 and adjust the voltage applied to the motor. Thereby, each of motors 416a and 416b rotates at a desired rotation speed.

 メモリ414bは、マイコン414aが実行するコンピュータプログラムを記憶する揮発性の記憶装置である。メモリ414bは、マイコン414aおよび測位装置414eが演算を行う際のワークメモリとしても利用され得る。 The memory 414b is a volatile storage device that stores a computer program executed by the microcomputer 414a. The memory 414b can also be used as a work memory when the microcomputer 414a and the positioning device 414e perform calculations.

 記憶装置414cは、不揮発性の半導体メモリ装置である。ただし、記憶装置414cは、ハードディスクに代表される磁気記録媒体、または、光ディスクに代表される光学式記録媒体であってもよい。さらに、記憶装置414cは、いずれかの記録媒体にデータを書き込みおよび/または読み出すためのヘッド装置および当該ヘッド装置の制御装置を含んでもよい。 The storage device 414c is a nonvolatile semiconductor memory device. However, the storage device 414c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk. Furthermore, the storage device 414c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.

 記憶装置414cは、走行する工場500の地図データMを記憶する。地図データMは、予め作成され記憶装置414cに記憶される。 The storage device 414c stores map data M of the traveling factory 500. The map data M is created in advance and stored in the storage device 414c.

 AGV400は、作成された地図と走行中に取得されたレーザ・レンジ・ファインダ15が出力したセンサデータとを利用して自己位置を推定しながら、目的のビーコン10の位置に向かって走行することができる。 The AGV 400 can travel toward the position of the target beacon 10 while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during the traveling. it can.

 測位装置414eは、レーザ・レンジ・ファインダ415からセンサデータを受け取り、また、記憶装置414cに記憶された地図データMを読み出す。レーザ・レンジ・ファインダ415のスキャン結果から作成された局所的地図データを、より広範囲の地図データMと照合(マッチング)することにより、地図データM上における自己位置(x, y, θ)を同定する。 The positioning device 414e receives the sensor data from the laser range finder 415, and reads the map data M stored in the storage device 414c. The local map data created from the scan results of the laser range finder 415 is matched (matched) with a wider range of map data M to identify the self-position (x, y, θ) on the map data M To do.

 本実施形態では、マイコン414aと測位装置414eとは別個の構成要素であるとしているが、これは一例である。マイコン414aおよび測位装置414eの各動作を独立して行うことが可能な1つのチップ回路または半導体集積回路であってもよい。図23には、マイコン414aおよび測位装置414eを包括するチップ回路414gが示されている。以下では、マイコン414aおよび測位装置414eが別個独立に設けられている例で説明する。 In the present embodiment, the microcomputer 414a and the positioning device 414e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 414a and the positioning device 414e. FIG. 23 shows a chip circuit 414g including the microcomputer 414a and the positioning device 414e. Hereinafter, an example in which the microcomputer 414a and the positioning device 414e are separately provided will be described.

 2台のモータ416aおよび416bは、それぞれ2つの車輪411bおよび411cに取り付けられ、各車輪を回転させる。つまり、2つの車輪411bおよび411cはそれぞれ駆動輪である。本明細書では、モータ416aおよびモータ416bは、それぞれAGV400の右輪および左輪を駆動するモータであるとして説明する。 The two motors 416a and 416b are attached to the two wheels 411b and 411c, respectively, and rotate each wheel. That is, the two wheels 411b and 411c are drive wheels, respectively. In this specification, description will be made assuming that the motor 416a and the motor 416b are motors that drive the right wheel and the left wheel of the AGV 400, respectively.

 駆動装置417は、2台のモータ416aおよび416bの各々に印加される電圧を調整するためのモータ駆動回路417aおよび417bを有する。モータ駆動回路417aおよび417bの各々はいわゆるインバータ回路であり、マイコン414aから送信されたPWM信号によって各モータに流れる電流をオンまたはオフし、それによりモータに印加される電圧を調整する。 The drive device 417 has motor drive circuits 417a and 417b for adjusting the voltage applied to each of the two motors 416a and 416b. Each of the motor drive circuits 417a and 417b is a so-called inverter circuit, and the current applied to each motor is turned on or off by the PWM signal transmitted from the microcomputer 414a, thereby adjusting the voltage applied to the motor.

 電子機器300は、ビーコン10から放射された信号波の到来方向を推定する。AGV400がその到来方向に向けて進むよう電子機器300から指示された場合、AGV400は直線的な経路で走行できない場合がある。その理由は、AGV400の自己位置とビーコン10との間に障害物が存在する場合でも、ビーコン10から放射された信号波は、障害物を透過して電子機器300に到達し得るからである。障害物の一例は、壁、柱、棚である。 The electronic device 300 estimates the arrival direction of the signal wave radiated from the beacon 10. When the AGV 400 is instructed by the electronic device 300 to travel in the direction of arrival, the AGV 400 may not be able to travel on a straight route. The reason is that even when an obstacle exists between the self-position of the AGV 400 and the beacon 10, the signal wave radiated from the beacon 10 can pass through the obstacle and reach the electronic device 300. Examples of obstacles are walls, pillars, and shelves.

 図24は、AGV400とビーコン10が存在する方向Pとの関係を示している。ここでは、棚510の紙面奥側にビーコン10が存在すると仮定している。AGV400からみて方向P上には、棚510が存在している。 FIG. 24 shows the relationship between the AGV 400 and the direction P in which the beacon 10 exists. Here, it is assumed that the beacon 10 exists on the back side of the shelf 510 in the drawing. A shelf 510 exists on the direction P as viewed from the AGV 400.

 マイコン414aは、地図データMを利用して、自己位置から見たビーコン10の方向Pに障害物があるか否かを判定する。障害物が存在する場合、マイコン414aは、地図データMを利用して、障害物を回避しながら目標位置に到達する経路を算出する。図24の例では、AGV400は、経路D1に沿って走行するのではなく、経路D2に沿って走行する。 The microcomputer 414a uses the map data M to determine whether or not there is an obstacle in the direction P of the beacon 10 viewed from its own position. When there is an obstacle, the microcomputer 414a uses the map data M to calculate a route to reach the target position while avoiding the obstacle. In the example of FIG. 24, the AGV 400 does not travel along the route D1, but travels along the route D2.

 図25は、走行後のAGV400とビーコン10が存在する方向Pとの関係を示している。AGV400が走行することにより、ビーコン10が存在する方向Pは変化する。AGV400は地図データMを利用して、棚510を回避する経路D3を走行する。これにより、棚510を回避しながら、ビーコン10の位置に到達することができる。 FIG. 25 shows the relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists. As the AGV 400 travels, the direction P in which the beacon 10 exists changes. The AGV 400 uses the map data M to travel on a route D3 that avoids the shelf 510. Thereby, the position of the beacon 10 can be reached while avoiding the shelf 510.

 なお、地図データMが最新の状態を反映していない場合には、走行経路上に地図データM上には存在しない障害物が載置されている場合がある。そこでマイコン414aは、電子機器300から受け取った画像データをさらに利用して、画像データから障害物の有無を判定してもよい。撮像装置50としてステレオカメラを採用すれば、より高い精度で障害物の位置を認識することができる。 If the map data M does not reflect the latest state, an obstacle that does not exist on the map data M may be placed on the travel route. Therefore, the microcomputer 414a may further determine the presence or absence of an obstacle from the image data by further using the image data received from the electronic device 300. If a stereo camera is employed as the imaging device 50, the position of the obstacle can be recognized with higher accuracy.

<マルチコプター>
 次に、移動体の第3の例である、アレーアンテナおよび撮像装置を搭載したマルチコプターを説明する。
<Multicopter>
Next, a third example of the moving body, a multicopter equipped with an array antenna and an imaging device will be described.

 図26は、探索システム1200の構成例を示している。探索システム1200は、ビーコン10と、マルチコプター600と、探索救助センター施設700内のPC702およびディスプレイ装置704とを含む。ビーコン10は、たとえば山岳地帯、海または河川でのレジャー客が所有している。 FIG. 26 shows a configuration example of the search system 1200. Search system 1200 includes beacon 10, multicopter 600, PC 702 and display device 704 in search and rescue center facility 700. The beacon 10 is owned by a leisure customer in a mountainous area, the sea or a river, for example.

 本実施形態では、マルチコプター600は電子機器300を搭載している。 In the present embodiment, the multicopter 600 is equipped with the electronic device 300.

 探索救助センター施設700が救助の要請を受けると、マルチコプター600は、たとえば山岳地帯、海または河川の上空を飛行しながら要救助者のビーコン10から放射される信号波を検出し、信号波の到来方向を推定する。信号波が検出されると、電子機器300の撮像装置50は画像データの出力を開始する。電子機器300の信号処理回路30は、到来方向を示す情報を画像データに付加した映像信号を、通信回路40を介して送信する。マルチコプター600はGPSモジュールを有しており、マルチコプター600の位置情報を取得する。信号処理回路30は、マルチコプター600の位置情報も通信回路40を介して送信する。映像信号および位置情報は、探索救助センター施設700に送信される。 When the search and rescue center facility 700 receives a request for rescue, the multicopter 600 detects a signal wave radiated from the beacon 10 of the rescuer while flying over a mountainous area, the sea, or a river, for example. Estimate the direction of arrival. When the signal wave is detected, the imaging device 50 of the electronic device 300 starts outputting image data. The signal processing circuit 30 of the electronic device 300 transmits a video signal obtained by adding information indicating the arrival direction to the image data via the communication circuit 40. The multicopter 600 has a GPS module and acquires position information of the multicopter 600. The signal processing circuit 30 also transmits the position information of the multicopter 600 via the communication circuit 40. The video signal and the position information are transmitted to the search and rescue center facility 700.

 探索救助センター施設700では、職員がPC702を利用して映像信号をディスプレイ装置704上で再生する。職員は、位置情報を用いて、その時点でのマルチコプター600の位置と、当該位置から見た要救助者が存在する方向を特定することができる。ディスプレイ装置704上に、ビーコン10の推定位置が表示されるため、位置の把握が容易であり、探索隊の編成、派遣等の救助活動を迅速に開始することができる。 In the search and rescue center facility 700, the staff uses the PC 702 to reproduce the video signal on the display device 704. The staff can specify the position of the multicopter 600 at that time and the direction in which the rescuer is present as viewed from the position, using the position information. Since the estimated position of the beacon 10 is displayed on the display device 704, it is easy to grasp the position, and rescue activities such as search team organization and dispatch can be started quickly.

 図27は、本開示による例示的なマルチコプター600の外観斜視図である。また、図28は、マルチコプター600の側面図である。マルチコプター600の中央筐体602の下部には、駆動装置302を介して電子機器300が取り付けられている。図17で説明した例と同様、駆動装置302は、アレーアンテナ20および撮像装置50を同時に、マルチコプター600の鉛直方向に平行な軸(ヨー軸)、水平方向に平行な軸(ピッチング軸)、および、前後方向に平行な軸(ロール軸)の周りに回転させることができる。なお、マルチコプター600の一般的な構成は公知であるため説明は省略する。 FIG. 27 is an external perspective view of an exemplary multicopter 600 according to the present disclosure. FIG. 28 is a side view of the multicopter 600. An electronic device 300 is attached to the lower part of the central housing 602 of the multicopter 600 via a driving device 302. As in the example described with reference to FIG. 17, the driving device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction of the multicopter 600 (yaw axis), an axis parallel to the horizontal direction (pitching axis), And it can be rotated around an axis (roll axis) parallel to the front-rear direction. In addition, since the general structure of the multicopter 600 is well-known, description is abbreviate | omitted.

 本実施形態においても、電子機器300は、モーションセンサ80の検出値を利用して、図10A~図11Bを参照しながら説明した表示位置の補正処理を行ってもよい。または、画像データの変化を利用した表示位置の補正処理を行ってもよい。手振れの場合と同様、マルチコプター600が飛行することにより、表示の位置ずれが発生し得るためである。表示位置を補正することにより、探索救助センター施設700の職員は、より容易に、かつ確実に、要救助者を発見することができる。 Also in the present embodiment, the electronic apparatus 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B by using the detection value of the motion sensor 80. Alternatively, display position correction processing using changes in image data may be performed. This is because, as in the case of camera shake, display misalignment may occur when the multicopter 600 flies. By correcting the display position, the staff of the search / rescue center facility 700 can more easily and reliably find a rescuer.

 本開示の移動体は、ビーコンを備える装置、器具、もしくは物品、またはビーコンが配置された位置もしくはその近傍に人を案内する電子機器を搭載する。そのような移動体は、移動ロボット、無人搬送車、ドローン、自動車などであり得る。 The mobile body of the present disclosure includes an apparatus, a device, or an article including a beacon, or an electronic device that guides a person at or near the position where the beacon is arranged. Such a moving body can be a mobile robot, an automated guided vehicle, a drone, an automobile, and the like.

 10 ビーコン(タグ)、20 アレーアンテナ、30 信号処理回路、 32 記憶装置(メモリ)、40 通信回路、50 撮像装置、52 レンズ、54 イメージセンサ、80 モーションセンサ、82 触覚デバイス、82a 振動モータ、82b モータ駆動回路、100、110、120、130 モバイル機器、200 車両、300 電子機器、302 駆動装置、310 ディスプレイ装置、400 AGV、600 マルチコプター、1000 配車システム、1100 搬送システム、1200 探索システム 10 beacons (tags), 20 array antennas, 30 signal processing circuits, 32 storage devices (memory), 40 communication circuits, 50 imaging devices, 52 lenses, 54 image sensors, 80 motion sensors, 82 tactile devices, 82a vibration motors, 82b Motor drive circuit, 100, 110, 120, 130 mobile device, 200 vehicle, 300 electronic device, 302 drive device, 310 display device, 400 AGV, 600 multicopter, 1000 dispatch system, 1100 transport system, 1200 search system

Claims (14)

 画像データを出力する撮像装置と、
 ビーコンから周期的または断続的に放射された信号波を受信する複数のアンテナ素子を有するアレーアンテナと、
 前記アレーアンテナから出力された信号に基づいて前記信号波の到来方向を推定し、前記到来方向を規定する座標を決定する信号処理回路と、
 を備え、
 前記信号処理回路は、前記到来方向を示す情報を前記画像データに付加した映像信号を出力する、移動体。
An imaging device that outputs image data;
An array antenna having a plurality of antenna elements for receiving a signal wave periodically or intermittently radiated from a beacon;
A signal processing circuit that estimates an arrival direction of the signal wave based on a signal output from the array antenna, and determines coordinates defining the arrival direction;
With
The said signal processing circuit is a moving body which outputs the video signal which added the information which shows the said arrival direction to the said image data.
 前記信号波は、前記ビーコンまたは前記ビーコンを携帯する人に関する識別情報を有する付加情報を含み、
 前記信号波から前記付加情報を取得する通信回路を更に備える、請求項1に記載の移動体。
The signal wave includes additional information having identification information regarding the beacon or a person carrying the beacon,
The mobile body according to claim 1, further comprising a communication circuit that acquires the additional information from the signal wave.
 前記画像データを表示するディスプレイ装置を備え、
 前記ディスプレイ装置は、前記映像信号に基づいて、前記到来方向を示す情報および前記画像データを表示する、請求項2に記載の移動体。
A display device for displaying the image data;
The moving body according to claim 2, wherein the display device displays information indicating the arrival direction and the image data based on the video signal.
 前記信号処理回路は、前記付加情報の選択された一部または全部を前記ディスプレイ装置上に表示させる、請求項3に記載の移動体。 The mobile object according to claim 3, wherein the signal processing circuit displays a part or all of the selected additional information on the display device.  前記移動体に対する前記撮像装置および前記アレーアンテナの姿勢を変化させる駆動装置を備えている、請求項1から4のいずれかに記載の移動体。 The moving body according to any one of claims 1 to 4, further comprising a driving device that changes a posture of the imaging device and the array antenna with respect to the moving body.  前記移動体は、道路上を走行する車両である、請求項1から5のいずれかに記載の移動体。 The moving body according to any one of claims 1 to 5, wherein the moving body is a vehicle traveling on a road.  モーションセンサを備え、
 前記信号処理回路は、前記モーションセンサからの出力に基づいて前記移動体の位置および/または姿勢の変化を推定し、前記変化に応じて前記到来方向を補正する、請求項1から6のいずれかに記載の移動体。
With a motion sensor,
7. The signal processing circuit according to claim 1, wherein the signal processing circuit estimates a change in position and / or posture of the moving body based on an output from the motion sensor, and corrects the arrival direction according to the change. The moving body described in 1.
 前記信号処理回路は、前記画像データに基づいて前記移動体の位置および/または姿勢の変化を推定し、前記変化に応じて前記到来方向を補正する、請求項1から6のいずれかに記載の移動体。 The said signal processing circuit estimates the change of the position and / or attitude | position of the said mobile body based on the said image data, and correct | amends the said arrival direction according to the said change, The one in any one of Claim 1 to 6 Moving body.  複数のビーコンおよび複数の車両を含む配車システムであって、
 前記車両は、
 画像データを出力する撮像装置と、
 前記複数のビーコンのいずれかから周期的または断続的に放射された信号波であって、前記ビーコンまたは前記ビーコンを携帯する人に関する識別情報を有する付加情報を含む、信号波を受信する複数のアンテナ素子を有するアレーアンテナと、
 前記アレーアンテナから出力された信号に基づいて前記信号波の到来方向を推定し、前記到来方向を規定する座標を決定する信号処理回路と、
 前記信号波から前記付加情報を取得する通信回路と、
を備え、
 前記信号処理回路は、前記到来方向を示す情報を前記画像データに付加した映像信号を出力し、
 前記配車システムは、前記ビーコンまたは前記ビーコンを携帯する人の位置情報を取得して、前記車両に伝達する、配車システム。
A vehicle allocation system including a plurality of beacons and a plurality of vehicles,
The vehicle is
An imaging device that outputs image data;
A plurality of antennas for receiving signal waves, which are signal waves radiated periodically or intermittently from any of the plurality of beacons, including additional information having identification information regarding the beacon or a person carrying the beacons. An array antenna having elements;
A signal processing circuit that estimates an arrival direction of the signal wave based on a signal output from the array antenna, and determines coordinates defining the arrival direction;
A communication circuit for acquiring the additional information from the signal wave;
With
The signal processing circuit outputs a video signal in which information indicating the arrival direction is added to the image data,
The vehicle allocation system acquires the position information of the beacon or a person carrying the beacon and transmits it to the vehicle.
 前記ビーコンまたは前記ビーコンを携帯する前記人の前記識別情報を含む登録情報を記憶するメモリを備え、
 前記信号処理回路は、前記登録情報と、前記通信回路が取得した前記付加情報に含まれる識別情報を照合する、請求項9に記載の配車システム。
A memory for storing registration information including the identification information of the beacon or the person carrying the beacon;
The vehicle allocation system according to claim 9, wherein the signal processing circuit collates the registration information with identification information included in the additional information acquired by the communication circuit.
 前記ビーコンを携帯する前記人の画像を記憶するメモリを備え、
 前記信号処理回路は、前記人の画像と、前記撮像装置が出力した前記画像データによって規定される画像内の人物とを照合する、請求項9または10に記載の配車システム。
A memory for storing an image of the person carrying the beacon;
The vehicle allocation system according to claim 9 or 10, wherein the signal processing circuit collates the image of the person with a person in an image defined by the image data output from the imaging device.
 前記撮像装置および前記アレーアンテナは、相対的位置関係が固定され、かつ、前記車両の運転手によって移動可能である、請求項9から11のいずれかに記載の配車システム。 12. The vehicle allocation system according to claim 9, wherein the imaging device and the array antenna have a relative positional relationship fixed and can be moved by a driver of the vehicle.  前記車両はモーションセンサを備え、
 前記信号処理回路は、前記モーションセンサからの出力に基づいて前記車両の位置および/または姿勢の変化を推定し、前記変化に応じて前記到来方向を補正する、請求項9から12のいずれかに記載の配車システム。
The vehicle includes a motion sensor;
The signal processing circuit estimates a change in the position and / or posture of the vehicle based on an output from the motion sensor, and corrects the arrival direction according to the change. The vehicle allocation system described.
 前記信号処理回路は、前記画像データに基づいて前記車両の位置および/または姿勢の変化を推定し、前記変化に応じて前記到来方向を補正する、請求項9から13のいずれかに記載の配車システム。 The vehicle allocation according to any one of claims 9 to 13, wherein the signal processing circuit estimates a change in the position and / or posture of the vehicle based on the image data, and corrects the arrival direction according to the change. system.
PCT/JP2018/018724 2017-05-31 2018-05-15 Mobile body provided with radio antenna, and vehicle dispatch system Ceased WO2018221204A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017107952 2017-05-31
JP2017-107952 2017-05-31

Publications (1)

Publication Number Publication Date
WO2018221204A1 true WO2018221204A1 (en) 2018-12-06

Family

ID=64454559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018724 Ceased WO2018221204A1 (en) 2017-05-31 2018-05-15 Mobile body provided with radio antenna, and vehicle dispatch system

Country Status (1)

Country Link
WO (1) WO2018221204A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110415534A (en) * 2019-08-03 2019-11-05 唐伟 A kind of solar energy intelligence zebra stripes traffic control robot
JP2023022898A (en) * 2021-08-04 2023-02-16 ローム株式会社 Transmission element imaging apparatus and transmission element imaging method
CN116010433A (en) * 2022-12-28 2023-04-25 广东嘉腾机器人自动化有限公司 Data update method, storage medium and electronic device based on differential data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229449A (en) * 2004-02-16 2005-08-25 Toyama Prefecture Mountain disaster casualties search system
JP2006242871A (en) * 2005-03-04 2006-09-14 Victor Co Of Japan Ltd Beacon receiver and viewer system
JP2015158802A (en) * 2014-02-24 2015-09-03 国立研究開発法人宇宙航空研究開発機構 Method for preventing misperception caused by parallax by correcting viewpoint position of camera image and system for implementing the same
JP2015191641A (en) * 2014-03-31 2015-11-02 Necエンベデッドプロダクツ株式会社 Monitoring device, monitoring system, monitoring method, and program
JP2016040151A (en) * 2014-08-12 2016-03-24 エイディシーテクノロジー株式会社 Communication system
JP2016181156A (en) * 2015-03-24 2016-10-13 株式会社Nttドコモ Vehicle allocation device, vehicle allocation system, vehicle allocation method, and program
JP2017027220A (en) * 2015-07-17 2017-02-02 日立オートモティブシステムズ株式会社 In-vehicle environment recognition device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229449A (en) * 2004-02-16 2005-08-25 Toyama Prefecture Mountain disaster casualties search system
JP2006242871A (en) * 2005-03-04 2006-09-14 Victor Co Of Japan Ltd Beacon receiver and viewer system
JP2015158802A (en) * 2014-02-24 2015-09-03 国立研究開発法人宇宙航空研究開発機構 Method for preventing misperception caused by parallax by correcting viewpoint position of camera image and system for implementing the same
JP2015191641A (en) * 2014-03-31 2015-11-02 Necエンベデッドプロダクツ株式会社 Monitoring device, monitoring system, monitoring method, and program
JP2016040151A (en) * 2014-08-12 2016-03-24 エイディシーテクノロジー株式会社 Communication system
JP2016181156A (en) * 2015-03-24 2016-10-13 株式会社Nttドコモ Vehicle allocation device, vehicle allocation system, vehicle allocation method, and program
JP2017027220A (en) * 2015-07-17 2017-02-02 日立オートモティブシステムズ株式会社 In-vehicle environment recognition device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110415534A (en) * 2019-08-03 2019-11-05 唐伟 A kind of solar energy intelligence zebra stripes traffic control robot
JP2023022898A (en) * 2021-08-04 2023-02-16 ローム株式会社 Transmission element imaging apparatus and transmission element imaging method
CN116010433A (en) * 2022-12-28 2023-04-25 广东嘉腾机器人自动化有限公司 Data update method, storage medium and electronic device based on differential data

Similar Documents

Publication Publication Date Title
US11787543B2 (en) Image space motion planning of an autonomous vehicle
US11822353B2 (en) Simple multi-sensor calibration
US10715963B2 (en) Navigation method and device
CN108427123B (en) LIDAR device and method for operating a LIDAR device
US10739784B2 (en) Radar aided visual inertial odometry initialization
US10435176B2 (en) Perimeter structure for unmanned aerial vehicle
JP7259749B2 (en) Information processing device, information processing method, program, and moving body
US7374103B2 (en) Object localization
US20160117864A1 (en) Recalibration of a flexible mixed reality device
CN109478068A (en) System and method for dynamically controlling parameters for processing sensor output data for collision avoidance and path planning
US11859997B2 (en) Electronic device for generating map data and operation method thereof
KR20200015880A (en) Station apparatus and moving robot system
WO2020133172A1 (en) Image processing method, apparatus, and computer readable storage medium
CN110609562B (en) A method and device for collecting image information
US10514456B2 (en) Radar aided visual inertial odometry outlier removal
US20200133300A1 (en) System and method for adaptive infrared emitter power optimization for simultaneous localization and mapping
CN110389653A (en) Tracking system for tracking and rendering virtual objects and method of operation therefor
WO2018221204A1 (en) Mobile body provided with radio antenna, and vehicle dispatch system
JP2021117502A (en) Landing control device, landing control method and program
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
US20230215092A1 (en) Method and system for providing user interface for map target creation
JP7226319B2 (en) Mobile positioning system and logistics management system
CN111563934A (en) Monocular vision odometer scale determination method and device
CN113960999A (en) Mobile robot repositioning method, system and chip
CN114147711B (en) Robot control method, device, terminal equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18808861

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18808861

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP