WO2019061466A1 - Procédé de commande de vol, dispositif de commande à distance et système de commande à distance - Google Patents
Procédé de commande de vol, dispositif de commande à distance et système de commande à distance Download PDFInfo
- Publication number
- WO2019061466A1 WO2019061466A1 PCT/CN2017/104911 CN2017104911W WO2019061466A1 WO 2019061466 A1 WO2019061466 A1 WO 2019061466A1 CN 2017104911 W CN2017104911 W CN 2017104911W WO 2019061466 A1 WO2019061466 A1 WO 2019061466A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target object
- information
- remote control
- feature
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Definitions
- the present invention relates to the field of electronic technologies, and in particular, to a flight control method, a remote control device, and a remote control system.
- aircraft such as drones, remote-controlled aircraft, aerial vehicles, etc.
- People can carry cameras, spray devices, etc. through drones to achieve aerial photography, spraying and other tasks.
- the operator can control the rocker on the remote control device by both hands, and shake the rocker up and down as needed to control the drone, and generate control according to the rocking direction and the rocking amplitude of the rocker. Command to control the flight of the aircraft.
- the embodiment of the invention discloses a flight control method, a remote control device and a remote control system, which can enrich the control of the aircraft.
- an embodiment of the present invention discloses a flight control method, which is applied to a remote control device, where the remote control device is used to remotely control aircraft flight, and the method includes:
- the control command is transmitted to the aircraft over a wireless link to control the aircraft to fly.
- an embodiment of the present invention discloses a remote control device for remotely controlling aircraft flight, including: a memory and a processor;
- the memory is configured to store program instructions
- the processor is configured to execute program instructions stored in the memory, when program instructions are executed,
- the processor is used to:
- the control command is transmitted to the aircraft over a wireless link to control the aircraft to fly.
- an embodiment of the present invention discloses a remote control system, including:
- At least one camera device and/or at least one sensor comprising a red, green and blue camera device;
- a remote control device as in the second aspect is a remote control device as in the second aspect.
- the remote control device can detect the movement trajectory of the feature position area on the target object in the space, determine the levitation control action, generate a control instruction according to the levitation control action, and finally use the wireless link to control the command.
- Sending to the aircraft and controlling the flight of the aircraft can accurately control the flight of the aircraft without the operator touching the remote control device, enriching the control mode of the aircraft and improving the intelligence of the remote control device.
- FIG. 1 is a schematic diagram of a scenario for flight control according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of another scenario for flight control according to an embodiment of the present invention.
- FIG. 3 is a schematic flow chart of a flight control method according to an embodiment of the present invention.
- FIG. 4 is a schematic flow chart of another flight control method according to an embodiment of the present invention.
- FIG. 5 is a schematic structural diagram of a remote control device according to an embodiment of the present invention.
- FIG. 6 is a schematic structural diagram of a remote control system according to an embodiment of the present invention.
- UAVs have a very wide range of uses. They are currently well used in aerial photography, agriculture, express transportation, disaster relief, surveying and mapping, news reporting, power inspection, disaster relief, etc., making drone technology a Popular emerging technologies. In the UAV technology, achieving precise control of the drone is an important research direction.
- the first way is to require the operator to manipulate the remote control device such as the joystick with both hands to achieve more precise control of the drone.
- the above method requires an artificial contact with the remote control device, and the remote control device can recognize the operation of the operator, and the control method of the aircraft is limited.
- the second way is that the operator performs a wave motion against the aircraft, and then the aircraft uses the camera device (such as a camera, a close-up binocular stereo module, etc.) provided on the aircraft to collect the depth information of the palm, and the depth information is used to identify the operation.
- the player's wave action then controls the aircraft to fly in accordance with the flight instructions represented by the wave action.
- the second method described above can control the flight of the aircraft without a remote control device.
- the control can only be controlled by the action of the operator waving, the control method is relatively simple, and the operator's hand needs to be very close to the aircraft and in the same horizontal plane.
- the aircraft can take in the waved action and cannot control the flight of the aircraft remotely.
- the present invention provides a flight control method, a remote control device, and a remote control system.
- FIG. 1 and FIG. 2 are schematic diagrams of scenarios for flight control according to an embodiment of the present invention.
- the hand is the target object
- the knuckle point is the feature position area of the target object, but it should be understood that in the embodiment of the present invention, the hand and the knuckle joint point are only for the target object and the target.
- An example of the feature location area of the object, in other embodiments, the target object and the feature location area of the target object may also be other objects, which is not limited in this embodiment of the present invention.
- the execution body of 101-106 is a remote control device, and the remote control device can be used for Remotely control the flight of the aircraft.
- the remote control device may be a wearable device, an augmented reality device, or the like.
- the wearable device may be, for example, a smart watch, a smart bracelet, smart glasses, etc.
- the augmented reality device may be, for example, a head mounted display or the like.
- the remote control device is exemplified by a smart watch, but it should be understood that in other embodiments, the remote control device may be any of the above-mentioned remote control devices.
- the remote control device can acquire the depth of the detection area through a depth information collection device (for example, a binocular stereo vision module, a 3D time of flight (ToF) module, a depth sensor, etc.) disposed on the remote control device. information.
- a depth information collection device for example, a binocular stereo vision module, a 3D time of flight (ToF) module, a depth sensor, etc.
- an area between two broken lines may be the detection area, and the remote control device may acquire depth information in the detection area.
- the depth information may be determined from a depth map of the current frame.
- the remote control device can detect whether there is a hand (ie, a target object) in the detection area by using the depth information, and if present, the opponent can perform rough positioning, and specifically, the palm can perform rough positioning.
- the remote control device may use the surface of the current dial as a reference coordinate system, project the depth information of the current frame onto the reference coordinate system to generate a point cloud image, and then the point cloud image.
- the presence or absence of the palm is detected, and if present, the palm is roughly positioned and steps 103-106 are performed.
- the remote control can enter a sleep mode.
- the remote control device can adjust the frequency of detecting the palm from the first frequency to the second frequency (the second frequency is less than the first frequency). If the remote control device receives a wake-up command (for example, shaking the remote control device, etc.), the sleep mode can be released and the second frequency can be adjusted back to the first frequency.
- a wake-up command for example, shaking the remote control device, etc.
- the knuckle point can be accurately positioned after the palm is roughly positioned.
- a black dot on the tip of the finger can be used to indicate a feature position area (ie, a knuckle point) of the target object, and the remote control device can accurately position the knuckle point.
- the remote control device may extract a region of interest (ROI) in the point cloud image, and perform palm fitting according to the point cloud data provided by the point cloud image within the ROI range. Thereby precise positioning of each knuckle point.
- ROI region of interest
- the remote control device can perform a movement trajectory fit based on the positional movement of the knuckle joint point.
- the remote control device may further remove the positional deviation of the movement trajectory caused by the shaking of the remote control device according to the preset filter function and the posture information of the remote control device, to implement smoothing filtering processing, and obtain the knuckle point.
- the movement track may further remove the positional deviation of the movement trajectory caused by the shaking of the remote control device according to the preset filter function and the posture information of the remote control device, to implement smoothing filtering processing, and obtain the knuckle point.
- the movement track may further remove the positional deviation of the movement trajectory caused by the shaking of the remote control device according to the preset filter function and the posture information of the remote control device, to implement smoothing filtering processing, and obtain the knuckle point.
- the remote control device can determine a hovering control action according to the movement trajectory and generate a corresponding control command. For example, if the remote control device fits a horizontal to right movement trajectory, then the levitation control action is to control the aircraft to fly horizontally to the right.
- the control command is an instruction to control the aircraft to fly horizontally to the right.
- the remote control device can transmit the control command to the drone via a wireless link (e.g., cellular mobile data network, Bluetooth, infrared, etc.) to cause the drone to fly in accordance with the indication of the control command.
- a wireless link e.g., cellular mobile data network, Bluetooth, infrared, etc.
- the operator can control the flight of the aircraft without touching the remote control device, which breaks the limitation of the artificial contact remote control device and realizes the precise control of the aircraft, and the form of the remote control device is not limited to the traditional rocker. It can also be a wearable device, an augmented reality device, etc., which enriches the control mode of the aircraft and improves the intelligence of the remote control device.
- FIG. 3 is a schematic flowchart of a flight control method according to an embodiment of the present invention.
- the method shown in FIG. 3 may include:
- execution body of the embodiment of the present invention may be a remote control device for remotely controlling aircraft flight.
- the remote control device is a wearable device or an augmented reality device.
- the wearable device is any one or more of a smart watch, smart glasses, and a smart bracelet; the augmented reality device is a head mounted display.
- the remote control device detects a movement trajectory of the feature location area on the target object in the space, and specifically includes: acquiring image feature information of the detection area; and determining the target object from the image feature information of the detection area. The movement track of the feature position area on the space.
- the detection area may be an area that can be detected by the depth information collection device on the remote control device.
- image feature information can be used to represent feature information in the scene image within the range of the detection region.
- the remote control device may be a smart watch, and the detection area may be an area above the dial of the smart watch and between two broken lines, and the remote control device may collect the detection area.
- Image feature information may be a smart watch, and the detection area may be an area above the dial of the smart watch and between two broken lines, and the remote control device may collect the detection area.
- the remote control device may acquire an image in real time, acquire image feature information in the image, or acquire image feature information in the image according to a preset time, where each frame image is provided.
- the image feature information may correspond to image feature information provided by an image captured at a time of, for example, 12:30:10, and the corresponding time may be 12:30:10, and the current time may correspond to the current frame image.
- Image feature information may correspond to image feature information provided by an image captured at a time of, for example, 12:30:10, and the corresponding time may be 12:30:10, and the current time may correspond to the current frame image.
- the image feature information may be the image feature information in the current frame image, or may be the image feature information in the continuous multi-frame image, and the like, which is not limited in this embodiment of the present invention.
- the remote control device may acquire a current frame image of the detection area, extract image feature information in the current frame image, and detect a movement trajectory of the feature position area on the target object in the space according to the image feature information.
- the remote control device before determining the movement trajectory of the feature location area on the target object in the image from the image feature information of the detection area, further includes: detecting whether the detection area exists a target object; if there is a target object, the acquisition frequency when acquiring the image feature information of the detection area is adjusted to the first frequency.
- the target object is a hand
- the feature location area on the target object is an knuckle point on the hand.
- the target object may also be other biometric parts, such as eyes, mouth, head, etc.
- the feature area on the target object may also be the pupil of the eye, the lip of the mouth, the hair of the head. Etc., the present invention does not impose any limitation on this.
- the remote control device may determine, according to the image feature information, whether the target object exists after extracting the image feature information in the current frame image, and if present, acquire the image feature of the detection region.
- the acquisition frequency at the time of information is adjusted to the first frequency.
- the first frequency may be, for example, a high acquisition frequency such as 50 Hz, 100 Hz, or the like. That is to say, if the remote control device detects the target object, the acquisition frequency of the remote control device at this time may be the first frequency, which may facilitate fitting the movement trajectory of the target object.
- the acquisition frequency when acquiring the image feature information of the detection area is adjusted to the second frequency.
- the second frequency may be, for example, a lower acquisition frequency such as 5 Hz, 10 Hz, or the like. That is to say, if the remote control device detects that the target object does not exist, the acquisition frequency of the remote control device may be the second frequency, which may reduce the power consumption of the remote control device and increase the usage time of the remote control device.
- the remote control device is in a sleep mode at the second frequency, the method further comprising: if receiving the wake-up command, acquiring frequency of the image feature information of the detection region from The second frequency is adjusted to the first frequency; wherein the first frequency is higher than the second frequency.
- the wake-up command may be, for example, an operation command for recognizing a wrist by an inertial measurement unit.
- the remote control device is a smart wristband or a smart watch, it is determined that the wake-up command is received by recognizing the action of raising the hand and the face of the dial facing up.
- the remote control device may determine that the working mode (ie, the mode of detecting the target object) needs to be entered, and the acquiring frequency corresponding to the working mode may be the first frequency, and the remote control device may The acquisition frequency can be adjusted from the second frequency to the first frequency.
- the working mode ie, the mode of detecting the target object
- determining the movement trajectory of the feature location area on the target object in the image from the image feature information of the detection area comprises: determining features on the target object according to image feature information of the detection area a location area; determining a movement trajectory of the feature location area in the space according to at least two frames of image feature information of the detection area.
- the image feature information provided by the at least two frames of the detection area may be image feature information provided by two consecutive frames of images.
- the remote control device determines the feature location area on the target object by using the image feature information provided by the current frame image, and then the root device can And according to the image feature information provided by the current frame image and the image feature information provided by the next frame image of the current frame image feature information, the movement trajectory of the feature location region in the space is determined.
- the moving track may refer to a position change track of the feature position area in space.
- the image feature information of the detection area is depth information of the detection area.
- the remote control device may first acquire a depth image in the detection area, and then determine depth information from the depth image, and according to the depth change indicated by the depth information, the contour movement of the target object The direction, size and other parameters determine the movement trajectory of the feature location area on the target object in space.
- the remote control device may acquire image feature information in the detection area by using a binocular stereo vision module, a 3D TOF module, a depth sensor, or the like, and further detect whether the target object exists in the detection area.
- the determining the feature location area on the target object according to the depth information of the detection area comprises: performing projection processing on the reference coordinate system according to the depth information of the detection area, and generating on the reference coordinate system a point cloud image, the reference coordinate system is determined according to the remote control device; determining contour information of the target object according to the point cloud image; determining position information of the feature position region according to the contour information of the target object, A feature location area on the target object is determined.
- the reference coordinate system may be a coordinate system centered on the remote control device.
- the center point or other points of the surface of the remote control device may be taken as the origin of the reference coordinate system, and then the horizontal axis and the vertical axis of the reference coordinate system are established with the surface of the remote control device (for example, the dial of the smart watch). .
- the depth information may be determined from an image in which the distance (depth) value from each point in the detection area to the detection area is taken as a pixel value.
- the reflected light carries information such as the orientation and distance of the target object. If the light beam is scanned according to the preset trajectory, the reflection is recorded while scanning. Information, when the scan is very fine, you can get a lot of light spots, so A point cloud image can be formed. That is to say, the point cloud image can indicate information such as the orientation and distance of the target object.
- the remote control device may first determine depth information represented by the depth map of the detection area, then project the depth information into a reference coordinate system, and generate a point cloud image in the reference coordinate system,
- the point cloud image can represent information such as the orientation and distance of the target object, and the contour information of the target object can be determined according to the information such as the azimuth and the distance.
- the position of the point cloud data in the contour is dense, and can be regarded as the feature position on the target object. region.
- the determining the location information of the feature location area according to the profile information of the target object comprises: determining initial location information of the feature location area according to the profile information of the target object; filtering according to a preset The function and the posture information of the remote control device perform smoothing processing on the initial position information to obtain position information of the feature position area.
- location information of the feature location area may be used to indicate the location change of the feature location area itself.
- the initial location information of the feature location area may include information of the location movement of the feature location area due to the movement of the remote control device itself.
- the remote control device can remove the position of the feature location area caused by the movement of the remote control device by using a preset filter function and the attitude information of the remote control device measured by the inertial measurement unit. Move to obtain the location information of the feature location area.
- the detecting a movement trajectory of the feature location area on the target object in the space comprises: acquiring motion sensing data of the feature location area on the target object by using a sensor disposed on the feature location area; Obtaining a movement trajectory of the feature location area on the target object in the space according to the motion sensing data.
- the senor may be pre-set in the area of the feature, and the sensor may be, for example, a temperature sensor, an infrared sensor, or the like, which is not limited in the embodiment of the present invention.
- the senor can be wirelessly connected to the remote control device, and the sensor transmits the mobile sensing data to the remote control device by wireless transmission.
- the operator can wear gloves
- the sensor is provided at the knuckle point of the glove, and the operator can issue some control actions, and the sensor converts these control actions
- the motion sensing data is sent to the remote control device, and the remote control device can perform data fitting according to the motion sensing data to obtain a movement track of the operator's knuckle point in the space.
- the image feature information of the detection area is color information or infrared information of the detection area.
- the color information may be, for example, color information in an image captured in the detection area, for example, the image is an RGB (Red Green Blue) image captured by the imaging device.
- the infrared information may be, for example, information represented by an image taken by the infrared camera.
- the feature location area on the target object may be determined from image feature information of the detection area based on depth learning. After the feature position area on the target object is located, the movement track of the feature position area on the target object in the space may be obtained according to the position change of the feature position area on the target object.
- the levitation control action can be determined according to the movement trajectory.
- the suspension control action may refer to a control action triggered by the operator when the remote control device is not touched.
- the palm (target object) shown in FIG. 2 performs a control action in a space area above the smart watch (remote control device).
- control command can be used to control the flight of the aircraft.
- the suspension control action and the control command may have a corresponding relationship.
- the correspondence may be a correspondence between a direction and an angle.
- the hovering control action is a horizontal left motion
- the control command may be an instruction to control the aircraft horizontally to the left.
- the generating the control instruction according to the suspension control action comprises: obtaining a type of the suspension control action and a motion vector of the feature location area according to the determined contour information; according to the type of the suspension control action The motion vector of the feature location area generates a control instruction.
- the motion vector of the feature location area includes a motion direction and a motion amplitude of the feature location area determined according to at least two frame depth information.
- the moving direction of the feature location area may be any direction, such as upper, lower, left, right, upper left, lower right, lower left, upper right, and the like, which is not limited in this embodiment of the present invention.
- the type of the suspension control action can be used to indicate the suspension control action.
- the direction may be, for example, a type that controls the upward direction, a type that controls the downward direction, a type that controls the leftward, a type that controls the rightward, and the like.
- the target object may be a palm
- the feature location area of the target object is a joint point
- the remote control device determines the suspension control action after positioning the joint finger point, and then the remote control device
- the type of the levitation control action may be determined by the classifier, for example, to control the downward type
- the motion vector of each knuckle point may be calculated according to the depth information provided by the at least two frames of depth information, and the motion vector may include a motion direction. , the magnitude of the movement and other parameters.
- the remote control device can determine a control command that can be used to control the aircraft to fly downward in accordance with parameters in the motion vector.
- the wireless link may be, for example, a cellular mobile data network, a wireless fidelity (WiFi), an infrared, a Bluetooth, etc., and the present invention does not impose any limitation.
- WiFi wireless fidelity
- WiFi infrared
- Bluetooth Bluetooth
- the remote control device can transmit a control command to the aircraft via a wireless link, and after receiving the control command, the aircraft can fly according to the instruction of the control command.
- the control command indicates that the flight is horizontal to the right, and the aircraft can fly to the right according to the level of the control command.
- the remote control device can detect the movement trajectory of the feature location area on the target object in the space, and determine the corresponding levitation control action, generate a control instruction according to the levitation control action, and finally pass the wireless link.
- Sending the control command to the aircraft can control the flight of the aircraft without the operator touching the remote control device, which breaks the limitation of the artificial contact with the remote control device, realizes the precise control of the aircraft, and enriches the control mode of the aircraft. Improves the intelligence of the remote control device.
- FIG. 4 is a schematic flowchart diagram of another flight control method according to an embodiment of the present invention.
- the method as shown in FIG. 4 may include:
- the remote control device may acquire depth information of the detection area before detecting a movement trajectory of the feature position area on the target object in the space; according to the depth information of the detection area It is detected whether there is a target object; if there is a target object, a movement trajectory of the feature position area on the detection target object in space is performed.
- a virtual image is superimposed on the detection area.
- the virtual image may be a non-realistic image generated by a virtual reality device, an augmented reality device, or the like.
- S402. Generate and display the virtual image in the detection area according to location information of the target object.
- the virtual image may be superimposed and displayed in the detection area by the remote control device.
- the remote control device can display the virtual image in any orientation directly below, directly above the target object, and the present invention does not impose any limitation.
- the remote control device may be an augmented reality device a
- the detection area may be an area directly in front of the line of sight of the augmented reality device a
- the augmented reality device a may detect the target object (eg, a palm) within the detection area.
- the virtual image is displayed directly below the palm, and the virtual image can be, for example, a virtual airplane.
- the virtual image may also be superimposed and displayed in the detection area by other devices.
- the remote control device is a smart watch
- the operator can wear an augmented reality device b
- the augmented reality device b can establish a connection with the smart watch, and then the remote device detects the target object, the enhancement
- the reality device b can project a virtual image on the detection area of the smart watch, such as a virtual aircraft.
- the movement trajectory of the feature position area of the target object can be used as the movement trajectory of the virtual image.
- the target object may be a palm
- the feature location area of the target object may be a joint point
- the movement track fitted by the remote control device may mean that the joint point moves horizontally to the right, then the virtual image is also It can be moved to the right horizontally according to the movement trajectory.
- the virtual image may be a virtual aircraft, and the operator may “pinch” the virtual aircraft in the detection area, and the operator's knuckles point moves horizontally to the right, and the virtual aircraft can also follow the operation.
- the author's knuckle points move horizontally to the right.
- the remote control device can also obtain a suspension control action according to the movement trajectory of the knuckle joint point, and determine a corresponding control command, send the control command to the aircraft, and control the aircraft to fly according to the control instruction.
- the remote control device can display the virtual image in the detection area, and control the movement of the virtual image according to the movement trajectory of the feature position area on the target object in the space, thereby enhancing the interest in remote control of the aircraft. Sex, but also improve the intelligence of the remote control device.
- FIG. 5 is a schematic structural diagram of a remote control device according to an embodiment of the present invention.
- the remote control device described in this embodiment includes:
- the memory 501 is configured to store program instructions
- the processor 502 is configured to execute the program instructions stored in the memory, when the program instructions are executed, to:
- the control command is transmitted to the aircraft over a wireless link to control the aircraft to fly.
- the processor 502 is configured to: acquire depth information of the detection area before detecting a movement trajectory of the feature location area on the target object in the space; and detect whether the target exists according to the depth information of the detection area. An object; if there is a target object, performing a movement trajectory of the feature position area on the detection target object in the space.
- the processor 502 when the processor 502 detects a movement trajectory of a feature location area on the target object in the space, the processor 502 is specifically configured to:
- a movement trajectory of the feature position area on the target object in the space is determined from the image feature information of the detection area.
- the image feature information of the detection area is depth information of the detection area.
- the processor 502 is further configured to: detect whether the target object exists in the detection area; if the target object exists, acquire frequency when acquiring image feature information of the detection area Adjust to the first frequency.
- the processor 502 is further configured to: if the target object does not exist, adjust an acquisition frequency when acquiring image feature information of the detection area to a second frequency.
- the remote control device is in a sleep mode at the second frequency
- the processor 502 is further configured to: when the wake-up instruction is received, the image feature information of the detection area is acquired The acquisition frequency is adjusted from the second frequency to the first frequency; wherein the first frequency is higher than the second frequency.
- the processor 502 determines the movement trajectory of the feature location area on the target object in the space from the image feature information of the detection area
- the processor 502 is specifically configured to: according to the image feature information of the detection area Determining a feature location area on the target object; determining a movement trajectory of the feature location area in the space according to the image feature information provided by the at least two frames of the detection area.
- the image feature information of the detection area is depth information of the detection area.
- the processor 502 when the processor 502 determines the feature location area on the target object according to the image feature information of the detection area, the processor 502 is specifically configured to: perform projection processing on the reference coordinate system according to the depth information of the detection area, and generate a point cloud image on the reference coordinate system, the reference coordinate system is determined according to the remote control device; determining contour information of the target object according to the point cloud image; determining a feature according to the contour information of the target object Position information of the location area to determine a feature location area on the target object.
- the method when the processor 502 determines the location information of the feature location area according to the profile information of the target object, the method is specifically configured to: determine initial location information of the feature location area according to the profile information of the target object. And performing smoothing processing on the initial position information according to the preset filter function and the posture information of the remote control device to obtain position information of the feature position area.
- the method when the processor 502 generates a control instruction according to the levitation control action, the method is specifically configured to: obtain a type of the levitation control action and the feature position according to the determined contour information. a motion vector of the region; generating a control instruction according to the type of the suspension control action and the motion vector of the feature location region.
- the motion vector of the feature location area includes a motion direction and a motion amplitude of the feature location area determined according to at least two frame depth information.
- the processor 502 when the processor 502 detects a movement trajectory of the feature location area on the target object in the space, the processor 502 is configured to: acquire a feature location area on the target object by using a sensor disposed on the feature location area Moving sensing data; obtaining a moving trajectory of the feature location area on the target object in the space according to the motion sensing data.
- the image feature information of the detection area is color information or infrared information of the detection area; the processor 502 determines a feature position on the target object from image feature information of the detection area.
- the method is specifically: determining, according to the depth learning, the feature location region on the target object from the image feature information of the detection region; acquiring the feature location region of the target object in the space Move the track.
- a virtual image is superimposed and displayed in the detection area.
- the processor 502 is further configured to: generate and display the virtual image in the detection area according to location information of the target object.
- the processor 502 is further configured to: control the virtual image movement according to the movement trajectory.
- the remote control device is a wearable device or an augmented reality device.
- the wearable device is any one or more of a smart watch, smart glasses, and a smart bracelet; the augmented reality device is a head mounted display.
- the target object is a hand
- the feature location area on the target object is an knuckle point on the hand.
- FIG. 6 is a schematic structural diagram of a remote control system according to an embodiment of the present invention.
- the remote control system includes: at least one camera device and/or at least one sensor, wherein the camera device includes a red, green and blue camera device; an aircraft; a remote control device.
- the remote control device 601 is the remote control device disclosed in the foregoing embodiment of the present invention, and the principles and implementations are similar to the foregoing embodiments, and details are not described herein again.
- the camera 603 can be disposed on the remote control device for capturing depth information of the detection area.
- the camera device 603 may include a red, green and blue camera device, and the red, green and blue camera device may acquire a color map of a feature location area of the target object, and the remote control device 601 may use the color map to target the target object.
- the location area is positioned and tracked to obtain a moving track.
- the senor 604 can be disposed at a feature location area of the target object, such as a temperature sensor, a distance sensor, an infrared sensor, and the like.
- the senor 604 can also be disposed on the remote control device 601 for acquiring depth information of the detection area.
- the sensor 604 can be, for example, a depth sensor or the like.
- the remote control system can be applied to remotely controlled aircraft devices.
- the remote control device 601 can be configured to detect a movement trajectory of the feature position area on the target object in the space, and determine a levitation control action based on the movement trajectory, generate a control instruction according to the levitation control action, and use the remote link to control the command. Send to the aircraft to control the flight of the aircraft.
- remote control device 601 can be used to perform the flight control method shown in the foregoing method embodiment, and the specific implementation process can refer to the method embodiment, and details are not described herein.
- the program can be stored in a computer readable storage medium, and the storage medium can include: Flash disk, Read-Only Memory (ROM), Random Access Memory (RAM), disk or optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Selective Calling Equipment (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention, selon un mode de réalisation, concerne un procédé de commande de vol, un dispositif de commande à distance et un système de commande à distance. Le procédé consiste à : détecter une piste de mouvement d'une région de positions de caractéristiques d'un objet cible dans un espace, et déterminer un mouvement de commande autre que tactile sur la base de la piste de mouvement ; générer une instruction de commande conformément au mouvement de commande autre que tactile ; et transmettre l'instruction de commande à un véhicule aérien au moyen d'une liaison sans fil de façon à commander le vol du véhicule aérien. L'invention fournit une nouvelle manière de commander un véhicule aérien, et permet un niveau d'intelligence plus élevé d'un dispositif de commande à distance.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/104911 WO2019061466A1 (fr) | 2017-09-30 | 2017-09-30 | Procédé de commande de vol, dispositif de commande à distance et système de commande à distance |
| CN201780007166.8A CN108700885B (zh) | 2017-09-30 | 2017-09-30 | 一种飞行控制方法、遥控装置、遥控系统 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/104911 WO2019061466A1 (fr) | 2017-09-30 | 2017-09-30 | Procédé de commande de vol, dispositif de commande à distance et système de commande à distance |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019061466A1 true WO2019061466A1 (fr) | 2019-04-04 |
Family
ID=63844086
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/104911 Ceased WO2019061466A1 (fr) | 2017-09-30 | 2017-09-30 | Procédé de commande de vol, dispositif de commande à distance et système de commande à distance |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN108700885B (fr) |
| WO (1) | WO2019061466A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190095247A1 (en) * | 2017-09-26 | 2019-03-28 | Omron Corporation | Control device |
| CN115620182A (zh) * | 2022-12-20 | 2023-01-17 | 成都鹰谷米特科技有限公司 | 一种信号处理方法、装置、终端及存储介质 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109521784B (zh) * | 2018-12-13 | 2021-05-11 | 华南农业大学 | 一种触觉感知式可穿戴上肢外骨骼无人机控制系统及方法 |
| CN110096066A (zh) * | 2019-04-18 | 2019-08-06 | 华南农业大学 | 一种力触觉再生外骨骼结构及无人机飞行姿态控制方法 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105607740A (zh) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | 一种基于计算机视觉的无人飞行器控制方法及装置 |
| CN105807926A (zh) * | 2016-03-08 | 2016-07-27 | 中山大学 | 一种基于三维连续动态手势识别的无人机人机交互方法 |
| CN106200657A (zh) * | 2016-07-09 | 2016-12-07 | 东莞市华睿电子科技有限公司 | 一种无人机控制方法 |
| CN106227341A (zh) * | 2016-07-20 | 2016-12-14 | 南京邮电大学 | 基于深度学习的无人机手势交互方法及系统 |
| CN106327854A (zh) * | 2016-09-22 | 2017-01-11 | 北京奇虎科技有限公司 | 无人机系统及无人机用红外遥控设备 |
| CN106339079A (zh) * | 2016-08-08 | 2017-01-18 | 清华大学深圳研究生院 | 一种基于计算机视觉的利用无人飞行器实现虚拟现实的方法及装置 |
| CN106569508A (zh) * | 2016-10-28 | 2017-04-19 | 深圳市元征软件开发有限公司 | 一种无人机控制方法和装置 |
| CN107066935A (zh) * | 2017-01-25 | 2017-08-18 | 网易(杭州)网络有限公司 | 基于深度学习的手部姿态估计方法及装置 |
| US20170269588A1 (en) * | 2015-12-22 | 2017-09-21 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009042392A2 (fr) * | 2007-09-24 | 2009-04-02 | Apple Inc. | Systèmes d'authentification incorporés dans un dispositif électronique |
| CN101446812A (zh) * | 2008-12-22 | 2009-06-03 | 深圳华为通信技术有限公司 | 设备的状态控制方法、控制装置及设备 |
| CN101458560B (zh) * | 2008-12-25 | 2011-06-29 | 南京壹进制信息技术有限公司 | 一种计算机智能节能方法 |
| CN102982557B (zh) * | 2012-11-06 | 2015-03-25 | 桂林电子科技大学 | 基于深度相机的空间手势姿态指令处理方法 |
| WO2015200209A1 (fr) * | 2014-06-23 | 2015-12-30 | Nixie Labs, Inc. | Véhicules aériens sans pilote portatifs, véhicules aériens sans pilote à lancement commandé, et systèmes et procédés associés |
| CN105223959B (zh) * | 2015-09-28 | 2018-07-13 | 佛山市南海区广工大数控装备协同创新研究院 | 一种无人机手套控制系统及控制方法 |
| WO2017060782A1 (fr) * | 2015-10-07 | 2017-04-13 | Lee Hoi Hung Herbert | Appareil volant à capteurs multiples et commande à base de gestes |
| CN105739525B (zh) * | 2016-02-14 | 2019-09-03 | 普宙飞行器科技(深圳)有限公司 | 一种配合体感操作实现虚拟飞行的系统 |
| CN106094868A (zh) * | 2016-08-01 | 2016-11-09 | 杨珊珊 | 无人飞行器的悬停控制装置及其悬停控制方法 |
-
2017
- 2017-09-30 WO PCT/CN2017/104911 patent/WO2019061466A1/fr not_active Ceased
- 2017-09-30 CN CN201780007166.8A patent/CN108700885B/zh not_active Expired - Fee Related
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170269588A1 (en) * | 2015-12-22 | 2017-09-21 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
| CN105607740A (zh) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | 一种基于计算机视觉的无人飞行器控制方法及装置 |
| CN105807926A (zh) * | 2016-03-08 | 2016-07-27 | 中山大学 | 一种基于三维连续动态手势识别的无人机人机交互方法 |
| CN106200657A (zh) * | 2016-07-09 | 2016-12-07 | 东莞市华睿电子科技有限公司 | 一种无人机控制方法 |
| CN106227341A (zh) * | 2016-07-20 | 2016-12-14 | 南京邮电大学 | 基于深度学习的无人机手势交互方法及系统 |
| CN106339079A (zh) * | 2016-08-08 | 2017-01-18 | 清华大学深圳研究生院 | 一种基于计算机视觉的利用无人飞行器实现虚拟现实的方法及装置 |
| CN106327854A (zh) * | 2016-09-22 | 2017-01-11 | 北京奇虎科技有限公司 | 无人机系统及无人机用红外遥控设备 |
| CN106569508A (zh) * | 2016-10-28 | 2017-04-19 | 深圳市元征软件开发有限公司 | 一种无人机控制方法和装置 |
| CN107066935A (zh) * | 2017-01-25 | 2017-08-18 | 网易(杭州)网络有限公司 | 基于深度学习的手部姿态估计方法及装置 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190095247A1 (en) * | 2017-09-26 | 2019-03-28 | Omron Corporation | Control device |
| US10761884B2 (en) * | 2017-09-26 | 2020-09-01 | Omron Corporation | Control device for operating multiple types of programs in different execution formats |
| CN115620182A (zh) * | 2022-12-20 | 2023-01-17 | 成都鹰谷米特科技有限公司 | 一种信号处理方法、装置、终端及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN108700885A (zh) | 2018-10-23 |
| CN108700885B (zh) | 2022-03-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230093612A1 (en) | Touchless photo capture in response to detected hand gestures | |
| US11914370B2 (en) | System and method for providing easy-to-use release and auto-positioning for drone applications | |
| US11796309B2 (en) | Information processing apparatus, information processing method, and recording medium | |
| US20200346750A1 (en) | Method for generating flight path, control device, and unmanned aerial vehicle | |
| US11443540B2 (en) | Information processing apparatus and information processing method | |
| CN106662917B (zh) | 眼睛跟踪校准系统和方法 | |
| WO2018214078A1 (fr) | Procédé et dispositif de commande de photographie | |
| CN108200334B (zh) | 图像拍摄方法、装置、存储介质及电子设备 | |
| US11763420B2 (en) | Creating shockwaves in three-dimensional depth videos and images | |
| WO2014071254A4 (fr) | Dispositif informatique et de commande de type montre sans fil et procédé pour imagerie en 3d, cartographie, réseau social et interfaçage | |
| CN110471526A (zh) | 一种人体姿态估计与手势识别结合的无人机控制方法 | |
| CN112462802A (zh) | 用于提供自主摄影及摄像的系统和方法 | |
| CN113359807A (zh) | 一种无人机第一视角飞行的控制方法及系统、智能眼镜 | |
| JP2018160228A (ja) | 経路生成装置、経路制御システム、及び経路生成方法 | |
| CN108628337A (zh) | 路径生成装置、路径控制系统以及路径生成方法 | |
| CN108700885B (zh) | 一种飞行控制方法、遥控装置、遥控系统 | |
| WO2020073245A1 (fr) | Procédé de reconnaissance de geste, procédé de commande d'angle de vue vr et système vr | |
| CN106292799A (zh) | 无人机、遥控装置及其控制方法 | |
| WO2020164003A1 (fr) | Procédé et système de visualisation pour fauteuil roulant intelligent | |
| CN110825333A (zh) | 显示方法、装置、终端设备及存储介质 | |
| US11589001B2 (en) | Information processing apparatus, information processing method, and program | |
| US12422846B2 (en) | Information processing device and information processing method | |
| WO2018146922A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| WO2016185634A1 (fr) | Dispositif de traitement d'informations | |
| US12499633B1 (en) | Body-mounted sensor for improved pose forecasting |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17927557 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17927557 Country of ref document: EP Kind code of ref document: A1 |