[go: up one dir, main page]

WO2019095681A1 - 定位方法、系统及所适用的机器人 - Google Patents

定位方法、系统及所适用的机器人 Download PDF

Info

Publication number
WO2019095681A1
WO2019095681A1 PCT/CN2018/090653 CN2018090653W WO2019095681A1 WO 2019095681 A1 WO2019095681 A1 WO 2019095681A1 CN 2018090653 W CN2018090653 W CN 2018090653W WO 2019095681 A1 WO2019095681 A1 WO 2019095681A1
Authority
WO
WIPO (PCT)
Prior art keywords
standard
robot
positioning
physical
positioning information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2018/090653
Other languages
English (en)
French (fr)
Inventor
陈建军
李磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ankobot Shanghai Smart Technologies Co Ltd
Original Assignee
Ankobot Shanghai Smart Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ankobot Shanghai Smart Technologies Co Ltd filed Critical Ankobot Shanghai Smart Technologies Co Ltd
Priority to US16/764,513 priority Critical patent/US11099577B2/en
Priority to EP18878085.2A priority patent/EP3712853A4/en
Publication of WO2019095681A1 publication Critical patent/WO2019095681A1/zh
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present application relates to the field of indoor positioning technology, and in particular, to a positioning method, a system, and a suitable robot.
  • a mobile robot is a machine that automatically performs work. It can accept human command, run pre-programmed procedures, or act on principles that are based on artificial intelligence techniques. These mobile robots can be used indoors or outdoors, can be used in industry or home, can be used to replace security inspections, replace people to clean the ground, can also be used for family companion, auxiliary office and so on. Due to the difference in the fields applied by different mobile robots, the mobile robots used in various fields have different ways of moving. For example, mobile robots can adopt wheel movement, walking movement, chain movement, and the like.
  • SLAM Simultaneous Localization and Mapping
  • VSLAM Vehicle-to-everything
  • SLAM Simultaneous Localization and Mapping
  • VSLAM Vehicle to Mobile
  • the VSLAM technology based on the image sensor will also increase the error of the map over time, which makes the SLAM
  • the maps constructed with VSLAM technology in this field may differ greatly from the maps of actual physical space.
  • the purpose of the present application is to provide a positioning method, a system, and a suitable robot for solving the problem of inaccurate positioning of the robot by using the data provided by the sensor in the prior art.
  • a first aspect of the present application provides a method for positioning a robot, comprising: capturing an image in a navigation operation environment of the robot; and identifying a graphic of the physical object in the image, when the at least one recognized image
  • the standard physical feature of the standard component is obtained when the standard graphic has a standard component; and the positioning information of the robot in the current physical space is determined based on the standard graphic and the standard physical feature of the standard component.
  • a second aspect of the present application further provides a positioning system for a robot, comprising: an imaging device for taking an image in a navigation operation environment of the robot; and a storage device for pre-preserving the positioning and map construction application and the behavior control application, and a standard physical feature of at least one standard component; a positioning processing device coupled to the camera device and the storage device for invoking the positioning and map construction application to perform: identifying a graphic of a physical object in the image, when at least one identified The graphic obtains a standard physical feature of the standard component when the standard graphic corresponding to the standard component is obtained; and the positioning information of the robot in the current physical space is determined based on the standard graphic and the standard physical feature of the standard component.
  • a third aspect of the present application further provides a robot comprising: a driving device for driving the robot to perform displacement and/or posture adjustment; an imaging device for capturing an image in a navigation operation environment of the robot; and a storage device Pre-stored positioning and map construction application and behavior control application, and standard physical features of at least one standard component; positioning processing device, connecting the driving device, the camera device, and the storage device, for calling the pre-stored positioning and map Constructing an application to: identify a graphic of a physical object in the image, and acquire a standard physical feature of the standard component when the identified at least one graphic is a standard graphic corresponding to a standard component; based on the standard graphic and the standard component A standard physical feature that determines location information of the robot in the current physical space.
  • a fourth aspect of the present application further provides a storage medium for an electronic device, characterized in that: one or more programs are stored, when the one or more computer programs are executed by one or more processors, One or more processors implement the positioning method of any of the preceding.
  • the positioning method, system, and applicable robot of the present application have the following beneficial effects: the positioning method of the robot of the present application recognizes and matches the graphic of the standard piece by using the graphic of the physical object in the captured image, and The technical solution for determining the positioning information of the robot in the current physical space based on the standard physical characteristics of the standard component solves the problem of inaccurate positioning of the robot by using the data provided by the sensor in the prior art.
  • FIG. 1 shows a flow chart of a positioning method of a robot of the present application in an embodiment.
  • FIG. 2 is a flow chart showing another embodiment of the positioning method of the robot of the present application.
  • FIG. 3 shows a schematic diagram of compensating for the positioning errors of the positions A1 and A2 in the map data for the positioning information B determined by the positioning method described in the present application.
  • FIG. 4 is a flow chart showing still another embodiment of the positioning method of the robot of the present application.
  • FIG. 5 is a schematic structural view of a positioning system of the robot of the present application in an embodiment.
  • FIG. 6 is a schematic structural view showing a positioning system of the robot of the present application in another embodiment.
  • FIG. 7 shows a schematic structural view of a robot of the present application in an embodiment.
  • FIG. 8 shows a schematic structural view of a robot of the present application in another embodiment.
  • the mobile robot can construct the map data of the site where the robot is located on the one hand, and provide route planning based on the constructed map data on the other hand. Route planning adjustment and navigation services. This makes the mobile robot more efficient to move.
  • the indoor sweeping robot can combine the built indoor map and positioning technology to predict the distance between the current position and the obstacles marked on the indoor map, and facilitate the timely adjustment of the cleaning strategy.
  • the obstacle may be described by a single mark, or may be marked as a wall, a table, a sofa, a closet or the like based on recognition of shape, size, and the like.
  • the indoor sweeping robot can accumulate the positioned positions and orientations based on the positioning technique and construct an indoor map based on the accumulated position and orientation changes.
  • patrol robots are usually used in factories, industrial parks, etc.
  • the patrol robots can combine the constructed plant maps and positioning technologies to predict the distance between the current position, the intersection, the intersection, and the charging pile. It is convenient to timely control the movement of the robot's mobile device according to other acquired monitoring data.
  • FIG. 1 is a flow chart showing a positioning method of a robot of the present application in an embodiment. As shown in the figure, the positioning method of the robot of the present application includes: step S110, step S120, and step S130.
  • step S110 an image is taken in the navigation operation environment of the robot.
  • the image pickup device can be used to take an image in the navigation operation environment of the robot.
  • the camera device includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the robot, and the camera device starts capturing images during the power-on movement of the robot.
  • the image pickup device may be provided on a main body of the robot.
  • the camera device may be disposed at the middle or the edge of the top cover of the sweeping robot, or the camera device may be disposed below the plane of the top surface of the sweeping robot, near the geometric center of the body, or the main body.
  • the optical axis of the imaging device may be at an angle of ⁇ 30° with respect to the vertical, or the optical axis of the imaging device may be at an angle of 0-180° with respect to the horizontal.
  • the navigation operation environment refers to an environment in which a robot designs a navigation route based on the constructed map data or moves based on a randomly designed navigation route and performs corresponding operations.
  • the navigation operating environment refers to an environment in which the sweeping robot moves according to the navigation route and performs a cleaning operation.
  • step S120 a graphic of the physical object in the image is identified, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the positioning processing device may be utilized to identify the graphic of the physical object in the image, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the positioning processing device may include one or more processors.
  • the processor may adopt an image recognition method based on a neural network, an image recognition method based on a wavelet moment, and the like, and process, analyze, and understand the captured image to identify targets and objects of various modes.
  • the processor can seek similar image targets by analyzing the correspondence, similarity and consistency of image content, features, structures, relationships, textures, and gray levels.
  • the above processors may be shared or independently configurable.
  • the standard member may include a standard member designed based on at least one of an industry standard, a national standard, an international standard, and a custom standard.
  • industry standards such as mechanical industry standard JB, building materials industry standard JC, etc.; national standards such as China GB standard, German DIN standard, British BS standard, etc.; international standards such as international ISO standards; custom standards will be detailed later.
  • the standard physical features may include contour dimensions, standard structural relationships, etc., for example, standard physical features of standard components include actual physical length, width, and height of standard components, and actual physical other dimensions of corresponding standards in standard components. Data, etc. For example, the spacing between the two holes on the power outlet. Another example is the length and width of the power outlet.
  • the sweeping robot is taken as an example. Since the sweeping robot usually performs indoor cleaning work, the objects in the image taken by the camera device generally include, for example, a wall, a table, a sofa, a wardrobe, a television, and a power socket. , network cable outlets, etc.
  • the image pickup device supplies the image to the positioning processing device after the image is taken in the navigation operation environment of the robot, and the positioning processing device recognizes the graphic of the physical object in the captured image by image recognition.
  • the graphic of the physical object can be characterized by features such as grayscale of the real object, contour of the physical object, and the like.
  • the graphic of the physical object is not limited to the external geometric figure of the physical object, and may include other graphics presented on the physical object, such as a two-hole socket on the power socket, a five-hole socket, a square socket on the network cable socket, and the like.
  • the five-hole socket of the power socket and the square socket of the network cable socket can be used to distinguish.
  • the image capturing device of the cleaning robot in the indoor ingestible image includes a power socket or a network cable socket, since the power socket and the network cable socket are designed according to the GB standard, they are not different depending on the environment in which they are installed.
  • the standard parts use a power outlet.
  • the standard physical characteristics of the standard parts may include the length, width, and height of the power socket, and the structural relationship of the five-hole socket on the power socket.
  • the graphics of the standard and the standard physical features of the standard may be preset and pre-stored using the storage device of the robot.
  • the manner in which the standard physical features of the standard are obtained includes reading the preset standard physical features from the storage device of the robot.
  • the positioning processing device has similarity and consistency through correspondence between image content, features, structure, relationship, texture, gray scale, and the like.
  • a qualitative analysis determines whether the identified at least one graphic corresponds to a graphic of the stored power outlet, and obtains a standard physical characteristic of the power outlet when the identified at least one graphic corresponds to the stored graphic of the power outlet.
  • the at least one graphic corresponding to the graphic of the stored power outlet is referred to as a standard graphic.
  • the standard graphic is a graphic of the power socket taken.
  • step S130 the positioning information of the robot in the current physical space is determined based on the standard physical features of the standard graphics and the standard components.
  • the positioning processing means can be utilized to determine the positioning information of the robot in the current physical space based on standard physical features of the standard graphics and standard components.
  • the positioning processing device includes a processor.
  • the processor utilizes a correspondence between a preset unit pixel interval and a unit length in an actual physical space, and a size of the identified standard graphic and a corresponding standard physical feature. The physical size, calculate the distance and declination of the robot from the standard part in the current physical space, that is, obtain the positioning information of the robot relative to the standard part.
  • the processor when the processor recognizes the socket and the boundary line between the wall and the bottom surface, or recognizes the socket and the default socket is mounted on the wall, according to the above correspondence, the processor can not only obtain the distance between the robots. The distance and declination of the socket, the processor can also use the spatial positional relationship of the wall, the robot and the socket to obtain the linear distance between the robot and the wall, thus obtaining the positioning information of the robot relative to the wall.
  • the processor in this step may be shared with the processor in the above steps or may be independently set.
  • the positioning method of the robot of the present application is a technical solution for determining the positioning information of the robot in the current physical space by using a pattern of the physical object in the captured image and matching with the pattern of the standard piece, and based on the standard physical characteristics of the standard part.
  • the problem of inaccurate positioning of the robot by using the data provided by the sensor in the prior art is solved.
  • the manner in which the standard physical characteristics of the standard component are obtained in the step S120 includes obtaining standard physical features from the remote server through the network.
  • the positioning processing device includes a processor, and the processor provides the retrieval request, that is, the standard graphic, to the remote server through the network, and the remote server adopts CBIR (Content-based Image Retrieval) according to the requirement.
  • the CBIR method is used to retrieve and determine the retrieval result, that is, the standard part graphic is retrieved.
  • the remote server outputs the standard physical characteristics of the standard component based on the acquired standard piece graphic, and provides the processor to the processor for subsequent processing.
  • the standard may include standard components designed based on custom criteria in addition to standard components that may be designed based on at least one of industry standards, national standards, and international standards.
  • the standard parts of the custom standard may be standard parts customized by the robot manufacturer, for example, standard parts designed and manufactured by the robot manufacturer and can be used in the robot working environment and used in conjunction with the robot.
  • the standard component of the custom standard may also be a standard component that generates standard physical features by physical parameters input by the user.
  • the sweeping robot is taken as an example.
  • the object in the image taken by the sweeping robot through the camera device during the working of the living room for example, generally includes a household appliance such as a television, so that the user can customize the television as a standard component.
  • the user can obtain the physical parameters of the television set by reading the manual of the television or querying the product information, and input the physical parameters via the input device, such as the robot application APP, to generate standard components having standard physical features.
  • the user can also select other physical objects as standard parts according to the indoor environment.
  • the positioning method of the robot of the present application allows the user to select standard parts according to the indoor environment through the user-defined standard parts, which helps the accurate positioning of the robot.
  • the determining the positioning information of the robot in the current physical space based on the standard physical features of the standard graphics and the standard in the step S130 includes: performing an off-angle correction processing on the standard graphics based on the standard physical features. Obtaining the deflection angle of the robot relative to the plane of the standard component; performing distance measurement on the corrected graphic based on the standard physical features to obtain the distance between the robot and the plane of the standard component and the standard component.
  • the positioning processing device can determine the photographing surface of the photographing device and the two holes The angle of deflection of the plane in which the power outlet is located.
  • the positioning processing device can use the obtained deflection angle to obtain the deflection angle of the robot relative to the plane of the standard member.
  • the positioning processing device performs the rectification processing on the standard pattern by using the obtained deflection angle to obtain a standard pattern in the imaging plane parallel to the plane of the standard component.
  • the distance measurement is performed by using the proportional relationship between the pixel size of the standard graphic in the image and the actual physical size, so that the distance between the robot and the plane of the standard component and the standard component is obtained.
  • the current position and orientation of the robot can be determined, that is, the positioning information of the robot in the current physical space is determined.
  • the obtained positioning information can be used in robot navigation, map creation, and map data correction.
  • the positioning information obtained by using the positioning method described in the present application can correct the map data in time, so that the indoor map data can be constructed and used as accurately as possible without obtaining accurate indoor map data in advance. Map data.
  • FIG. 2 is a flow chart showing another embodiment of the positioning method of the robot of the present application.
  • the positioning method of the robot of the present application includes: step S210, step S220, step S230, and step S240.
  • step S210 an image is taken in the navigation operation environment of the robot.
  • the image pickup device can be used to take an image in the navigation operation environment of the robot.
  • the step S210 is the same as or similar to the step S110 in the foregoing example, and will not be described in detail herein.
  • step S220 a graphic of the physical object in the image is identified, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the positioning processing device may be utilized to identify the graphic of the physical object in the image, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the step S220 is the same as or similar to the step S120 in the foregoing example, and will not be described in detail herein.
  • step S230 the positioning information of the robot in the current physical space is determined based on the standard physical features of the standard graphics and the standard components.
  • the positioning processing device can be utilized to determine the positioning information of the robot in the current physical space based on standard physical features of the standard graphics and standard components.
  • the step S230 is the same as or similar to the step S130 in the foregoing example, and will not be described in detail herein.
  • step S240 the map data in the robot is adjusted based on the positioning information of the robot in the current physical space.
  • control data of the robot can be used to adjust the map data in the robot based on the positioning information of the robot in the current physical space.
  • the control device of the robot may include one or more processors (CPUs) or micro control units (MCUs) dedicated to controlling the robot.
  • the processor in the control device may be shared with the processor in the above-described positioning processing device or may be independently set.
  • the control device functions as a slave processing device, and the processor in the positioning processing device functions as a master device, and the control device controls the adjustment based on the positioning of the location processing device.
  • the control device is shared with a processor in the positioning device.
  • the map data of the sweeping robot is generally based on moving data provided by a plurality of moving sensors, such as a speed sensor, an odometer sensor, a ranging sensor, a cliff sensor, etc., provided on the robot, and the photographing device
  • the provided image data is constructed, and there is a cumulative error in the process of constructing the map by using the motion sensor and the photographing device, so when the robot moves to a certain position, the positioning information of the robot in the current physical space determined according to the positioning method of the robot of the present application is determined. There is a deviation from the map data, so the map data needs to be adjusted.
  • the positioning processing device provides the positioning information to the control device after determining the positioning information of the robot in the current physical space, and the control device adjusts the map data in the robot based on the positioning information.
  • the map data may include, but is not limited to, position and angle data drawn within a preset grid or coordinate space based on movement data and image data provided by the motion sensor, based on the identified physical features in a preset grid or Landmark information marked in the coordinate space, etc.
  • the map data can be used for map construction and navigation.
  • the positioning method of the robot of the present application determines the positioning information of the robot in the current physical space based on the standard physical characteristics of the standard component by identifying the graphic of the physical object in the captured image and matching the graphic of the standard component, based on the The technical solution of the positioning information adjusting the map data in the robot enables the positioning error in the map data to be compensated according to the positioning information, thereby achieving the purpose of accurate positioning.
  • the adjusting the map data in the robot based on the positioning information of the robot in the current physical space in step S240 comprises: compensating a positioning error of the corresponding position in the map data based on the positioning information; and And/or compensating for a positioning error in the landmark information related to the standard component in the map data based on the positioning information.
  • the control device may compare whether the positioning information and the positioning information based on the mobile data have position and angle deviation, and if yes, replace the mobile data with the positioning information of the standard component. And the determined positioning information, and other map data that has been constructed based on the position and angular deviation in the two positioning information. For example, please refer to FIG. 3. FIG.
  • the positions A1 and A2 are positioning information determined based on the movement data.
  • the positioning processing device identifies the standard component at the A2 position and determines the current positioning information B based on the identified standard component, the positioning information B is replaced with the position A2, and the positioning information B is located on the left side of the A2 and offset by one grid.
  • the positioning information of A1 is also adjusted to the left by a grid to obtain A1' indicated by a broken line to compensate for the positioning error of the moving data at position A1.
  • landmark data is included in the map data.
  • the landmark information is used to help the robot use the image for positioning.
  • the landmark information includes, but is not limited to, features of standard and non-standard components (such as edges, corners, contours, brightness features, etc.), and can capture various position and angle information of these features.
  • a landmark information usually includes features of standard and non-standard parts, or all non-standard parts.
  • Adjacent landmark information usually contains repeated and non-repeating features.
  • the non-standard parts refer to objects that are not previously defined as standard parts, such as tables, chairs, walls, and the like.
  • the location information of the locations C1 and C2 is included in the constructed map data, wherein the location C1 includes the landmark information t1, and the location C2 includes the landmark information t2, which includes the features of the standard component and the features of the non-standard component.
  • the position and declination of the corresponding feature are captured, and t2 includes the features of the non-standard piece overlapping with t1 and the position and off-angle of the corresponding feature.
  • the positioning processing device identifies the standard at the C1 position and determines the current positioning information D based on the identified standard, the positioning information D is replaced with the positioning information at the position C1 to compensate for the positioning error in the original map data, and based on the positioning.
  • the information D adjusts the position in the landmark information t1 and the positioning error in the yaw angle. By means of the overlapping non-standard features in the landmark information t1 and t2, this step can also adjust the positioning error in the rear position C2 and the landmark information t2.
  • the manner of error compensation for the positioning information in the map data is only an example. Since the errors of the moving data and the image data are accumulated, the positioning errors are not necessarily identical. Therefore, the positioning compensation method can also compensate the positioning error in the map data by using the weight compensation method.
  • the positioning information obtained by using the positioning method of the present application and the positioning-related information that the map data can provide, such as positioning information, landmark information, etc., can adjust one or more locations in the map data. Positioning information and one or more positioning errors contained at the same location.
  • FIG. 4 is a flow chart showing a positioning method of the robot of the present application in still another embodiment. As shown in the figure, the positioning method of the robot of the present application includes: step S410, step S420, step S430, and step S440.
  • step S410 an image is taken in the navigation operation environment of the robot.
  • the image pickup device can be used to take an image in the navigation operation environment of the robot.
  • the step S410 is the same as or similar to the step S110 in the foregoing example, and will not be described in detail herein.
  • step S420 a graphic of the physical object in the image is identified, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the positioning processing device may be utilized to identify the graphic of the physical object in the image, and when the identified at least one graphic is a standard graphic corresponding to the standard component, the standard physical feature of the standard component is acquired.
  • the step S420 is the same as or similar to the step S120 in the foregoing example, and will not be described in detail herein.
  • step S430 the positioning information of the robot in the current physical space is determined based on the standard physical features of the standard graphics and the standard components.
  • the positioning processing means can be utilized to determine the positioning information of the robot in the current physical space based on standard physical features of the standard graphics and standard components.
  • the step S430 is the same as or similar to the step S130 in the foregoing example, and will not be described in detail herein.
  • step S440 the navigation route of the robot is adjusted based on the positioning information of the robot in the current physical space.
  • the robot's navigation device can be used to adjust the navigation route of the robot based on the positioning information of the robot in the current physical space.
  • the control device of the robot may include one or more processors (CPUs) or micro control units (MCUs) dedicated to controlling the robot.
  • the processor in the control device may be shared with the processor in the above-described positioning processing device or may be independently provided.
  • the control device functions as a slave processing device, and the processor in the positioning processing device functions as a master device, and the control device adjusts based on the positioning control of the location processing device.
  • the control device is shared with a processor in the positioning device.
  • the map data of the sweeping robot is generally constructed based on moving data provided by a plurality of moving sensors provided on the robot, such as a speed sensor, an odometer sensor, and the like, and image data provided by the photographing device. Since there is a cumulative error in the process of constructing the map by using the motion sensor and the photographing device, when the robot moves to a certain position, the positioning information of the robot in the current physical space determined according to the positioning method of the robot of the present application deviates from the map data.
  • the positioning processing device provides the positioning information to the control device after determining the positioning information of the robot in the current physical space, and the control device is based on the robot in the current physical space.
  • the positioning information adjusts the navigation route of the robot.
  • the control device may further adjust the distance that the robot continues to move according to the adjusted navigation route or adjust the direction of the robot movement to move the robot according to the adjusted navigation route.
  • the positioning method of the robot of the present application determines the positioning information of the robot in the current physical space based on the standard physical characteristics of the standard component by identifying the graphic of the physical object in the captured image and matching the graphic of the standard component, based on the The technical solution of locating information to adjust the navigation route of the robot enables the purpose of correcting the navigation route in the case of accurate positioning.
  • the adjusting the navigation route of the robot based on the positioning information of the robot in the current physical space in the step S440 includes: redetermining the position of the robot in the preset map data based on the positioning information, and Orientation and adjust the navigation route based on the redefined position and orientation.
  • the robot when the robot performs a navigation operation according to the constructed map, when the robot moves to a certain position, the position and orientation of the robot on the constructed map are known. At this time, the positioning processing is performed.
  • the device determines the positioning information of the robot in the current physical space, that is, the current actual position and orientation of the robot according to the positioning method of the robot of the present application.
  • the current actual position and orientation of the robot is deviated from the position and orientation of the robot on the constructed map, and the control device corrects the position and orientation of the robot on the constructed map based on the current actual position and orientation of the robot, ie, Determining the position and orientation of the robot in the constructed map, then re-determining the distance of the robot from the obstacle, whether the robot is yaw, and if the robot continues along the constructed map based on the redefined position and orientation and the constructed map
  • the route moves the distance required to move and the direction of the deflection, which in turn adjusts the navigation route.
  • FIG. 5 is a schematic structural diagram of a positioning system of the robot of the present application in an embodiment. As shown, the positioning system includes an imaging device 11, a storage device 12, and a positioning processing device 13.
  • the camera device 11 is for taking an image in a navigation operation environment of the robot.
  • the image pickup apparatus includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the robot, and the camera device starts capturing images during the power-on movement of the robot. Further, the image pickup device may be provided on a main body of the robot.
  • the camera device may be disposed at the middle or the edge of the top cover of the sweeping robot, or the camera device may be disposed below the plane of the top surface of the sweeping robot, near the geometric center of the body, or the main body.
  • the optical axis of the imaging device may be at an angle of ⁇ 30° with respect to the vertical, or the optical axis of the imaging device may be at an angle of 0-180° with respect to the horizontal.
  • the navigation operation environment refers to an environment in which a robot designs a navigation route based on the constructed map data or moves based on a randomly designed navigation route and performs corresponding operations.
  • the navigation operating environment refers to an environment in which the sweeping robot moves according to the navigation route and performs a cleaning operation.
  • the storage device 12 is configured to pre-store a positioning and map construction application and a behavior control application, and standard physical features of at least one standard.
  • the positioning and map construction application is the basic application in the field of intelligent robots. The problem can be described as whether there is a way for the robot to gradually draw a complete map of the environment while the robot is in an unknown environment. Which direction should be traveled, that is, to achieve intelligence requires three tasks to be completed, the first is Localization, the second is Mapping, and the third is subsequent path planning (Navigation) ).
  • the behavior control application in the present application refers to controlling robot movement, posture adjustment, and the like according to the set information or instructions.
  • the storage device also pre-stores standard physical features of each standard component.
  • the standard component may include a standard component designed based on at least one of an industry standard, a national standard, an international standard, and a custom standard.
  • industry standards such as mechanical industry standard JB, building materials industry standard JC, etc.; national standards such as China GB standard, German DIN standard, British BS standard, etc.; international standards such as international ISO standards; custom standards will be detailed later.
  • the standard physical features may include a profile size, a standard structural relationship, etc., for example, standard physical features of the standard component include actual physical length, width, and height of the standard component, actual physical other size data of the corresponding standard in the standard component, and the like. .
  • the spacing between the two holes on the power outlet Another example is the length and width of the power outlet.
  • the storage device 12 includes, but is not limited to, a high speed random access memory, a non-volatile memory. For example, one or more disk storage devices, flash devices, or other non-volatile solid state storage devices.
  • storage device 12 may also include memory remote from one or more processors, such as network attached storage accessed via RF circuitry or external ports and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • the memory controller can control access to the storage device by other components of the robot, such as the CPU and peripheral interfaces.
  • the positioning processing device 13 is connected to the imaging device 11 and the storage device 12.
  • the positioning processing device 13 can include one or more processors. Positioning processing device 13 is operatively coupled to volatile memory and/or non-volatile memory in storage device 12.
  • the location processing device 13 may execute instructions stored in a memory and/or non-volatile storage device to perform operations in the robot, such as identifying graphics of objects in the image and based on standard physical signs of the identified standard graphics and standards. Positioning in the map, etc.
  • the processor may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more digital signal processors (DSPs), one or more field programmable logic arrays (FPGAs) , or any combination of them.
  • the positioning processing device is also operatively coupled to an I/O port and an input structure that enables the robot to interact with various other electronic devices that enable the user to interact with the computing device .
  • the input structure can include buttons, keyboards, mice, trackpads, and the like.
  • the other electronic device may be a mobile motor in the mobile device in the robot, or a slave processor in the robot dedicated to controlling the mobile device and the cleaning device, such as an MCU (Microcontroller Unit, MCU for short).
  • MCU Microcontroller Unit
  • the positioning processing device 13 is connected to the storage device 12 and the camera device 11 via data lines, respectively.
  • the positioning processing device 13 interacts with the storage device 12 by means of a data read/write technology, and the positioning processing device 13 interacts with the camera device 11 via an interface protocol.
  • the data reading and writing technology includes but is not limited to: a high speed/low speed data interface protocol, a database read and write operation, and the like.
  • the interface protocols include, but are not limited to, an HDMI interface protocol, a serial interface protocol, and the like.
  • the positioning processing device 13 is configured to invoke the positioning and map construction application to perform: identifying a graphic of a physical object in the image, and acquiring a standard of the standard component when the identified at least one graphic is a standard graphic corresponding to a standard component Physical characteristics; determining positioning information of the robot in a current physical space based on the standard graphics and standard physical characteristics of the standard component.
  • the positioning processing device 13 can adopt an image recognition method based on a neural network, an image recognition method based on a wavelet moment, and the like to process, analyze, and understand the captured image to identify targets of various modes. Object.
  • the positioning processing device can also seek similar image targets by analyzing the correspondence, similarity and consistency of image content, features, structures, relationships, textures, and gradations.
  • the sweeping robot is taken as an example. Since the sweeping robot usually performs indoor cleaning work, the objects in the image taken by the camera device generally include, for example, a wall, a table, a sofa, a wardrobe, a television, and a power socket. , network cable outlets, etc.
  • the image pickup device supplies the image to the positioning processing device after the image is taken in the navigation operation environment of the robot, and the positioning processing device recognizes the graphic of the physical object in the captured image by image recognition.
  • the graphic of the physical object can be characterized by features such as grayscale of the real object, contour of the physical object, and the like.
  • the graphic of the physical object is not limited to the external geometric figure of the physical object, and may include other graphics presented on the physical object, such as a two-hole socket on the power socket, a five-hole socket, a square socket on the network cable socket, and the like.
  • the five-hole socket of the power socket and the square socket of the network cable socket can be used to distinguish.
  • the image capturing device of the cleaning robot in the indoor ingestible image includes a power socket or a network cable socket, since the power socket and the network cable socket are designed according to the GB standard, they are not different depending on the environment in which they are installed.
  • the standard parts use a power outlet.
  • the standard physical characteristics of the standard parts may include the length, width, and height of the power socket, and the structural relationship of the five-hole socket on the power socket.
  • the graphics of the standard and the standard physical features of the standard may be preset and pre-stored using the storage device of the robot.
  • the manner in which the standard physical features of the standard are obtained includes reading the preset standard physical features from the storage device of the robot.
  • the positioning processing device 13 compares the image content, features, structure, relationship, texture, gray scale, etc., similarity and The analysis of the consistency determines whether the identified at least one graphic corresponds to the stored pattern of the power outlet, and when the identified at least one graphic corresponds to the stored pattern of the power outlet, the standard physical characteristics of the power outlet are obtained.
  • the at least one graphic corresponding to the graphic of the stored power outlet is referred to as a standard graphic.
  • the standard graphic is a graphic of the power socket taken.
  • the positioning processing device 13 calculates the robot in the current physical space by using the correspondence between the preset unit pixel interval and the unit length in the actual physical space, and the size of the identified standard graphic and the physical size in the corresponding standard physical feature. The distance and the off angle of the standard component are obtained, that is, the positioning information of the robot relative to the standard component is obtained.
  • the positioning processing device when the positioning processing device recognizes the socket and the boundary line between the wall and the bottom surface, or recognizes the socket and the default socket is mounted on the wall, according to the above correspondence, the positioning processing device can not only obtain The distance and declination of the robot from the socket, the positioning processing device can also obtain the linear distance between the robot and the wall by using the spatial positional relationship of the wall, the robot and the socket, thus obtaining the positioning information of the robot relative to the wall.
  • the positioning system of the present application recognizes the graphic of the physical object in the image taken by the imaging device 11 by using the positioning processing device 13 and matches the graphic of the standard component stored in the storage device 12, and based on the standard component
  • the technical solution of determining the positioning information of the robot in the current physical space by the standard physical feature solves the problem that the positioning of the robot by the data provided by the sensor is inaccurate in the prior art.
  • FIG. 6 is a schematic structural diagram of a positioning system of the robot of the present application in another embodiment.
  • the positioning system also includes a network access device 14.
  • the network access device 14 is connected to the location processing device 13, and the network access device 14 is configured to acquire corresponding standard physical features from the remote server.
  • the positioning processing device provides the search request, that is, the standard graphic, to the remote server through the network, and the remote server uses the CBIR (Content-based Image Retrieval, CBIR) method according to the requirement.
  • CBIR Content-based Image Retrieval, CBIR
  • Retrieving and determining the search result means that the standard part graphic is retrieved, and then the remote server outputs the standard physical feature of the standard piece based on the acquired standard piece graphic and provides it to the positioning processing device through the network for subsequent processing.
  • the standard may include standard components designed based on custom criteria in addition to standard components that may be designed based on at least one of industry standards, national standards, and international standards.
  • the standard parts of the custom standard may be standard parts customized by the robot manufacturer, for example, standard parts designed and manufactured by the robot manufacturer and can be used in the robot working environment and used in conjunction with the robot.
  • the standard component of the custom standard may also be a standard component that generates standard physical features by physical parameters input by the user.
  • the sweeping robot is taken as an example.
  • the object in the image taken by the sweeping robot through the camera device during the working of the living room for example, generally includes a household appliance such as a television, so that the user can customize the television as a standard component.
  • the user can obtain the physical parameters of the television set by reading the manual of the television or querying the product information, and input the physical parameters through the input device, such as the robot application APP, to generate standard components having standard physical features.
  • the user can also select other physical objects as standard parts according to the indoor environment.
  • the positioning system of the robot of the present application allows the user to select standard parts according to the indoor environment through the user-defined standard parts, which helps the accurate positioning of the robot.
  • the positioning processing device determines the positioning information of the robot in the current physical space based on the standard graphics and standard physical characteristics of the standard component, including: based on the standard physical feature pair Performing an off-angle correction process on the standard pattern to obtain a deflection angle of the robot relative to a plane in which the standard member is located; performing distance measurement on the corrected pattern based on the standard physical feature to obtain the distance between the robot and the standard The distance between the piece and the plane of the standard part.
  • the positioning processing device can determine the photographing surface of the photographing device and the two holes The angle of deflection of the plane in which the power outlet is located.
  • the positioning processing device can use the obtained deflection angle to obtain the deflection angle of the robot relative to the plane of the standard member.
  • the positioning processing device performs the rectification processing on the standard pattern by using the obtained deflection angle to obtain a standard pattern in the imaging plane parallel to the plane of the standard component.
  • the distance measurement is performed by using the proportional relationship between the pixel size of the standard graphic in the image and the actual physical size, so that the distance between the robot and the plane of the standard component and the standard component is obtained.
  • the current position and orientation of the robot can be determined, that is, the positioning information of the robot in the current physical space is determined.
  • the obtained positioning information can be used in robot navigation, map creation, and map data correction.
  • the positioning system described in the present application can correct the map data in time according to the obtained positioning information, so that the indoor map can be constructed and used as accurately as possible without obtaining accurate indoor map data in advance. data.
  • the positioning processing apparatus is further configured to perform the step of adjusting map data in the robot based on positioning information of the robot in a current physical space.
  • the map data of the sweeping robot is generally based on moving data provided by a plurality of moving sensors, such as a speed sensor, an odometer sensor, a ranging sensor, a cliff sensor, etc., provided on the robot, and the photographing device
  • the provided image data is constructed, and there is a cumulative error in the process of constructing the map by using the mobile sensor and the photographing device, so when the robot moves to a certain position, the positioning information determined by the positioning system of the robot of the present application in the current physical space is There is a bias in the map data, so the map data needs to be adjusted.
  • the positioning processing device adjusts the map data in the robot based on the positioning information after determining the positioning information of the robot in the current physical space.
  • the map data may include, but is not limited to, position and angle data drawn within a preset grid or coordinate space based on movement data and image data provided by the motion sensor, based on the identified physical features in a preset grid or Landmark information marked in the coordinate space, etc.
  • the map data can be used for map construction and navigation.
  • the positioning system of the robot of the present application recognizes the graphic of the physical object in the image taken by the imaging device by using the positioning processing device and matches the graphic of the standard component stored in the storage device, and based on the standard physical characteristics of the standard component. Determining the positioning information of the robot in the current physical space and adjusting the map data in the robot based on the positioning information enables the positioning error in the map data to be compensated according to the positioning information, so as to achieve accurate positioning.
  • the manner in which the positioning processing device adjusts the map data in the robot based on the positioning information of the robot in the current physical space comprises: compensating for a positioning error of the corresponding position in the map data based on the positioning information. And/or compensating for a positioning error in the landmark information associated with the standard component in the map data based on the positioning information.
  • the robot when the robot does not recognize the standard, it constructs the map data using the movement data provided by the motion sensor; when the robot recognizes the standard, and based on the identified standard graphics and the standard physical characteristics of the corresponding standard
  • the positioning processing device may compare whether the positioning information and the positioning information based on the mobile data have position and angle deviation, and if yes, replace the movement with the positioning information of the standard component.
  • FIG. 3 FIG.
  • the positions A1 and A2 are positioning information determined based on the movement data.
  • the positioning processing device identifies the standard component at the A2 position and determines the current positioning information B based on the identified standard component, the positioning information B is replaced with the position A2, and the positioning information B is located on the left side of the A2 and offset by one grid.
  • the positioning information of A1 is also adjusted to the left by a grid to obtain A1' indicated by a broken line to compensate for the positioning error of the moving data at position A1.
  • landmark data is included in the map data.
  • the landmark information is used to help the robot use the image for positioning.
  • the landmark information includes, but is not limited to, features of standard and non-standard components (such as edges, corners, contours, brightness features, etc.), and can capture various position and angle information of these features.
  • a landmark information usually includes features of standard and non-standard parts, or all non-standard parts.
  • Adjacent landmark information usually contains repeated and non-repeating features.
  • the non-standard parts refer to objects that are not previously defined as standard parts, such as tables, chairs, walls, and the like.
  • the location information of the locations C1 and C2 is included in the constructed map data, wherein the location C1 includes the landmark information t1, and the location C2 includes the landmark information t2, which includes the features of the standard component and the features of the non-standard component.
  • the position and declination of the corresponding feature are captured, and t2 includes the features of the non-standard piece overlapping with t1 and the position and off-angle of the corresponding feature.
  • the positioning processing device identifies the standard at the C1 position and determines the current positioning information D based on the identified standard, the positioning information D is replaced with the positioning information at the position C1 to compensate for the positioning error in the original map data, and based on the positioning.
  • the information D adjusts the position in the landmark information t1 and the positioning error in the yaw angle. By means of the overlapping non-standard features in the landmark information t1 and t2, this step can also adjust the positioning error in the rear position C2 and the landmark information t2.
  • the manner of error compensation for the positioning information in the map data is only an example. Since the errors of the moving data and the image data are accumulated, the positioning errors are not necessarily identical. Therefore, the positioning compensation method can also compensate the positioning error in the map data by using the weight compensation method.
  • this step can adjust the positioning of one or more locations in the map data. Information, and one or more positioning errors contained at the same location.
  • the positioning processing apparatus is further configured to adjust a navigation route of the robot based on positioning information of the robot in a current physical space.
  • the map data of the sweeping robot is generally constructed based on moving data provided by a plurality of moving sensors provided on the robot, such as a speed sensor, an odometer sensor, and the like, and image data provided by the photographing device. Since there is a cumulative error in the process of constructing the map by using the motion sensor and the photographing device, when the robot moves to a certain position, the positioning information determined by the positioning system of the robot of the present application in the current physical space deviates from the map data. Therefore, in the case where the navigation operation is performed according to the constructed map, the positioning processing device adjusts the navigation route of the robot based on the positioning information of the robot in the current physical space after determining the positioning information of the robot in the current physical space. In addition, the positioning processing device may further adjust the distance that the robot continues to move according to the adjusted navigation route or adjust the direction of the robot movement to move the robot according to the adjusted navigation route.
  • the positioning system of the robot of the present application determines the positioning information of the robot in the current physical space based on the standard physical characteristics of the standard component by identifying the physical object in the captured image and matching the graphic of the standard component, based on the The technical solution of locating information to adjust the navigation route of the robot enables the purpose of correcting the navigation route in the case of accurate positioning.
  • the manner in which the positioning processing device adjusts the navigation route of the robot based on the positioning information of the robot in the current physical space includes: re-determining the robot in the preset map data based on the positioning information. Position and orientation, and adjust the navigation route based on the redefined position and orientation.
  • the positioning processing device in the positioning system of the robot of the present application determines the positioning information of the robot in the current physical space, that is, the current actual position and orientation of the robot.
  • the current actual position and orientation of the robot is deviated from the position and orientation of the robot on the constructed map, and the positioning processing device corrects the position and orientation of the robot on the constructed map based on the current actual position and orientation of the robot, ie, Re-determine the position and orientation of the robot in the constructed map, then re-determine the distance of the robot from the obstacle, whether the robot is yaw, and if the robot continues along the constructed map based on the re-determined position and orientation and the constructed map
  • the route in the movement moves the distance required to move and the direction of the deflection, thereby adjusting the navigation route.
  • FIG. 7 is a schematic structural view of the robot of the present application in an embodiment.
  • the robot includes a driving device 215, an imaging device 211, a storage device 212, and a positioning processing device 213.
  • the driving device 215 is used to drive the robot to perform displacement and/or posture adjustment.
  • the driving device may be a driving motor that drives the roller movement of the cleaning robot and a deflection.
  • the camera device 211 is configured to capture an image in a navigation operation environment of the robot.
  • the image pickup apparatus includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the robot, and the camera device starts capturing images during the power-on movement of the robot. Further, the image pickup device may be provided on a main body of the robot.
  • the camera device may be disposed at the middle or the edge of the top cover of the sweeping robot, or the camera device may be disposed below the plane of the top surface of the sweeping robot, near the geometric center of the body, or the main body.
  • the optical axis of the imaging device may be at an angle of ⁇ 30° with respect to the vertical, or the optical axis of the imaging device may be at an angle of 0-180° with respect to the horizontal.
  • the navigation operation environment refers to an environment in which a robot designs a navigation route based on the constructed map data or moves based on a randomly designed navigation route and performs corresponding operations.
  • the navigation operating environment refers to an environment in which the sweeping robot moves according to the navigation route and performs a cleaning operation.
  • the storage device 212 is configured to pre-store a positioning and map construction application and a behavior control application, and standard physical features of at least one standard.
  • the positioning and map construction application is the basic application in the field of intelligent robots. The problem can be described as whether there is a way for the robot to gradually draw a complete map of the environment while the robot is in an unknown environment. Which direction should be traveled, that is, to achieve intelligence requires three tasks to be completed, the first is Localization, the second is Mapping, and the third is subsequent path planning (Navigation) ).
  • the behavior control application in the present application refers to controlling robot movement, posture adjustment, and the like according to the set information or instructions.
  • the storage device also pre-stores standard physical features of each standard component.
  • the standard component may include a standard component designed based on at least one of an industry standard, a national standard, an international standard, and a custom standard.
  • industry standards such as mechanical industry standard JB, building materials industry standard JC, etc.; national standards such as China GB standard, German DIN standard, British BS standard, etc.; international standards such as international ISO standards; custom standards will be detailed later.
  • the standard physical features may include a profile size, a standard structural relationship, etc., for example, standard physical features of the standard component include actual physical length, width, and height of the standard component, actual physical other size data of the corresponding standard in the standard component, and the like. .
  • the spacing between the two holes on the power outlet Another example is the length and width of the power outlet.
  • the storage device 212 includes, but is not limited to, a high speed random access memory, a non-volatile memory. For example, one or more disk storage devices, flash devices, or other non-volatile solid state storage devices.
  • storage device 212 may also include a memory remote from one or more processors, such as network attached storage accessed via RF circuitry or external ports and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • the memory controller can control access to the storage device by other components of the robot, such as the CPU and peripheral interfaces.
  • the positioning processing device 213 is connected to the driving device 215, the imaging device 211, and the storage device 212.
  • the positioning processing device 213 can include one or more processors.
  • Positioning processing device 213 is operatively coupled to volatile memory and/or non-volatile memory in storage device 212.
  • the location processing device 213 can execute instructions stored in a memory and/or non-volatile storage device to perform operations in the robot, such as identifying graphics of objects in the image and based on standard physical signs of the identified standard graphics and standards. Positioning in the map, etc.
  • the processor may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more digital signal processors (DSPs), one or more field programmable logic arrays (FPGAs) , or any combination of them.
  • the positioning processing device is also operatively coupled to an I/O port and an input structure that enables the robot to interact with various other electronic devices that enable the user to interact with the computing device .
  • the input structure can include buttons, keyboards, mice, trackpads, and the like.
  • the other electronic device may be a mobile motor in the mobile device in the robot, or a slave processor in the robot dedicated to controlling the mobile device and the cleaning device, such as an MCU (Microcontroller Unit, MCU for short).
  • MCU Microcontroller Unit
  • the positioning processing device 213 is connected to the storage device 212, the camera device 211, and the driving device 215 via data lines, respectively.
  • the positioning processing device 213 interacts with the storage device 212 by a data reading and writing technology, and the positioning processing device 213 interacts with the camera device 211 and the driving device 215 through an interface protocol.
  • the data reading and writing technology includes but is not limited to: a high speed/low speed data interface protocol, a database read and write operation, and the like.
  • the interface protocols include, but are not limited to, an HDMI interface protocol, a serial interface protocol, and the like.
  • the positioning processing device 213 is configured to invoke the pre-stored positioning and map construction application to perform: identifying a graphic of a physical object in the image, and acquiring the standard component when the identified at least one graphic is a standard graphic corresponding to a standard component Standard physical characteristics; determining positioning information of the robot in the current physical space based on the standard graphics and standard physical characteristics of the standard component.
  • the positioning processing device 213 can adopt an image recognition method based on a neural network, an image recognition method based on a wavelet moment, and the like to process, analyze, and understand the captured image to identify targets of various modes. Object.
  • the positioning processing device can also seek similar image targets by analyzing the correspondence, similarity and consistency of image content, features, structures, relationships, textures, and gradations.
  • the sweeping robot is taken as an example. Since the sweeping robot usually performs indoor cleaning work, the objects in the image taken by the camera device generally include, for example, a wall, a table, a sofa, a wardrobe, a television, and a power socket. , network cable outlets, etc.
  • the image pickup device supplies the image to the positioning processing device after the image is taken in the navigation operation environment of the robot, and the positioning processing device recognizes the graphic of the physical object in the captured image by image recognition.
  • the graphic of the physical object can be characterized by features such as grayscale of the real object, contour of the physical object, and the like.
  • the graphic of the physical object is not limited to the external geometric figure of the physical object, and may include other graphics presented on the physical object, such as a two-hole socket on the power socket, a five-hole socket, a square socket on the network cable socket, and the like.
  • the five-hole socket of the power socket and the square socket of the network cable socket can be used to distinguish.
  • the image capturing device of the cleaning robot in the indoor ingestible image includes a power socket or a network cable socket, since the power socket and the network cable socket are designed according to the GB standard, they are not different depending on the environment in which they are installed.
  • the standard parts use a power outlet.
  • the standard physical characteristics of the standard parts may include the length, width, and height of the power socket, and the structural relationship of the five-hole socket on the power socket.
  • the graphics of the standard and the standard physical features of the standard may be preset and pre-stored using the storage device of the robot.
  • the manner in which the standard physical features of the standard are obtained includes reading the preset standard physical features from the storage device of the robot.
  • the positioning processing device 213 compares the image content, features, structure, relationship, texture, gray scale, etc., similarity and The analysis of the consistency determines whether the identified at least one graphic corresponds to the stored pattern of the power outlet, and when the identified at least one graphic corresponds to the stored pattern of the power outlet, the standard physical characteristics of the power outlet are obtained.
  • the at least one graphic corresponding to the graphic of the stored power outlet is referred to as a standard graphic.
  • the standard graphic is a graphic of the power socket taken.
  • the positioning processing device 213 calculates the robot in the current physical space by using the correspondence between the preset unit pixel interval and the unit length in the actual physical space, and the size of the identified standard graphic and the physical size in the corresponding standard physical feature. The distance and the off angle of the standard component are obtained, that is, the positioning information of the robot relative to the standard component is obtained.
  • the positioning processing device when the positioning processing device recognizes the socket and the boundary line between the wall and the bottom surface, or recognizes the socket and the default socket is mounted on the wall, according to the above correspondence, the positioning processing device can not only obtain The distance and declination of the robot from the socket, the positioning processing device can also obtain the linear distance between the robot and the wall by using the spatial positional relationship of the wall, the robot and the socket, thus obtaining the positioning information of the robot relative to the wall.
  • the robot of the present application recognizes the graphic of the physical object in the image taken by the imaging device 211 by using the positioning processing device 213 and matches the graphic of the standard component stored in the storage device 212, and the standard physics based on the standard component.
  • the technical solution for determining the positioning information of the robot in the current physical space solves the problem that the positioning of the robot by the data provided by the sensor is inaccurate in the prior art.
  • FIG. 8 is a schematic structural diagram of the robot of the present application in another embodiment.
  • the robot also includes a network access device 214.
  • the network access device 214 is connected to the location processing device 213, and the network access device 214 is configured to acquire corresponding standard physical features from the remote server.
  • the positioning processing device provides the search request, that is, the standard graphic, to the remote server through the network, and the remote server uses the CBIR (Content-based Image Retrieval, CBIR) method according to the requirement.
  • CBIR Content-based Image Retrieval, CBIR
  • Retrieving and determining the search result means that the standard part graphic is retrieved, and then the remote server outputs the standard physical feature of the standard piece based on the acquired standard piece graphic and provides it to the positioning processing device through the network for subsequent processing.
  • the standard may include standard components designed based on custom criteria in addition to standard components that may be designed based on at least one of industry standards, national standards, and international standards.
  • the standard parts of the custom standard may be standard parts customized by the robot manufacturer, for example, standard parts designed and manufactured by the robot manufacturer and can be used in the robot working environment and used in conjunction with the robot.
  • the standard component of the custom standard may also be a standard component that generates standard physical features by physical parameters input by the user.
  • the sweeping robot is taken as an example.
  • the object in the image taken by the sweeping robot through the camera device during the working of the living room for example, generally includes a household appliance such as a television, so that the user can customize the television as a standard component.
  • the user can obtain the physical parameters of the television set by reading the manual of the television or querying the product information, and input the physical parameters via the input device, such as the robot application APP, to generate standard components having standard physical features.
  • the user can also select other physical objects as standard parts according to the indoor environment.
  • the robot of the present application allows the user to select standard parts according to the indoor environment through the user-defined standard parts, which helps the accurate positioning of the robot.
  • the positioning processing device determines the positioning information of the robot in the current physical space based on the standard graphics and standard physical characteristics of the standard component, including: based on the standard physical feature pair Performing an off-angle correction process on the standard pattern to obtain a deflection angle of the robot relative to a plane in which the standard member is located; performing distance measurement on the corrected pattern based on the standard physical feature to obtain the distance between the robot and the standard The distance between the piece and the plane of the standard part.
  • the positioning processing device can determine the photographing surface of the photographing device and the two holes The angle of deflection of the plane in which the power outlet is located.
  • the positioning processing device can use the obtained deflection angle to obtain the deflection angle of the robot relative to the plane of the standard member.
  • the positioning processing device performs the rectification processing on the standard pattern by using the obtained deflection angle to obtain a standard pattern in the imaging plane parallel to the plane of the standard component.
  • the distance measurement is performed by using the proportional relationship between the pixel size of the standard graphic in the image and the actual physical size, so that the distance between the robot and the plane of the standard component and the standard component is obtained.
  • the current position and orientation of the robot can be determined, that is, the positioning information of the robot in the current physical space is determined.
  • the obtained positioning information can be used in robot navigation, map creation, and map data correction.
  • the robot described in the present application can correct the map data in time according to the obtained positioning information, so that the indoor map data can be constructed and used as accurately as possible without obtaining accurate indoor map data in advance. .
  • the positioning processing device invokes the positioning and map construction application to perform a step of adjusting map data in the robot based on positioning information of the robot in a current physical space.
  • the map data of the sweeping robot is generally based on moving data provided by a plurality of moving sensors, such as a speed sensor, an odometer sensor, a ranging sensor, a cliff sensor, etc., provided on the robot, and the photographing device
  • the provided image data is constructed, and there is a cumulative error in the process of constructing the map by using the motion sensor and the photographing device, so when the robot moves to a certain position, the robot determines the positioning information and the map data of the robot in the current physical space when the robot moves to a certain position. There is a bias and thus the map data needs to be adjusted.
  • the positioning processing device adjusts the map data in the robot based on the positioning information after determining the positioning information of the robot in the current physical space.
  • the map data may include, but is not limited to, position and angle data drawn within a preset grid or coordinate space based on movement data and image data provided by the motion sensor, based on the identified physical features in a preset grid or Landmark information marked in the coordinate space, etc.
  • the map data can be used for map construction and navigation.
  • the robot of the present application identifies a figure of a physical object in an image taken by the image pickup device by using a positioning processing device and matches a pattern of a standard piece stored in the storage device, and determines a robot based on standard physical characteristics of the standard piece
  • the positioning information in the current physical space and the technical solution for adjusting the map data in the robot based on the positioning information enable the positioning error in the map data to be compensated according to the positioning information, thereby achieving the purpose of accurate positioning.
  • the manner in which the positioning processing device adjusts the map data in the robot based on the positioning information of the robot in the current physical space comprises: compensating for a positioning error of the corresponding position in the map data based on the positioning information. And/or compensating for a positioning error in the landmark information associated with the standard component in the map data based on the positioning information.
  • the robot when the robot does not recognize the standard, it constructs the map data using the movement data provided by the motion sensor; when the robot recognizes the standard, and based on the identified standard graphics and the standard physical characteristics of the corresponding standard
  • the positioning processing device may compare whether the positioning information and the positioning information based on the mobile data have position and angle deviation, and if yes, replace the movement with the positioning information of the standard component.
  • FIG. 3 FIG.
  • the positions A1 and A2 are positioning information determined based on the movement data.
  • the positioning processing device identifies the standard component at the A2 position and determines the current positioning information B based on the identified standard component, the positioning information B is replaced with the position A2, and the positioning information B is located on the left side of the A2 and offset by one grid.
  • the positioning information of A1 is also adjusted to the left by a grid to obtain A1' indicated by a broken line to compensate for the positioning error of the moving data at position A1.
  • landmark data is included in the map data.
  • the landmark information is used to help the robot use the image for positioning.
  • the landmark information includes, but is not limited to, features of standard and non-standard components (such as edges, corners, contours, brightness features, etc.), and can capture various position and angle information of these features.
  • a landmark information usually includes features of standard and non-standard parts, or all non-standard parts.
  • Adjacent landmark information usually contains repeated and non-repeating features.
  • the non-standard parts refer to objects that are not previously defined as standard parts, such as tables, chairs, walls, and the like.
  • the location information of the locations C1 and C2 is included in the constructed map data, wherein the location C1 includes the landmark information t1, and the location C2 includes the landmark information t2, which includes the features of the standard component and the features of the non-standard component.
  • the position and declination of the corresponding feature are captured, and t2 includes the features of the non-standard piece overlapping with t1 and the position and off-angle of the corresponding feature.
  • the positioning processing device identifies the standard at the C1 position and determines the current positioning information D based on the identified standard, the positioning information D is replaced with the positioning information at the position C1 to compensate for the positioning error in the original map data, and based on the positioning.
  • the information D adjusts the position in the landmark information t1 and the positioning error in the yaw angle. By means of the overlapping non-standard features in the landmark information t1 and t2, this step can also adjust the positioning error in the rear position C2 and the landmark information t2.
  • the manner of error compensation for the positioning information in the map data is only an example. Since the errors of the moving data and the image data are accumulated, the positioning errors are not necessarily identical. Therefore, the positioning compensation method can also compensate the positioning error in the map data by using the weight compensation method.
  • this step can adjust positioning information of one or more locations in the map data. And one or more positioning errors contained in the same location.
  • the positioning processing device is further configured to invoke the behavior control application to perform a step of controlling the driving device to adjust a displacement and/or a posture of the robot according to the positioning information.
  • the positioning processing device may only drive the robot to perform displacement adjustment, or may only drive the robot to perform posture adjustment, or drive the robot to perform both displacement adjustment and posture adjustment, depending on the positioning information of the robot in the current physical space.
  • the relationship between location information in a constructed map For example, the positioning processing device controls the steering and the rotational speed of the driving motor in the driving device according to the positioning information.
  • the robot includes two sets of driving motors corresponding to each set of rollers, and the positioning processing device controls each of the driving motors to drive the respective rollers at different rotational speeds according to the positioning information, so that the robot changes the traveling direction.
  • the manner in which the positioning processing device drives the robot to perform displacement and/or posture adjustment according to the positioning information comprises: adjusting navigation of the robot based on positioning information of the robot in a current physical space. a route; controlling the drive device to perform displacement and/or attitude adjustment according to a navigation route.
  • the positioning processing device adjusts a navigation route of the robot based on positioning information of the robot in a current physical space.
  • the map data of the sweeping robot is generally constructed based on moving data provided by a plurality of moving sensors provided on the robot, such as a speed sensor, an odometer sensor, and the like, and image data provided by the photographing device. Since there is a cumulative error in constructing the map by using the motion sensor and the photographing device, when the robot moves to a certain position, the positioning information of the robot determined by the robot of the present application in the current physical space deviates from the map data. Therefore, in the case where the navigation operation is performed according to the constructed map, the positioning processing device adjusts the navigation route of the robot based on the positioning information of the robot in the current physical space after determining the positioning information of the robot in the current physical space.
  • the positioning processing device controls the driving device to perform displacement and/or attitude adjustment according to the navigation route. That is, the positioning processing device adjusts the distance that the robot continues to move or adjusts the direction in which the robot moves to control the driving device to move according to the adjusted navigation route.
  • the robot of the present application determines the positioning information of the robot in the current physical space based on the standard physical characteristics of the standard component by using the graphic of the physical object in the captured image and matching with the graphic of the standard component, based on the positioning information.
  • the technical solution of adjusting the displacement and/or posture adjustment of the robot enables the navigation route to be corrected and the displacement and posture of the robot to be adjusted in the case of accurate positioning so that the robot can move according to the new navigation route.
  • the manner in which the positioning processing device adjusts the navigation route of the robot based on the positioning information of the robot in the current physical space includes: re-determining the robot in the preset map data based on the positioning information. Position and orientation, and adjust the navigation route based on the redefined position and orientation.
  • the robot when the robot performs a navigation operation according to the constructed map, when the robot moves to a certain position, the position and orientation of the robot on the constructed map are known.
  • the positioning processing device in the robot determines the positioning information of the robot in the current physical space, that is, the current actual position and orientation of the robot.
  • the current actual position and orientation of the robot is deviated from the position and orientation of the robot on the constructed map, and the positioning processing device corrects the position and orientation of the robot on the constructed map based on the current actual position and orientation of the robot, ie, Re-determine the position and orientation of the robot in the constructed map, then re-determine the distance of the robot from the obstacle, whether the robot is yaw, and if the robot continues along the constructed map based on the re-determined position and orientation and the constructed map
  • the route in the movement moves the distance required to move and the direction of the deflection, thereby adjusting the navigation route.
  • the present application also provides a storage medium for an electronic device, the storage medium storing one or more programs, when the one or more computer programs are executed by one or more processors, The one or more processors implement the positioning method of any of the foregoing.
  • portions of the technical solution of the present application that contribute in essence or to the prior art may be embodied in the form of a software product, which may include one or more of the executable instructions for storing the machine thereon.
  • a machine-readable medium that, when executed by one or more machines, such as a computer, computer network, or other electronic device, can cause the one or more machines to perform operations in accordance with embodiments of the present application. For example, each step in the positioning method of the robot is executed.
  • the machine-readable medium can include, but is not limited to, a floppy disk, an optical disk, a CD-ROM (Compact Disk-Read Only Memory), a magneto-optical disk, a ROM (Read Only Memory), a RAM (Random Access Memory), an EPROM (erasable) In addition to programmable read only memory, EEPROM (Electrically Erasable Programmable Read Only Memory), magnetic or optical cards, flash memory, or other types of media/machine readable media suitable for storing machine executable instructions.
  • the storage medium may be located in a robot or in a third-party server, such as in a server that provides an application store. There are no restrictions on the specific application mall, such as Huawei Application Mall, Apple App Store, etc.
  • This application can be used in a variety of general purpose or special purpose computing system environments or configurations.
  • the application can be described in the general context of computer-executable instructions executed by a computer, such as a program module.
  • program modules include routines, programs, objects, components, data structures, and the like that perform particular tasks or implement particular abstract data types.
  • the present application can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are connected through a communication network.
  • program modules can be located in both local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

一种定位方法、系统及所适用的机器人。其中,所述定位方法包括:在机器人的导航操作环境下摄取图像(S110);识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取所述标准件的标准物理特征(S120);基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息(S130)。所述机器人的定位方法通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息的技术方案,解决了现有技术中利用传感器所提供数据对机器人的定位不准确的问题。

Description

定位方法、系统及所适用的机器人 技术领域
本申请涉及室内定位技术领域,特别是涉及一种定位方法、系统及所适用的机器人。
背景技术
移动机器人是自动执行工作的机器装置。它既可以接受人类指挥,又可以运行预先编排的程序,也可以根据以人工智能技术制定的原则纲领行动。这类移动机器人可用在室内或室外,可用于工业或家庭,可用于取代保安巡视、取代人们清洁地面,还可用于家庭陪伴、辅助办公等。受不同移动机器人所应用的领域差别,各领域所使用的移动机器人的移动方式有所差异,例如,移动机器人可采用轮式移动、行走式移动、链条式移动等。随着移动机器人的移动技术的更新迭代,利用传感器和图像传感器所提供的移动信息进行SLAM(Simultaneous Localization and Mapping,即时定位与地图构建,简称SLAM)和VSLAM技术为移动机器人提供更精准的导航能力,使得移动机器人能更有效地自主移动。然而,以扫地机器人为例,滚轮在不同材质的地面上移动所能行进的距离并不相同,基于图像传感器的VSLAM技术也会随着时间的延长,而出现地图的误差增大,这使得SLAM和VSLAM技术在该领域中所构建的地图与实际物理空间的地图可能出现较大差异。
发明内容
鉴于以上所述现有技术的缺点,本申请的目的在于提供一种定位方法、系统及所适用的机器人,用于解决现有技术中利用传感器所提供数据对机器人的定位不准确的问题。
为实现上述目的及其他相关目的,本申请的第一方面提供一种机器人的定位方法,包括:在机器人的导航操作环境下摄取图像;识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
本申请的第二方面还提供一种机器人的定位系统,包括:摄像装置,用于在机器人的导航操作环境下摄取图像;存储装置,用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征;定位处理装置,与所述摄像装置和存储装置相连,用于调用所述定位与地图构建应用以执行:识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
本申请的第三方面还提供一种机器人,包括:驱动装置,用于驱动所述机器人进行位移和/或姿态调整;摄像装置,用于在机器人的导航操作环境下摄取图像;存储装置,用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征;定位处理装置,连接所述驱动装置、摄像装置、及存储装置,用于调用所述预存有定位与地图构建应用以执行:识别所述图像中 实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
本申请的第四方面还提供一种电子设备的存储介质,其特征在于,存储有一个或多个程序,当所述一个或多个计算机程序被一个或多个处理器执行时,使得所述一个或多个处理器实现前述的任一项所述的定位方法。
如上所述,本申请的定位方法、系统及所适用的机器人,具有以下有益效果:本申请机器人的定位方法通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息的技术方案,解决了现有技术中利用传感器所提供数据对机器人的定位不准确的问题。
附图说明
图1显示为本申请机器人的定位方法在一种实施方式中的流程图。
图2显示为本申请机器人的定位方法在另一种实施方式中的流程图。
图3显示为利用本申请所述定位方法所确定的定位信息B来补偿地图数据中位置A1和A2的定位误差的示意图。
图4显示为本申请机器人的定位方法在又一种实施方式中的流程图。
图5显示为本申请机器人的定位系统在一种实施方式中的结构示意图。
图6显示为本申请机器人的定位系统在另一种实施方式中的结构示意图。
图7显示为本申请的机器人在一种实施方式中的结构示意图。
图8显示为本申请的机器人在另一种实施方式中的结构示意图。
具体实施方式
以下由特定的具体实施例说明本申请的实施方式,熟悉此技术的人士可由本说明书所揭露的内容轻易地了解本申请的其他优点及功效。
在下述描述中,参考附图,附图描述了本申请的若干实施例。应当理解,还可使用其他实施例,并且可以在不背离本申请的精神和范围的情况下进行机械组成、结构、电气以及操作上的改变。下面的详细描述不应该被认为是限制性的,并且本申请的实施例的范围仅由公布的专利的权利要求书所限定。
再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。此处使用的术语“或”和“和/或”被解释为包 括性的,或意味着任一个或任何组合。因此,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A、B和C”。仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
移动机器人基于不断定位的积累并结合其他预设的或所获取的与移动相关的信息,一方面能够构建机器人所在场地的地图数据,另一方面,还可基于已构建的地图数据提供路线规划、路线规划调整及导航服务。这使得移动机器人的移动效率更高。以扫地机器人为例,例如,室内扫地机器人可结合已构建的室内地图和定位技术,预判当前位置相距室内地图上标记的障碍物的距离,并便于及时调整清扫策略。其中,所述障碍物可由单一标记描述,或基于对形状、尺寸等识别而被标记成墙、桌、沙发、衣柜等。又如,室内扫地机器人可基于定位技术累积所定位的各位置及取向,并根据累积的位置及取向的变化构建室内地图。以巡逻机器人为例,巡逻机器人通常应用于厂区、工业园区等场景,巡逻机器人可结合已构建的厂区地图和定位技术,预判当前位置相距转弯处、路口、充电桩等位置的距离,由此便于根据所获取的其他监控数据及时控制机器人的移动装置进行移动。
基于上述移动机器人的示例而推及至其他应用场景下所使用的移动机器人,为了提高移动机器人的定位精度,减少传感器的误差累积,本申请提供一种机器人的定位方法。所述定位方法可用于扫地机器人。请参阅图1,图1显示为本申请机器人的定位方法在一种实施方式中的流程图。如图所示,本申请机器人的定位方法包括:步骤S110、步骤S120以及步骤S130。
在步骤S110中,在机器人的导航操作环境下摄取图像。
在此,可利用摄像装置在机器人的导航操作环境下摄取图像。其中,所述摄像装置包括但不限于:照相机、视频摄像机、集成有光学系统或CCD芯片的摄像模块、集成有光学系统和CMOS芯片的摄像模块等。所述摄像装置的供电系统可受机器人的供电系统控制,当机器人上电移动期间,所述摄像装置即开始摄取图像。此外,所述摄像装置可以设于机器人的主体上。以扫地机器人为例,所述摄像装置可以设于扫地机器人的顶盖的中部或边缘,或者所述摄像装置可以设于扫地机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。另外,所述摄像装置的光学轴可以相对于垂线成±30°的夹角,或者所述摄像装置的光学轴可以相对于水平线成0-180°的夹角。所述导航操作环境是指机器人依据利用已构建的地图数据而设计的导航路线、或基于随机设计的导航路线移动并进行相应操作的环境。以扫地机器人为例,导航操作环境指扫地机器人依据导航路线移动并进行清洁操作的环境。
在步骤S120中,识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。
在此,可利用定位处理装置来识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。其中,所述定位处理装置可包括一个或多个处理器。 处理器可采用基于神经网络的图像识别方法、基于小波矩的图像识别方法等图像识别方法对所摄取的图像进行处理、分析和理解,以识别各种不同模式的目标和对象。此外,处理器还可通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来寻求相似图像目标。上述处理器可以共用或可独立设置。
此外,所述标准件可包括基于行业标准、国家标准、国际标准和自定义标准中的至少一种标准而设计的标准件。例如,行业标准如机械行业标准JB、建筑材料行业标准JC等;国家标准如中国GB标准、德国DIN标准、英国BS标准等;国际标准如国际ISO标准;自定义标准稍后详述。另外,所述标准物理特征可包括轮廓尺寸、标准结构关系等,例如,标准件的标准物理特征包括标准件实际物理上的长、宽、高,标准件中对应标准的实际物理上的其他尺寸数据等。例如,电源插座上两孔间距等。又如电源插座的长宽值。
在一实施例中,以扫地机器人为例,由于扫地机器人通常进行室内清洁工作,因而通过摄像装置所摄取的图像中的实物一般会包括例如,墙、桌、沙发、衣柜、电视机、电源插座、网线插座等。在本示例中,首先,摄像装置在机器人的导航操作环境下摄取图像之后将所述图像提供给定位处理装置,定位处理装置通过图像识别来识别所摄取的图像中实物的图形。其中,所述实物的图形可以由实物的灰度、实物的轮廓等特征表征。同时,所述实物的图形并不限于实物的外部几何图形,还可包括实物上呈现出的其他图形,例如电源插座上的二孔插口、五孔插口,网线插座上的方形插口等。鉴于此,例如,对于外部几何图形相近的电源插座和网线插座,则可利用电源插座的五孔插口与网线插座的方形插口来辨别。此外,在扫地机器人的摄像装置在室内可摄取的图像中的实物包括电源插座、网线插座的情况下,由于电源插座、网线插座是根据GB标准设计的,因而不会因其所处的环境不同而有所变化,因此,可以作为标准件。本示例中,标准件选用电源插座。标准件的标准物理特征可包括电源插座的长、宽、高,电源插座上五孔插口的结构关系等。在某些实施方式中,标准件的图形和标准件的标准物理特征可以是预设的,并且利用机器人的存储装置预先存储。因此,获取标准件的标准物理特征的方式包括自机器人的存储装置中读取预设的标准物理特征。
然后,针对所识别的图像中实物的图形和所存储的标准件即电源插座的图形,定位处理装置通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来确定所识别的至少一个图形与所存储的电源插座的图形是否对应,当所识别的至少一个图形与所存储的电源插座的图形对应时,获取电源插座的标准物理特征。其中,与所存储的电源插座的图形对应的所述至少一个图形被称为标准图形,本示例中,标准图形即所摄取的电源插座的图形。
在步骤S130中,基于标准图形和标准件的标准物理特征,确定机器人在当前物理空间中的定位信息。
在此,可利用定位处理装置基于标准图形和标准件的标准物理特征来确定机器人在当前物理空 间中的定位信息。所述定位处理装置包括处理器,在本示例中,处理器利用预设的单位像素间隔与实际物理空间中单位长度的对应关系,以及所识别的标准图形的尺寸和所对应的标准物理特征中的实物尺寸,计算机器人在当前物理空间中相距标准件的距离和偏角,即得到机器人相对于标准件的定位信息。以设置在墙上的插座为例,当处理器识别出插座以及墙与底面的交界线、或识别出插座并默认插座被安装在墙上时,按照上述对应关系,处理器不仅可得到机器人相距插座的距离和偏角,处理器还可以利用墙、机器人和插座的空间位置关系得到机器人与墙的直线距离,如此得到机器人相对于墙的定位信息。此外,本步骤中的处理器可与上述步骤中的处理器共用或可独立设置。
本申请机器人的定位方法通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息的技术方案,解决了现有技术中利用传感器所提供数据对机器人的定位不准确的问题。
在某些实施方式中,所述步骤S120中获取标准件的标准物理特征的方式包括通过网络自远程服务端获取标准物理特征。在一实施例中,定位处理装置包括处理器,处理器将检索要求即标准图形通过网络提供给远程服务端,远程服务端根据所述要求采用CBIR(Content-based Image Retrieval,基于内容的图像检索,简称CBIR)方法进行检索并确定检索结果即检索得到标准件图形,然后,远程服务端基于所获取的标准件图形输出标准件的标准物理特征,并通过网络提供给处理器以进行后续处理。
在某些实施方式中,标准件除了可包括基于行业标准、国家标准、国际标准中的至少一种标准而设计的标准件之外,还可包括基于自定义标准而设计的标准件。所述自定义标准的标准件可以是机器人厂商自定义的标准件,例如,机器人厂商设计制造的可设于机器人工作环境中与机器人配套使用的标准件。所述自定义标准的标准件还可以是藉由用户输入的物理参数生成有标准物理特征的标准件。在一实施例中,以扫地机器人为例,基于扫地机器人在例如客厅工作时通过摄像装置所摄取的图像中的实物一般会包括家用电器如电视机,因而用户可以自定义电视机为标准件。具体地,用户可以通过阅读电视机的说明书或者查询产品信息得到电视机的物理参数,并经由输入装置例如机器人应用程序APP输入物理参数生成有标准物理特征的标准件。此外,用户也可根据室内环境来选择其他实物作为标准件。本申请机器人的定位方法通过用户自定义标准件,使得用户能够根据室内环境选择标准件,有助于机器人的准确定位。
在某些实施方式中,所述步骤S130中基于标准图形和标准件的标准物理特征,确定机器人在当前物理空间中的定位信息的方式包括:基于标准物理特征对标准图形进行偏角矫正处理以得到机器人相对于标准件所在平面的偏转角度;基于标准物理特征对所矫正后的图形进行距离测量以得到机器人相距标准件及标准件所在平面的距离。
以所述定位处理装置所识别的标准件为两孔电源插座为例,根据两孔电源插座的标准物理特征 中每个孔的尺寸和角度特征、和两孔之间的实际物理间距和角度特征,以及图像所识别的两孔电源插座的标准图形中每个孔的像素尺寸和像素偏角、两孔之间的像素间距和像素偏角,定位处理装置能够确定拍摄装置的拍摄面与两孔电源插座所在平面的偏转角度。定位处理装置可利用所得到的偏转角度得到机器人相对于标准件所在平面的偏转角度。
为了便于计算机器人相距标准件的距离,定位处理装置利用所得到的偏转角度对所述标准图形进行纠偏处理,以得到与标准件所在平面平行的拍摄面内的标准图形。再利用图像中标准图形的像素尺寸与实际物理尺寸的比例关系进行距离测量,如此得到所述机器人相距所述标准件及所述标准件所在平面的距离。
在获得机器人相对于标准件所在平面的偏转角度和机器人相距标准件及标准件所在平面的距离之后即可确定机器人的当前位置及取向,即,确定机器人在当前物理空间中的定位信息。所得到的定位信息可用在机器人导航、地图创建和地图数据修正等方面。特别在室内定位与地图构建技术中,利用本申请所述的定位方法所得到的定位信息能够及时修正地图数据,使得在无法预先获取准确室内地图数据的情况下,尽可能准确地构建并使用室内地图数据。
请参阅图2,图2显示为本申请机器人的定位方法在另一种实施方式中的流程图。如图所示,本申请机器人的定位方法包括:步骤S210、步骤S220、步骤S230以及步骤S240。
在步骤S210中,在机器人的导航操作环境下摄取图像。
在此,可利用摄像装置在机器人的导航操作环境下摄取图像。其中,步骤S210与前述举例中的步骤S110的方式相同或相似,在此不再详述。
在步骤S220中,识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。
在此,可利用定位处理装置来识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。其中,步骤S220与前述举例中的步骤S120的方式相同或相似,在此不再详述。
在步骤S230中,基于标准图形和标准件的标准物理特征,确定机器人在当前物理空间中的定位信息。
在此,可利用定位处理装置基于标准图形和标准件的标准物理特征来确定机器人在当前物理空间中的定位信息。其中,步骤S230与前述举例中的步骤S130的方式相同或相似,在此不再详述。
在步骤S240中,基于机器人在当前物理空间中的定位信息调整机器人中的地图数据。
在此,可利用机器人的控制装置基于机器人在当前物理空间中的定位信息来调整机器人中的地图数据。其中,机器人的控制装置可以包含专用于控制机器人的一个或多个处理器(CPU)或微控制单元(MCU)。此外,控制装置中的处理器与上述定位处理装置中的处理器可以共用或可独立设 置。例如,所述控制装置作为从处理设备,所述定位处理装置中的处理器作为主设备,控制装置基于定位处理装置的定位来控制调整。或者所述控制装置与所述定位装置中的处理器共用。
以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器、测距传感器、悬崖传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至某一位置时,根据本申请机器人的定位方法确定的机器人在当前物理空间中的定位信息与地图数据存在偏差,因而需要对所述地图数据进行调整。在此,定位处理装置在确定机器人在当前物理空间中的定位信息后将所述定位信息提供给控制装置,控制装置基于所述定位信息来调整机器人中的地图数据。所述地图数据可以包括但不限于基于移动传感器所提供的移动数据和图像数据而在预设栅格内或坐标空间描绘的位置及角度数据,基于所识别的实物特征而在预设栅格或坐标空间内标记的地标信息等。所述地图数据可用于地图构建及导航。
本申请机器人的定位方法通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人中的地图数据的技术方案,使得能够依据定位信息补偿地图数据中的定位误差,实现准确定位的目的。
在某些实施方式中,所述步骤S240中基于机器人在当前物理空间中的定位信息调整机器人中的地图数据的方式包括:基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
在一具体示例中,当机器人未识别到标准件时,其利用移动传感器所提供的移动数据构建地图数据;当机器人识别出标准件,并基于所识别的标准图形和相应标准件的标准物理特征确定所述机器人在当前物理空间中的定位信息时,控制装置可比较所述定位信息和基于移动数据而定位的定位信息是否有位置及角度偏差,若是,则可用标准件的定位信息替换移动数据而确定的定位信息,以及基于该两定位信息中位置及角度偏差顺修已构建的其他地图数据。例如,请参阅图3,图3显示为利用本申请所述定位方法所确定的定位信息B来补偿地图数据中位置A1和A2的定位误差的示意图。其中,位置A1、A2均为基于移动数据而确定的定位信息。当定位处理装置在A2位置识别出标准件并基于所识别出的标准件确定当前定位信息B时,将定位信息B替换位置A2,并利用定位信息B位于A2左侧且偏差一个栅格,将A1的定位信息也相左调整一个栅格得到虚线所示的A1’,以补偿移动数据在位置A1处的定位误差。
在另一具体示例中,在利用图像定位而构建地图数据时,地图数据中包含地标信息。所述地标信息用于帮助机器人利用图像进行定位。所述地标信息包括但不限于:标准件和非标准件的特征(如边、角、轮廓、亮度特征等),能够拍摄到这些特征的各位置及角度信息。其中,为了提高定位精 度,一个地标信息中的特征通常为多个,例如10个以上。因此,一个地标信息中通常包含标准件和非标准件的特征、或者全部非标准件特征。相邻的地标信息中通常包含重复的和不重复的特征。在此,非标准件是指未被预先定义成标准件的物体,如,桌椅、墙体等。当利用本申请所述方法得到包含标准件特征的定位信息时,可基于所得到的定位信息补偿地图数据中与所述标准件相关的地标信息中的定位误差。
例如,在已构建的地图数据中包含位置C1和C2的定位信息,其中,位置C1中包含地标信息t1,位置C2中包含地标信息t2,t1中包含标准件的特征和非标准件的特征和拍摄到相应特征的位置及偏角,t2包含与t1相重叠的非标准件的特征和拍摄到相应特征的位置及偏角。当定位处理装置在C1位置识别出标准件并基于所识别出的标准件确定当前定位信息D时,将定位信息D替换位置C1处的定位信息以补偿原始地图数据中的定位误差,以及基于定位信息D调整地标信息t1中的位置及偏角中的定位误差。借助地标信息t1和t2中重叠的非标准件特征,本步骤还可以调整后位置C2和地标信息t2中的定位误差。
需要说明的是,上述对地图数据中的定位信息进行误差补偿的方式仅为举例。由于移动数据和图像数据的误差是累积而得的,故,其定位误差不一定完全一致。故而,定位补偿的方式还可以利用加权补偿的方式补偿各地图数据中的定位误差。
另外,利用本申请所述定位方法而得到的定位信息,以及地图数据所能提供的与定位相关的信息,如定位信息、地标信息等,本步骤可调整地图数据中一处或多处位置的定位信息,以及同一位置处所包含的一种或多种定位误差。
请参阅图4,图4显示为本申请机器人的定位方法在又一种实施方式中的流程图。如图所示,本申请机器人的定位方法包括:步骤S410、步骤S420、步骤S430以及步骤S440。
在步骤S410中,在机器人的导航操作环境下摄取图像。
在此,可利用摄像装置在机器人的导航操作环境下摄取图像。其中,步骤S410与前述举例中的步骤S110的方式相同或相似,在此不再详述。
在步骤S420中,识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。
在此,可利用定位处理装置来识别图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取标准件的标准物理特征。其中,步骤S420与前述举例中的步骤S120的方式相同或相似,在此不再详述。
在步骤S430中,基于标准图形和标准件的标准物理特征,确定机器人在当前物理空间中的定位信息。
在此,可利用定位处理装置基于标准图形和标准件的标准物理特征来确定机器人在当前物理空 间中的定位信息。其中,步骤S430与前述举例中的步骤S130的方式相同或相似,在此不再详述。
在步骤S440中,基于机器人在当前物理空间中的定位信息调整机器人的导航路线。
在此,可利用机器人的控制装置基于机器人在当前物理空间中的定位信息来调整机器人的导航路线。其中,机器人的控制装置可以包含专用于控制机器人的一个或多个处理器(CPU)或微控制单元(MCU)。此外,控制装置中的处理器与上述定位处理装置中的处理器可以共用或可独立设置。例如,所述控制装置作为从处理设备,所述定位处理装置中的处理器作为主设备,控制装置基于定位处理装置的定位控制调整。或者所述控制装置与所述定位装置中的处理器共用。
以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至某一位置时,根据本申请机器人的定位方法确定的机器人在当前物理空间中的定位信息与地图数据存在偏差。因此,在根据已构建的地图进行导航操作的情况下,定位处理装置在确定机器人在当前物理空间中的定位信息后将所述定位信息提供给控制装置,控制装置基于机器人在当前物理空间中的定位信息调整机器人的导航路线。此外,控制装置还可以依据调整后的导航路线调整机器人继续移动的距离或者调整机器人移动的方向以使机器人依据调整后的导航路线移动。
本申请机器人的定位方法通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人的导航路线的技术方案,使得能够在准确定位的情况下实现修正导航路线的目的。
在某些实施方式中,所述步骤S440中基于机器人在当前物理空间中的定位信息调整机器人的导航路线的方式包括:基于所述定位信息重新确定所述机器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
以扫地机器人为例,在机器人根据已构建的地图进行导航操作的情况下,当机器人移动至某一位置时,机器人在已构建的地图上的位置及取向是已知的,此时,定位处理装置根据本申请机器人的定位方法确定了机器人在当前物理空间中的定位信息即机器人当前实际的位置及取向。所述机器人当前实际的位置及取向与机器人在已构建地图上的位置及取向之间存在偏差,控制装置基于机器人当前实际的位置及取向来修正机器人在已构建地图上的位置及取向,即重新确定机器人在已构建的地图中的位置及取向,然后根据重新确定的位置及取向以及已构建的地图来重新确定机器人距障碍物的距离、机器人是否偏航以及如果机器人继续沿已构建的地图中的路线移动所需移动的距离和偏转的方向,进而调整导航路线。
本申请还提供一种机器人的定位系统,所述定位系统可配置于扫地机器人中。请参阅图5,图5显示为本申请机器人的定位系统在一种实施方式中的结构示意图。如图所示,所述定位系统包括 摄像装置11、存储装置12以及定位处理装置13。
所述摄像装置11用于在机器人的导航操作环境下摄取图像。所述摄像装置包括但不限于:照相机、视频摄像机、集成有光学系统或CCD芯片的摄像模块、集成有光学系统和CMOS芯片的摄像模块等。所述摄像装置的供电系统可受机器人的供电系统控制,当机器人上电移动期间,所述摄像装置即开始摄取图像。此外,所述摄像装置可以设于机器人的主体上。以扫地机器人为例,所述摄像装置可以设于扫地机器人的顶盖的中部或边缘,或者所述摄像装置可以设于扫地机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。另外,所述摄像装置的光学轴可以相对于垂线成±30°的夹角,或者所述摄像装置的光学轴可以相对于水平线成0-180°的夹角。所述导航操作环境是指机器人依据利用已构建的地图数据而设计的导航路线、或基于随机设计的导航路线移动并进行相应操作的环境。以扫地机器人为例,导航操作环境指扫地机器人依据导航路线移动并进行清洁操作的环境。
所述存储装置12用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征。其中,定位与地图构建应用即SLAM应用是智能机器人领域中的基础应用,问题可以描述为当机器人在未知环境中时,是否有办法让机器人一边逐步描绘出此环境完全的地图,同时一边决定机器人应该往哪个方向行进,也就是说,要实现智能化需要完成三个任务,第一个是定位(Localization),第二个是建图(Mapping),第三个则是随后的路径规划(Navigation)。本申请中的行为控制应用是指根据所设定的信息或指令控制机器人移动、进行姿态调整等。
此外,所述存储装置还预存有各标准件的标准物理特征。其中,所述标准件可包括基于行业标准、国家标准、国际标准、和自定义标准中的至少一种标准而设计的标准件。例如,行业标准如机械行业标准JB、建筑材料行业标准JC等;国家标准如中国GB标准、德国DIN标准、英国BS标准等;国际标准如国际ISO标准;自定义标准稍后详述。所述标准物理特征可包括轮廓尺寸、标准结构关系等,例如,标准件的标准物理特征包括标准件实际物理上的长、宽、高,标准件中对应标准的实际物理上的其他尺寸数据等。例如,电源插座上两孔间距等。又如电源插座的长宽值。
所述存储装置12包括但不限于高速随机存取存储器、非易失性存储器。例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储装置12还可以包括远离一个或多个处理器的存储器,例如,经由RF电路或外部端口以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器可控制机器人的诸如CPU和外设接口之类的其他组件对存储装置的访问。
所述定位处理装置13与摄像装置11和存储装置12相连。所述定位处理装置13可包括一个或多个处理器。定位处理装置13可操作地与存储装置12中的易失性存储器和/或非易失性存储器耦 接。定位处理装置13可执行在存储器和/或非易失性存储设备中存储的指令以在机器人中执行操作,诸如识别图像中实物的图形以及基于所识别的标准图形和标准件的标准物理体征在地图中进行定位等。如此,处理器可包括一个或多个通用微处理器、一个或多个专用处理器(ASIC)、一个或多个数字信号处理器(DSP)、一个或多个现场可编程逻辑阵列(FPGA)、或它们的任何组合。所述定位处理装置还与I/O端口和输入结构可操作地耦接,该I/O端口可使得机器人能够与各种其他电子设备进行交互,该输入结构可使得用户能够与计算设备进行交互。因此,输入结构可包括按钮、键盘、鼠标、触控板等。所述其他电子设备可以是所述机器人中移动装置中的移动电机,或机器人中专用于控制移动装置和清扫装置的从处理器,如MCU(Microcontroller Unit,微控制单元,简称MCU)。
在一种示例中,所述定位处理装置13通过数据线分别连接存储装置12和摄像装置11。所述定位处理装置13通过数据读写技术与存储装置12进行交互,所述定位处理装置13通过接口协议与摄像装置11进行交互。其中,所述数据读写技术包括但不限于:高速/低速数据接口协议、数据库读写操作等。所述接口协议包括但不限于:HDMI接口协议、串行接口协议等。
所述定位处理装置13用于调用所述定位与地图构建应用以执行:识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
首先,所述定位处理装置13可采用基于神经网络的图像识别方法、基于小波矩的图像识别方法等图像识别方法对所摄取的图像进行处理、分析和理解,以识别各种不同模式的目标和对象。此外,所述定位处理装置还可通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来寻求相似图像目标。
在一实施例中,以扫地机器人为例,由于扫地机器人通常进行室内清洁工作,因而通过摄像装置所摄取的图像中的实物一般会包括例如,墙、桌、沙发、衣柜、电视机、电源插座、网线插座等。在本示例中,首先,摄像装置在机器人的导航操作环境下摄取图像之后将所述图像提供给定位处理装置,定位处理装置通过图像识别来识别所摄取的图像中实物的图形。其中,所述实物的图形可以由实物的灰度、实物的轮廓等特征表征。同时,所述实物的图形并不限于实物的外部几何图形,还可包括实物上呈现出的其他图形,例如电源插座上的二孔插口、五孔插口,网线插座上的方形插口等。鉴于此,例如,对于外部几何图形相近的电源插座和网线插座,则可利用电源插座的五孔插口与网线插座的方形插口来辨别。此外,在扫地机器人的摄像装置在室内可摄取的图像中的实物包括电源插座、网线插座的情况下,由于电源插座、网线插座是根据GB标准设计的,因而不会因其所处的环境不同而有所变化,因此,可以作为标准件。本示例中,标准件选用电源插座。标准件的标准物理特征可包括电源插座的长、宽、高,电源插座上五孔插口的结构关系等。在某些实施方式中, 标准件的图形和标准件的标准物理特征可以是预设的,并且利用机器人的存储装置预先存储。因此,获取标准件的标准物理特征的方式包括自机器人的存储装置中读取预设的标准物理特征。
此外,针对所识别的图像中实物的图形和所存储的标准件即电源插座的图形,定位处理装置13通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来确定所识别的至少一个图形与所存储的电源插座的图形是否对应,当所识别的至少一个图形与所存储的电源插座的图形对应时,获取电源插座的标准物理特征。其中,与所存储的电源插座的图形对应的所述至少一个图形被称为标准图形,本示例中,标准图形即所摄取的电源插座的图形。
然后,定位处理装置13利用预设的单位像素间隔与实际物理空间中单位长度的对应关系,以及所识别的标准图形的尺寸和所对应的标准物理特征中的实物尺寸,计算机器人在当前物理空间中相距标准件的距离和偏角,即得到机器人相对于标准件的定位信息。以设置在墙上的插座为例,当定位处理装置识别出插座以及墙与底面的交界线、或识别出插座并默认插座被安装在墙上时,按照上述对应关系,定位处理装置不仅可得到机器人相距插座的距离和偏角,定位处理装置还可以利用墙、机器人和插座的空间位置关系得到机器人与墙的直线距离,如此得到机器人相对于墙的定位信息。
本申请机器人的定位系统通过采用定位处理装置13对由摄像装置11所摄取的图像中实物的图形进行识别并与存储装置12中所存储的标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息的技术方案,解决了现有技术中利用传感器所提供数据对机器人的定位不准确的问题。
请参阅图6,图6显示为本申请机器人的定位系统在另一种实施方式中的结构示意图。如图所示,定位系统还包括网络接入装置14。所述网络接入装置14与定位处理装置13相连,网络接入装置14用于自远程服务端获取相应标准物理特征。
在一实施例中,定位处理装置将检索要求即标准图形通过网络提供给远程服务端,远程服务端根据所述要求采用CBIR(Content-based Image Retrieval,基于内容的图像检索,简称CBIR)方法进行检索并确定检索结果即检索得到标准件图形,然后,远程服务端基于所获取的标准件图形输出标准件的标准物理特征,并通过网络提供给定位处理装置以进行后续处理。
在某些实施方式中,标准件除了可包括基于行业标准、国家标准、国际标准中的至少一种标准而设计的标准件之外,还可包括基于自定义标准而设计的标准件。所述自定义标准的标准件可以是机器人厂商自定义的标准件,例如,机器人厂商设计制造的可设于机器人工作环境中与机器人配套使用的标准件。所述自定义标准的标准件还可以是藉由用户输入的物理参数生成有标准物理特征的标准件。在一实施例中,以扫地机器人为例,基于扫地机器人在例如客厅工作时通过摄像装置所摄取的图像中的实物一般会包括家用电器如电视机,因而用户可以自定义电视机为标准件。具体地, 用户可以通过阅读电视机的说明书或者查询产品信息得到电视机的物理参数,并经由输入装置例如机器人应用程序APP输入物理参数生成有标准物理特征的标准件。此外,用户也可根据室内环境来选择其他实物作为标准件。本申请机器人的定位系统通过用户自定义标准件,使得用户能够根据室内环境选择标准件,有助于机器人的准确定位。
在某些实施方式中,所述定位处理装置基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息的方式包括:基于所述标准物理特征对所述标准图形进行偏角矫正处理以得到所述机器人相对于所述标准件所在平面的偏转角度;基于所述标准物理特征对所矫正后的图形进行距离测量以得到所述机器人相距所述标准件及所述标准件所在平面的距离。
以所述定位处理装置所识别的标准件为两孔电源插座为例,根据两孔电源插座的标准物理特征中每个孔的尺寸和角度特征、和两孔之间的实际物理间距和角度特征,以及图像所识别的两孔电源插座的标准图形中每个孔的像素尺寸和像素偏角、两孔之间的像素间距和像素偏角,定位处理装置能够确定拍摄装置的拍摄面与两孔电源插座所在平面的偏转角度。定位处理装置可利用所得到的偏转角度得到机器人相对于标准件所在平面的偏转角度。
为了便于计算机器人相距标准件的距离,定位处理装置利用所得到的偏转角度对所述标准图形进行纠偏处理,以得到与标准件所在平面平行的拍摄面内的标准图形。再利用图像中标准图形的像素尺寸与实际物理尺寸的比例关系进行距离测量,如此得到所述机器人相距所述标准件及所述标准件所在平面的距离。
在获得机器人相对于标准件所在平面的偏转角度和机器人相距标准件及标准件所在平面的距离之后即可确定机器人的当前位置及取向,即,确定机器人在当前物理空间中的定位信息。所得到的定位信息可用在机器人导航、地图创建和地图数据修正等方面。特别在室内定位与地图构建技术中,本申请所述的定位系统根据得到的定位信息能够及时修正地图数据,使得在无法预先获取准确室内地图数据的情况下,尽可能准确地构建并使用室内地图数据。
在某些实施方式中,所述定位处理装置还用于执行基于所述机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的步骤。
以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器、测距传感器、悬崖传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至某一位置时,本申请机器人的定位系统确定的机器人在当前物理空间中的定位信息与地图数据存在偏差,因而需要对所述地图数据进行调整。在此,定位处理装置在确定机器人在当前物理空间中的定位信息后基于所述定位信息来调整机器人中的地图数据。所述地图数据可以包括但不限于基于移动传感器所提供的移动数据和图像数据而在预设栅格内或坐标空间描绘的位置及角度数 据,基于所识别的实物特征而在预设栅格或坐标空间内标记的地标信息等。所述地图数据可用于地图构建及导航。
本申请机器人的定位系统通过采用定位处理装置对由摄像装置所摄取的图像中实物的图形进行识别并与存储装置中所存储的标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人中的地图数据的技术方案,使得能够依据定位信息补偿地图数据中的定位误差,实现准确定位的目的。
在某些实施方式中,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的方式包括:基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
在一具体示例中,当机器人未识别到标准件时,其利用移动传感器所提供的移动数据构建地图数据;当机器人识别出标准件,并基于所识别的标准图形和相应标准件的标准物理特征确定所述机器人在当前物理空间中的定位信息时,定位处理装置可比较所述定位信息和基于移动数据而定位的定位信息是否有位置及角度偏差,若是,则可用标准件的定位信息替换移动数据而确定的定位信息,以及基于该两定位信息中位置及角度偏差顺修已构建的其他地图数据。例如,请参阅图3,图3显示为利用本申请所述定位方法所确定的定位信息B来补偿地图数据中位置A1和A2的定位误差的示意图。其中,位置A1、A2均为基于移动数据而确定的定位信息。当定位处理装置在A2位置识别出标准件并基于所识别出的标准件确定当前定位信息B时,将定位信息B替换位置A2,并利用定位信息B位于A2左侧且偏差一个栅格,将A1的定位信息也相左调整一个栅格得到虚线所示的A1’,以补偿移动数据在位置A1处的定位误差。
在另一具体示例中,在利用图像定位而构建地图数据时,地图数据中包含地标信息。所述地标信息用于帮助机器人利用图像进行定位。所述地标信息包括但不限于:标准件和非标准件的特征(如边、角、轮廓、亮度特征等),能够拍摄到这些特征的各位置及角度信息。其中,为了提高定位精度,一个地标信息中的特征通常为多个,例如10个以上。因此,一个地标信息中通常包含标准件和非标准件的特征、或者全部非标准件特征。相邻的地标信息中通常包含重复的和不重复的特征。在此,非标准件是指未被预先定义成标准件的物体,如,桌椅、墙体等。当本申请所述的定位系统得到包含标准件特征的定位信息时,可基于所得到的定位信息补偿地图数据中与所述标准件相关的地标信息中的定位误差。
例如,在已构建的地图数据中包含位置C1和C2的定位信息,其中,位置C1中包含地标信息t1,位置C2中包含地标信息t2,t1中包含标准件的特征和非标准件的特征和拍摄到相应特征的位置及偏角,t2包含与t1相重叠的非标准件的特征和拍摄到相应特征的位置及偏角。当定位处理装置在C1位置识别出标准件并基于所识别出的标准件确定当前定位信息D时,将定位信息D替换 位置C1处的定位信息以补偿原始地图数据中的定位误差,以及基于定位信息D调整地标信息t1中的位置及偏角中的定位误差。借助地标信息t1和t2中重叠的非标准件特征,本步骤还可以调整后位置C2和地标信息t2中的定位误差。
需要说明的是,上述对地图数据中的定位信息进行误差补偿的方式仅为举例。由于移动数据和图像数据的误差是累积而得的,故,其定位误差不一定完全一致。故而,定位补偿的方式还可以利用加权补偿的方式补偿各地图数据中的定位误差。
另外,利用本申请所述定位系统得到的定位信息,以及地图数据所能提供的与定位相关的信息,如定位信息、地标信息等,本步骤可调整地图数据中一处或多处位置的定位信息,以及同一位置处所包含的一种或多种定位误差。
在某些实施方式中,所述定位处理装置还用于基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线。
以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至某一位置时,本申请机器人的定位系统确定的机器人在当前物理空间中的定位信息与地图数据存在偏差。因此,在根据已构建的地图进行导航操作的情况下,定位处理装置在确定机器人在当前物理空间中的定位信息后基于机器人在当前物理空间中的定位信息调整机器人的导航路线。此外,定位处理装置还可以依据调整后的导航路线调整机器人继续移动的距离或者调整机器人移动的方向以使机器人依据调整后的导航路线移动。
本申请机器人的定位系统通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人的导航路线的技术方案,使得能够在准确定位的情况下实现修正导航路线的目的。
在某些实施方式中,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人的导航路线的方式包括:基于所述定位信息重新确定所述机器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
在一实施例中,以扫地机器人为例,在机器人根据已构建的地图进行导航操作的情况下,当机器人移动至某一位置时,机器人在已构建的地图上的位置及取向是已知的,此时,本申请机器人的定位系统中的定位处理装置确定了机器人在当前物理空间中的定位信息即机器人当前实际的位置及取向。所述机器人当前实际的位置及取向与机器人在已构建地图上的位置及取向之间存在偏差,定位处理装置基于机器人当前实际的位置及取向来修正机器人在已构建地图上的位置及取向,即重新确定机器人在已构建的地图中的位置及取向,然后根据重新确定的位置及取向以及已构建的地图 来重新确定机器人距障碍物的距离、机器人是否偏航以及如果机器人继续沿已构建的地图中的路线移动所需移动的距离和偏转的方向,进而调整导航路线。
本申请还提供一种机器人,请参阅图7,图7显示为本申请的机器人在一种实施方式中的结构示意图。如图所示,所述机器人包括驱动装置215、摄像装置211、存储装置212以及定位处理装置213。
所述驱动装置215用于驱动所述机器人进行位移和/或姿态调整。以扫地机器人为例,所述驱动装置可以是驱动扫地机器人的滚轮移动以及偏转的驱动电机等。
所述摄像装置211用于在机器人的导航操作环境下摄取图像。所述摄像装置包括但不限于:照相机、视频摄像机、集成有光学系统或CCD芯片的摄像模块、集成有光学系统和CMOS芯片的摄像模块等。所述摄像装置的供电系统可受机器人的供电系统控制,当机器人上电移动期间,所述摄像装置即开始摄取图像。此外,所述摄像装置可以设于机器人的主体上。以扫地机器人为例,所述摄像装置可以设于扫地机器人的顶盖的中部或边缘,或者所述摄像装置可以设于扫地机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。另外,所述摄像装置的光学轴可以相对于垂线成±30°的夹角,或者所述摄像装置的光学轴可以相对于水平线成0-180°的夹角。所述导航操作环境是指机器人依据利用已构建的地图数据而设计的导航路线、或基于随机设计的导航路线移动并进行相应操作的环境。以扫地机器人为例,导航操作环境指扫地机器人依据导航路线移动并进行清洁操作的环境。
所述存储装置212用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征。其中,定位与地图构建应用即SLAM应用是智能机器人领域中的基础应用,问题可以描述为当机器人在未知环境中时,是否有办法让机器人一边逐步描绘出此环境完全的地图,同时一边决定机器人应该往哪个方向行进,也就是说,要实现智能化需要完成三个任务,第一个是定位(Localization),第二个是建图(Mapping),第三个则是随后的路径规划(Navigation)。本申请中的行为控制应用是指根据所设定的信息或指令控制机器人移动、进行姿态调整等。
此外,所述存储装置还预存有各标准件的标准物理特征。其中,所述标准件可包括基于行业标准、国家标准、国际标准、和自定义标准中的至少一种标准而设计的标准件。例如,行业标准如机械行业标准JB、建筑材料行业标准JC等;国家标准如中国GB标准、德国DIN标准、英国BS标准等;国际标准如国际ISO标准;自定义标准稍后详述。所述标准物理特征可包括轮廓尺寸、标准结构关系等,例如,标准件的标准物理特征包括标准件实际物理上的长、宽、高,标准件中对应标准的实际物理上的其他尺寸数据等。例如,电源插座上两孔间距等。又如电源插座的长宽值。
所述存储装置212包括但不限于高速随机存取存储器、非易失性存储器。例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储装置212还可以包括远 离一个或多个处理器的存储器,例如,经由RF电路或外部端口以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器可控制机器人的诸如CPU和外设接口之类的其他组件对存储装置的访问。
所述定位处理装置213连接所述驱动装置215、摄像装置211、及存储装置212。所述定位处理装置213可包括一个或多个处理器。定位处理装置213可操作地与存储装置212中的易失性存储器和/或非易失性存储器耦接。定位处理装置213可执行在存储器和/或非易失性存储设备中存储的指令以在机器人中执行操作,诸如识别图像中实物的图形以及基于所识别的标准图形和标准件的标准物理体征在地图中进行定位等。如此,处理器可包括一个或多个通用微处理器、一个或多个专用处理器(ASIC)、一个或多个数字信号处理器(DSP)、一个或多个现场可编程逻辑阵列(FPGA)、或它们的任何组合。所述定位处理装置还与I/O端口和输入结构可操作地耦接,该I/O端口可使得机器人能够与各种其他电子设备进行交互,该输入结构可使得用户能够与计算设备进行交互。因此,输入结构可包括按钮、键盘、鼠标、触控板等。所述其他电子设备可以是所述机器人中移动装置中的移动电机,或机器人中专用于控制移动装置和清扫装置的从处理器,如MCU(Microcontroller Unit,微控制单元,简称MCU)。
在一种示例中,所述定位处理装置213通过数据线分别连接存储装置212、摄像装置211和驱动装置215。所述定位处理装置213通过数据读写技术与存储装置212进行交互,所述定位处理装置213通过接口协议与摄像装置211和驱动装置215进行交互。其中,所述数据读写技术包括但不限于:高速/低速数据接口协议、数据库读写操作等。所述接口协议包括但不限于:HDMI接口协议、串行接口协议等。
所述定位处理装置213用于调用所述预存有定位与地图构建应用以执行:识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
首先,所述定位处理装置213可采用基于神经网络的图像识别方法、基于小波矩的图像识别方法等图像识别方法对所摄取的图像进行处理、分析和理解,以识别各种不同模式的目标和对象。此外,所述定位处理装置还可通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来寻求相似图像目标。
在一实施例中,以扫地机器人为例,由于扫地机器人通常进行室内清洁工作,因而通过摄像装置所摄取的图像中的实物一般会包括例如,墙、桌、沙发、衣柜、电视机、电源插座、网线插座等。在本示例中,首先,摄像装置在机器人的导航操作环境下摄取图像之后将所述图像提供给定位处理 装置,定位处理装置通过图像识别来识别所摄取的图像中实物的图形。其中,所述实物的图形可以由实物的灰度、实物的轮廓等特征表征。同时,所述实物的图形并不限于实物的外部几何图形,还可包括实物上呈现出的其他图形,例如电源插座上的二孔插口、五孔插口,网线插座上的方形插口等。鉴于此,例如,对于外部几何图形相近的电源插座和网线插座,则可利用电源插座的五孔插口与网线插座的方形插口来辨别。此外,在扫地机器人的摄像装置在室内可摄取的图像中的实物包括电源插座、网线插座的情况下,由于电源插座、网线插座是根据GB标准设计的,因而不会因其所处的环境不同而有所变化,因此,可以作为标准件。本示例中,标准件选用电源插座。标准件的标准物理特征可包括电源插座的长、宽、高,电源插座上五孔插口的结构关系等。在某些实施方式中,标准件的图形和标准件的标准物理特征可以是预设的,并且利用机器人的存储装置预先存储。因此,获取标准件的标准物理特征的方式包括自机器人的存储装置中读取预设的标准物理特征。
此外,针对所识别的图像中实物的图形和所存储的标准件即电源插座的图形,定位处理装置213通过对图像内容、特征、结构、关系、纹理及灰度等的对应关系,相似性和一致性的分析来确定所识别的至少一个图形与所存储的电源插座的图形是否对应,当所识别的至少一个图形与所存储的电源插座的图形对应时,获取电源插座的标准物理特征。其中,与所存储的电源插座的图形对应的所述至少一个图形被称为标准图形,本示例中,标准图形即所摄取的电源插座的图形。
然后,定位处理装置213利用预设的单位像素间隔与实际物理空间中单位长度的对应关系,以及所识别的标准图形的尺寸和所对应的标准物理特征中的实物尺寸,计算机器人在当前物理空间中相距标准件的距离和偏角,即得到机器人相对于标准件的定位信息。以设置在墙上的插座为例,当定位处理装置识别出插座以及墙与底面的交界线、或识别出插座并默认插座被安装在墙上时,按照上述对应关系,定位处理装置不仅可得到机器人相距插座的距离和偏角,定位处理装置还可以利用墙、机器人和插座的空间位置关系得到机器人与墙的直线距离,如此得到机器人相对于墙的定位信息。
本申请的机器人通过采用定位处理装置213对由摄像装置211所摄取的图像中实物的图形进行识别并与存储装置212中所存储的标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息的技术方案,解决了现有技术中利用传感器所提供数据对机器人的定位不准确的问题。
请参阅图8,图8显示为本申请的机器人在另一种实施方式中的结构示意图。如图所示,机器人还包括网络接入装置214。所述网络接入装置214与定位处理装置213相连,网络接入装置214用于自远程服务端获取相应标准物理特征。
在一实施例中,定位处理装置将检索要求即标准图形通过网络提供给远程服务端,远程服务端根据所述要求采用CBIR(Content-based Image Retrieval,基于内容的图像检索,简称CBIR)方法 进行检索并确定检索结果即检索得到标准件图形,然后,远程服务端基于所获取的标准件图形输出标准件的标准物理特征,并通过网络提供给定位处理装置以进行后续处理。
在某些实施方式中,标准件除了可包括基于行业标准、国家标准、国际标准中的至少一种标准而设计的标准件之外,还可包括基于自定义标准而设计的标准件。所述自定义标准的标准件可以是机器人厂商自定义的标准件,例如,机器人厂商设计制造的可设于机器人工作环境中与机器人配套使用的标准件。所述自定义标准的标准件还可以是藉由用户输入的物理参数生成有标准物理特征的标准件。在一实施例中,以扫地机器人为例,基于扫地机器人在例如客厅工作时通过摄像装置所摄取的图像中的实物一般会包括家用电器如电视机,因而用户可以自定义电视机为标准件。具体地,用户可以通过阅读电视机的说明书或者查询产品信息得到电视机的物理参数,并经由输入装置例如机器人应用程序APP输入物理参数生成有标准物理特征的标准件。此外,用户也可根据室内环境来选择其他实物作为标准件。本申请的机器人通过用户自定义标准件,使得用户能够根据室内环境选择标准件,有助于机器人的准确定位。
在某些实施方式中,所述定位处理装置基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息的方式包括:基于所述标准物理特征对所述标准图形进行偏角矫正处理以得到所述机器人相对于所述标准件所在平面的偏转角度;基于所述标准物理特征对所矫正后的图形进行距离测量以得到所述机器人相距所述标准件及所述标准件所在平面的距离。
以所述定位处理装置所识别的标准件为两孔电源插座为例,根据两孔电源插座的标准物理特征中每个孔的尺寸和角度特征、和两孔之间的实际物理间距和角度特征,以及图像所识别的两孔电源插座的标准图形中每个孔的像素尺寸和像素偏角、两孔之间的像素间距和像素偏角,定位处理装置能够确定拍摄装置的拍摄面与两孔电源插座所在平面的偏转角度。定位处理装置可利用所得到的偏转角度得到机器人相对于标准件所在平面的偏转角度。
为了便于计算机器人相距标准件的距离,定位处理装置利用所得到的偏转角度对所述标准图形进行纠偏处理,以得到与标准件所在平面平行的拍摄面内的标准图形。再利用图像中标准图形的像素尺寸与实际物理尺寸的比例关系进行距离测量,如此得到所述机器人相距所述标准件及所述标准件所在平面的距离。
在获得机器人相对于标准件所在平面的偏转角度和机器人相距标准件及标准件所在平面的距离之后即可确定机器人的当前位置及取向,即,确定机器人在当前物理空间中的定位信息。所得到的定位信息可用在机器人导航、地图创建和地图数据修正等方面。特别在室内定位与地图构建技术中,本申请所述的机器人根据得到的定位信息能够及时修正地图数据,使得在无法预先获取准确室内地图数据的情况下,尽可能准确地构建并使用室内地图数据。
在某些实施方式中,所述定位处理装置调用所述定位与地图构建应用还执行基于所述机器人在 当前物理空间中的定位信息调整所述机器人中的地图数据的步骤。
以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器、测距传感器、悬崖传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至某一位置时,本申请的机器人确定的机器人在当前物理空间中的定位信息与地图数据存在偏差,因而需要对所述地图数据进行调整。在此,定位处理装置在确定机器人在当前物理空间中的定位信息后基于所述定位信息来调整机器人中的地图数据。所述地图数据可以包括但不限于基于移动传感器所提供的移动数据和图像数据而在预设栅格内或坐标空间描绘的位置及角度数据,基于所识别的实物特征而在预设栅格或坐标空间内标记的地标信息等。所述地图数据可用于地图构建及导航。
本申请的机器人通过采用定位处理装置对由摄像装置所摄取的图像中实物的图形进行识别并与存储装置中所存储的标准件的图形进行匹配、以及基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人中的地图数据的技术方案,使得能够依据定位信息补偿地图数据中的定位误差,实现准确定位的目的。
在某些实施方式中,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的方式包括:基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
在一具体示例中,当机器人未识别到标准件时,其利用移动传感器所提供的移动数据构建地图数据;当机器人识别出标准件,并基于所识别的标准图形和相应标准件的标准物理特征确定所述机器人在当前物理空间中的定位信息时,定位处理装置可比较所述定位信息和基于移动数据而定位的定位信息是否有位置及角度偏差,若是,则可用标准件的定位信息替换移动数据而确定的定位信息,以及基于该两定位信息中位置及角度偏差顺修已构建的其他地图数据。例如,请参阅图3,图3显示为利用本申请所述定位方法所确定的定位信息B来补偿地图数据中位置A1和A2的定位误差的示意图。其中,位置A1、A2均为基于移动数据而确定的定位信息。当定位处理装置在A2位置识别出标准件并基于所识别出的标准件确定当前定位信息B时,将定位信息B替换位置A2,并利用定位信息B位于A2左侧且偏差一个栅格,将A1的定位信息也相左调整一个栅格得到虚线所示的A1’,以补偿移动数据在位置A1处的定位误差。
在另一具体示例中,在利用图像定位而构建地图数据时,地图数据中包含地标信息。所述地标信息用于帮助机器人利用图像进行定位。所述地标信息包括但不限于:标准件和非标准件的特征(如边、角、轮廓、亮度特征等),能够拍摄到这些特征的各位置及角度信息。其中,为了提高定位精度,一个地标信息中的特征通常为多个,例如10个以上。因此,一个地标信息中通常包含标准件 和非标准件的特征、或者全部非标准件特征。相邻的地标信息中通常包含重复的和不重复的特征。在此,非标准件是指未被预先定义成标准件的物体,如,桌椅、墙体等。当本申请所述的机器人得到包含标准件特征的定位信息时,可基于所得到的定位信息补偿地图数据中与所述标准件相关的地标信息中的定位误差。
例如,在已构建的地图数据中包含位置C1和C2的定位信息,其中,位置C1中包含地标信息t1,位置C2中包含地标信息t2,t1中包含标准件的特征和非标准件的特征和拍摄到相应特征的位置及偏角,t2包含与t1相重叠的非标准件的特征和拍摄到相应特征的位置及偏角。当定位处理装置在C1位置识别出标准件并基于所识别出的标准件确定当前定位信息D时,将定位信息D替换位置C1处的定位信息以补偿原始地图数据中的定位误差,以及基于定位信息D调整地标信息t1中的位置及偏角中的定位误差。借助地标信息t1和t2中重叠的非标准件特征,本步骤还可以调整后位置C2和地标信息t2中的定位误差。
需要说明的是,上述对地图数据中的定位信息进行误差补偿的方式仅为举例。由于移动数据和图像数据的误差是累积而得的,故,其定位误差不一定完全一致。故而,定位补偿的方式还可以利用加权补偿的方式补偿各地图数据中的定位误差。
另外,利用本申请所述机器人得到的定位信息,以及地图数据所能提供的与定位相关的信息,如定位信息、地标信息等,本步骤可调整地图数据中一处或多处位置的定位信息,以及同一位置处所包含的一种或多种定位误差。
在某些实施方式中,所述定位处理装置还用于调用所述行为控制应用以执行依据所述定位信息控制所述驱动装置调整所述机器人的位移和/或姿态的步骤。其中,所述定位处理装置可以仅驱动机器人进行位移调整,也可以仅驱动机器人进行姿态调整,或者驱动机器人既进行位移调整、又进行姿态调整,这取决于机器人在当前物理空间中的定位信息与在已构建的地图中的定位信息之间的关系。例如,所述定位处理装置依据定位信息控制驱动装置中的驱动电机的转向、转速。又如,所述机器人包含两组驱动电机以对应每组滚轮,所述定位处理装置依据定位信息控制每个驱动电机以相同转向不同转速驱动各自滚轮,以便机器人改变行走方向。
在某些实施方式中,所述定位处理装置依据所述定位信息驱动所述机器人进行位移和/或姿态调整的方式包括:基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线;控制所述驱动装置按照导航路线进行位移和/或姿态调整。
在一实施例中,首先,所述定位处理装置基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线。以扫地机器人为例,扫地机器人的地图数据一般是基于设于机器人上的多个移动传感器例如速度传感器、里程计传感器等所提供的移动数据,以及所述拍摄装置所提供的图像数据构建的,由于采用移动传感器和拍摄装置构建地图过程中存在累积误差,所以当机器人移动至 某一位置时,本申请的机器人确定的机器人在当前物理空间中的定位信息与地图数据存在偏差。因此,在根据已构建的地图进行导航操作的情况下,定位处理装置在确定机器人在当前物理空间中的定位信息后基于机器人在当前物理空间中的定位信息调整机器人的导航路线。
然后,所述定位处理装置控制所述驱动装置按照导航路线进行位移和/或姿态调整。也就是说,定位处理装置调整机器人继续移动的距离或者调整机器人移动的方向以控制驱动装置按照调整后的导航路线移动。
本申请的机器人通过采用对所摄取图像中实物的图形进行识别并与标准件的图形进行匹配、基于所述标准件的标准物理特征确定机器人在当前物理空间中的定位信息、基于所述定位信息调整机器人进行位移和/或姿态调整的技术方案,使得能够在准确定位的情况下修正导航路线并调整机器人的位移及姿态使得机器人能根据新的导航路线进行移动。
在某些实施方式中,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人的导航路线的方式包括:基于所述定位信息重新确定所述机器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
以扫地机器人为例,在机器人根据已构建的地图进行导航操作的情况下,当机器人移动至某一位置时,机器人在已构建的地图上的位置及取向是已知的,此时,本申请的机器人中的定位处理装置确定了机器人在当前物理空间中的定位信息即机器人当前实际的位置及取向。所述机器人当前实际的位置及取向与机器人在已构建地图上的位置及取向之间存在偏差,定位处理装置基于机器人当前实际的位置及取向来修正机器人在已构建地图上的位置及取向,即重新确定机器人在已构建的地图中的位置及取向,然后根据重新确定的位置及取向以及已构建的地图来重新确定机器人距障碍物的距离、机器人是否偏航以及如果机器人继续沿已构建的地图中的路线移动所需移动的距离和偏转的方向,进而调整导航路线。
还需要说明的是,通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到本申请的部分或全部可借助软件并结合必需的通用硬件平台来实现。基于这样的理解,本申请还提供一种电子设备的存储介质,所述存储介质存储有一个或多个程序,当所述一个或多个计算机程序被一个或多个处理器执行时,使得所述一个或多个处理器实现前述的任一项所述的定位方法。
基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可包括其上存储有机器可执行指令的一个或多个机器可读介质,这些指令在由诸如计算机、计算机网络或其他电子设备等一个或多个机器执行时可使得该一个或多个机器根据本申请的实施例来执行操作。例如执行机器人的定位方法中的各步骤等。机器可读介质可包括,但不限于,软盘、光盘、CD-ROM(紧致盘-只读存储器)、磁光盘、ROM(只读存储器)、RAM(随机存取存储器)、EPROM(可擦除可编程只读存储器)、EEPROM(电可擦除 可编程只读存储器)、磁卡或光卡、闪存、或适于存储机器可执行指令的其他类型的介质/机器可读介质。其中,所述存储介质可位于机器人也可位于第三方服务器中,如位于提供某应用商城的服务器中。在此对具体应用商城不做限制,如小米应用商城、华为应用商城、苹果应用商城等。
本申请可用于众多通用或专用的计算系统环境或配置中。例如:个人计算机、服务器计算机、手持设备或便携式设备、平板型设备、多处理器系统、基于微处理器的系统、置顶盒、可编程的消费电子设备、网络PC、小型计算机、大型计算机、包括以上任何系统或设备的分布式计算环境等。
本申请可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本申请,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
上述实施例仅例示性说明本申请的原理及其功效,而非用于限制本申请。任何熟悉此技术的人士皆可在不违背本申请的精神及范畴下,对上述实施例进行修饰或改变。因此,举凡所属技术领域中具有通常知识者在未脱离本申请所揭示的精神与技术思想下所完成的一切等效修饰或改变,仍应由本申请的权利要求所涵盖。

Claims (31)

  1. 一种机器人的定位方法,其特征在于,包括:
    在机器人的导航操作环境下摄取图像;
    识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时,获取所述标准件的标准物理特征;
    基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
  2. 根据权利要求1所述的机器人的定位方法,其特征在于,所述获取所述标准件的标准物理特征的方式包括:
    自所述机器人的存储装置中读取预设的标准物理特征;或者
    通过网络自远程服务端获取所述标准物理特征。
  3. 根据权利要求1所述的机器人的定位方法,其特征在于,所述标准件包括:基于行业标准、国家标准、国际标准、和自定义标准中的至少一种标准而设计的标准件。
  4. 根据权利要求3所述的机器人的定位方法,其特征在于,所述自定义标准的标准件为藉由用户输入的物理参数生成有标准物理特征的标准件。
  5. 根据权利要求1所述的机器人的定位方法,其特征在于,所述基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息的方式包括:
    基于所述标准物理特征对所述标准图形进行偏角矫正处理以得到所述机器人相对于所述标准件所在平面的偏转角度;
    基于所述标准物理特征对所矫正后的图形进行距离测量以得到所述机器人相距所述标准件及所述标准件所在平面的距离。
  6. 根据权利要求1所述的机器人的定位方法,其特征在于,还包括:基于所述机器人在当前物理空间中的定位信息调整所述机器人中的地图数据。
  7. 根据权利要求6所述的机器人的定位方法,其特征在于,所述基于机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的方式包括:
    基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或
    基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
  8. 根据权利要求1所述的机器人的定位方法,其特征在于,还包括:基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线。
  9. 根据权利要求8所述的机器人的定位方法,其特征在于,所述基于机器人在当前物理空间中的定位信息调整所述机器人的导航路线的方式包括:基于所述定位信息重新确定所述机 器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
  10. 一种机器人的定位系统,其特征在于,包括:
    摄像装置,用于在机器人的导航操作环境下摄取图像;
    存储装置,用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征;
    定位处理装置,与所述摄像装置和存储装置相连,用于调用所述定位与地图构建应用以执行:识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;以及基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
  11. 根据权利要求10所述的机器人的定位系统,其特征在于,所述存储装置还预存有各标准件的标准物理特征。
  12. 根据权利要求10所述的机器人的定位系统,其特征在于,还包括网络接入装置,与所述定位处理装置相连,用于自远程服务端获取相应标准物理特征。
  13. 根据权利要求10所述的机器人的定位系统,其特征在于,所述标准件包括:基于行业标准、国家标准、国际标准、和自定义标准中的至少一种标准而设计的标准件。
  14. 根据权利要求13所述的机器人的定位系统,其特征在于,所述自定义标准的标准件为藉由用户输入的物理参数生成有标准物理特征的标准件。
  15. 根据权利要求10所述的机器人的定位系统,其特征在于,所述定位处理装置基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息的方式包括:
    基于所述标准物理特征对所述标准图形进行偏角矫正处理以得到所述机器人相对于所述标准件所在平面的偏转角度;
    基于所述标准物理特征对所矫正后的图形进行距离测量以得到所述机器人相距所述标准件及所述标准件所在平面的距离。
  16. 根据权利要求10所述的机器人的定位系统,其特征在于,所述定位处理装置还用于执行基于所述机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的步骤。
  17. 根据权利要求16所述的机器人的定位系统,其特征在于,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的方式包括:
    基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或
    基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
  18. 根据权利要求10所述的机器人的定位系统,其特征在于,所述定位处理装置还用于基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线。
  19. 根据权利要求18所述的机器人的定位系统,其特征在于,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人的导航路线的方式包括:
    基于所述定位信息重新确定所述机器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
  20. 一种机器人,其特征在于,包括:
    驱动装置,用于驱动所述机器人进行位移和/或姿态调整;
    摄像装置,用于在机器人的导航操作环境下摄取图像;
    存储装置,用于预存有定位与地图构建应用及行为控制应用,以及至少一个标准件的标准物理特征;
    定位处理装置,连接所述驱动装置、摄像装置、及存储装置,用于调用所述预存有定位与地图构建应用以执行:识别所述图像中实物的图形,当所识别的至少一个图形为对应有标准件的标准图形时获取所述标准件的标准物理特征;基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息。
  21. 根据权利要求20所述的机器人,其特征在于,所述存储装置还预存有各标准件的标准物理特征。
  22. 根据权利要求20所述的机器人,其特征在于,还包括网络接入装置,与所述定位处理装置相连,用于自远程服务端获取相应标准物理特征。
  23. 根据权利要求20所述的机器人,其特征在于,所述标准件包括:基于行业标准、国家标准、国际标准、和自定义标准中的至少一种标准而设计的标准件。
  24. 根据权利要求23所述的机器人,其特征在于,所述自定义标准的标准件为藉由用户输入的物理参数生成有标准物理特征的标准件。
  25. 根据权利要求20所述的机器人,其特征在于,所述定位处理装置基于所述标准图形和所述标准件的标准物理特征,确定所述机器人在当前物理空间中的定位信息的方式包括:
    基于所述标准物理特征对所述标准图形进行偏角矫正处理以得到所述机器人相对于所述标准件所在平面的偏转角度;
    基于所述标准物理特征对所矫正后的图形进行距离测量以得到所述机器人相距所述标准件及所述标准件所在平面的距离。
  26. 根据权利要求20所述的机器人,其特征在于,所述定位处理装置调用所述定位与地图构 建应用还执行基于所述机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的步骤。
  27. 根据权利要求26所述的机器人,其特征在于,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人中的地图数据的方式包括:
    基于所述定位信息补偿所述地图数据中对应位置的定位误差;和/或
    基于所述定位信息补偿所述地图数据中与所述标准件相关的地标信息中的定位误差。
  28. 根据权利要求20所述的机器人,其特征在于,所述定位处理装置还用于调用所述行为控制应用以执行依据所述定位信息控制所述驱动装置调整所述机器人的位移和/或姿态。
  29. 根据权利要求28所述的机器人,其特征在于,所述定位处理装置依据所述定位信息控制所述驱动装置调整所述机器人的位移和/或姿态的方式包括:
    基于所述机器人在当前物理空间中的定位信息调整所述机器人的导航路线;
    控制所述驱动装置按照导航路线进行位移和/或姿态调整。
  30. 根据权利要求29所述的机器人,其特征在于,所述定位处理装置基于机器人在当前物理空间中的定位信息调整所述机器人的导航路线的方式包括:
    基于所述定位信息重新确定所述机器人在预设地图数据中的位置及取向,并基于所重新确定的位置及取向调整导航路线。
  31. 一种电子设备的存储介质,其特征在于,存储有一个或多个程序,当所述一个或多个计算机程序被一个或多个处理器执行时,使得所述一个或多个处理器实现如权利要求1至9中任一项所述的定位方法。
PCT/CN2018/090653 2017-11-16 2018-06-11 定位方法、系统及所适用的机器人 Ceased WO2019095681A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/764,513 US11099577B2 (en) 2017-11-16 2018-06-11 Localization method and system, and robot using the same
EP18878085.2A EP3712853A4 (en) 2017-11-16 2018-06-11 POSITIONING PROCESS AND SYSTEM, AND APPROPRIATE ROBOT

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711136700.3A CN107680135B (zh) 2017-11-16 2017-11-16 定位方法、系统及所适用的机器人
CN201711136700.3 2017-11-16

Publications (1)

Publication Number Publication Date
WO2019095681A1 true WO2019095681A1 (zh) 2019-05-23

Family

ID=61149589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090653 Ceased WO2019095681A1 (zh) 2017-11-16 2018-06-11 定位方法、系统及所适用的机器人

Country Status (4)

Country Link
US (1) US11099577B2 (zh)
EP (1) EP3712853A4 (zh)
CN (1) CN107680135B (zh)
WO (1) WO2019095681A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824525A (zh) * 2019-11-15 2020-02-21 中冶华天工程技术有限公司 机器人的自定位方法
CN112754912A (zh) * 2021-01-12 2021-05-07 深圳市第二人民医院(深圳市转化医学研究院) 艾灸盒智能悬挂系统及其绳索定位锚投方法

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107680135B (zh) 2017-11-16 2019-07-23 珊口(上海)智能科技有限公司 定位方法、系统及所适用的机器人
TW201937452A (zh) * 2018-03-01 2019-09-16 緯創資通股份有限公司 定位系統及方法及電腦可讀取儲存媒體
CN109074757B (zh) * 2018-07-03 2021-11-09 达闼机器人有限公司 一种建立地图的方法、终端和计算机可读存储介质
CN109643127B (zh) * 2018-11-19 2022-05-03 深圳阿科伯特机器人有限公司 构建地图、定位、导航、控制方法及系统、移动机器人
US20220163971A1 (en) * 2019-01-28 2022-05-26 VEKTOR Dynamics A/S Robotic vehicle with safety measures
CN110919644B (zh) * 2019-06-11 2022-02-08 远形时空科技(北京)有限公司 一种利用摄像头设备和机器人进行定位交互的方法及系统
IL269715B2 (en) * 2019-09-26 2025-02-01 Seamless Vision 2017 Ltd Vehicle navigation indication
KR20190121275A (ko) * 2019-10-07 2019-10-25 엘지전자 주식회사 실내 측위 시스템, 장치 및 방법
CN110850872A (zh) * 2019-10-31 2020-02-28 深圳市优必选科技股份有限公司 机器人巡检方法、装置、计算机可读存储介质及机器人
US12399012B2 (en) * 2020-02-21 2025-08-26 Canon Kabushiki Kaisha Information processing device, information processing method, and storage medium
CN111739092A (zh) * 2020-06-12 2020-10-02 广东博智林机器人有限公司 一种吊篮、检测机器人、检测控制系统及检测方法
CN112000103B (zh) * 2020-08-27 2023-04-11 西安达升科技股份有限公司 一种agv机器人定位、建图与导航的方法及系统
WO2022097765A1 (ko) * 2020-11-04 2022-05-12 주식회사 다비오 객체 인식 기반 실내 측위를 위한 단말장치, 서비스 서버 및 그 방법
CN112927269B (zh) * 2021-03-26 2024-07-16 深圳市无限动力发展有限公司 基于环境语义的地图构建方法、装置和计算机设备
CN114001738B (zh) * 2021-09-28 2024-08-30 浙江大华技术股份有限公司 视觉巡线定位方法、系统以及计算机可读存储介质
CN116965745A (zh) * 2022-04-22 2023-10-31 追觅创新科技(苏州)有限公司 坐标重定位方法、系统及清洁机器人
CN114980309A (zh) * 2022-05-12 2022-08-30 深圳依时货拉拉科技有限公司 定位方法、定位装置、计算机设备及计算机可读存储介质
CN115113632B (zh) * 2022-08-31 2022-11-22 深圳市米塔机器人有限公司 机器人的控制方法、机器人及电子设备
CN115496399B (zh) * 2022-10-12 2023-07-25 杭州余杭建筑设计院有限公司 基于无人机的基坑勘测任务即时更新分配方法及系统
CN115790694B (zh) * 2022-12-13 2025-12-19 南京新路阳光电有限公司 一种可移动式紫外线杀菌灯车自动校准装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101114337A (zh) * 2007-08-08 2008-01-30 华中科技大学 一种地面建筑物识别定位方法
US20100152945A1 (en) * 2008-12-17 2010-06-17 Samsung Electronics Co., Ltd. Apparatus and method of localization of mobile robot
CN104062973A (zh) * 2014-06-23 2014-09-24 西北工业大学 一种基于图像标志物识别的移动机器人slam方法
CN105352508A (zh) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 机器人定位导航方法及装置
CN105865451A (zh) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 用于移动机器人室内定位的方法和设备
CN107680135A (zh) * 2017-11-16 2018-02-09 珊口(上海)智能科技有限公司 定位方法、系统及所适用的机器人

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040052491A (ko) * 2001-03-16 2004-06-23 비젼 로보틱스 코포레이션 이미지 센서의 유효 다이내믹 레인지를 효과적으로증대하는 시스템 및 방법
WO2005098476A1 (en) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Method and apparatus for position estimation using reflected light sources
KR101750340B1 (ko) * 2010-11-03 2017-06-26 엘지전자 주식회사 로봇 청소기 및 이의 제어 방법
CN102135429B (zh) * 2010-12-29 2012-06-13 东南大学 一种基于视觉的机器人室内定位导航方法
WO2013071190A1 (en) * 2011-11-11 2013-05-16 Evolution Robotics, Inc. Scaling vector field slam to large environments
US8818723B2 (en) * 2012-08-27 2014-08-26 Massachusetts Institute Of Technology Localization and tracking system for mobile robots
US9427874B1 (en) * 2014-08-25 2016-08-30 Google Inc. Methods and systems for providing landmarks to facilitate robot localization and visual odometry
CN105806337B (zh) * 2014-12-30 2019-07-19 Tcl集团股份有限公司 一种应用于室内机器人的定位方法和室内机器人
US9630319B2 (en) * 2015-03-18 2017-04-25 Irobot Corporation Localization and mapping using physical features
US9840003B2 (en) * 2015-06-24 2017-12-12 Brain Corporation Apparatus and methods for safe navigation of robotic devices
CN105486311B (zh) * 2015-12-24 2019-08-16 青岛海通机器人系统有限公司 室内机器人定位导航方法及装置
CN106643801B (zh) * 2016-12-27 2019-11-19 纳恩博(北京)科技有限公司 一种定位准确度的检测方法及电子设备
CN106959691B (zh) * 2017-03-24 2020-07-24 联想(北京)有限公司 可移动电子设备和即时定位与地图构建方法
KR102348041B1 (ko) * 2017-03-28 2022-01-05 엘지전자 주식회사 복수의 이동 로봇을 포함하는 로봇 시스템의 제어 방법
US10895971B2 (en) * 2017-05-12 2021-01-19 Irobot Corporation Methods, systems, and devices for mapping, controlling, and displaying device status
CN107328420B (zh) * 2017-08-18 2021-03-02 上海智蕙林医疗科技有限公司 定位方法和装置
US10698413B2 (en) * 2017-12-28 2020-06-30 Savioke Inc. Apparatus, system, and method for mobile robot relocalization
US10878294B2 (en) * 2018-01-05 2020-12-29 Irobot Corporation Mobile cleaning robot artificial intelligence for situational awareness

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101114337A (zh) * 2007-08-08 2008-01-30 华中科技大学 一种地面建筑物识别定位方法
US20100152945A1 (en) * 2008-12-17 2010-06-17 Samsung Electronics Co., Ltd. Apparatus and method of localization of mobile robot
CN104062973A (zh) * 2014-06-23 2014-09-24 西北工业大学 一种基于图像标志物识别的移动机器人slam方法
CN105352508A (zh) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 机器人定位导航方法及装置
CN105865451A (zh) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 用于移动机器人室内定位的方法和设备
CN107680135A (zh) * 2017-11-16 2018-02-09 珊口(上海)智能科技有限公司 定位方法、系统及所适用的机器人

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3712853A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824525A (zh) * 2019-11-15 2020-02-21 中冶华天工程技术有限公司 机器人的自定位方法
CN112754912A (zh) * 2021-01-12 2021-05-07 深圳市第二人民医院(深圳市转化医学研究院) 艾灸盒智能悬挂系统及其绳索定位锚投方法

Also Published As

Publication number Publication date
CN107680135B (zh) 2019-07-23
EP3712853A1 (en) 2020-09-23
CN107680135A (zh) 2018-02-09
US11099577B2 (en) 2021-08-24
EP3712853A4 (en) 2020-12-23
US20210011483A1 (en) 2021-01-14

Similar Documents

Publication Publication Date Title
WO2019095681A1 (zh) 定位方法、系统及所适用的机器人
CN109074083B (zh) 移动控制方法、移动机器人及计算机存储介质
US10518414B1 (en) Navigation method, navigation system, movement control system and mobile robot
WO2019090833A1 (zh) 定位系统、方法及所适用的机器人
US10436590B2 (en) Localization system and method, and robot using the same
CN109643127B (zh) 构建地图、定位、导航、控制方法及系统、移动机器人
Folkesson et al. Vision SLAM in the measurement subspace
US20200306989A1 (en) Magnetometer for robot navigation
CN106813672B (zh) 移动机器人的导航方法及移动机器人
WO2019114219A1 (zh) 移动机器人及其控制方法和控制系统
WO2021146862A1 (zh) 移动设备的室内定位方法、移动设备及控制系统
CN109506652B (zh) 一种基于地毯偏移的光流数据融合方法及清洁机器人
WO2019232804A1 (zh) 软件更新方法、系统、移动机器人及服务器
Sim et al. Autonomous vision-based robotic exploration and mapping using hybrid maps and particle filters
WO2019113859A1 (zh) 基于机器视觉的虚拟墙构建方法及装置、地图构建方法、可移动电子设备
CN109416251B (zh) 基于色块标签的虚拟墙构建方法及装置、地图构建方法、可移动电子设备
Bok et al. Accurate motion estimation and high-precision 3d reconstruction by sensor fusion
CN115668293B (zh) 地毯检测方法、运动控制方法以及使用该些方法的移动机器
US12051263B2 (en) Human lying posture detection method and mobile machine using the same
Holzmann et al. Direct stereo visual odometry based on lines
CN117562443A (zh) 机器人的控制方法、机器人、清洁系统及存储介质
CN116385489B (zh) 基于rgbd相机的目标跟随方法及系统
Wang et al. Geometric constraints for robot navigation using omnidirectional camera
Wang et al. Real-time visual odometry estimation based on principal direction detection on ceiling vision
Zang et al. Camera localization by CAD model matching

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18878085

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018878085

Country of ref document: EP

Effective date: 20200616