[go: up one dir, main page]

US20110082585A1 - Method and apparatus for simultaneous localization and mapping of mobile robot environment - Google Patents

Method and apparatus for simultaneous localization and mapping of mobile robot environment Download PDF

Info

Publication number
US20110082585A1
US20110082585A1 US12/873,018 US87301810A US2011082585A1 US 20110082585 A1 US20110082585 A1 US 20110082585A1 US 87301810 A US87301810 A US 87301810A US 2011082585 A1 US2011082585 A1 US 2011082585A1
Authority
US
United States
Prior art keywords
robot
physical environment
particles
map
data acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/873,018
Other languages
English (en)
Inventor
Boris Sofman
Vladimir Ermakov
Mark Emmerich
Steven Alexander
Nathaniel David MONSON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neato Robotics Inc
Original Assignee
Neato Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neato Robotics Inc filed Critical Neato Robotics Inc
Priority to US12/873,018 priority Critical patent/US20110082585A1/en
Assigned to NEATO ROBOTICS, INC. reassignment NEATO ROBOTICS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ERMAKOV, VLADIMIR, SOFMAN, BORIS, MONSON, NATHANIEL DAVID, ALEXANDER, STEVEN, EMMERICH, MARK
Publication of US20110082585A1 publication Critical patent/US20110082585A1/en
Priority to US14/067,705 priority patent/US8903589B2/en
Assigned to SQUARE 1 BANK reassignment SQUARE 1 BANK SECURITY AGREEMENT Assignors: NEATO ROBOTICS, INC.
Priority to US14/543,508 priority patent/US9678509B2/en
Assigned to NEATO ROBOTICS, INC. reassignment NEATO ROBOTICS, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: SQUARE 1 BANK
Priority to US15/602,012 priority patent/US20170255203A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/01Mobile robot
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/46Sensing device
    • Y10S901/47Optical

Definitions

  • aspects of the present invention relate to mobile robots, and more particularly to the mapping of environments in which mobile robots operate, to facilitate movement of mobile robots within those environments.
  • simultaneous localization and mapping As a system that enables a mobile robot to map its environment and maintain working data of its position within that map, simultaneous localization and mapping (SLAM) is both accurate and versatile. Its reliability and suitability for a variety of applications make it a useful element for imparting a robot with some level of autonomy.
  • SLAM techniques tend to be computationally intensive and thus their efficient execution often requires a level of processing power and memory capacity that may not be cost effective for some consumer product applications.
  • FIG. 1 depicts a block diagram showing some features according to the invention.
  • FIG. 2 depicts a flow chart showing some features according to the invention, corresponding to certain aspects of FIG. 1 .
  • FIG. 3 depicts another flow chart showing some other features according to the invention, corresponding to certain aspects of FIG. 1 .
  • FIG. 4 depicts a block diagram showing some other features according to the invention.
  • FIG. 5 depicts an example of particle weight distribution for a localization iteration process.
  • FIG. 6 depicts a further example of particle weight distribution for a localization iteration process.
  • FIG. 7 depicts yet a further example of particle weight distribution for a localization iteration process.
  • FIG. 8 depicts localized and delocalized states based on verified particle distribution data.
  • FIG. 9 depicts a block diagram showing some other features according to the invention.
  • FIG. 10 depicts a flow chart showing some other features of the invention, corresponding to certain aspects of FIG. 6 .
  • FIG. 11 depicts an example of orientation of a mobile robot in its physical environment.
  • FIG. 12 depicts a further example of orientation of a mobile robot in its physical environment.
  • FIG. 13 depicts a flow chart showing some other features of the invention, corresponding to certain aspects of FIG. 1 .
  • FIG. 14 depicts one scenario of movement and orientation of a mobile robot in its physical environment.
  • FIG. 15 depicts a further scenario of movement and orientation of a mobile robot in its physical environment.
  • FIG. 16 depicts yet a further scenario of movement and orientation of a mobile robot in its physical environment.
  • Localization requires regularly updating a robot's pose (position and angle) within its environment. The frequency with which this is done can affect overall system performance, depending on how often data must be processed as a result of an update operation. Minimizing computational load is essential to providing a SLAM system that can function effectively in a low-cost hardware environment.
  • computational load may be reduced by eliminating robot position updates when it appears that the robot has become delocalized, in which case the updates likely would be erroneous anyway.
  • FIG. 1 is a diagram depicting aspects of the just-mentioned feature in a mobile robotic system 100 .
  • data acquisition system 110 generates data regarding the environment of mobile robot 120 .
  • This data becomes input data to processing apparatus 130 .
  • processing apparatus 130 From this data, processing apparatus 130 generates a map or model of the mobile robot's environment (block 132 ).
  • Processing apparatus 130 also may contain a separate function (block 134 ) that monitors the generation or updating of the map for any shift in map elements beyond a threshold limit. If such an occurrence is detected, the processing apparatus (block 136 ) responds by executing instructions to suspend or modify the use of data from data acquisition system 110 .
  • a sensing unit 140 also may monitor the data acquisition system 110 for a loss in preferred orientation of the data acquisition system 110 for data generation. If sensing unit 140 detects a loss in orientation, processing apparatus 130 will respond by executing instructions to suspend or modify use of data generated by the data acquisition system 110 .
  • Mobile robot 120 may be connected to processing apparatus 130 .
  • the sensing unit 140 if present, may be attached to the mobile robot 120 .
  • Data acquisition system 110 may be attached to the mobile robot 120 as well, or alternatively may be separate.
  • FIG. 2 shows a flow of operation of the system depicted in FIG. 1 .
  • the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 202 .
  • the orientation of the data acquisition system is monitored to see whether the data acquisition system is maintaining its preferred orientation with respect to the robot's physical environment (e.g. whether the data acquisition system is tilting, has tipped over, or otherwise seems to display an orientation other than one in which the robot can function within its physical environment.
  • the generated data is used to generate or update the map of the robot's physical environment.
  • FIG. 3 is a diagram showing other features of the invention.
  • map generation apparatus 310 provides a map of a mobile device's environment for localization of the mobile device within that environment.
  • a delocalization detection apparatus 320 uses the map information to determine the position of the device.
  • Particle generation apparatus 322 generates particles representing potential poses of the mobile device.
  • Particle weight assignment apparatus 324 assigns weights to each particle representing its relative likelihood of accuracy relative to other particles.
  • an erroneous particle generation apparatus 326 generates particles such that their corresponding weights as generated by particle weight assignment apparatus 324 will be low, representing a low probability of correctly indicating the mobile device's position.
  • a particle weight comparison apparatus 328 compares the weights of the erroneous particles with the weights of the particles generated by the particle generation apparatus 322 and confirms that the device is accurately localized or determines whether delocalization has occurred.
  • the method may operate as follows:
  • FIG. 4 depicts a flow of operation of the system depicted in FIG. 3 .
  • the existing map may be used or updated as appropriate.
  • particles are generated, either anew or iteratively, the iteratively generated particles being added to the existing particle set.
  • weights are assigned to each particle.
  • erroneous particles are generated, and at block 405 , the erroneous particles have weights assigned to them.
  • the weights of the erroneous particles are compared to those of the original particle set to determine whether delocalization has occurred.
  • a check for delocalization is made. If delocalization has not occurred, then similarly to block 205 in FIG. 2 , map generation and updating continues. If delocalization has occurred, then similarly to block 206 in FIG. 2 , map generation is suspended or modified.
  • a typical approach to localization under a SLAM scheme might include the following steps:
  • a typical localization iteration based on the above process might yield the particle weight distribution illustrated in FIG. 5 .
  • the distribution of particles, sorted by weight appears as a curve, indicating a mix of particles of low, middle and high weights.
  • a particle's index number may indicate its relative position with respect to other particles regarding its probability of accurately representing the robot's pose (position and angle).
  • particle 1 has the highest probability of accuracy and all subsequent particles (i.e., particles 2, 3, 4, etc.) have sequentially lower probabilities of accuracy in their pose.
  • the weight scale (the vertical axis in the graph) may be highly dependent on environmental conditions such as distance from walls, number of valid distance readings from a spatial sensor such as a laser rangefinder, etc.
  • An approach to determining delocalization via the introduction of erroneous particles generally should be independent of environmental conditions.
  • the goal of introducing erroneous particles is to identify when the particles with higher probability of representing the robot's pose are not much better than particles with the lowest probability of representing the robot's pose. In such a circumstance, the implication is that most or all potential poses are bad, and therefore the robot has little or no reliable information regarding its actual whereabouts within its environment. By definition, the robot is delocalized.
  • the process of assessing the state of localization involves introducing additional test particles whose pose is deliberately erroneous in order to set a baseline weight for comparison to better particles.
  • the particles representing candidate location angles with the highest weights are fairly close to an ideal motion model. Recognizing this, a generally effective approach to delocalization detection is to introduce erroneous particles at the center of the ideal motion model with large offsets to the angle (e.g., ⁇ 30°, 40°, 50°, 60°, etc.).
  • the erroneous particles will reside relatively close together at the end of the sorted distribution curve that contains the lowest weighted particles, as shown in FIG. 6 .
  • the erroneous particles referred to here as verification particles for their purpose, are clustered together on the lower right end of the curve, each having a weight that is closer to zero than the particles comprising the rest of the sorted distribution.
  • some erroneous (verification) particles reside at the far right side of the distribution, but other erroneous particles are scattered through the rest of the particle set. As more particles known to be erroneous have weights that exceed other, non-verification particles, it becomes increasingly likely that the robot has delocalized.
  • delocalization can be done in any of a variety of ways, including by examining the mean index value of the erroneous (verification) particles. In a localized condition, most or all of the erroneous particles will reside relatively close together at the bottom of the index, since they generally will have the lowest weights. Averaging the indices of the erroneous particles in a localized case will yield a large number relative to the size of the total set of particles, including both erroneous and non-erroneous particles.
  • an average of verification particle indices that remains constant and high in value with respect to total particle set size reflects a localized condition.
  • An average that falls in value or begins to fluctuate in value may indicate a delocalized condition.
  • both of these states, localized and delocalized, are depicted in the plots of the averaged verification particle data in FIG. 8 .
  • the plotted data are the averaged verification particle indices.
  • the averaged data are high and relatively constant, which is consistent with a localized state.
  • the average value drops significantly and then recovers; in this particular data set, this drop corresponds to an engineer picking the robot up from the floor and moving it to a different location.
  • the return of the average to a high, stable number indicates that the robot likely recovered from the event.
  • Determining that the robot has delocalized relies on comparing the averaged erroneous particle index to a threshold number.
  • the threshold number can be decided a priori during coding, but it is typically beneficial to include some hysteresis in the evaluation of whether a robot is localized. For example, looking at the latter portion of the data set illustrated in FIG. 8 , the variability of the averaged verification particle indices reaches a high number several times, but, in each instance, it drops again after only a few iterations.
  • a proper evaluation of whether a robot has recovered from a delocalization event should not look only at instantaneous values, but also should evaluate whether the averaged index returns to a high value and remains stable at a high value for a period of time sufficient to demonstrate that the robot likely has successfully re-localized.
  • the necessary minimum duration can also be defined in the code.
  • Newly encountered, unmapped space may contain a mix of dynamic and static elements. Making a distinction between the robot's identification of potentially dynamic areas of the map and those that are static is essential for building useful and accurate maps for the robot to use.
  • the issue of distinguishing between static (permanent) elements of the robot's surroundings and dynamic (transient) elements may be addressed in the following way:
  • FIG. 9 is a diagram of a system containing other features of the invention.
  • a data acquisition system 910 generates data regarding the physical environment of a mobile device such as a robot.
  • the data generated by the data acquisition system provides input to a map/model processing apparatus 920 .
  • the map/model processing apparatus 920 generates and maintains a map in a cell-based grid form (block 922 ) and assigns a probability of occupancy to each cell (block 924 ) based on the data received from the data acquisition system.
  • the map/model processing unit monitors individual cells (block 926 ) for changes in their probability of occupancy. Based on the detection of such changes, the processing unit determines if any cells are dynamic. If cells are determined to be dynamic, they are marked accordingly (block 928 ). Mapping or updating of such cells is suspended for the period that they are in a dynamic state.
  • FIG. 10 depicts a flow of operation of the embodiment shown in FIG. 9 .
  • the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 1002 .
  • the generated data is used to generate or update the map of the robot's physical environment.
  • probabilities of occupancy for each cell in the grid map are assigned or updated.
  • floors may have areas of uneven surface or surface discontinuities, or because objects resting on the floor may introduce non-uniformities in a robot's available travel surface, it is possible that a sensor collecting spatial data may not maintain consistent orientation with the presiding surfaces of the surrounding geometry, which can lead to erroneous delineation of the robot's surroundings.
  • FIGS. 11-12 illustrate the potential problem encountered by a robot collecting spatial data without an ability to detect when its sensor has lost parallel orientation with the floor.
  • the robot In the upper illustration, the robot is traveling away from a physical boundary at A and toward a physical boundary at B.
  • a sensor mounted on the robot in this example is collecting spatial data in a horizontal plane indicated by the thin line positioned at a height near the top of the robot.
  • the robot In the lower illustration, the robot begins traversing an obstacle which tilts the robot backward. If the robot does not recognize that it is no longer collecting data in a plane that is accordant with the surrounding geometry, then the spatial construction developed from the sensor data will not match the actual geometry defined by the robot's surroundings.
  • the data collection plane's forward incline will distort the previously determined position of the wall at B to one further out, at B′.
  • the backward decline on the data collection plane results in its intersection with the floor, creating the impression that a boundary exists behind the robot at A′ rather than at the further position of A.
  • wheel slip accompanies tilt when a robot traverses a substantive irregularity in a floor surface. This can be particularly problematic if it occurs when the robot is collecting its first data on a new area (e.g., when the robot has turned a corner into an unmapped space) since the distorted image may be incorporated into the map.
  • erroneous data generated during a tilt event can propagate into mapping or localization algorithms.
  • the potential results may include some degree of mapping corruption, which frequently can lead to delocalization.
  • dynamic areas created by people, pets or objects moved or in use by a person will present a dynamic area to mark, one that usually is limited in its footprint.
  • the dynamic area is spread along a relatively wide area, then this may represent a different scenario. For example, if a map boundary area shifts suddenly or moves in a way that many, possibly contiguous cells are tagged as active, then it may be likely that the robot has tilted. In such a case, the spatial sensor's detection plane may be angled such that a portion of the floor near the robot is read as a boundary, as indicated in the example described earlier.
  • the robot identifies that a dynamic area involves an area larger than would be created by people, pets or moving objects in relative proportion with the former, then the updating of the map may be suspended.
  • FIG. 13 depicts a flow of operation of a system as depicted in FIG. 1 , with the variant that tilt of the robot is detected and addressed in software.
  • the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 1302 .
  • the generated data is used to generate or update the map of the robot's physical environment.
  • a check is made to see if any elements of the map (e.g. a map boundary area) has shifted beyond a threshold limit. If not, then at block 1305 , map generation or update continues. However, if at block 1304 there has been a shift beyond the threshold limit, then at block 1306 , the map generation is suspended, or the map is modified.
  • the map e.g. a map boundary area
  • the instruction to suspend or modify is generated within the processing apparatus, and does not originate from the sensing unit.
  • flow returns to data generation, so that further checks can be made to see whether the map elements have returned to within threshold limits.
  • Detection of motion may rely on spatial scanning done by, for example, a laser rangefinder, which may continuously scan a robot's surroundings.
  • the spatial distance represented by an aggregate distance, or by a distance differential may be compared to a pre-defined threshold value. If the difference between the first to the last distance measurement is larger than the threshold, it may be concluded that the robot is tilted.
  • FIG. 14 provides an example of such a scenario. Consider the robot at location A moving through a room and passing a doorway into an adjoining room. Assume that the robot employs a planar spatial sensor enabling it to delineate the physical limits of its surroundings.
  • Such a sensor likely would detect, through the open doorway, some portion of the wall of the adjoining room, which, in the example case, may yield the detected length of wall segment B. If one side of the advancing robot encounters an obstacle such as, for example, a thick rug, that results in the robot straddling the object (e.g., the left wheel(s) may be raised by the rug while the right wheel(s) continues to roll on the floor), then the robot's sensing plane likely will tilt toward its right side. Depending on room geometry and degree of tilt, it is possible that the portion of the sensing plane that had been detecting the wall of the adjoining room at B, now would intersect the floor of the adjoining room at the much closer location of B′.
  • an obstacle such as, for example, a thick rug
  • the data may show the wall boundary shift suddenly from B to B′ while other boundaries might show little or no variation in position.
  • the determination that a tilt event has occurred may be based on a comparison between the physical length represented by the consecutive, newly-“occupied” cells and a pre-defined threshold. If the represented distance, or distance differential, meets or exceeds the threshold, it may be concluded that the robot has tilted and map updating may be suspended.
  • Detection of tilt in hardware may involve the use of an accelerometer or similar component that may detect changes in the orientation of the component's mounting surface.
  • data generated by the spatial scanner may be supplemented by data regarding changes in orientation.
  • this latter data set providing contextual verification for the spatial sensor's data
  • information collected while the tilt-detecting component indicates that the spatial sensor has lost its preferred orientation could be discarded.
  • this data may be discarded before it is processed by any localization or mapping software.
  • a robot uses a sensor generating 2D spatial information in a horizontal plane from the robot's surroundings.
  • the dotted line indicates the sensing perimeter, created by the spatial sensing plane intersecting objects surrounding the robot. This perimeter informs the robot of nearby obstacles and the boundaries presented by walls and doors.
  • FIG. 16 As depicted in FIG. 16 . if the robot traverses a low obstacle, such as the door frame shown in FIG. 16 , or an uneven surface, then the robot may lose its parallel disposition with respect to the floor. As a result, a sensor fixed to the robot collecting spatial information regarding the robot's surroundings may collect data at an angle away from horizontal.
  • the dotted line in FIG. 16 shows the intersection of the spatial sensor's plane of detection with object surfaces surrounding the robot. With the robot tilted, the generated spatial data becomes erroneous. The calculated distance to the wall in front of the robot becomes distorted as the detection plane at B′ intersects the wall at a higher point, but, more critically, the detection plane's intersection with the floor behind the robot would incorrectly report a linear boundary at A′.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
US12/873,018 2009-08-31 2010-08-31 Method and apparatus for simultaneous localization and mapping of mobile robot environment Abandoned US20110082585A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/873,018 US20110082585A1 (en) 2009-08-31 2010-08-31 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US14/067,705 US8903589B2 (en) 2009-08-31 2013-10-30 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US14/543,508 US9678509B2 (en) 2009-08-31 2014-11-17 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US15/602,012 US20170255203A1 (en) 2009-08-31 2017-05-22 Method and apparatus for simultaneous localization and mapping of mobile robot environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US23859709P 2009-08-31 2009-08-31
US12/873,018 US20110082585A1 (en) 2009-08-31 2010-08-31 Method and apparatus for simultaneous localization and mapping of mobile robot environment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/067,705 Division US8903589B2 (en) 2009-08-31 2013-10-30 Method and apparatus for simultaneous localization and mapping of mobile robot environment

Publications (1)

Publication Number Publication Date
US20110082585A1 true US20110082585A1 (en) 2011-04-07

Family

ID=42941963

Family Applications (4)

Application Number Title Priority Date Filing Date
US12/873,018 Abandoned US20110082585A1 (en) 2009-08-31 2010-08-31 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US14/067,705 Active US8903589B2 (en) 2009-08-31 2013-10-30 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US14/543,508 Active US9678509B2 (en) 2009-08-31 2014-11-17 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US15/602,012 Abandoned US20170255203A1 (en) 2009-08-31 2017-05-22 Method and apparatus for simultaneous localization and mapping of mobile robot environment

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/067,705 Active US8903589B2 (en) 2009-08-31 2013-10-30 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US14/543,508 Active US9678509B2 (en) 2009-08-31 2014-11-17 Method and apparatus for simultaneous localization and mapping of mobile robot environment
US15/602,012 Abandoned US20170255203A1 (en) 2009-08-31 2017-05-22 Method and apparatus for simultaneous localization and mapping of mobile robot environment

Country Status (10)

Country Link
US (4) US20110082585A1 (fr)
EP (1) EP2473890B1 (fr)
JP (2) JP2013503404A (fr)
KR (1) KR101362961B1 (fr)
CN (2) CN102576228A (fr)
AU (1) AU2010286429B2 (fr)
CA (2) CA2859112C (fr)
HK (1) HK1211352A1 (fr)
NZ (1) NZ598500A (fr)
WO (1) WO2011026119A2 (fr)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8209144B1 (en) * 2009-09-15 2012-06-26 Google Inc. Accurate alignment of multiple laser scans using a template surface
US20120195491A1 (en) * 2010-07-21 2012-08-02 Palo Alto Research Center Incorporated System And Method For Real-Time Mapping Of An Indoor Environment Using Mobile Robots With Limited Sensing
WO2014110204A1 (fr) * 2013-01-10 2014-07-17 Intel Corporation Positionnement et cartographie basés sur des points de repère virtuels
US20140350839A1 (en) * 2013-05-23 2014-11-27 Irobot Corporation Simultaneous Localization And Mapping For A Mobile Robot
US20150261223A1 (en) * 2011-09-30 2015-09-17 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9304001B2 (en) 2013-07-03 2016-04-05 Samsung Electronics Co., Ltd Position recognition methods of autonomous mobile robots
US9361591B2 (en) 2012-10-29 2016-06-07 Electronics And Telecommunications Research Institute Apparatus and method for building map of probability distribution based on properties of object and system
US9566706B2 (en) 2014-01-14 2017-02-14 Samsung Electronics Co., Ltd. Robot and control method thereof
WO2017116492A1 (fr) * 2015-12-31 2017-07-06 Olney Guy Procédé permettant d'intégrer des flux parallèles de données de capteurs associées générant des réponses d'essai sans connaissance préalable de la signification des données ou de l'environnement détecté
DE102016203547A1 (de) 2016-03-03 2017-09-07 Kuka Roboter Gmbh Verfahren zum Aktualisieren einer Belegungskarte und autonomes Fahrzeug
US9827994B2 (en) * 2015-06-25 2017-11-28 Hyundai Motor Company System and method for writing occupancy grid map of sensor centered coordinate system using laser scanner
US9864377B2 (en) 2016-04-01 2018-01-09 Locus Robotics Corporation Navigation using planned robot travel paths
WO2018094272A1 (fr) * 2016-11-18 2018-05-24 Robert Bosch Start-Up Platform North America, LLC, Series 1 Créature robotique et procédé de fonctionnement
US10310511B2 (en) 2016-04-20 2019-06-04 Toyota Jidosha Kabushiki Kaisha Automatic driving control system of mobile object
CN109900267A (zh) * 2019-04-12 2019-06-18 哈尔滨理工大学 一种基于slam的移动机器人地图创建与自主探索系统
US10365656B2 (en) 2017-11-22 2019-07-30 Locus Robotics Corp. Robot charger docking localization
US10386851B2 (en) 2017-09-22 2019-08-20 Locus Robotics Corp. Multi-resolution scan matching with exclusion zones
CN110174894A (zh) * 2019-05-27 2019-08-27 小狗电器互联网科技(北京)股份有限公司 机器人及其重定位方法
US10429847B2 (en) 2017-09-22 2019-10-01 Locus Robotics Corp. Dynamic window approach using optimal reciprocal collision avoidance cost-critic
US20190329407A1 (en) * 2018-04-30 2019-10-31 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for multimodal mapping and localization
US10549430B2 (en) * 2015-08-28 2020-02-04 Panasonic Intellectual Property Corporation Of America Mapping method, localization method, robot system, and robot
US20200050205A1 (en) * 2018-08-07 2020-02-13 Cnh Industrial America Llc System and method for updating a mapped area
US20200265621A1 (en) * 2019-02-14 2020-08-20 Faro Technologies, Inc. System and method of scanning two dimensional floorplans using multiple scanners concurrently
US10761539B2 (en) 2017-11-22 2020-09-01 Locus Robotics Corp. Robot charger docking control
TWI736960B (zh) * 2019-08-28 2021-08-21 財團法人車輛研究測試中心 同步定位與建圖優化方法
CN113478480A (zh) * 2021-06-22 2021-10-08 中建三局集团有限公司 一种横折臂布料机的轨迹规划方法
US11199853B1 (en) 2018-07-11 2021-12-14 AI Incorporated Versatile mobile platform
US11254002B1 (en) 2018-03-19 2022-02-22 AI Incorporated Autonomous robotic device
US11320828B1 (en) 2018-03-08 2022-05-03 AI Incorporated Robotic cleaner
US11340079B1 (en) 2018-05-21 2022-05-24 AI Incorporated Simultaneous collaboration, localization, and mapping
US20220269273A1 (en) * 2021-02-23 2022-08-25 Hyundai Motor Company Apparatus for estimating position of target, robot system having the same, and method thereof
US11454981B1 (en) 2018-04-20 2022-09-27 AI Incorporated Versatile mobile robotic device
US11548159B1 (en) * 2018-05-31 2023-01-10 AI Incorporated Modular robot
US11625870B2 (en) 2017-07-31 2023-04-11 Oxford University Innovation Limited Method of constructing a model of the motion of a mobile device and related systems
US11944876B2 (en) * 2022-05-30 2024-04-02 Tennibot Inc. Autonomous tennis assistant systems
US12039674B2 (en) 2020-09-18 2024-07-16 Apple Inc. Inertial data management for extended reality for moving platforms
DE102023204536A1 (de) 2023-05-15 2024-11-21 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Lokalisierung eines mobilen Geräts
US12406442B2 (en) 2020-09-18 2025-09-02 Apple Inc. Extended reality for moving platforms
US12535822B2 (en) * 2020-11-06 2026-01-27 Ubtech Robotics Corp Ltd Mapping method, computer-readable storage medium, and robot

Families Citing this family (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5429986B2 (ja) * 2009-12-25 2014-02-26 株式会社Ihiエアロスペース 移動ロボットの遠方環境認識装置及び方法
JP5452442B2 (ja) * 2010-10-25 2014-03-26 株式会社日立製作所 ロボットシステム及び地図更新方法
WO2014004929A2 (fr) 2012-06-27 2014-01-03 Pentair Water Pool And Spa, Inc. Nettoyeur de piscine avec système de télémètre laser et procédé s'y rapportant
US9020637B2 (en) * 2012-11-02 2015-04-28 Irobot Corporation Simultaneous localization and mapping for a mobile robot
CN103903253B (zh) * 2012-12-28 2017-06-27 联想(北京)有限公司 一种可移动终端定位方法及系统
CN103901774B (zh) * 2012-12-28 2017-02-08 联想(北京)有限公司 高效鲁棒的基于多传感器的slam协调方法及系统
JP2014203145A (ja) * 2013-04-02 2014-10-27 パナソニック株式会社 自律移動装置
CN103412565B (zh) * 2013-05-17 2016-01-27 浙江中控研究院有限公司 一种具有全局位置快速估计能力的机器人定位方法
KR101505129B1 (ko) 2013-08-19 2015-03-23 부경대학교 산학협력단 레이저 스캐너를 이용한 위치인식 및 맵핑 시스템을 이용한 위치인식 방법
US9886036B2 (en) * 2014-02-10 2018-02-06 John Bean Technologies Corporation Routing of automated guided vehicles
CN103984037B (zh) * 2014-04-30 2017-07-28 深圳市墨克瑞光电子研究院 基于视觉的移动机器人障碍物检测方法和装置
US9259838B1 (en) * 2014-07-24 2016-02-16 Google Inc. Systems and methods for ground plane estimation
US10127667B2 (en) * 2014-08-01 2018-11-13 Locuslabs, Inc. Image-based object location system and process
FR3025325B1 (fr) * 2014-09-01 2016-12-30 Valeo Schalter & Sensoren Gmbh Dispositif et procede de localisation et de cartographie
US10660496B2 (en) * 2014-09-24 2020-05-26 Samsung Electronics Co., Ltd. Cleaning robot and method of controlling the cleaning robot
CN104597900A (zh) * 2014-12-02 2015-05-06 华东交通大学 一种基于类电磁机制优化的FastSLAM方法
EP3971672A1 (fr) 2014-12-17 2022-03-23 Husqvarna AB Véhicule robotisé autonome à capteurs multiples ayant une capacité cartographique
US10444760B2 (en) 2014-12-17 2019-10-15 Husqvarna Ab Robotic vehicle learning site boundary
TWI548891B (zh) * 2015-01-12 2016-09-11 金寶電子工業股份有限公司 掃地機定位系統及其定位方法
CN104858871B (zh) * 2015-05-15 2016-09-07 珠海市一微半导体有限公司 机器人系统及其自建地图和导航的方法
CN106325266A (zh) * 2015-06-15 2017-01-11 联想(北京)有限公司 一种空间分布图的构建方法及电子设备
DE102015111613A1 (de) 2015-07-17 2017-01-19 Still Gmbh Verfahren zur Hinderniserfassung bei einem Flurförderzeug
CN106584451B (zh) * 2015-10-14 2019-12-10 国网智能科技股份有限公司 一种基于视觉导航的变电站自动构图机器人及方法
US10093021B2 (en) * 2015-12-02 2018-10-09 Qualcomm Incorporated Simultaneous mapping and planning by a robot
CN105892461B (zh) * 2016-04-13 2018-12-04 上海物景智能科技有限公司 一种机器人所处环境与地图的匹配识别方法及系统
US9996944B2 (en) * 2016-07-06 2018-06-12 Qualcomm Incorporated Systems and methods for mapping an environment
US10274325B2 (en) * 2016-11-01 2019-04-30 Brain Corporation Systems and methods for robotic mapping
CN108107882B (zh) * 2016-11-24 2021-07-06 中国科学技术大学 基于光学运动跟踪的服务机器人自动标定与检测系统
AU2018244578A1 (en) * 2017-03-30 2019-10-03 Crown Equipment Corporation Warehouse mapping tools
US10394246B2 (en) 2017-03-31 2019-08-27 Neato Robotics, Inc. Robot with automatic styles
CN106919174A (zh) * 2017-04-10 2017-07-04 江苏东方金钰智能机器人有限公司 一种智能引导机器人的引导方法
US10761541B2 (en) * 2017-04-21 2020-09-01 X Development Llc Localization with negative mapping
US10551843B2 (en) 2017-07-11 2020-02-04 Neato Robotics, Inc. Surface type detection for robotic cleaning device
US10918252B2 (en) 2017-07-27 2021-02-16 Neato Robotics, Inc. Dirt detection layer and laser backscatter dirt detection
JP7051192B2 (ja) 2017-08-16 2022-04-11 シャークニンジャ オペレーティング エルエルシー ロボット掃除機
US11339580B2 (en) 2017-08-22 2022-05-24 Pentair Water Pool And Spa, Inc. Algorithm for a pool cleaner
GB2567944A (en) 2017-08-31 2019-05-01 Neato Robotics Inc Robotic virtual boundaries
US10583561B2 (en) 2017-08-31 2020-03-10 Neato Robotics, Inc. Robotic virtual boundaries
JP2019047848A (ja) * 2017-09-07 2019-03-28 パナソニックIpマネジメント株式会社 自律走行掃除機、および、累積床面確率更新方法
CN107728616B (zh) * 2017-09-27 2019-07-02 广东宝乐机器人股份有限公司 移动机器人的地图创建方法及移动机器人
WO2019069524A1 (fr) * 2017-10-02 2019-04-11 ソニー株式会社 Appareil de mise à jour d'informations environnementales, procédé de mise à jour d'informations environnementales, et programme
DE112017008137T5 (de) * 2017-10-03 2020-07-02 Intel Corporation Kartieren einer rasterbelegung unter verwendung einer fehlerbereichsverteilung
EP3692336B1 (fr) 2017-10-06 2022-05-18 Qualcomm Incorporated Relocalisation et réinitialisation simultanées de vslam
US10638906B2 (en) * 2017-12-15 2020-05-05 Neato Robotics, Inc. Conversion of cleaning robot camera images to floorplan for user interaction
US11457788B2 (en) * 2018-05-11 2022-10-04 Samsung Electronics Co., Ltd. Method and apparatus for executing cleaning operation
US11243540B2 (en) 2018-05-17 2022-02-08 University Of Connecticut System and method for complete coverage of unknown environments
US11157016B2 (en) 2018-07-10 2021-10-26 Neato Robotics, Inc. Automatic recognition of multiple floorplans by cleaning robot
US11194335B2 (en) 2018-07-10 2021-12-07 Neato Robotics, Inc. Performance-based cleaning robot charging method and apparatus
DE102018121365A1 (de) 2018-08-31 2020-04-23 RobArt GmbH Exploration eines robotereinsatzgebietes durch einen autonomen mobilen roboter
US11272823B2 (en) 2018-08-31 2022-03-15 Neato Robotics, Inc. Zone cleaning apparatus and method
CN118014912A (zh) * 2018-10-15 2024-05-10 科沃斯机器人股份有限公司 环境地图的校正方法、设备及存储介质
CN109191027A (zh) * 2018-11-09 2019-01-11 浙江国自机器人技术有限公司 一种机器人召唤方法、系统、设备及计算机可读存储介质
CN109531592B (zh) * 2018-11-30 2022-02-15 佛山科学技术学院 一种基于视觉slam的图书盘点机器人
EP3731130B1 (fr) * 2019-04-23 2024-06-05 Continental Autonomous Mobility Germany GmbH Appareil pour déterminer une carte d'occupation
EP3764179B1 (fr) 2019-07-08 2024-11-13 ABB Schweiz AG Évaluation des conditions d'équipement et processus industriels
US11250576B2 (en) * 2019-08-19 2022-02-15 Toyota Research Institute, Inc. Systems and methods for estimating dynamics of objects using temporal changes encoded in a difference map
US11327483B2 (en) * 2019-09-30 2022-05-10 Irobot Corporation Image capture devices for autonomous mobile robots and related systems and methods
CN110852211A (zh) * 2019-10-29 2020-02-28 北京影谱科技股份有限公司 基于神经网络的slam中障碍物过滤方法和装置
WO2021125510A1 (fr) * 2019-12-20 2021-06-24 Samsung Electronics Co., Ltd. Procédé et dispositif de navigation dans un environnement dynamique
US11880209B2 (en) 2020-05-15 2024-01-23 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
CN111928866B (zh) * 2020-09-27 2021-02-12 上海思岚科技有限公司 一种机器人地图差异更新的方法及设备
KR102556355B1 (ko) * 2020-11-12 2023-07-17 주식회사 유진로봇 동적 환경에 강인한 로봇 위치 추정 장치 및 방법
EP4245474A4 (fr) 2020-11-12 2025-04-23 Yujin Robot Co., Ltd. Système de robot
CN112698345B (zh) * 2020-12-04 2024-01-30 江苏科技大学 一种激光雷达的机器人同时定位与建图优化方法
CN112581613B (zh) * 2020-12-08 2024-11-01 纵目科技(上海)股份有限公司 一种栅格地图的生成方法、系统、电子设备及存储介质
JP7595744B2 (ja) * 2021-03-11 2024-12-06 株式会社Fuji 移動システム及び管理装置
US12092476B2 (en) * 2021-03-15 2024-09-17 Omron Corporation Method and apparatus for updating an environment map used by robots for self-localization
CN112985417B (zh) * 2021-04-19 2021-07-27 长沙万为机器人有限公司 移动机器人粒子滤波定位的位姿校正方法及移动机器人
KR20230017060A (ko) * 2021-07-27 2023-02-03 삼성전자주식회사 로봇 및 그 제어 방법
KR102444672B1 (ko) * 2021-08-27 2022-09-20 (주)뷰런테크놀로지 라이다 센서를 이용하여 동적 맵을 생성하고, 생성된 동적 맵을 이용하여 객체를 판단하는 방법 및 상기 방법을 수행하는 장치
US12336675B2 (en) 2022-04-11 2025-06-24 Vorwerk & Co. Interholding Gmb Obstacle avoidance using fused depth and intensity from nnt training
WO2024232795A1 (fr) * 2023-05-11 2024-11-14 Telefonaktiebolaget Lm Ericsson (Publ) Arrêt de génération d'une carte d'un environnement
KR102831650B1 (ko) 2023-06-13 2025-07-07 한국로봇융합연구원 로봇 이동성능을 검증하는 장치 및 방법

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5202661A (en) * 1991-04-18 1993-04-13 The United States Of America As Represented By The Secretary Of The Navy Method and system for fusing data from fixed and mobile security sensors
US5363305A (en) * 1990-07-02 1994-11-08 Nec Research Institute, Inc. Navigation system for a mobile robot
US5793934A (en) * 1994-06-22 1998-08-11 Siemens Aktiengesellschaft Method for the orientation, route planning and control of an autonomous mobile unit
US5957984A (en) * 1994-09-06 1999-09-28 Siemens Aktiengesellschaft Method of determining the position of a landmark in the environment map of a self-propelled unit, the distance of the landmark from the unit being determined dynamically by the latter
US20040076324A1 (en) * 2002-08-16 2004-04-22 Burl Michael Christopher Systems and methods for the automated sensing of motion in a mobile robot using visual data
US20040167669A1 (en) * 2002-12-17 2004-08-26 Karlsson L. Niklas Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
US20050171637A1 (en) * 2004-01-30 2005-08-04 Funai Electric Co., Ltd. Self-running cleaner with collision obviation capability
US20060235585A1 (en) * 2005-04-18 2006-10-19 Funai Electric Co., Ltd. Self-guided cleaning robot
US20070061043A1 (en) * 2005-09-02 2007-03-15 Vladimir Ermakov Localization and mapping system and method for a robotic device
US20080027591A1 (en) * 2006-07-14 2008-01-31 Scott Lenser Method and system for controlling a remote vehicle
US20080109126A1 (en) * 2006-03-17 2008-05-08 Irobot Corporation Lawn Care Robot
US20100049391A1 (en) * 2008-08-25 2010-02-25 Murata Machinery, Ltd. Autonomous moving apparatus

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0126497D0 (en) * 2001-11-03 2002-01-02 Dyson Ltd An autonomous machine
JP2006239844A (ja) * 2005-03-04 2006-09-14 Sony Corp 障害物回避装置、障害物回避方法及び障害物回避プログラム並びに移動型ロボット装置
JP2007041657A (ja) * 2005-07-29 2007-02-15 Sony Corp 移動体制御方法および移動体
KR100843085B1 (ko) * 2006-06-20 2008-07-02 삼성전자주식회사 이동 로봇의 격자지도 작성 방법 및 장치와 이를 이용한영역 분리 방법 및 장치
US7801644B2 (en) * 2006-07-05 2010-09-21 Battelle Energy Alliance, Llc Generic robot architecture
US7587260B2 (en) * 2006-07-05 2009-09-08 Battelle Energy Alliance, Llc Autonomous navigation system and method
US8996172B2 (en) * 2006-09-01 2015-03-31 Neato Robotics, Inc. Distance sensor system and method
CN100449444C (zh) * 2006-09-29 2009-01-07 浙江大学 移动机器人在未知环境中同时定位与地图构建的方法
US7613673B2 (en) * 2006-10-18 2009-11-03 The Boeing Company Iterative particle reduction methods and systems for localization and pattern recognition
KR100809352B1 (ko) * 2006-11-16 2008-03-05 삼성전자주식회사 파티클 필터 기반의 이동 로봇의 자세 추정 방법 및 장치
JP2009169845A (ja) * 2008-01-18 2009-07-30 Toyota Motor Corp 自律移動ロボット及び地図更新方法
KR101538775B1 (ko) * 2008-09-12 2015-07-30 삼성전자 주식회사 전방 영상을 이용한 위치 인식 장치 및 방법
KR101503903B1 (ko) * 2008-09-16 2015-03-19 삼성전자 주식회사 이동 로봇의 지도 구성 장치 및 방법

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5363305A (en) * 1990-07-02 1994-11-08 Nec Research Institute, Inc. Navigation system for a mobile robot
US5202661A (en) * 1991-04-18 1993-04-13 The United States Of America As Represented By The Secretary Of The Navy Method and system for fusing data from fixed and mobile security sensors
US5793934A (en) * 1994-06-22 1998-08-11 Siemens Aktiengesellschaft Method for the orientation, route planning and control of an autonomous mobile unit
US5957984A (en) * 1994-09-06 1999-09-28 Siemens Aktiengesellschaft Method of determining the position of a landmark in the environment map of a self-propelled unit, the distance of the landmark from the unit being determined dynamically by the latter
US20040076324A1 (en) * 2002-08-16 2004-04-22 Burl Michael Christopher Systems and methods for the automated sensing of motion in a mobile robot using visual data
US20040167669A1 (en) * 2002-12-17 2004-08-26 Karlsson L. Niklas Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
US20050171637A1 (en) * 2004-01-30 2005-08-04 Funai Electric Co., Ltd. Self-running cleaner with collision obviation capability
US20060235585A1 (en) * 2005-04-18 2006-10-19 Funai Electric Co., Ltd. Self-guided cleaning robot
US20070061043A1 (en) * 2005-09-02 2007-03-15 Vladimir Ermakov Localization and mapping system and method for a robotic device
US20080109126A1 (en) * 2006-03-17 2008-05-08 Irobot Corporation Lawn Care Robot
US20080027591A1 (en) * 2006-07-14 2008-01-31 Scott Lenser Method and system for controlling a remote vehicle
US20100049391A1 (en) * 2008-08-25 2010-02-25 Murata Machinery, Ltd. Autonomous moving apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Sheng Fu et al, "SLAM for Mobile Robots Using Laser Range Finder and Monocular Vision", IEEE, 2007 *

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8209144B1 (en) * 2009-09-15 2012-06-26 Google Inc. Accurate alignment of multiple laser scans using a template surface
US8209143B1 (en) * 2009-09-15 2012-06-26 Google Inc. Accurate alignment of multiple laser scans using a template surface
US20120195491A1 (en) * 2010-07-21 2012-08-02 Palo Alto Research Center Incorporated System And Method For Real-Time Mapping Of An Indoor Environment Using Mobile Robots With Limited Sensing
US9952053B2 (en) * 2011-09-30 2018-04-24 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US20150261223A1 (en) * 2011-09-30 2015-09-17 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9218003B2 (en) * 2011-09-30 2015-12-22 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US20160069691A1 (en) * 2011-09-30 2016-03-10 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9404756B2 (en) * 2011-09-30 2016-08-02 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US10962376B2 (en) 2011-09-30 2021-03-30 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US20170052033A1 (en) * 2011-09-30 2017-02-23 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
KR101807484B1 (ko) 2012-10-29 2017-12-11 한국전자통신연구원 객체 및 시스템 특성에 기반한 확률 분포 지도 작성 장치 및 그 방법
US9361591B2 (en) 2012-10-29 2016-06-07 Electronics And Telecommunications Research Institute Apparatus and method for building map of probability distribution based on properties of object and system
US9677890B2 (en) 2013-01-10 2017-06-13 Intel Corporation Positioning and mapping based on virtual landmarks
WO2014110204A1 (fr) * 2013-01-10 2014-07-17 Intel Corporation Positionnement et cartographie basés sur des points de repère virtuels
US20140350839A1 (en) * 2013-05-23 2014-11-27 Irobot Corporation Simultaneous Localization And Mapping For A Mobile Robot
US9037396B2 (en) * 2013-05-23 2015-05-19 Irobot Corporation Simultaneous localization and mapping for a mobile robot
US9304001B2 (en) 2013-07-03 2016-04-05 Samsung Electronics Co., Ltd Position recognition methods of autonomous mobile robots
US9566706B2 (en) 2014-01-14 2017-02-14 Samsung Electronics Co., Ltd. Robot and control method thereof
US9827994B2 (en) * 2015-06-25 2017-11-28 Hyundai Motor Company System and method for writing occupancy grid map of sensor centered coordinate system using laser scanner
US10549430B2 (en) * 2015-08-28 2020-02-04 Panasonic Intellectual Property Corporation Of America Mapping method, localization method, robot system, and robot
WO2017116492A1 (fr) * 2015-12-31 2017-07-06 Olney Guy Procédé permettant d'intégrer des flux parallèles de données de capteurs associées générant des réponses d'essai sans connaissance préalable de la signification des données ou de l'environnement détecté
DE102016203547A1 (de) 2016-03-03 2017-09-07 Kuka Roboter Gmbh Verfahren zum Aktualisieren einer Belegungskarte und autonomes Fahrzeug
WO2017148730A1 (fr) 2016-03-03 2017-09-08 Kuka Roboter Gmbh Procédé de mise à jour d'une carte d'occupation et véhicule autonome
US9864377B2 (en) 2016-04-01 2018-01-09 Locus Robotics Corporation Navigation using planned robot travel paths
US10310511B2 (en) 2016-04-20 2019-06-04 Toyota Jidosha Kabushiki Kaisha Automatic driving control system of mobile object
WO2018094272A1 (fr) * 2016-11-18 2018-05-24 Robert Bosch Start-Up Platform North America, LLC, Series 1 Créature robotique et procédé de fonctionnement
US11625870B2 (en) 2017-07-31 2023-04-11 Oxford University Innovation Limited Method of constructing a model of the motion of a mobile device and related systems
US10386851B2 (en) 2017-09-22 2019-08-20 Locus Robotics Corp. Multi-resolution scan matching with exclusion zones
US10429847B2 (en) 2017-09-22 2019-10-01 Locus Robotics Corp. Dynamic window approach using optimal reciprocal collision avoidance cost-critic
US10365656B2 (en) 2017-11-22 2019-07-30 Locus Robotics Corp. Robot charger docking localization
US10761539B2 (en) 2017-11-22 2020-09-01 Locus Robotics Corp. Robot charger docking control
US11320828B1 (en) 2018-03-08 2022-05-03 AI Incorporated Robotic cleaner
US11254002B1 (en) 2018-03-19 2022-02-22 AI Incorporated Autonomous robotic device
US11454981B1 (en) 2018-04-20 2022-09-27 AI Incorporated Versatile mobile robotic device
US20190329407A1 (en) * 2018-04-30 2019-10-31 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for multimodal mapping and localization
US10807236B2 (en) * 2018-04-30 2020-10-20 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for multimodal mapping and localization
US11340079B1 (en) 2018-05-21 2022-05-24 AI Incorporated Simultaneous collaboration, localization, and mapping
US11548159B1 (en) * 2018-05-31 2023-01-10 AI Incorporated Modular robot
US12468304B1 (en) 2018-07-11 2025-11-11 AI Incorporated Versatile mobile platform
US11199853B1 (en) 2018-07-11 2021-12-14 AI Incorporated Versatile mobile platform
US20200050205A1 (en) * 2018-08-07 2020-02-13 Cnh Industrial America Llc System and method for updating a mapped area
US10891769B2 (en) * 2019-02-14 2021-01-12 Faro Technologies, Inc System and method of scanning two dimensional floorplans using multiple scanners concurrently
US20200265621A1 (en) * 2019-02-14 2020-08-20 Faro Technologies, Inc. System and method of scanning two dimensional floorplans using multiple scanners concurrently
CN109900267A (zh) * 2019-04-12 2019-06-18 哈尔滨理工大学 一种基于slam的移动机器人地图创建与自主探索系统
CN110174894A (zh) * 2019-05-27 2019-08-27 小狗电器互联网科技(北京)股份有限公司 机器人及其重定位方法
TWI736960B (zh) * 2019-08-28 2021-08-21 財團法人車輛研究測試中心 同步定位與建圖優化方法
US12039674B2 (en) 2020-09-18 2024-07-16 Apple Inc. Inertial data management for extended reality for moving platforms
US12406442B2 (en) 2020-09-18 2025-09-02 Apple Inc. Extended reality for moving platforms
US12535822B2 (en) * 2020-11-06 2026-01-27 Ubtech Robotics Corp Ltd Mapping method, computer-readable storage medium, and robot
US20220269273A1 (en) * 2021-02-23 2022-08-25 Hyundai Motor Company Apparatus for estimating position of target, robot system having the same, and method thereof
US12314053B2 (en) * 2021-02-23 2025-05-27 Hyundai Motor Company Apparatus for estimating position of target, robot system having the same, and method thereof
CN113478480A (zh) * 2021-06-22 2021-10-08 中建三局集团有限公司 一种横折臂布料机的轨迹规划方法
US11944876B2 (en) * 2022-05-30 2024-04-02 Tennibot Inc. Autonomous tennis assistant systems
DE102023204536A1 (de) 2023-05-15 2024-11-21 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Lokalisierung eines mobilen Geräts

Also Published As

Publication number Publication date
CN102576228A (zh) 2012-07-11
JP5837553B2 (ja) 2015-12-24
AU2010286429A1 (en) 2012-04-05
NZ598500A (en) 2013-11-29
JP2013503404A (ja) 2013-01-31
CA2772636A1 (fr) 2011-03-03
EP2473890B1 (fr) 2014-03-12
EP2473890A2 (fr) 2012-07-11
US9678509B2 (en) 2017-06-13
US20150105964A1 (en) 2015-04-16
KR101362961B1 (ko) 2014-02-12
CN104699099B (zh) 2018-03-23
HK1211352A1 (en) 2016-05-20
CA2859112C (fr) 2017-08-15
WO2011026119A2 (fr) 2011-03-03
JP2014078254A (ja) 2014-05-01
US20170255203A1 (en) 2017-09-07
WO2011026119A3 (fr) 2011-06-16
US8903589B2 (en) 2014-12-02
CA2859112A1 (fr) 2011-03-03
CN104699099A (zh) 2015-06-10
AU2010286429B2 (en) 2013-11-28
KR20120043096A (ko) 2012-05-03
US20140058610A1 (en) 2014-02-27

Similar Documents

Publication Publication Date Title
US8903589B2 (en) Method and apparatus for simultaneous localization and mapping of mobile robot environment
JP2014078254A5 (fr)
CN111708047B (zh) 机器人定位评估方法、机器人及计算机存储介质
CN105247431B (zh) 自主移动体
JP2018526748A (ja) 自律移動ロボットと自律移動ロボットの基地局とを有するシステム、自律移動ロボットの基地局、自律移動ロボットのための方法、自律移動ロボットの基地局への自動ドッキング方法
KR20220000328A (ko) 레이저 반사 강도를 이용한 공간 구조 정보 기반 이동 로봇의 위치 인식 장치 및 방법
EP3865910B1 (fr) Système et procédé de correction d'erreurs d'orientation
CN112415532B (zh) 灰尘检测方法、距离检测装置以及电子设备
CN111671360B (zh) 一种扫地机器人位置的计算方法、装置及扫地机器人
Batavia et al. Obstacle detection in smooth high curvature terrain
CN112344966B (zh) 一种定位失效检测方法、装置、存储介质及电子设备
US20250044453A1 (en) 3d sub-grid map-based robot pose estimation method and robot using the same
HK1172961A (en) Method and apparatus for simultaneous localization and mapping of mobile robot environment
JP2024158196A (ja) 点検経路設定システム及び点検経路設定方法
CN115164882B (zh) 一种激光畸变去除方法、装置、系统及可读存储介质
CN118960713A (zh) 用于确定移动设备的位置和定向的方法
CN118962666A (zh) 用于定位移动设备的方法
CN121482140A (zh) 针对清洁机器人的吸扒脱落检测方法、电子设备、介质
CN119579930A (zh) 位姿丢失检测方法、存储介质、电子设备及程序产品
JP2023128778A (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
CN121207289A (zh) 吸奶器的奶量检测方法、吸奶器、介质以及产品
CN117539234A (zh) 一种基于路标的定位成功判断方法、芯片及机器人

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEATO ROBOTICS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOFMAN, BORIS;ERMAKOV, VLADIMIR;EMMERICH, MARK;AND OTHERS;SIGNING DATES FROM 20101120 TO 20101214;REEL/FRAME:025696/0116

AS Assignment

Owner name: SQUARE 1 BANK, NORTH CAROLINA

Free format text: SECURITY AGREEMENT;ASSIGNOR:NEATO ROBOTICS, INC.;REEL/FRAME:032382/0669

Effective date: 20120824

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: NEATO ROBOTICS, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:SQUARE 1 BANK;REEL/FRAME:034905/0429

Effective date: 20150206