US20130338831A1 - Robot cleaner and controlling method of the same - Google Patents
Robot cleaner and controlling method of the same Download PDFInfo
- Publication number
- US20130338831A1 US20130338831A1 US13/920,612 US201313920612A US2013338831A1 US 20130338831 A1 US20130338831 A1 US 20130338831A1 US 201313920612 A US201313920612 A US 201313920612A US 2013338831 A1 US2013338831 A1 US 2013338831A1
- Authority
- US
- United States
- Prior art keywords
- pattern
- robot cleaner
- obstacle
- image
- optical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 12
- 230000003287 optical effect Effects 0.000 claims abstract description 116
- 230000002093 peripheral effect Effects 0.000 claims abstract description 26
- 238000004140 cleaning Methods 0.000 claims description 35
- 238000012545 processing Methods 0.000 claims description 18
- 239000000428 dust Substances 0.000 claims description 13
- 230000001678 irradiating effect Effects 0.000 claims description 6
- 238000001514 detection method Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000005452 bending Methods 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- This relates to a robot cleaner, and particularly, to a robot cleaner capable of detecting an obstacle.
- Robots have been developed for industrial use, and may manage some parts of factory automation. Robots may be applied to various fields, such as medical robots, space robots, home robots, and others.
- a home robot may perform a cleaning operation by sucking dust or foreign materials while moving within a predetermined region.
- Such a robot cleaner may include a chargeable battery, and an obstacle sensor for avoiding obstacles while moving to perform a cleaning operation autonomously.
- FIG. 1 is a partial perspective view of a mobile terminal according to embodiments as broadly described herein;
- FIG. 2 is a block diagram of a robot cleaner according to an embodiment as broadly described herein;
- FIGS. 3A to 3C illustrate operation of a robot cleaner based on an obstacle detection result
- FIG. 4 is a block diagram of a robot cleaner according to another embodiment as broadly described herein;
- FIG. 5 is a perspective view of a robot cleaner according to an embodiment as broadly described herein;
- FIG. 6 is a block diagram of the robot cleaner shown in FIG. 5 ;
- FIGS. 7 and 8 are flowcharts of a method for controlling a robot cleaner according to embodiments as broadly described herein.
- a robot cleaner 1 may include a body 10 which forms an external appearance of the robot cleaner, an optical pattern sensor 100 , and a controller 200 .
- the optical pattern sensor 100 is provided on a front surface of the body 10 , and is configured to irradiate an optical pattern and to output a pattern image.
- the optical pattern sensor 100 may include an optical source module 110 configured to irradiate one or more cross-shaped optical patterns forward from a front side of the body 10 , and a camera module 120 configured to capture a pattern image on the optical pattern-irradiated region.
- the optical source module 110 comprises a laser diode (LD), a light emitting diode (LED), etc.
- the optical pattern sensor 100 may further include an additional lighting in addition to or instead of the optical source module 110 .
- the camera module 120 may be provided at the optical pattern sensor 100 and include one, two or more cameras.
- the camera module 120 may be a structured light camera, and the optical pattern sensor 100 may include a laser vision sensor.
- the optical pattern sensor 100 may further include a filter connected to a front end of the optical source module, and configured to pass only a prescribed frequency therethrough among optical patterns irradiated from the optical source module.
- the optical pattern may be implemented as a cross-shaped optical pattern, or combination of a plurality of optical patterns.
- the optical pattern is implemented as an asymmetrical cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- the optical pattern sensor 100 is configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- the optical pattern sensor 100 may be configured to irradiate a cross-shaped optical pattern in which a horizontal length is the same as a vertical length, or a cross-shaped optical pattern in which a horizontal length is shorter than a vertical length.
- the horizontal pattern may be set so that the robot cleaner can scan an obstacle within a wide range.
- the vertical pattern may be set to have a length, based on a value equal to or larger than the height of the robot cleaner. Accordingly, the vertical pattern may be shorter than the horizontal pattern.
- the vertical pattern and the horizontal pattern may be combined with each other in various manners, and a plurality of vertical patterns may be coupled to a single horizontal pattern.
- a longest part of the optical pattern e.g., a diameter of a circle or a long axis of an oval (ellipse) serves as a length of a horizontal pattern or a vertical pattern.
- the controller 200 may include an obstacle recognition module 210 configured to recognize an obstacle by processing a pattern image.
- the obstacle recognition module 210 may recognize whether an obstacle exists or not, or may recognize a width of an obstacle using a horizontal pattern of a pattern image. For instance, the obstacle recognition module 210 may recognize a width of an obstacle after the robot cleaner consecutively irradiates optical patterns while moving. Here, the obstacle recognition module 210 may recognize a width of an obstacle according to a bending degree of a horizontal pattern, or according to an angle formed by two lines and the obstacle recognition module 210 , the lines extending from two side edges of a horizontal pattern up to the horizontal pattern. Alternatively, the obstacle recognition module 210 may recognize a height of an obstacle using a horizontal pattern of a pattern image. For instance, the obstacle recognition module 210 may check a position of a horizontal pattern from a pattern image having no obstacle, and then calculate a moving distance of the horizontal pattern when an obstacle exists, thereby recognizing a height of the obstacle.
- the obstacle recognition module 210 may adapted to precisely recognize a height of an obstacle using a vertical pattern, or using both a vertical pattern and a horizontal pattern.
- FIG. 3A illustrates a case in which the obstacle is implemented as a chair having legs of a height greater than a prescribed value.
- the robot cleaner may move while avoiding the legs, and may pass through the chair because the legs have a prescribed height.
- FIG. 3B illustrates a case in which the obstacle is a low threshold.
- the robot cleaner irradiates optical patterns forward to recognize a threshold. Then the robot cleaner passes through the threshold if it determines that the threshold is passable.
- FIG. 3C illustrates a case in which the obstacle is a bed frame.
- the robot cleaner recognizes a bed frame based on a pattern image obtained after irradiating optical patterns. If it is determined that the bed frame is too low to pass under, the robot cleaner detours around the bed frame. Accordingly, the robot cleaner may avoid being caught (trapped) in an obstacle such as a bed, furniture, or an electronic product, each having too small a gap for the robot cleaner to pass through.
- the robot cleaner may further include an image detector configured to capture a peripheral image and to output image information.
- the controller 200 further includes a position recognition module configured to recognize a position of the robot cleaner using image information output from the image detector.
- the controller 200 may further comprise a map creation module configured to create a peripheral map using a recognized position of the robot cleaner.
- FIGS. 5 and 6 are views illustrating a robot cleaner according to an embodiment as broadly described herein.
- the configuration of the robot cleaner may be applied to another robot cleaner.
- the robot cleaner further includes an image detector 400 configured to capture a peripheral image and to output image information, besides the optical pattern sensor 100 and the controller 200 .
- the image detector 400 is provided with a camera sensor installed toward the upper side or the front side. If the image detector 400 is provided with a plurality of camera sensors, the camera sensors may be formed on an upper surface or a side surface of the robot cleaner, at constant intervals or at constant angles. Referring to FIG. 5 , a single camera sensor is installed toward the front side.
- the image detector 400 may further include a lens connected to the camera sensor and focusing a camera on a subject, a camera controller configured to control the camera sensor, and a lens controller configured to control the lens.
- the lens preferably used is a lens having a wide view angle so that all the peripheral regions, e.g., all the regions on the ceiling may be captured at a predetermined position.
- a position recognition module 220 may be configured to extract feature points from image information captured by the image detector 400 , and to recognize a position of the robot cleaner based on the feature points.
- a map creation module 230 may be configured to create a map with respect to a cleaning region using the position of the robot cleaner recognized by the position recognition module 220 . The map creation module 230 may update or compensate for a created peripheral map by reflecting an obstacle recognized by the optical pattern sensor, to the peripheral map.
- the robot cleaner irradiates optical patterns while moving or in a stopped state (S 110 ).
- the optical pattern may be implemented as an asymmetric cross-shaped optical pattern.
- the robot cleaner captures a pattern image with respect to the optical pattern-irradiated region, thereby acquiring the pattern image (S 120 ).
- the robot cleaner recognizes an obstacle using the pattern image (S 130 ).
- the robot cleaner recognizes an obstacle by processing the pattern image using a controller such as a micro computer.
- the optical pattern sensor itself may be configured to recognize an obstacle. For instance, the robot cleaner may recognize whether an obstacle exists or not, a width of an obstacle, a height of an obstacle, etc. based on a horizontal pattern.
- the robot cleaner may detect a precise height of an obstacle based on a vertical pattern.
- the robot cleaner may capture a peripheral image (S 140 ). Then the robot cleaner may extract feature points from the peripheral image, thereby recognizing its position based on the feature points (S 150 ). Further, the robot cleaner may create a peripheral map based on the recognized position (S 160 ).
- a robot cleaner may include a body 10 which forms an appearance of the robot cleaner, a driver 300 , an optical pattern sensor 100 , and a controller 200 .
- the optical pattern sensor 100 is provided on a front surface of the body 10 , and is configured to irradiate one or more cross-shaped optical patterns forward from the front side of the body 10 to thus output a pattern image.
- the optical pattern sensor 100 includes an optical source module 110 configured to irradiate one or more cross-shaped optical patterns forward from a front side of the body, and a camera module 120 configured to capture the pattern image on the optical pattern-irradiated region.
- the optical pattern sensor 100 further includes an image processing module 130 configured to detect an obstacle by processing a pattern image. That is, the obstacle recognition module 210 may be included in the controller 200 . Alternatively, the image processing module 130 may be included in the optical pattern sensor 100 .
- the optical pattern may be implemented as a cross-shaped optical pattern, or combination of a plurality of optical patterns.
- the optical pattern is implemented as an asymmetrical cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- the optical pattern sensor 100 is configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- the image processing module 130 detects an obstacle by processing an image acquired by one or more camera modules.
- the image processing module 130 may detect an obstacle using a shape, an area, a change, etc. of an irradiated optical pattern, from an image.
- the image processing module 130 detects a size, a width, a height, etc. of an obstacle from a patter image including a horizontal pattern and a vertical pattern.
- the image processing module 130 may extract one pattern component in a prescribed direction (e.g., X-direction) from an image captured by the camera module 120 , then convert the captured image into another direction, and extract another pattern component in the prescribed direction(e.g., X-direction), thereby detecting an obstacle.
- the image processing module 130 may extract only a vertical component from an image captured by one camera module, and may extract only a horizontal component from an image captured by another camera module. Then the image processing module 130 may create a three-dimensional (3D) pattern, and may detect an obstacle based on the 3D pattern, thereby outputting obstacle information such as a size and a shape of an obstacle, to the controller 200 .
- the optical pattern sensor may further comprise a filter connected to a front end of the optical source module 110 , and configured to pass only a prescribed frequency therethrough among optical patterns irradiated from the optical source module 110 .
- the controller 200 recognizes an obstacle from a pattern image, and generates a driving signal based on a recognition result.
- the controller 200 may further include an obstacle recognition module 210 configured to recognize an obstacle by processing a pattern image. Alternatively, the controller 200 may generate a driving signal by receiving information about an obstacle detected by the image processing module 130 .
- the robot cleaner further includes an image detector configured to output image information by capturing a peripheral image.
- the controller 200 further includes a position recognition module configured to recognize a position of the robot cleaner based on image information output from the image detector.
- the controller 200 may further include a map creation module configured to create a peripheral map using the recognized position of the robot cleaner.
- the image detector of FIG. 5 or FIG. 6 will not be explained, because it was aforementioned in a previous embodiment.
- the driver 300 is provided with a wheel motor for driving one or more wheels installed below the body 10 , and is configured to move the body according to a driving signal.
- the robot cleaner is provided with right and left main wheels 310 at two lower portions thereof. A handgrip may be installed at two side surfaces of the wheels, for facilitation of a user's grasp.
- the wheel motors are respectively connected to the main wheels 310 to thus rotate the main wheels 310 , and can be rotated in two directions in an independent manner.
- the robot cleaner is provided with one or more auxiliary wheels on the rear surface thereof.
- the auxiliary wheels serve to support the body of the robot cleaner, to minimize friction between a lower surface of the body and the floor, and to allow the robot cleaner to smoothly move.
- FIGS. 3A to 3C the controller 200 generates a driving signal based on a recognition result with respect to an obstacle, and the driver 300 moves the body according to the driving signal.
- FIG. 3A illustrates a case where the obstacle is a chair having legs of a height more than a prescribed value. The robot cleaner can move while avoiding the legs, and can pass through the chair because the legs have a prescribed height or greater.
- FIG. 3B illustrates a case where the obstacle is a low threshold. The robot cleaner irradiates optical patterns forward from the front side, to thus recognize the threshold. Then the robot cleaner passes through the threshold if it determines that the threshold is passable.
- FIG. 3C illustrates a case where the obstacle is a bed frame.
- the robot cleaner recognizes the bed frame based on a pattern image obtained after irradiating optical patterns. If it is determined that the bed frame is too low to pass through/under, the robot cleaner detours around the bed frame. Accordingly, the robot cleaner may avoid being caught (trapped) in an obstacle such as a bed, furniture, or an electronic product, each having a too small to pass through gap.
- FIGS. 5 and 6 are views illustrating a robot cleaner according to an embodiment as broadly described herein.
- the configuration of the robot cleaner may be applied to another robot cleaner.
- the robot cleaner may further include an obstacle detector 700 configured to detect a peripheral obstacle, in addition to the optical pattern sensor.
- the obstacle detector 700 includes first sensors 710 installed on an outer circumferential surface of the robot cleaner at constant intervals.
- the obstacle detector 700 may also include second sensors protruding outward from the body. Positions and types of the first sensors and the second sensors may be variable according to a type of the robot cleaner, and the obstacle detector may include various types of sensors.
- the first sensors 710 are configured to detect an object which exists in a moving direction of the robot cleaner, i.e. an obstacle, and then transmit obstacle information to the controller 200 . That is, the first sensors detect protrusions, appliances, furniture, wall surfaces, wall corners, etc. which exist on a moving path of the robot cleaner, and then transmit obstacle information to the controller 200 .
- the first sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, etc.
- the second sensors detect an obstacle which exists at the front or lateral side, and transmits obstacle information to the controller 200 . That is, the second sensors detect protrusions, appliances, furniture, wall surfaces, wall corners, etc. which exist on a moving path of the robot cleaner, and then transmit obstacle information to the controller 200 .
- the second sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, an RF sensor, a position sensitive device (PSD) sensor, etc.
- PSD position sensitive device
- the obstacle detector 700 may further include a cliff sensor installed at the bottom surface of the body, and configured to detect an obstacle which is on the floor, e.g., a cliff.
- the cliff sensor is configured to obtain stable measurement values regardless of reflectivity of the floor and a color difference.
- the cliff sensor may be implemented in the form of an infrared ray module.
- the obstacle detector 700 may further include a charging signal sensor for receiving a guide signal transmitted from a charging station.
- the robot cleaner receives a guide signal transmitted from the charging station using the charging signals sensor, thereby checking a position and a direction of the charging station.
- the charging station creates a guide signal indicating a direction and a distance so that the robot cleaner can return to the charging station.
- the robot cleaner receives the guide signal transmitted from the charging station, and determines the current position and sets a moving direction. Then the robot cleaner returns to the charging station.
- the charging signal sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, a radio frequency (RF) sensor, etc, and may be generally used as an infrared ray sensor.
- the robot cleaner may further include wheel sensors connected to the right and left main wheels 310 , and sensing RPMs of the right and left main wheels 310 .
- the wheel sensors may be implemented as a rotary encoder. When the robot cleaner moves in a running mode or a cleaning mode, the rotary encoder senses RPMs of the right and left main wheels 310 , and outputs the sensed RPMs.
- the controller 200 may calculate rotation speeds of the right and left main wheels 310 based on the sensed RPMs.
- the position recognition module 220 may recognize a position of the robot cleaner based on information about an obstacle detected by the obstacle detector 700 . Also, the position recognition module 220 may compensate for a recognized position of the robot cleaner, based on an obstacle detected using image information and the optical pattern sensor.
- the map creation module 230 may create a map using information about an obstacle detected by the obstacle detection unit, or may compensate for a created peripheral map.
- the robot cleaner further includes a storage device 500 configured to store therein image information, obstacle information, position information, a peripheral map, etc.
- the storage device 500 is configured to further store therein a cleaning map, a cleaning region, etc.
- the storage device 500 stores therein a control program for controlling the robot cleaner, and data associated with the control program.
- the storage device 500 may further store therein a cleaning type and a running type.
- a non-volatile memory (NVM, NVRAM) is mainly used.
- the NVM indicates a storage device capable of maintaining stored information even if power is not supplied thereto.
- the NVM includes a ROM, a flash memory, a magnetic computer memory device (e.g., a hard disk, a diskette drive, and a magnetic tape), an optical disk drive, a magnetic RAM, a PRAM, etc.
- the robot cleaner further includes a cleaning device 600 configured to draw dust or foreign materials into the robot cleaner.
- the cleaning device 600 includes a dust box configured to store therein collected dust particles, a suction fan configured to provide a driving power to suck dust within a cleaning region, and a suction motor configured to suck air by rotating the suction fan.
- the cleaning device 600 further includes an agitator rotatably mounted to a lower part of the body of the robot cleaner, and a side brush configured to clean a corner or an edge of a wall, etc. with rotating centering around a vertical shaft of the body.
- the agitator makes dust particles on the floor or a carpet move to the air with rotating centering around a horizontal shaft of the body of the robot cleaner.
- a plurality of blades are provided on an outer circumferential surface of the agitator in a spiral form.
- a brush may be provided between the blades.
- the robot cleaner may further include an input device 810 , an output device 820 and a power device 830 .
- the robot cleaner may further include an input device 810 through which a user directly inputs a control command to the robot cleaner.
- the user may input, through the input device 810 , a command instructing output of one or more information among information stored in the storage device 500 .
- the input device 810 may be implemented as one or more buttons.
- the input device 810 may include an OK button and a set button.
- the OK button is used to input a command for checking obstacle information, position information, image information, a cleaning region, a cleaning map, etc.
- the set button is used to input a command for setting such information.
- the input device 810 may be provided with a reset button for inputting a command for resetting such information, a deletion button, a cleaning start button, a stop button, etc.
- the input device 810 may be provided with a button for setting reservation information, or a button for deleting reservation information.
- the input device 810 may be further provided with a button for setting a cleaning mode, or a button for changing a cleaning mode.
- the input device 810 may be further provided with a button for inputting a command instructing the robot cleaner to return to a charging station.
- the input device 810 may be installed at an upper part of the robot cleaner, in the form of hard or soft keys, a touch pad, etc.
- the input device 810 may be implemented in the form of a touch screen together with the output device 820 .
- the output device 820 is installed at an upper part of the robot cleaner.
- an installation position or an installation type may be variable.
- the output device 820 outputs, to a screen, reservation information, a battery state, intensive cleaning, space extension, a cleaning or running operation in a zigzag form, a cleaning operation with respect to a designated region, etc.
- the output device 820 may output the current cleaning state of the robot cleaner, and the current state of each unit of the robot cleaner.
- the output device 820 may display, on the screen, obstacle information, position information, image information, a cleaning map, a cleaning region, a designated region, etc.
- the output device 820 may be implemented as one of a light emitting diode (LED), a liquid crystal display (LCD), a plasma display panel (PDP), and an organic light emitting diode (OLED).
- LED light emitting diode
- LCD liquid crystal display
- PDP plasma display panel
- OLED organic light emitting diode
- the power device 830 is provided with a chargeable battery to supply power into the robot cleaner.
- the power device 830 supplies, to each component as appropriate, a driving power and an operation power required when the robot cleaner moves or when the robot cleaner performs a cleaning operation.
- the robot cleaner moves to a charging station to be supplied with a charging current.
- the robot cleaner further comprises a battery sensor configured to sense a charged state of a battery, and to transmit detection information to the controller 200 . As the battery is connected to the battery sensor, the remaining amount and a charged state of the battery are transmitted to the controller 200 .
- the remaining amount of battery power may be displayed on the screen of the output device 820 .
- the robot cleaner irradiates optical patterns while moving or in a stopped state (S 210 ).
- the optical pattern is preferably implemented as an asymmetric cross-shaped optical pattern.
- the robot cleaner captures a pattern image with respect to the optical pattern-irradiated region, thereby acquiring the pattern image (S 220 ).
- the robot cleaner recognizes an obstacle using the pattern image (S 230 ).
- the robot cleaner recognizes an obstacle by processing the pattern image using a controller such as a micro computer.
- the optical pattern sensor itself may be configured to recognize an obstacle. For instance, the robot cleaner may recognize whether an obstacle exists or not, a width of an obstacle, a height of an obstacle, etc. based on a horizontal pattern.
- the robot cleaner may detect a precise height of an obstacle based on a vertical pattern.
- the robot cleaner determines whether to pass through an obstacle or not, based on a recognition result with respect to the obstacle (S 240 ).
- the robot cleaner passes through the obstacle with its body moving forwardly (S 241 ).
- the robot cleaner passes through the threshold with its body moving forwardly.
- the robot cleaner determines whether to detour round the obstacle or not (S 250 ).
- FIG. 3C in case of an obstacle having a small gap, the robot cleaner detours a round the obstacle (S 261 ). If the robot cleaner can neither move forward nor detour, the robot cleaner may stop or move back (S 263 ).
- Such an algorithm may be variable according to a user or a programming, and a study function may be added according to a specification of the robot cleaner.
- the robot cleaner may capture a peripheral image (S 270 ), and may extract feature points from the peripheral image to thus recognize its position based on the feature points (S 280 ). Then the robot cleaner may create a peripheral map based on the recognized position (S 290 ).
- an asymmetric cross-shaped optical pattern may be irradiated, and a pattern image with respect to the optical pattern-irradiated region may be analyzed.
- whether an obstacle exists or not may be checked, and a width or a height of an obstacle may be detected.
- a robot cleaner as embodied and broadly described herein may perform operations such as a forward motion, a backward motion, a stopping motion and a detour motion, based on a detection result with respect to an obstacle.
- a robot cleaner is provided that is capable of precisely detecting a peripheral obstacle using a peculiar optical pattern, and a method for controlling the same.
- a robot cleaner is provided that is capable of detecting a width or a height of an obstacle by irradiating an asymmetric cross-shaped optical pattern, and by analyzing a pattern image with respect to the optical pattern-irradiated region, and capable of moving according to a detection result, and a method for controlling the same.
- a robot cleaner as embodied and broadly described herein may include a body which forms an appearance; a cleaning unit including a dust box for storing collected dust, a suction fan for providing a driving force to suck dust inside a cleaning region, and a suction motor for sucking air by rotating the suction fan; an optical pattern sensor provided on a front surface of the body, and configured to irradiate an optical pattern and to output a pattern image; and a control unit configured to recognize an obstacle based on the pattern image, wherein the optical pattern sensor comprises: an optical source module configured to irradiate one or more cross-shaped optical patterns toward a front side of the body; and a camera module configured to capture the pattern image on the optical pattern-irradiated region.
- the optical pattern sensor may be configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- the control unit may include an obstacle recognition module configured to recognize an obstacle by processing the pattern image.
- a robot cleaner may include a body which forms an appearance; a cleaning unit including a dust box for storing collected dust, a suction fan for providing a driving force to suck dust inside a cleaning region, and a suction motor for sucking air by rotating the suction fan; a driving unit provided with a wheel motor for driving one or more wheels installed below the body, and the driving unit configured to move the body according to a driving signal; an optical pattern sensor provided on a front surface of the body, configured to irradiate one or more cross-shaped optical patterns toward a front side of the body, and configured to output a pattern image; and a control unit configured to recognize an obstacle based on the pattern image, and configured to generate the driving signal based on a recognition result.
- a robot cleaner as embodied and broadly described herein may precisely detect a peripheral obstacle using a peculiar optical pattern.
- an asymmetric cross-shaped optical pattern can be irradiated, and a pattern image with respect to the optical pattern-irradiated region can be analyzed. Under such configuration, whether an obstacle exists or not can be checked, and a width or a height of an obstacle can be detected.
- a robot cleaner as embodied and broadly described herein may perform operations such as a forward motion, a backward motion, a stopping motion and a detour motion, based on a detection result with respect to an obstacle. This can enhance stability of the robot cleaner and a user's convenience, and improve a driving efficiency and a cleaning efficiency.
- any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc. means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention.
- the appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Electric Vacuum Cleaner (AREA)
Abstract
Description
- This application claims priority under 35 U.S.C. §119 to Korean Application No. 10-2012-0065153 filed on Jun. 18, 2012, whose entire disclosure is hereby incorporated by reference.
- 1. Field
- This relates to a robot cleaner, and particularly, to a robot cleaner capable of detecting an obstacle.
- 2. Background
- Robots have been developed for industrial use, and may manage some parts of factory automation. Robots may be applied to various fields, such as medical robots, space robots, home robots, and others. A home robot may perform a cleaning operation by sucking dust or foreign materials while moving within a predetermined region. Such a robot cleaner may include a chargeable battery, and an obstacle sensor for avoiding obstacles while moving to perform a cleaning operation autonomously.
- The embodiments will be described in detail with reference to the following drawings in which like reference numerals refer to like elements wherein:
-
FIG. 1 is a partial perspective view of a mobile terminal according to embodiments as broadly described herein; -
FIG. 2 is a block diagram of a robot cleaner according to an embodiment as broadly described herein; -
FIGS. 3A to 3C illustrate operation of a robot cleaner based on an obstacle detection result; -
FIG. 4 is a block diagram of a robot cleaner according to another embodiment as broadly described herein; -
FIG. 5 is a perspective view of a robot cleaner according to an embodiment as broadly described herein; -
FIG. 6 is a block diagram of the robot cleaner shown inFIG. 5 ; and -
FIGS. 7 and 8 are flowcharts of a method for controlling a robot cleaner according to embodiments as broadly described herein. - Description will now be given in detail of the exemplary embodiments, with reference to the accompanying drawings. For the sake of brief description with reference to the drawings, the same or equivalent components will be provided with the same reference numbers, and description thereof will not be repeated.
- Referring to
FIGS. 1 and 2 , arobot cleaner 1 according to an embodiment as broadly described herein may include abody 10 which forms an external appearance of the robot cleaner, anoptical pattern sensor 100, and acontroller 200. - The
optical pattern sensor 100 is provided on a front surface of thebody 10, and is configured to irradiate an optical pattern and to output a pattern image. As shown inFIG. 2 , theoptical pattern sensor 100 may include anoptical source module 110 configured to irradiate one or more cross-shaped optical patterns forward from a front side of thebody 10, and acamera module 120 configured to capture a pattern image on the optical pattern-irradiated region. Theoptical source module 110 comprises a laser diode (LD), a light emitting diode (LED), etc. Theoptical pattern sensor 100 may further include an additional lighting in addition to or instead of theoptical source module 110. Thecamera module 120 may be provided at theoptical pattern sensor 100 and include one, two or more cameras. Thecamera module 120 may be a structured light camera, and theoptical pattern sensor 100 may include a laser vision sensor. Theoptical pattern sensor 100 may further include a filter connected to a front end of the optical source module, and configured to pass only a prescribed frequency therethrough among optical patterns irradiated from the optical source module. - The optical pattern may be implemented as a cross-shaped optical pattern, or combination of a plurality of optical patterns. In certain embodiments, the optical pattern is implemented as an asymmetrical cross-shaped optical pattern in which a horizontal length is longer than a vertical length. More specifically, the
optical pattern sensor 100 is configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length. Alternatively, theoptical pattern sensor 100 may be configured to irradiate a cross-shaped optical pattern in which a horizontal length is the same as a vertical length, or a cross-shaped optical pattern in which a horizontal length is shorter than a vertical length. Here, the horizontal pattern may be set so that the robot cleaner can scan an obstacle within a wide range. The vertical pattern may be set to have a length, based on a value equal to or larger than the height of the robot cleaner. Accordingly, the vertical pattern may be shorter than the horizontal pattern. The vertical pattern and the horizontal pattern may be combined with each other in various manners, and a plurality of vertical patterns may be coupled to a single horizontal pattern. In a case where an optical pattern is irradiated in a conical shape, a longest part of the optical pattern, e.g., a diameter of a circle or a long axis of an oval (ellipse) serves as a length of a horizontal pattern or a vertical pattern. - Referring to
FIG. 2 , thecontroller 200 may include anobstacle recognition module 210 configured to recognize an obstacle by processing a pattern image. - The
obstacle recognition module 210 may recognize whether an obstacle exists or not, or may recognize a width of an obstacle using a horizontal pattern of a pattern image. For instance, theobstacle recognition module 210 may recognize a width of an obstacle after the robot cleaner consecutively irradiates optical patterns while moving. Here, theobstacle recognition module 210 may recognize a width of an obstacle according to a bending degree of a horizontal pattern, or according to an angle formed by two lines and theobstacle recognition module 210, the lines extending from two side edges of a horizontal pattern up to the horizontal pattern. Alternatively, theobstacle recognition module 210 may recognize a height of an obstacle using a horizontal pattern of a pattern image. For instance, theobstacle recognition module 210 may check a position of a horizontal pattern from a pattern image having no obstacle, and then calculate a moving distance of the horizontal pattern when an obstacle exists, thereby recognizing a height of the obstacle. - When using only a horizontal pattern, there may be some limitations in recognizing a height of an obstacle. In this case, the obstacle may be erroneously recognized. Accordingly, the
obstacle recognition module 210 may adapted to precisely recognize a height of an obstacle using a vertical pattern, or using both a vertical pattern and a horizontal pattern. -
FIG. 3A illustrates a case in which the obstacle is implemented as a chair having legs of a height greater than a prescribed value. The robot cleaner may move while avoiding the legs, and may pass through the chair because the legs have a prescribed height.FIG. 3B illustrates a case in which the obstacle is a low threshold. The robot cleaner irradiates optical patterns forward to recognize a threshold. Then the robot cleaner passes through the threshold if it determines that the threshold is passable.FIG. 3C illustrates a case in which the obstacle is a bed frame. The robot cleaner recognizes a bed frame based on a pattern image obtained after irradiating optical patterns. If it is determined that the bed frame is too low to pass under, the robot cleaner detours around the bed frame. Accordingly, the robot cleaner may avoid being caught (trapped) in an obstacle such as a bed, furniture, or an electronic product, each having too small a gap for the robot cleaner to pass through. - The robot cleaner may further include an image detector configured to capture a peripheral image and to output image information. The
controller 200 further includes a position recognition module configured to recognize a position of the robot cleaner using image information output from the image detector. Thecontroller 200 may further comprise a map creation module configured to create a peripheral map using a recognized position of the robot cleaner. -
FIGS. 5 and 6 are views illustrating a robot cleaner according to an embodiment as broadly described herein. The configuration of the robot cleaner may be applied to another robot cleaner. Referring toFIG. 5 orFIG. 6 , the robot cleaner further includes animage detector 400 configured to capture a peripheral image and to output image information, besides theoptical pattern sensor 100 and thecontroller 200. Theimage detector 400 is provided with a camera sensor installed toward the upper side or the front side. If theimage detector 400 is provided with a plurality of camera sensors, the camera sensors may be formed on an upper surface or a side surface of the robot cleaner, at constant intervals or at constant angles. Referring toFIG. 5 , a single camera sensor is installed toward the front side. Theimage detector 400 may further include a lens connected to the camera sensor and focusing a camera on a subject, a camera controller configured to control the camera sensor, and a lens controller configured to control the lens. As the lens, preferably used is a lens having a wide view angle so that all the peripheral regions, e.g., all the regions on the ceiling may be captured at a predetermined position. Aposition recognition module 220 may be configured to extract feature points from image information captured by theimage detector 400, and to recognize a position of the robot cleaner based on the feature points. Amap creation module 230 may be configured to create a map with respect to a cleaning region using the position of the robot cleaner recognized by theposition recognition module 220. Themap creation module 230 may update or compensate for a created peripheral map by reflecting an obstacle recognized by the optical pattern sensor, to the peripheral map. - Referring to
FIG. 7 , the robot cleaner irradiates optical patterns while moving or in a stopped state (S110). As aforementioned, the optical pattern may be implemented as an asymmetric cross-shaped optical pattern. Then the robot cleaner captures a pattern image with respect to the optical pattern-irradiated region, thereby acquiring the pattern image (S120). Then the robot cleaner recognizes an obstacle using the pattern image (S130). Here, the robot cleaner recognizes an obstacle by processing the pattern image using a controller such as a micro computer. Alternatively, the optical pattern sensor itself may be configured to recognize an obstacle. For instance, the robot cleaner may recognize whether an obstacle exists or not, a width of an obstacle, a height of an obstacle, etc. based on a horizontal pattern. Further, the robot cleaner may detect a precise height of an obstacle based on a vertical pattern. The robot cleaner may capture a peripheral image (S140). Then the robot cleaner may extract feature points from the peripheral image, thereby recognizing its position based on the feature points (S150). Further, the robot cleaner may create a peripheral map based on the recognized position (S160). - Referring to
FIG. 4 , a robot cleaner according to another embodiment may include abody 10 which forms an appearance of the robot cleaner, adriver 300, anoptical pattern sensor 100, and acontroller 200. - The
optical pattern sensor 100 is provided on a front surface of thebody 10, and is configured to irradiate one or more cross-shaped optical patterns forward from the front side of thebody 10 to thus output a pattern image. Referring toFIG. 4 , theoptical pattern sensor 100 includes anoptical source module 110 configured to irradiate one or more cross-shaped optical patterns forward from a front side of the body, and acamera module 120 configured to capture the pattern image on the optical pattern-irradiated region. Theoptical pattern sensor 100 further includes animage processing module 130 configured to detect an obstacle by processing a pattern image. That is, theobstacle recognition module 210 may be included in thecontroller 200. Alternatively, theimage processing module 130 may be included in theoptical pattern sensor 100. - The optical pattern may be implemented as a cross-shaped optical pattern, or combination of a plurality of optical patterns. In certain embodiments, the optical pattern is implemented as an asymmetrical cross-shaped optical pattern in which a horizontal length is longer than a vertical length. More specifically, the
optical pattern sensor 100 is configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length. - The
optical source module 110 and thecamera module 120 will not be explained, because they were aforementioned in a previous embodiment. Theimage processing module 130 detects an obstacle by processing an image acquired by one or more camera modules. Theimage processing module 130 may detect an obstacle using a shape, an area, a change, etc. of an irradiated optical pattern, from an image. Theimage processing module 130 detects a size, a width, a height, etc. of an obstacle from a patter image including a horizontal pattern and a vertical pattern. As another example, theimage processing module 130 may extract one pattern component in a prescribed direction (e.g., X-direction) from an image captured by thecamera module 120, then convert the captured image into another direction, and extract another pattern component in the prescribed direction(e.g., X-direction), thereby detecting an obstacle. When using two camera modules, theimage processing module 130 may extract only a vertical component from an image captured by one camera module, and may extract only a horizontal component from an image captured by another camera module. Then theimage processing module 130 may create a three-dimensional (3D) pattern, and may detect an obstacle based on the 3D pattern, thereby outputting obstacle information such as a size and a shape of an obstacle, to thecontroller 200. The optical pattern sensor may further comprise a filter connected to a front end of theoptical source module 110, and configured to pass only a prescribed frequency therethrough among optical patterns irradiated from theoptical source module 110. - The
controller 200 recognizes an obstacle from a pattern image, and generates a driving signal based on a recognition result. Thecontroller 200 may further include anobstacle recognition module 210 configured to recognize an obstacle by processing a pattern image. Alternatively, thecontroller 200 may generate a driving signal by receiving information about an obstacle detected by theimage processing module 130. The robot cleaner further includes an image detector configured to output image information by capturing a peripheral image. Thecontroller 200 further includes a position recognition module configured to recognize a position of the robot cleaner based on image information output from the image detector. Thecontroller 200 may further include a map creation module configured to create a peripheral map using the recognized position of the robot cleaner. The image detector ofFIG. 5 orFIG. 6 will not be explained, because it was aforementioned in a previous embodiment. - The
driver 300 is provided with a wheel motor for driving one or more wheels installed below thebody 10, and is configured to move the body according to a driving signal. The robot cleaner is provided with right and leftmain wheels 310 at two lower portions thereof. A handgrip may be installed at two side surfaces of the wheels, for facilitation of a user's grasp. The wheel motors are respectively connected to themain wheels 310 to thus rotate themain wheels 310, and can be rotated in two directions in an independent manner. And the robot cleaner is provided with one or more auxiliary wheels on the rear surface thereof. The auxiliary wheels serve to support the body of the robot cleaner, to minimize friction between a lower surface of the body and the floor, and to allow the robot cleaner to smoothly move. - As shown in
FIGS. 3A to 3C , thecontroller 200 generates a driving signal based on a recognition result with respect to an obstacle, and thedriver 300 moves the body according to the driving signal.FIG. 3A illustrates a case where the obstacle is a chair having legs of a height more than a prescribed value. The robot cleaner can move while avoiding the legs, and can pass through the chair because the legs have a prescribed height or greater.FIG. 3B illustrates a case where the obstacle is a low threshold. The robot cleaner irradiates optical patterns forward from the front side, to thus recognize the threshold. Then the robot cleaner passes through the threshold if it determines that the threshold is passable.FIG. 3C illustrates a case where the obstacle is a bed frame. The robot cleaner recognizes the bed frame based on a pattern image obtained after irradiating optical patterns. If it is determined that the bed frame is too low to pass through/under, the robot cleaner detours around the bed frame. Accordingly, the robot cleaner may avoid being caught (trapped) in an obstacle such as a bed, furniture, or an electronic product, each having a too small to pass through gap. -
FIGS. 5 and 6 are views illustrating a robot cleaner according to an embodiment as broadly described herein. The configuration of the robot cleaner may be applied to another robot cleaner. Referring toFIG. 5 orFIG. 6 , the robot cleaner may further include anobstacle detector 700 configured to detect a peripheral obstacle, in addition to the optical pattern sensor. - As shown in
FIG. 5 , theobstacle detector 700 includesfirst sensors 710 installed on an outer circumferential surface of the robot cleaner at constant intervals. Theobstacle detector 700 may also include second sensors protruding outward from the body. Positions and types of the first sensors and the second sensors may be variable according to a type of the robot cleaner, and the obstacle detector may include various types of sensors. Thefirst sensors 710 are configured to detect an object which exists in a moving direction of the robot cleaner, i.e. an obstacle, and then transmit obstacle information to thecontroller 200. That is, the first sensors detect protrusions, appliances, furniture, wall surfaces, wall corners, etc. which exist on a moving path of the robot cleaner, and then transmit obstacle information to thecontroller 200. The first sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, etc. The second sensors detect an obstacle which exists at the front or lateral side, and transmits obstacle information to thecontroller 200. That is, the second sensors detect protrusions, appliances, furniture, wall surfaces, wall corners, etc. which exist on a moving path of the robot cleaner, and then transmit obstacle information to thecontroller 200. The second sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, an RF sensor, a position sensitive device (PSD) sensor, etc. - The
obstacle detector 700 may further include a cliff sensor installed at the bottom surface of the body, and configured to detect an obstacle which is on the floor, e.g., a cliff. The cliff sensor is configured to obtain stable measurement values regardless of reflectivity of the floor and a color difference. Like the PSD sensor, the cliff sensor may be implemented in the form of an infrared ray module. - The
obstacle detector 700 may further include a charging signal sensor for receiving a guide signal transmitted from a charging station. The robot cleaner receives a guide signal transmitted from the charging station using the charging signals sensor, thereby checking a position and a direction of the charging station. The charging station creates a guide signal indicating a direction and a distance so that the robot cleaner can return to the charging station. The robot cleaner receives the guide signal transmitted from the charging station, and determines the current position and sets a moving direction. Then the robot cleaner returns to the charging station. The charging signal sensor may be implemented as an infrared ray sensor, an ultrasonic sensor, a radio frequency (RF) sensor, etc, and may be generally used as an infrared ray sensor. - The robot cleaner may further include wheel sensors connected to the right and left
main wheels 310, and sensing RPMs of the right and leftmain wheels 310. The wheel sensors may be implemented as a rotary encoder. When the robot cleaner moves in a running mode or a cleaning mode, the rotary encoder senses RPMs of the right and leftmain wheels 310, and outputs the sensed RPMs. Thecontroller 200 may calculate rotation speeds of the right and leftmain wheels 310 based on the sensed RPMs. - The
position recognition module 220 may recognize a position of the robot cleaner based on information about an obstacle detected by theobstacle detector 700. Also, theposition recognition module 220 may compensate for a recognized position of the robot cleaner, based on an obstacle detected using image information and the optical pattern sensor. Themap creation module 230 may create a map using information about an obstacle detected by the obstacle detection unit, or may compensate for a created peripheral map. - The robot cleaner further includes a
storage device 500 configured to store therein image information, obstacle information, position information, a peripheral map, etc. Referring toFIG. 5 orFIG. 6 , thestorage device 500 is configured to further store therein a cleaning map, a cleaning region, etc. Thestorage device 500 stores therein a control program for controlling the robot cleaner, and data associated with the control program. Thestorage device 500 may further store therein a cleaning type and a running type. As thestorage device 500, a non-volatile memory (NVM, NVRAM) is mainly used. The NVM indicates a storage device capable of maintaining stored information even if power is not supplied thereto. The NVM includes a ROM, a flash memory, a magnetic computer memory device (e.g., a hard disk, a diskette drive, and a magnetic tape), an optical disk drive, a magnetic RAM, a PRAM, etc. - Referring to
FIG. 6 , the robot cleaner further includes acleaning device 600 configured to draw dust or foreign materials into the robot cleaner. Thecleaning device 600 includes a dust box configured to store therein collected dust particles, a suction fan configured to provide a driving power to suck dust within a cleaning region, and a suction motor configured to suck air by rotating the suction fan. Thecleaning device 600 further includes an agitator rotatably mounted to a lower part of the body of the robot cleaner, and a side brush configured to clean a corner or an edge of a wall, etc. with rotating centering around a vertical shaft of the body. The agitator makes dust particles on the floor or a carpet move to the air with rotating centering around a horizontal shaft of the body of the robot cleaner. A plurality of blades are provided on an outer circumferential surface of the agitator in a spiral form. A brush may be provided between the blades. - Referring to
FIG. 6 , the robot cleaner may further include aninput device 810, anoutput device 820 and apower device 830. - The robot cleaner may further include an
input device 810 through which a user directly inputs a control command to the robot cleaner. The user may input, through theinput device 810, a command instructing output of one or more information among information stored in thestorage device 500. Theinput device 810 may be implemented as one or more buttons. For instance, theinput device 810 may include an OK button and a set button. The OK button is used to input a command for checking obstacle information, position information, image information, a cleaning region, a cleaning map, etc. The set button is used to input a command for setting such information. Theinput device 810 may be provided with a reset button for inputting a command for resetting such information, a deletion button, a cleaning start button, a stop button, etc. As another example, theinput device 810 may be provided with a button for setting reservation information, or a button for deleting reservation information. Theinput device 810 may be further provided with a button for setting a cleaning mode, or a button for changing a cleaning mode. Theinput device 810 may be further provided with a button for inputting a command instructing the robot cleaner to return to a charging station. As shown inFIG. 5 , theinput device 810 may be installed at an upper part of the robot cleaner, in the form of hard or soft keys, a touch pad, etc. Theinput device 810 may be implemented in the form of a touch screen together with theoutput device 820. - Referring to
FIG. 5 , theoutput device 820 is installed at an upper part of the robot cleaner. In this case, an installation position or an installation type may be variable. For instance, theoutput device 820 outputs, to a screen, reservation information, a battery state, intensive cleaning, space extension, a cleaning or running operation in a zigzag form, a cleaning operation with respect to a designated region, etc. Theoutput device 820 may output the current cleaning state of the robot cleaner, and the current state of each unit of the robot cleaner. Theoutput device 820 may display, on the screen, obstacle information, position information, image information, a cleaning map, a cleaning region, a designated region, etc. Theoutput device 820 may be implemented as one of a light emitting diode (LED), a liquid crystal display (LCD), a plasma display panel (PDP), and an organic light emitting diode (OLED). - The
power device 830 is provided with a chargeable battery to supply power into the robot cleaner. Thepower device 830 supplies, to each component as appropriate, a driving power and an operation power required when the robot cleaner moves or when the robot cleaner performs a cleaning operation. When the remaining amount of power is deficient, the robot cleaner moves to a charging station to be supplied with a charging current. The robot cleaner further comprises a battery sensor configured to sense a charged state of a battery, and to transmit detection information to thecontroller 200. As the battery is connected to the battery sensor, the remaining amount and a charged state of the battery are transmitted to thecontroller 200. The remaining amount of battery power may be displayed on the screen of theoutput device 820. - Referring to
FIG. 8 , the robot cleaner irradiates optical patterns while moving or in a stopped state (S210). As aforementioned, the optical pattern is preferably implemented as an asymmetric cross-shaped optical pattern. Then the robot cleaner captures a pattern image with respect to the optical pattern-irradiated region, thereby acquiring the pattern image (S220). Then the robot cleaner recognizes an obstacle using the pattern image (S230). Here, the robot cleaner recognizes an obstacle by processing the pattern image using a controller such as a micro computer. Alternatively, the optical pattern sensor itself may be configured to recognize an obstacle. For instance, the robot cleaner may recognize whether an obstacle exists or not, a width of an obstacle, a height of an obstacle, etc. based on a horizontal pattern. Further, the robot cleaner may detect a precise height of an obstacle based on a vertical pattern. The robot cleaner determines whether to pass through an obstacle or not, based on a recognition result with respect to the obstacle (S240). As shown inFIG. 3A , in case of a passable obstacle, the robot cleaner passes through the obstacle with its body moving forwardly (S241). As shown inFIG. 3B , in case of a passable threshold, the robot cleaner passes through the threshold with its body moving forwardly. If the robot cleaner cannot forward move, the robot cleaner determines whether to detour round the obstacle or not (S250). As shown inFIG. 3C , in case of an obstacle having a small gap, the robot cleaner detours a round the obstacle (S261). If the robot cleaner can neither move forward nor detour, the robot cleaner may stop or move back (S263). Such an algorithm may be variable according to a user or a programming, and a study function may be added according to a specification of the robot cleaner. - The robot cleaner may capture a peripheral image (S270), and may extract feature points from the peripheral image to thus recognize its position based on the feature points (S280). Then the robot cleaner may create a peripheral map based on the recognized position (S290).
- A robot cleaner as embodied and broadly precisely detect a peripheral obstacle using a peculiar (unique) optical pattern. Especially, an asymmetric cross-shaped optical pattern may be irradiated, and a pattern image with respect to the optical pattern-irradiated region may be analyzed. When so configured, whether an obstacle exists or not may be checked, and a width or a height of an obstacle may be detected.
- A robot cleaner as embodied and broadly described herein may perform operations such as a forward motion, a backward motion, a stopping motion and a detour motion, based on a detection result with respect to an obstacle.
- A robot cleaner is provided that is capable of precisely detecting a peripheral obstacle using a peculiar optical pattern, and a method for controlling the same.
- A robot cleaner is provided that is capable of detecting a width or a height of an obstacle by irradiating an asymmetric cross-shaped optical pattern, and by analyzing a pattern image with respect to the optical pattern-irradiated region, and capable of moving according to a detection result, and a method for controlling the same.
- A robot cleaner as embodied and broadly described herein may include a body which forms an appearance; a cleaning unit including a dust box for storing collected dust, a suction fan for providing a driving force to suck dust inside a cleaning region, and a suction motor for sucking air by rotating the suction fan; an optical pattern sensor provided on a front surface of the body, and configured to irradiate an optical pattern and to output a pattern image; and a control unit configured to recognize an obstacle based on the pattern image, wherein the optical pattern sensor comprises: an optical source module configured to irradiate one or more cross-shaped optical patterns toward a front side of the body; and a camera module configured to capture the pattern image on the optical pattern-irradiated region.
- The optical pattern sensor may be configured to irradiate an asymmetric cross-shaped optical pattern in which a horizontal length is longer than a vertical length.
- The control unit may include an obstacle recognition module configured to recognize an obstacle by processing the pattern image.
- A robot cleaner according to another embodiment may include a body which forms an appearance; a cleaning unit including a dust box for storing collected dust, a suction fan for providing a driving force to suck dust inside a cleaning region, and a suction motor for sucking air by rotating the suction fan; a driving unit provided with a wheel motor for driving one or more wheels installed below the body, and the driving unit configured to move the body according to a driving signal; an optical pattern sensor provided on a front surface of the body, configured to irradiate one or more cross-shaped optical patterns toward a front side of the body, and configured to output a pattern image; and a control unit configured to recognize an obstacle based on the pattern image, and configured to generate the driving signal based on a recognition result.
- A robot cleaner as embodied and broadly described herein may precisely detect a peripheral obstacle using a peculiar optical pattern. Especially, an asymmetric cross-shaped optical pattern can be irradiated, and a pattern image with respect to the optical pattern-irradiated region can be analyzed. Under such configuration, whether an obstacle exists or not can be checked, and a width or a height of an obstacle can be detected.
- A robot cleaner as embodied and broadly described herein may perform operations such as a forward motion, a backward motion, a stopping motion and a detour motion, based on a detection result with respect to an obstacle. This can enhance stability of the robot cleaner and a user's convenience, and improve a driving efficiency and a cleaning efficiency.
- Any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature, structure, or characteristic in connection with other ones of the embodiments.
- Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Claims (19)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020120065153A KR101949277B1 (en) | 2012-06-18 | 2012-06-18 | Autonomous mobile robot |
| KR10-2012-0065153 | 2012-06-18 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20130338831A1 true US20130338831A1 (en) | 2013-12-19 |
| US9511494B2 US9511494B2 (en) | 2016-12-06 |
Family
ID=48625873
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/920,612 Expired - Fee Related US9511494B2 (en) | 2012-06-18 | 2013-06-18 | Robot cleaner and controlling method of the same |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US9511494B2 (en) |
| EP (1) | EP2677386B1 (en) |
| KR (1) | KR101949277B1 (en) |
Cited By (67)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104238566A (en) * | 2014-09-27 | 2014-12-24 | 江阴润玛电子材料股份有限公司 | Image-recognition-based line patrolling robot control system for electronic circuit |
| US20150055339A1 (en) * | 2013-08-22 | 2015-02-26 | George Allen Carr, JR. | Systems and Methods for Illuminating an Object |
| US20150120056A1 (en) * | 2013-10-31 | 2015-04-30 | Lg Electronics Inc. | Mobile robot |
| US20150313438A1 (en) * | 2014-05-03 | 2015-11-05 | Bobsweep Inc. | Auxiliary Oval Wheel for Robotic Devices |
| WO2016034104A1 (en) * | 2014-09-05 | 2016-03-10 | 科沃斯机器人有限公司 | Self-moving surface walking robot and image processing method therefor |
| US20160104044A1 (en) * | 2014-10-14 | 2016-04-14 | Lg Electronics Inc. | Robot cleaner and method for controlling the same |
| CN105559694A (en) * | 2014-10-10 | 2016-05-11 | 莱克电气股份有限公司 | Robot dust collector |
| US20160278599A1 (en) * | 2015-03-23 | 2016-09-29 | Lg Electronics Inc. | Robot cleaner, robot cleaning system having the same, and method for operating a robot cleaner |
| US20160313741A1 (en) * | 2013-12-19 | 2016-10-27 | Aktiebolaget Electrolux | Prioritizing cleaning areas |
| WO2017018848A1 (en) * | 2015-07-29 | 2017-02-02 | Lg Electronics Inc. | Mobile robot and control method thereof |
| KR20170065564A (en) * | 2015-10-08 | 2017-06-13 | 도시바 라이프스타일 가부시키가이샤 | Electrical vacuum cleaner |
| US9811089B2 (en) | 2013-12-19 | 2017-11-07 | Aktiebolaget Electrolux | Robotic cleaning device with perimeter recording function |
| US20170336798A1 (en) * | 2016-05-20 | 2017-11-23 | Lg Electronics Inc. | Autonomous cleaner |
| US9851720B2 (en) * | 2014-05-15 | 2017-12-26 | Lg Electronics Inc. | Method of controlling a cleaner |
| US9939529B2 (en) | 2012-08-27 | 2018-04-10 | Aktiebolaget Electrolux | Robot positioning system |
| WO2018065376A1 (en) * | 2016-10-05 | 2018-04-12 | Alfred Kärcher Gmbh & Co. Kg | Self-propelled and self-steering ground working machine and method for working a ground area |
| US20180120852A1 (en) * | 2016-09-20 | 2018-05-03 | Shenzhen Silver Star Intelligent Technology Co., Ltd. | Mobile robot and navigating method for mobile robot |
| US10045675B2 (en) | 2013-12-19 | 2018-08-14 | Aktiebolaget Electrolux | Robotic vacuum cleaner with side brush moving in spiral pattern |
| US10149589B2 (en) | 2013-12-19 | 2018-12-11 | Aktiebolaget Electrolux | Sensing climb of obstacle of a robotic cleaning device |
| WO2019001001A1 (en) * | 2017-06-28 | 2019-01-03 | 杭州海康机器人技术有限公司 | Obstacle information acquisition apparatus and method |
| US10209080B2 (en) | 2013-12-19 | 2019-02-19 | Aktiebolaget Electrolux | Robotic cleaning device |
| US10219665B2 (en) | 2013-04-15 | 2019-03-05 | Aktiebolaget Electrolux | Robotic vacuum cleaner with protruding sidebrush |
| US10231591B2 (en) | 2013-12-20 | 2019-03-19 | Aktiebolaget Electrolux | Dust container |
| CN109890575A (en) * | 2016-08-25 | 2019-06-14 | Lg电子株式会社 | Mobile robot and its control method |
| US10342405B2 (en) | 2016-05-20 | 2019-07-09 | Lg Electronics Inc. | Autonomous cleaner |
| US10342400B2 (en) | 2016-05-20 | 2019-07-09 | Lg Electronics Inc. | Autonomous cleaner |
| US10362916B2 (en) | 2016-05-20 | 2019-07-30 | Lg Electronics Inc. | Autonomous cleaner |
| US10398276B2 (en) | 2016-05-20 | 2019-09-03 | Lg Electronics Inc. | Autonomous cleaner |
| CN110216674A (en) * | 2019-06-20 | 2019-09-10 | 北京科技大学 | A kind of redundant degree of freedom mechanical arm visual servo obstacle avoidance system |
| US10420448B2 (en) | 2016-05-20 | 2019-09-24 | Lg Electronics Inc. | Autonomous cleaner |
| US10433697B2 (en) | 2013-12-19 | 2019-10-08 | Aktiebolaget Electrolux | Adaptive speed control of rotating side brush |
| US10441128B2 (en) | 2016-05-20 | 2019-10-15 | Lg Electronics Inc. | Autonomous cleaner |
| US10448794B2 (en) | 2013-04-15 | 2019-10-22 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| US10463221B2 (en) | 2016-05-20 | 2019-11-05 | Lg Electronics Inc. | Autonomous cleaner |
| US10463212B2 (en) | 2016-05-20 | 2019-11-05 | Lg Electronics Inc. | Autonomous cleaner |
| US10466711B2 (en) * | 2016-08-22 | 2019-11-05 | Lg Electronics Inc. | Moving robot and controlling method thereof |
| US10499778B2 (en) | 2014-09-08 | 2019-12-10 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| CN110554696A (en) * | 2019-08-14 | 2019-12-10 | 深圳市银星智能科技股份有限公司 | Robot system, robot and robot navigation method based on laser radar |
| AU2017266810B2 (en) * | 2016-05-20 | 2019-12-19 | Lg Electronics Inc. | Robot cleaner |
| US10518416B2 (en) | 2014-07-10 | 2019-12-31 | Aktiebolaget Electrolux | Method for detecting a measurement error in a robotic cleaning device |
| US10524628B2 (en) | 2016-05-20 | 2020-01-07 | Lg Electronics Inc. | Autonomous cleaner |
| US10534367B2 (en) | 2014-12-16 | 2020-01-14 | Aktiebolaget Electrolux | Experience-based roadmap for a robotic cleaning device |
| CN110687903A (en) * | 2018-06-19 | 2020-01-14 | 速感科技(北京)有限公司 | Mobile robot trapped judging method and device and motion control method and device |
| US10542859B2 (en) * | 2015-02-13 | 2020-01-28 | Samsung Electronics Co., Ltd. | Cleaning robot and controlling method thereof |
| US20200081451A1 (en) * | 2017-06-02 | 2020-03-12 | Aktiebolaget Electrolux | Method of detecting a difference in level of a surface in front of a robotic cleaning device |
| US10617271B2 (en) | 2013-12-19 | 2020-04-14 | Aktiebolaget Electrolux | Robotic cleaning device and method for landmark recognition |
| US10678251B2 (en) | 2014-12-16 | 2020-06-09 | Aktiebolaget Electrolux | Cleaning method for a robotic cleaning device |
| US10729297B2 (en) | 2014-09-08 | 2020-08-04 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| CN111557618A (en) * | 2019-02-13 | 2020-08-21 | 三星电子株式会社 | Robot cleaner and control method thereof |
| CN111890352A (en) * | 2020-06-24 | 2020-11-06 | 中国北方车辆研究所 | Mobile robot touch teleoperation control method based on panoramic navigation |
| US10874271B2 (en) | 2014-12-12 | 2020-12-29 | Aktiebolaget Electrolux | Side brush and robotic cleaner |
| US10877484B2 (en) | 2014-12-10 | 2020-12-29 | Aktiebolaget Electrolux | Using laser sensor for floor type detection |
| US10874274B2 (en) | 2015-09-03 | 2020-12-29 | Aktiebolaget Electrolux | System of robotic cleaning devices |
| US11054837B2 (en) * | 2017-03-27 | 2021-07-06 | Casio Computer Co., Ltd. | Autonomous mobile apparatus adaptable to change in height, autonomous movement method and non-transitory computer-readable recording medium |
| US11099554B2 (en) | 2015-04-17 | 2021-08-24 | Aktiebolaget Electrolux | Robotic cleaning device and a method of controlling the robotic cleaning device |
| US11122953B2 (en) | 2016-05-11 | 2021-09-21 | Aktiebolaget Electrolux | Robotic cleaning device |
| US11169533B2 (en) | 2016-03-15 | 2021-11-09 | Aktiebolaget Electrolux | Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection |
| CN113907644A (en) * | 2020-07-08 | 2022-01-11 | 原相科技股份有限公司 | Automatic sweeper, automatic sweeper control method and sweeping robot |
| US11325260B2 (en) * | 2018-06-14 | 2022-05-10 | Lg Electronics Inc. | Method for operating moving robot |
| CN115113616A (en) * | 2021-03-08 | 2022-09-27 | 广东博智林机器人有限公司 | Path planning method |
| JP2023502406A (en) * | 2019-11-18 | 2023-01-24 | 北京石頭世紀科技股▲ふん▼有限公司 | Camera device and cleaning robot |
| US20230091839A1 (en) * | 2020-02-28 | 2023-03-23 | Lg Electronics Inc. | Moving robot and control method thereof |
| CN115890676A (en) * | 2022-11-28 | 2023-04-04 | 深圳优地科技有限公司 | Robot control method, robot and storage medium |
| US11726490B1 (en) * | 2016-02-19 | 2023-08-15 | AI Incorporated | System and method for guiding heading of a mobile robotic device |
| US20240019869A1 (en) * | 2016-08-23 | 2024-01-18 | Beijing Xiaomi Mobile Software Co., Ltd. | Cleaning robot and control method therefor |
| US11921517B2 (en) | 2017-09-26 | 2024-03-05 | Aktiebolaget Electrolux | Controlling movement of a robotic cleaning device |
| US20240383148A1 (en) * | 2021-09-10 | 2024-11-21 | Lg Electronics Inc. | Robot and method for controlling same |
Families Citing this family (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101578864B1 (en) * | 2014-04-25 | 2015-12-29 | 에브리봇 주식회사 | Distance sensor, robot cleaner and control method thereof |
| KR101620428B1 (en) | 2014-10-10 | 2016-05-12 | 엘지전자 주식회사 | Robot clener and control method thereof |
| WO2016073699A1 (en) | 2014-11-05 | 2016-05-12 | Trw Automotive U.S. Llc | Augmented object detection using structured light |
| TWI653964B (en) | 2016-05-17 | 2019-03-21 | Lg電子股份有限公司 | Mobile robot and its control method |
| TWI639021B (en) | 2016-05-17 | 2018-10-21 | 南韓商Lg電子股份有限公司 | Mobile robot and method of controlling the same |
| CN207979622U (en) | 2016-05-17 | 2018-10-19 | Lg电子株式会社 | Robot cleaner |
| KR102167898B1 (en) * | 2016-10-27 | 2020-10-20 | 엘지전자 주식회사 | Moving Robot and controlling method |
| DE102017105724A1 (en) * | 2017-03-16 | 2018-09-20 | Vorwerk & Co. Interholding Gmbh | Method for operating a self-propelled soil tillage implement |
| CN108015764B (en) * | 2017-11-20 | 2020-07-14 | 中国运载火箭技术研究院 | A spatial zero-prior target capture method based on multi-source visual information fusion |
| CN108153264A (en) * | 2017-12-26 | 2018-06-12 | 佛山市道静科技有限公司 | A kind of intelligentized Furniture control system |
| US11009882B2 (en) * | 2018-01-12 | 2021-05-18 | Pixart Imaging Inc. | Method, system for obstacle detection and a sensor subsystem |
| KR102048364B1 (en) * | 2018-04-13 | 2019-11-25 | 엘지전자 주식회사 | Robot cleaner |
| KR20200036678A (en) | 2018-09-20 | 2020-04-07 | 삼성전자주식회사 | Cleaning robot and Method of performing task thereof |
| US12140954B2 (en) | 2018-09-20 | 2024-11-12 | Samsung Electronics Co., Ltd. | Cleaning robot and method for performing task thereof |
| CN111813103B (en) * | 2020-06-08 | 2021-07-16 | 珊口(深圳)智能科技有限公司 | Control method, control system and storage medium for mobile robot |
| CN111938517B (en) * | 2020-08-17 | 2021-09-03 | 美智纵横科技有限责任公司 | Obstacle crossing method and device, sweeping robot and storage medium |
| KR102648850B1 (en) * | 2021-08-12 | 2024-03-19 | 네이버랩스 주식회사 | Pattern lighting apparatus for generating robot-recognizable pattern and building including the same |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4954962A (en) * | 1988-09-06 | 1990-09-04 | Transitions Research Corporation | Visual navigation and obstacle avoidance structured light system |
| US6327090B1 (en) * | 1997-07-03 | 2001-12-04 | Levelite Technology, Inc. | Multiple laser beam generation |
| US6728608B2 (en) * | 2002-08-23 | 2004-04-27 | Applied Perception, Inc. | System and method for the creation of a terrain density model |
| US20050166354A1 (en) * | 2004-01-30 | 2005-08-04 | Funai Electric Co., Ltd. | Autonomous vacuum cleaner |
| US20070100498A1 (en) * | 2005-10-27 | 2007-05-03 | Kosei Matsumoto | Mobile robot |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004095071A2 (en) * | 2003-04-17 | 2004-11-04 | Kenneth Sinclair | Object detection system |
| WO2007051972A1 (en) * | 2005-10-31 | 2007-05-10 | Qinetiq Limited | Navigation system |
| KR100735565B1 (en) * | 2006-05-17 | 2007-07-04 | 삼성전자주식회사 | Object detection method using structured light and robot using same |
-
2012
- 2012-06-18 KR KR1020120065153A patent/KR101949277B1/en not_active Expired - Fee Related
-
2013
- 2013-06-14 EP EP13172023.7A patent/EP2677386B1/en not_active Not-in-force
- 2013-06-18 US US13/920,612 patent/US9511494B2/en not_active Expired - Fee Related
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4954962A (en) * | 1988-09-06 | 1990-09-04 | Transitions Research Corporation | Visual navigation and obstacle avoidance structured light system |
| US6327090B1 (en) * | 1997-07-03 | 2001-12-04 | Levelite Technology, Inc. | Multiple laser beam generation |
| US6728608B2 (en) * | 2002-08-23 | 2004-04-27 | Applied Perception, Inc. | System and method for the creation of a terrain density model |
| US20050166354A1 (en) * | 2004-01-30 | 2005-08-04 | Funai Electric Co., Ltd. | Autonomous vacuum cleaner |
| US20070100498A1 (en) * | 2005-10-27 | 2007-05-03 | Kosei Matsumoto | Mobile robot |
Non-Patent Citations (3)
| Title |
|---|
| point - A place or locality.pdf (The Free Dictionary, point - definition of point by the Free Dictionary, 4/10/2015, http://www.thefreedictionary.com/point, pages 1-33) * |
| Point - ParticularPlace.pdf (Free Merriam-Webster Dictionary, Point - Definition and More from the Free Meriam-Webster Dictionary, 4/10/2015, http://www.merriam-webster.com/dictionary/point, pages 1-10) * |
| point_ A particular spot_place_or_position.pdf (Oxford Dictionary, point: definition of point in Oxford dictionary (American English) (US), 4/10/2015, http://www.oxforddictionaries.com/us/definition/american_english/point, pages 1-33) * |
Cited By (95)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9939529B2 (en) | 2012-08-27 | 2018-04-10 | Aktiebolaget Electrolux | Robot positioning system |
| US10448794B2 (en) | 2013-04-15 | 2019-10-22 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| US10219665B2 (en) | 2013-04-15 | 2019-03-05 | Aktiebolaget Electrolux | Robotic vacuum cleaner with protruding sidebrush |
| US20150055339A1 (en) * | 2013-08-22 | 2015-02-26 | George Allen Carr, JR. | Systems and Methods for Illuminating an Object |
| US9657936B2 (en) * | 2013-08-22 | 2017-05-23 | George Allen Carr, Jr. | Systems and methods for illuminating an object |
| US9440355B2 (en) * | 2013-10-31 | 2016-09-13 | Lg Electronics Inc. | Mobile robot |
| US20150120056A1 (en) * | 2013-10-31 | 2015-04-30 | Lg Electronics Inc. | Mobile robot |
| US10149589B2 (en) | 2013-12-19 | 2018-12-11 | Aktiebolaget Electrolux | Sensing climb of obstacle of a robotic cleaning device |
| US10209080B2 (en) | 2013-12-19 | 2019-02-19 | Aktiebolaget Electrolux | Robotic cleaning device |
| US10617271B2 (en) | 2013-12-19 | 2020-04-14 | Aktiebolaget Electrolux | Robotic cleaning device and method for landmark recognition |
| US20160313741A1 (en) * | 2013-12-19 | 2016-10-27 | Aktiebolaget Electrolux | Prioritizing cleaning areas |
| US10045675B2 (en) | 2013-12-19 | 2018-08-14 | Aktiebolaget Electrolux | Robotic vacuum cleaner with side brush moving in spiral pattern |
| US10433697B2 (en) | 2013-12-19 | 2019-10-08 | Aktiebolaget Electrolux | Adaptive speed control of rotating side brush |
| US9811089B2 (en) | 2013-12-19 | 2017-11-07 | Aktiebolaget Electrolux | Robotic cleaning device with perimeter recording function |
| US9946263B2 (en) * | 2013-12-19 | 2018-04-17 | Aktiebolaget Electrolux | Prioritizing cleaning areas |
| US10231591B2 (en) | 2013-12-20 | 2019-03-19 | Aktiebolaget Electrolux | Dust container |
| US20150313438A1 (en) * | 2014-05-03 | 2015-11-05 | Bobsweep Inc. | Auxiliary Oval Wheel for Robotic Devices |
| US9851720B2 (en) * | 2014-05-15 | 2017-12-26 | Lg Electronics Inc. | Method of controlling a cleaner |
| US10518416B2 (en) | 2014-07-10 | 2019-12-31 | Aktiebolaget Electrolux | Method for detecting a measurement error in a robotic cleaning device |
| CN105467985A (en) * | 2014-09-05 | 2016-04-06 | 科沃斯机器人有限公司 | Autonomous mobile surface walking robot and image processing method thereof |
| WO2016034104A1 (en) * | 2014-09-05 | 2016-03-10 | 科沃斯机器人有限公司 | Self-moving surface walking robot and image processing method therefor |
| US10729297B2 (en) | 2014-09-08 | 2020-08-04 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| US10499778B2 (en) | 2014-09-08 | 2019-12-10 | Aktiebolaget Electrolux | Robotic vacuum cleaner |
| CN104238566A (en) * | 2014-09-27 | 2014-12-24 | 江阴润玛电子材料股份有限公司 | Image-recognition-based line patrolling robot control system for electronic circuit |
| CN105559694A (en) * | 2014-10-10 | 2016-05-11 | 莱克电气股份有限公司 | Robot dust collector |
| US10133930B2 (en) * | 2014-10-14 | 2018-11-20 | Lg Electronics Inc. | Robot cleaner and method for controlling the same |
| US20160104044A1 (en) * | 2014-10-14 | 2016-04-14 | Lg Electronics Inc. | Robot cleaner and method for controlling the same |
| US10255501B2 (en) * | 2014-10-14 | 2019-04-09 | Lg Electronics Inc. | Robot cleaner and method for controlling the same |
| US10877484B2 (en) | 2014-12-10 | 2020-12-29 | Aktiebolaget Electrolux | Using laser sensor for floor type detection |
| US10874271B2 (en) | 2014-12-12 | 2020-12-29 | Aktiebolaget Electrolux | Side brush and robotic cleaner |
| US10678251B2 (en) | 2014-12-16 | 2020-06-09 | Aktiebolaget Electrolux | Cleaning method for a robotic cleaning device |
| US10534367B2 (en) | 2014-12-16 | 2020-01-14 | Aktiebolaget Electrolux | Experience-based roadmap for a robotic cleaning device |
| US10542859B2 (en) * | 2015-02-13 | 2020-01-28 | Samsung Electronics Co., Ltd. | Cleaning robot and controlling method thereof |
| US20160278599A1 (en) * | 2015-03-23 | 2016-09-29 | Lg Electronics Inc. | Robot cleaner, robot cleaning system having the same, and method for operating a robot cleaner |
| US9962054B2 (en) * | 2015-03-23 | 2018-05-08 | Lg Electronics Inc. | Robot cleaner, robot cleaning system having the same, and method for operating a robot cleaner |
| EP3679849A1 (en) * | 2015-03-23 | 2020-07-15 | LG Electronics Inc. -1- | Robot cleaner and robot cleaning system having the same |
| US11099554B2 (en) | 2015-04-17 | 2021-08-24 | Aktiebolaget Electrolux | Robotic cleaning device and a method of controlling the robotic cleaning device |
| US11029700B2 (en) | 2015-07-29 | 2021-06-08 | Lg Electronics Inc. | Mobile robot and control method thereof |
| WO2017018848A1 (en) * | 2015-07-29 | 2017-02-02 | Lg Electronics Inc. | Mobile robot and control method thereof |
| US10874274B2 (en) | 2015-09-03 | 2020-12-29 | Aktiebolaget Electrolux | System of robotic cleaning devices |
| US11712142B2 (en) | 2015-09-03 | 2023-08-01 | Aktiebolaget Electrolux | System of robotic cleaning devices |
| KR20170065564A (en) * | 2015-10-08 | 2017-06-13 | 도시바 라이프스타일 가부시키가이샤 | Electrical vacuum cleaner |
| KR102003787B1 (en) | 2015-10-08 | 2019-07-25 | 도시바 라이프스타일 가부시키가이샤 | Electrical vacuum cleaner |
| EP3360454A4 (en) * | 2015-10-08 | 2019-04-24 | Toshiba Lifestyle Products & Services Corporation | ELECTRICAL VACUUM |
| US12135563B1 (en) * | 2016-02-19 | 2024-11-05 | AI Incorporated | System and method for guiding heading of a mobile robotic device |
| US11726490B1 (en) * | 2016-02-19 | 2023-08-15 | AI Incorporated | System and method for guiding heading of a mobile robotic device |
| US11169533B2 (en) | 2016-03-15 | 2021-11-09 | Aktiebolaget Electrolux | Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection |
| US11122953B2 (en) | 2016-05-11 | 2021-09-21 | Aktiebolaget Electrolux | Robotic cleaning device |
| US10481611B2 (en) * | 2016-05-20 | 2019-11-19 | Lg Electronics Inc. | Autonomous cleaner |
| US10362916B2 (en) | 2016-05-20 | 2019-07-30 | Lg Electronics Inc. | Autonomous cleaner |
| US10524628B2 (en) | 2016-05-20 | 2020-01-07 | Lg Electronics Inc. | Autonomous cleaner |
| US11846937B2 (en) | 2016-05-20 | 2023-12-19 | Lg Electronics Inc. | Autonomous cleaner |
| US11547263B2 (en) | 2016-05-20 | 2023-01-10 | Lg Electronics Inc. | Autonomous cleaner |
| US20170336798A1 (en) * | 2016-05-20 | 2017-11-23 | Lg Electronics Inc. | Autonomous cleaner |
| US10342405B2 (en) | 2016-05-20 | 2019-07-09 | Lg Electronics Inc. | Autonomous cleaner |
| US10463212B2 (en) | 2016-05-20 | 2019-11-05 | Lg Electronics Inc. | Autonomous cleaner |
| US10463221B2 (en) | 2016-05-20 | 2019-11-05 | Lg Electronics Inc. | Autonomous cleaner |
| US10441128B2 (en) | 2016-05-20 | 2019-10-15 | Lg Electronics Inc. | Autonomous cleaner |
| US10420448B2 (en) | 2016-05-20 | 2019-09-24 | Lg Electronics Inc. | Autonomous cleaner |
| US10342400B2 (en) | 2016-05-20 | 2019-07-09 | Lg Electronics Inc. | Autonomous cleaner |
| AU2017266810B2 (en) * | 2016-05-20 | 2019-12-19 | Lg Electronics Inc. | Robot cleaner |
| US10827895B2 (en) | 2016-05-20 | 2020-11-10 | Lg Electronics Inc. | Autonomous cleaner |
| US10827896B2 (en) | 2016-05-20 | 2020-11-10 | Lg Electronics Inc. | Autonomous cleaner |
| US10835095B2 (en) | 2016-05-20 | 2020-11-17 | Lg Electronics Inc. | Autonomous cleaner |
| US10856714B2 (en) | 2016-05-20 | 2020-12-08 | Lg Electronics Inc. | Autonomous cleaner |
| US10939792B2 (en) | 2016-05-20 | 2021-03-09 | Lg Electronics Inc. | Autonomous cleaner |
| US10398276B2 (en) | 2016-05-20 | 2019-09-03 | Lg Electronics Inc. | Autonomous cleaner |
| US10466711B2 (en) * | 2016-08-22 | 2019-11-05 | Lg Electronics Inc. | Moving robot and controlling method thereof |
| US20240019869A1 (en) * | 2016-08-23 | 2024-01-18 | Beijing Xiaomi Mobile Software Co., Ltd. | Cleaning robot and control method therefor |
| CN109890575A (en) * | 2016-08-25 | 2019-06-14 | Lg电子株式会社 | Mobile robot and its control method |
| US20180120852A1 (en) * | 2016-09-20 | 2018-05-03 | Shenzhen Silver Star Intelligent Technology Co., Ltd. | Mobile robot and navigating method for mobile robot |
| WO2018065376A1 (en) * | 2016-10-05 | 2018-04-12 | Alfred Kärcher Gmbh & Co. Kg | Self-propelled and self-steering ground working machine and method for working a ground area |
| US11054837B2 (en) * | 2017-03-27 | 2021-07-06 | Casio Computer Co., Ltd. | Autonomous mobile apparatus adaptable to change in height, autonomous movement method and non-transitory computer-readable recording medium |
| US11474533B2 (en) * | 2017-06-02 | 2022-10-18 | Aktiebolaget Electrolux | Method of detecting a difference in level of a surface in front of a robotic cleaning device |
| US20200081451A1 (en) * | 2017-06-02 | 2020-03-12 | Aktiebolaget Electrolux | Method of detecting a difference in level of a surface in front of a robotic cleaning device |
| WO2019001001A1 (en) * | 2017-06-28 | 2019-01-03 | 杭州海康机器人技术有限公司 | Obstacle information acquisition apparatus and method |
| US11921517B2 (en) | 2017-09-26 | 2024-03-05 | Aktiebolaget Electrolux | Controlling movement of a robotic cleaning device |
| US20220258357A1 (en) * | 2018-06-14 | 2022-08-18 | Lg Electronics Inc. | Method for operating moving robot |
| US11325260B2 (en) * | 2018-06-14 | 2022-05-10 | Lg Electronics Inc. | Method for operating moving robot |
| US11787061B2 (en) * | 2018-06-14 | 2023-10-17 | Lg Electronics Inc. | Method for operating moving robot |
| CN110687903A (en) * | 2018-06-19 | 2020-01-14 | 速感科技(北京)有限公司 | Mobile robot trapped judging method and device and motion control method and device |
| CN111557618A (en) * | 2019-02-13 | 2020-08-21 | 三星电子株式会社 | Robot cleaner and control method thereof |
| CN110216674A (en) * | 2019-06-20 | 2019-09-10 | 北京科技大学 | A kind of redundant degree of freedom mechanical arm visual servo obstacle avoidance system |
| CN110554696A (en) * | 2019-08-14 | 2019-12-10 | 深圳市银星智能科技股份有限公司 | Robot system, robot and robot navigation method based on laser radar |
| JP2023502406A (en) * | 2019-11-18 | 2023-01-24 | 北京石頭世紀科技股▲ふん▼有限公司 | Camera device and cleaning robot |
| JP7433430B2 (en) | 2019-11-18 | 2024-02-19 | 北京石頭世紀科技股▲ふん▼有限公司 | Camera equipment and cleaning robot |
| US20230091839A1 (en) * | 2020-02-28 | 2023-03-23 | Lg Electronics Inc. | Moving robot and control method thereof |
| US12329335B2 (en) * | 2020-02-28 | 2025-06-17 | Lg Electronics Inc. | Moving robot and control method thereof |
| CN111890352A (en) * | 2020-06-24 | 2020-11-06 | 中国北方车辆研究所 | Mobile robot touch teleoperation control method based on panoramic navigation |
| US11690490B2 (en) | 2020-07-08 | 2023-07-04 | Pixart Imaging Inc. | Auto clean machine and auto clean machine control method |
| US12096908B2 (en) | 2020-07-08 | 2024-09-24 | Pixart Imaging Inc. | Robot cleaner and robot cleaner control method |
| CN113907644A (en) * | 2020-07-08 | 2022-01-11 | 原相科技股份有限公司 | Automatic sweeper, automatic sweeper control method and sweeping robot |
| CN115113616A (en) * | 2021-03-08 | 2022-09-27 | 广东博智林机器人有限公司 | Path planning method |
| US20240383148A1 (en) * | 2021-09-10 | 2024-11-21 | Lg Electronics Inc. | Robot and method for controlling same |
| CN115890676A (en) * | 2022-11-28 | 2023-04-04 | 深圳优地科技有限公司 | Robot control method, robot and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2677386B1 (en) | 2019-04-03 |
| KR101949277B1 (en) | 2019-04-25 |
| US9511494B2 (en) | 2016-12-06 |
| KR20130141979A (en) | 2013-12-27 |
| EP2677386A1 (en) | 2013-12-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9511494B2 (en) | Robot cleaner and controlling method of the same | |
| JP6655614B2 (en) | Robot cleaner and control method thereof | |
| TWI603699B (en) | Mobile robot and its control method | |
| US9339163B2 (en) | Mobile robot and operating method thereof | |
| US9423797B2 (en) | Robot cleaner and remote monitoring system and method of the same | |
| TWI653964B (en) | Mobile robot and its control method | |
| KR102527645B1 (en) | Cleaning robot and controlling method thereof | |
| US9785148B2 (en) | Moving robot and controlling method thereof | |
| US9474427B2 (en) | Robot cleaner and method for controlling the same | |
| US8983661B2 (en) | Robot cleaner, controlling method of the same, and robot cleaning system | |
| US11537135B2 (en) | Moving robot and controlling method for the moving robot | |
| KR20140011216A (en) | Robot cleaner and controlling method of the same | |
| KR101938703B1 (en) | Robot cleaner and control method for the same | |
| KR20180082264A (en) | Moving Robot and controlling method | |
| KR102070066B1 (en) | Robot cleaner and method for controlling the same | |
| EP3986679B1 (en) | Moving robot and method of controlling the same | |
| US12059115B2 (en) | Cleaner and method for controlling same | |
| KR20110085499A (en) | Robot cleaner and its control method | |
| KR20150136872A (en) | Cleaning robot and controlling method thereof | |
| KR101223480B1 (en) | Mobile robot and controlling method of the same | |
| KR20250159833A (en) | Moving Robot and controlling method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: LG ELECTRONICS INC., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NOH, DONGKI;BAEK, SEUNGMIN;YOON, JEONGSUK;REEL/FRAME:030635/0828 Effective date: 20130613 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20241206 |