[go: up one dir, main page]

SE546338C2 - Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects - Google Patents

Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects

Info

Publication number
SE546338C2
SE546338C2 SE2151256A SE2151256A SE546338C2 SE 546338 C2 SE546338 C2 SE 546338C2 SE 2151256 A SE2151256 A SE 2151256A SE 2151256 A SE2151256 A SE 2151256A SE 546338 C2 SE546338 C2 SE 546338C2
Authority
SE
Sweden
Prior art keywords
work tool
robotic work
robotic
movement information
dynamic object
Prior art date
Application number
SE2151256A
Other languages
Swedish (sv)
Other versions
SE2151256A1 (en
Inventor
Adam Tengblad
Herman Jonsson
Jakob Malm
Kamilla Kowalska
Marcus Homelius
Maria Strahl
Micael Hafström
Mikaela Ahlen
Sebastian Bergström
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Priority to SE2151256A priority Critical patent/SE546338C2/en
Priority to DE102022126506.4A priority patent/DE102022126506A1/en
Publication of SE2151256A1 publication Critical patent/SE2151256A1/en
Publication of SE546338C2 publication Critical patent/SE546338C2/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • G05D1/693Coordinated control of the position or course of two or more vehicles for avoiding collisions between vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method for use in a robotic work tool (100:1) arranged to operate in an operational area (205), the method comprising detecting a dynamic object (100:2); receiving movement information for the dynamic object (100:2); and determining a path to be travelled for the robotic work tool (100:1) based on movement information for the dynamic object (100:2) so as to avoid collision with the dynamic object (100:2), wherein the method further comprises receiving the movement information via a visual sensor (190), the movement information being signaled by the dynamic object (100:2) through visual indicators (195), wherein the movement information comprises a current speed and/or heading and wherein the dynamic object (100:2) is a second robotic work tool (100:2).

Description

TECHNICAL FIELD This application relates to a robotic work tool and in particular to a system and a method for providing an improved navigation for robotic work tools, such as lawnmowers, in such a system.
BACKGROUND Automated or robotic work tools such as robotic lawnmowers are becoming increasingly more popular and so is the use of the more than one robotic Working tool(s) in the same work area. The risk of collision between different robots has thus increased.
The patent application published as WO202lll590lAl discloses a work tool system comprising a work tool and a server, the server comprising a controller and a communication interface and the work tool comprising a controller and a communication interface, wherein the server is configured to: receive movement indications for a user through the communication interface; determine a movement pattem based on the movement indications; determine a Do Not Disturb area suitable for the movement pattem; and to transmit information on the Do Not Disturb area to the work tool through the communication interface; and wherein the work tool is configured to: receive information on the Do Not Disturb area; control the work tool so that the Do Not Disturb area is not violated.
The patent application published as US58l9008A discloses a a sensor system which can effect local communications suitable for exchanging information to avoid collisions between mobile robots. The sensor system also prevents collisions between the mobile robots and obstacles. The system is well-suited for a multi-robot environment where multiple mobile robots operate. The system includes infrared signal transmitters installed in each of the multiple mobile robots for sending transmission data via infrared signals. The system further includes infrared signal receivers installed in each of the multiple mobile robots for receiving the transmission data sent by the infrared signal transn1itters. Each mobile robot includes a control unit. The control unit prepares transmission information Which includes mobile robot identification information unique to the mobile robot. The transmission information is included in the transmission data sent from the infrared signal transmitter. The control unit also eXtracts transmission information Which is received by the infrared signal receiver.
Thus, there is a need for an improved manner of avoiding collisions.
SUMMARY It is therefore an object of the teachings of this application to overcome or at least reduce those problems by providing by providing a robotic Work tool arranged to operate in an operational area, the robotic Work tool comprising a controller, Wherein the controller is configured to: detect a dynamic obstacle; receive movement information for the dynamic object; and determine a path to be travelled for the robotic Work tool based on movement information for the dynamic object so as to avoid collision With the dynamic object as per the appended claims.
It is also an object of the teachings of this application to overcome the problems by providing a method for use in a robotic Work tool robotic Work tool arranged to operate in an operational area, the method comprising: detecting a dynamic obstacle; receiving movement information for the dynamic object; and deterrnining a path to be travelled for the robotic Work tool based on movement information for the dynamic object so as to avoid collision With the dynamic object.
It is also an object of the teachings of this application to overcome the problems by providing a robotic Work tool arranged to operate in an operational area, the robotic Work tool comprising a controller, Wherein the controller is configured to determine movement information for the robot Work tool and to communicate the movement information to another robotic Work tool.
It is also an object of the teachings of this application to overcome the problems by providing a method for use in a robotic Work tool, the method comprising deterrnining movement information for the robot Work tool and communicating the movement information to another robotic Work tool as per the appended claims.
It is also an object of the teachings of this application to overcome the problems by providing a robotic Work tool arranged to operate in an operational area, the robotic Work tool being configured to determine an action for the robot Work tool and signal the ' fm» <-=»\_-> .,~~..~.-s-«.--'.->.-§ .-.^ actlOn :Ab ;,.~\_.: du» <:,'\.~:\.-š:\:\.-\: ä., It is also an object of the teachings of this application to overcome the problems by providing a method for use in a robotic Work tool method for use in a robotic Work tool arranged to operate in an operational area, the method comprising determining an action for the robot Work tool and signal the action.
~ MQ.. q.. _ 4... www _. . Am, _.; \ .nufiav .V WU . ._ .~. »_ ä' \ c ._ \_ .man _. . .ut ,-c.,\~_.\c. .ch usd-_. .I ._,,,_.>_\,_ _, _._\,. _,\A_,,,_\ \\ .i ...yco ..,..\. :Ä \fi .~.-,--,..\«t1«.. . u. «. 4~».-\ fifif.. t .t .~ «- ...w *www LM. -. «..--\v.«\.\\'.\-. ~. v, .won-m -~ .WW *www \_\_.«_\..«,.> . .r »w u. .LL .M ..~. _\._\ .f .\. vw M.~..\ d.. h» . ._ .,.\\.\~_\_\.\;" ...hv V.. _ . »- ...UM .f __» , .o-...M ., .,.\\.\~_\_\.\.;\. I s, _ - ~.~ \./ _\ v ~.~ \./ In some embodiments the action is the preventive action.
In some embodiments the robotic Work tool is a robotic laWnmoWer¿}.§¿¿*>~ Further embodiments and aspects are as in the attached patent claims and as discussed in the detailed description.
Other features and advantages of the disclosed embodiments Will appear from the following detailed disclosure, from the attached dependent claims as Well as from the draWings. Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless eXplicitly defined otherwise herein. All references to "a/an/the [element, device, component, means, step, etc.]" are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless eXplicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless eXplicitly stated.
BRIEF DESCRIPTION OF THE DRAWINGS The invention Will be described in further detail under reference to the accompanying draWings in Which: Figure 1A shows an example of a robotic lawnmower according to some embodiments of the teachings herein; Figure 1B shows a schematic view of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein; Figure 2A shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 2B shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 2C shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 2D shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 3A shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; Figure 3B shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; Figure 4A shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 4B shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 4C shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 5A shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; and Figure 5B shows a corresponding flowchart for a method according to some example embodiments of the teachings herein.
DETAILED DESCRIPTION The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like reference numbers refer to like elements throughout.
It should be noted that even though the description given herein will be focused on robotic lawnmowers, the teachings herein may also be applied to, robotic ball collectors, robotic mine sweepers, robotic farming equipment, or other robotic work tools where a work tool is to be safeguarded against from accidentally extending beyond or too close to the edge of the robotic work tool.
Figure 1A shows a perspective view of a robotic work tool 100, here exemplified by a robotic lawnmower 100, having a body 140 and a plurality of wheels 130 (only one side is shown). The robotic work tool 100 may be a multi-chassis type or a mono-chassis type (as in figure 1A). A multi-chassis type comprises more than one main body parts that are movable with respect to one another. A mono-chassis type comprises only one main body part.
It should be noted that robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to even more than 1 meter for large robots arranged to service for example airfields.
It should be noted that even though the description herein is focussed on the example of a robotic lawnmower, the teachings may equally be applied to other types of robotic work tools, such as robotic watering tools, robotic golfball collectors, and robotic mulchers to mention a few examples.
It should also be noted that the robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within a work area, where the robotic work tool propels itself across or around the work area in a pattern (random or predetermined).
Figure 1B shows a schematic overview of the robotic work tool 100, also exemplified here by a robotic lawnmower 100. In this example embodiment the robotic lawnmower 100 is of a mono-chassis type, having a main body part 140. The main body part 140 substantially houses all components of the robotic lawnmower 100. The robotic lawnmower 100 has a plurality of wheels 130. In the exemplary embodiment of figure 1B the robotic lawnmower 100 has four wheels 130, two front wheels and two rear wheels. At least some of the wheels 130 are drivably connected to at least one electric motor 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may altematively be used, possibly in combination with an electric motor. In the example of figure 1B, each of the Wheels 130 is connected to a common or to a respective electric motor 155 for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners. The wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device. By controlling the motors 150, the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) It should be noted that wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
The robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120. The controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor. The controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic lawnmower 100 including, but not being limited to, the propulsion and navigation of the robotic lawnmower.
The controller 110 in combination with the electric motor 155 and the wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic lawnmower, enabling it to be self-propelled as discussed under figure 1A, The controller 110 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC). The memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
The robotic lawnmower 100 is further arranged with a wireless communication interface 115 for communicating with other devices, such as a server, a personal computer, a smartphone, the charging station, and/or other robotic work tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (IEEE802. 1 lb), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few. The robotic lawnmower 100 may be arranged to communicate With a user equipment 200 as discussed in relation to figure 2 below for providing information regarding status, location, and progress of operation to the user equipment 200 as well as receiving commands or settings from the user equipment 200. Alternatively or additionally, the robotic lawnmower 100 may be arranged to communicate with a server (referenced 240 in figure 2A) for providing information regarding status, location, and progress of operation as well as receiving commands or settings.
The robotic lawnmower 100 also comprises a grass cutting device 160, such as a rotating blade 160 driven by a cutter motor 165. The grass cutting device being an example of a work tool 160 for a robotic work tool The robotic lawnmower 100 may further comprise at least one navigation sensor, such as an optical navigation sensor, an ultrasound sensor, a beacon navigation sensor and/or a satellite navigation sensor 185. The optical navigation sensor may be a camera-based sensor and/or a laser-based sensor. The beacon navigation sensor may be a Radio Frequency receiver, such as an Ultra Wide Band (UWB) receiver or sensor, configured to receive signals from a Radio Frequency beacon, such as a UWB beacon. Altematively or additionally, the beacon navigation sensor may be an optical receiver configured to receive signals from an optical beacon. The satellite navigation sensor may be a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device. In embodiments, where the robotic lawnmower 100 is arranged with a navigation sensor, the magnetic sensors 170 as will be discussed below are optional. ln embodiments relying (at least partially) on a navigation sensor, the work area may be specified as a virtual work area in a map application stored in the memory 120 of the robotic lawnmower 100. The virtual work area may be defined by a virtual boundary.
The robotic lawnmower 100 may also or alternatively comprise deduced reckoning sensors 180. The deduced reckoning sensors may be odometers, accelerometer or other deduced reckoning sensors. In some embodiments, the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, Which Will give an indication of the speed and thereby distance for the corresponding Wheel.
For enabling the robotic laWnmoWer 100 to navigate With reference to a boundary Wire emitting a magnetic field caused by a control signal transmitted through the boundary Wire, the robotic laWnmoWer 100 is, in some embodiments, further configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the boundary Wire and/or for receiving (and possibly also sending) inforrnation to/from a signal generator (Will be discussed With reference to figure 1). In some embodiments, the sensors 170 may be connected to the controller 110, possibly via filters and an amplifier, and the controller 110 may be configured to process and evaluate any signals received from the sensors 170. The sensor signals are caused by the magnetic field being generated by the control signal being transn1itted through the boundary Wire. This enables the controller 110 to determine Whether the robotic laWnmoWer 100 is close to or crossing the boundary Wire, or inside or outside an area enclosed by the boundary Wire.
As mentioned above, in some embodiments, the robotic laWnmoWer 100 is in some embodiments arranged to operate according to a map application representing one or more Work areas (and possibly the surroundings of the Work area(s)) stored in the memory 120 of the robotic laWnmoWer 100. The map application may be generated or supplemented as the robotic laWnmoWer 100 operates or otherwise moves around in the Work area 205. In some embodiments, the map application includes one or more start regions and one or more goal regions for each Work area. In some embodiments, the map application also includes one or more transport areas.
As discussed in the above, the map application is in some embodiments stored in the memory 120 of the robotic Working tool(s) 100. In some embodiments the map application is stored in the server (referenced 240 in figure 2A). In some embodiments maps are stored both in the memory 120 of the robotic Working tool(s) 100 and in the server, Wherein the maps may be the same maps or show subsets of features of the area.
The robotic Working tool 100 may also comprise additional sensors 190 for enabling operation of the robotic Working tool 100, such as visual sensors (for example a camera), ranging sensors for enabling SLAM-based navigation (Simultaneous Localization and Mapping), moisture Sensors, collision Sensors, wheel load sensors to mention a few sensors. In particular, in some embodiments, the robotic work tool 100 comprises at least one visual sensor for receiving visual indications that may be interpreted to correspond to movement information.
In some embodiments, the robotic work tool 100 may also comprise visual indicators 195 such as lamps, light ramps or panels, Light Emitting Diodes or sirens for communicating visual indications of movement information. More details on this will be discussed further below.
In some embodiments the visual indicators 195 are arranged at various sides of the robotic work tool 100 sop that the visual indicator is visually associated with a direction of the robotic work tool. In such embodiments the visual indicator at one side may be used to signal a turn in the direction of that side. Colors may be used to indicate an order of the turn or action or perhaps a speed at which it will be performed, just to mention a few examples of possibilities.
Figure 2A shows a robotic work tool system 200 in some embodiments. The schematic view is not to scale. The robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein. It should be noted that the operational area 205 shown in figure 2A is simplified for illustrative purposes. The robotic work tool system comprises a boundary 220 that may be virtual and/or electro mechanical such as a magnetic field generated by a control signal being transmitted through a boundary wire, and which magnetic field is sensed by sensor in the robotic work tool The robotic work tool system 200 further comprises a station 210 possibly at a station location. A station location may alternatively or additionally indicate a service station, a parking area, a charging station or a safe area where the robotic work tool may remain for a time period between or during operation session.
As with figures 1A and lB, the robotic work tool(s) is eXemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within a work area.
The one or more robotic working tools 100 of the robotic work tool system 200 are arranged to operate in an operational area 205, Which in this example comprises a first work area 205A and a second work area 205B connected by a transport area TA. However, it should be noted that an operational area may comprise a single work area or one or more work areas, possibly arranged adjacent for easy transition between the work areas, or connected by one or more transport paths or areas, also referred to as corridors. In the following work areas and operational areas will be referred to interchangeably, unless specifically indicated.
The operational area 205 is in this application eXemplified as a garden, but can also be other work areas as would be understood, such as an airfield. As discussed above, the garden may contain a number of obstacles, for example a number of trees, stones, slopes and houses or other structures.
In some embodiments the robotic work tool is arranged or configured to traverse and operate in work areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such. The ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground. The robotic work tool is also or altematively arranged or configured to traverse and operate in a work area that contains obstacles that are not easily discerned from the ground. Examples of such are grass or moss covered rocks, roots or other obstacles that are close to ground and of a similar colour or teXture as the ground. The robotic work tool is also or altematively arranged or configured to traverse and operate in a work area that contains obstacles that are overhanging, i.e. obstacles that may not be detectable from the ground up, such as low hanging branches of trees or bushes. Such a garden is thus not simply a flat lawn to be mowed or similar, but a work area of unpredictable structure and characteristics. The work area 205 eXemplified with referenced to figure 2A, may thus be such a non- uniform work area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
As shown in figure 2A, the robotic working tool(s) l00 is arranged to navigate in one or more work areas 205A, 205B, possibly connected by a transport area TA. ll The robotic working tool system 200 may altematively or additionally comprise or be arranged to be connected to a server 240, such as a cloud service, a cloud server application or a dedicated server 240. The connection to the server 240 may be direct from the robotic working tool l00, direct from a user equipment 250, indirect from the robotic working tool l00 via the service station 2l0, and/or indirect from the robotic working tool l00 via the user equipment As a skilled person would understand that a server, a cloud server or a cloud service may be implemented in a number of ways utilizing one or more controllers 240A and one or more memories 240B that may be grouped in the same server or over a plurality of servers.
In the below several embodiments of how the robotic work tool may be adapted will be disclosed. It should be noted that all embodiments may be combined in any combination providing a combined adaptation of the robotic work tool.
The inventors have realized that estimating a path of a dynamic obstacle, such as a human, an animal or a robotic work tool is a complex task, as there are so many variables, and the movement of all these objects are prone to change without waming.
The inventors have also realized that a centralized control of multiple robots requires constant connection between a server and the robotic work tool(s) as to be able to react to case in the navigation of one or more robotic work tools.
In order to avoid errors when estimating a movement and to avoid the requirement of constant connection and loading of a server, the inventors have realized a simple solution that only loads (computationally) those robotic work tools that are affected by one another, i.e. which are in close proXimity (such as within the same work area or in the same region or portion of the same work area) of one another, i.e. in such proximity that there may be a risk for collisions and/or blocking.
Figure 2B shows a simplified view of a robotic work tool system 200 as in figure 2A. Figure 2B shows how a first robotic work tool l00:l operates in a work area 205. A second robotic work tool l00:2 is also operating in the same work area 205. The second robotic work tool l00:2 is an example of another objet, being a dynamic object. As a skilled person would understand, the teachings herein also apply to other objects apart from robotic work tools, such as rider lawnmowers, manual lawnmowers or othergardening equipment which may also be adapted to communicate movement information in any manner as will be discussed below.
The second robotic work tool l00:2 has a current movement, which is indicated by the arrow extending from the second robotic work tool l00:2 representing a heading and/or a speed in that heading. The second robotic work tool l00:2 is configured to transrnit or communicate information regarding its movement. Correspondingly, the first robotic work tool l00:l is configured to receive the movement information from the second robotic work tool l00:2. By the second robotic work tool l00:2 actively communicating information regarding its movement to other robotic work tools in the vicinity, the receiving robotic work tools do not have to estimate the movement as information is received from the robotic work tool. Furthermore, as the second robotic work tool communicates the information, there is no immediate need for a server, which reduces the required computational load of any server used.
Figure 2B shows how the second robotic work tool l00:2 communicates the movement information, as is indicated by the circle pattem extending from the second robotic work tool l00:2. The first robotic work tool l00:l detects the second robotic work tool l00:2 and receives information for the second robotic work tool l00: In some embodiments, the first robotic work tool is configured to detect the second robotic work tool l00:2 by receiving the movement information. In some such embodiments, the movement information is transmitted as a RF beacon, for example a BTE beacon.
In some embodiments, the first robotic work tool is configured to detect the second robotic work tool l00:2 by detecting the presence of the second robotic work tool l00:2 for example through a visual detection or through sensing communications being transmitted by the second robotic work tool l00:2, such as RF beacons (for example BTE beacons). ln some such embodiments, the first robotic work tool 100 is configured to request movement information from the second robotic work tool in response to detecting the second robotic work tool.
In some embodiments, the first robotic work tool is configured to receive the movement information through the visual sensor l90, whereby the second robotic work tool is configured to communicate the movement information through visual indicators.In such embodiments different colors, different number of visual indicators and/or different patterns may be used to signal the movement information. The visual signaling is thus received by the first robotic Work tool, Which is configured to translate the visual indications to movement information. This Will be discussed in further detail in relation to figures 4A, 4B, 4C, 5A and 5B.
In some embodiments the movement information comprises a current speed and/or heading. In some embodiments the movement information comprises a movement pattern. Such a movement pattem may be a random pattem, a pattem of parallel lines to be traversed, or more advanced pattems. In some embodiments the movement information comprises an area to be covered. Such an area may be a current Work area or a portion of such a Work area. The area is thus in some embodiments, an area to be avoided by the first robotic Work tool. In some embodiments the movement information comprises a next navigation action. Such a navigation action may be to change heading, change speed, maintain speed, maintain heading, and/or turning. ln embodiments Where the movement information is communicated visually, the second robotic Work tool may thus visually indicate a next navigation action by activating the visual indicators l95 in a manner corresponding to the next navigation action.
It should be noted that the second robotic Work tool may be configured to communicate the movement information by RF communication, by visual indicators, alone or in combination. A combination may be to transmit the movement information both as RF communication and as visual indicators. Another combination may be to communicate some of the movement information via RF communication and some (possibly overlapping) via visual indicators. This enables for a variety of robotic Work tools to receive the needed or necessary movement information regardless of the first robotic Work tool°s capabilities. It also enables other occupants of the area to be made aware of the most prominent movement information through the visual indicators, While a precise or more comprehensive information is communicated through the RF communication.
In some embodiments the movement information comprises a time frame for said movement information. In such embodiments, the movement information comprises a time frame or period When the movement information Will be valid. Forexample, if the movement information is a heading, the time frame may indicate for how long the robotic work tool will travel in that heading.
As the first robotic work tool has received the movement information, the first robotic work tool l00:l is configured to determine a path to be travelled for the robotic work tool based on movement information for the second robotic work tool l00:2 so as to avoid collision with the second robotic work tool.
For embodiments, where the movement includes a heading the controller is further configured to adapt a navigation of the robotic work tool (100: l) to steer in a heading not crossing a heading of the dynamic object. This is shown in figure 2C showing a simplified view of the system of figure 2B. In some such embodiments, the controller is further configured to adapt the navigation of the robotic work tool (l00: l), but maintain a planned speed of the robotic work tool. By maintaining the planned speed, the efficiency of the robotic work tool is maintained.
For embodiments, where the movement includes a movement of the second robotic work tool, the controller is further configured to adapt a navigation of the first robotic work tool l00: lto follow and/or align the navigation of the robotic work tool with the movement of the second robotic work tool. This is also shown in figure 2C.
For embodiments, where the movement includes an area (or at least a portion of an area) of the second robotic work tool l00:2, the controller is further configured to adapt a navigation of the robotic work tool (l00: l) to stay out of such an area. The area may be a work area or a portion of a work area. In some such embodiments, the first robotic work tool operates in a separate work area. In some embodiments, the first robotic work tool operates in a different portion of the work area of the second robotic work tool l00:2. This is shown in figure 2D showing a simplified view of the system of figure 2B.
In embodiments where the first robotic work tool is in connection with a server 240, the movement information may be received partially or in full via the server 240, the server thus acting as a proxy or interrnediate for the first robotic work tool l00:l. In some such embodiments, the server is configured to at least partially pre-process the movement information into a format that is easy to process for the robotic work tool.
For example, the serer may translate sensor input to an actual heading and/or speed.
In some embodiments the second robotic Work tool l00:2 is a robotic laWnmoWer. And, in some embodiments the first robotic Work tool is a robotic laWnmoWer.
As Will be understood by a skilled person, the operation above, although aimed at the first robotic Work tool, also applies to the corresponding behavior of the second robotic Work tool. It should also be noted that the first robotic Work tool may be configured to perform the operation of the second robotic Work tool and vice versa.
In some embodiments, the second robotic Work tool l00:2 is thus configured to determine movement information for the robot Work tool l00:2 and to communicate the movement information to another robotic Work tool 100: l.
In some embodiments, the second robotic Work tool l00:2 is further configured to communicate the movement information through a radio frequency communication interface. In some embodiments, the radio frequency communication interface is configured to transmit a beacon for communicating the movement information.
In some embodiments, the second robotic Work tool l00:2 is further configured to receive a request from the another robotic Work tool l00:l and in response thereto communicate the movement information.
In some embodiments, the second robotic Work tool l00:2 is further configured to communicate the movement information directly to the another robotic Work tool l00: l.
In some embodiments, the second robotic Work tool l00:2 is further configured to communicate the movement information to the another robotic Work tool (l00: l) via a server.
Figure 3A shows a floWchart for a general method according to herein. The method is for use in a robotic Work tool as in figures lA and lB. The method comprises detecting 3 l0 a dynamic obstacle and receiving 320 movement information for the dynamic object. The method further comprises determining 330 a path to be travelled for the robotic Work tool based on movement information for the dynamic object so as to avoid collision With the dynamic object.
Figure 3B shows a floWchart for a general method according to herein. The method is for use in a robotic Work tool as in figures lA and lB. The method comprisesdeterrnining 315 movement information for the robot work tool 100:2 and communicating 325 the movement information to another robotic work tool 100: Returning to figure 2B, it is discussed that a robotic work tool is in some embodiments arranged to indicate movement information visually. In such embodiments different colors, different number of visual indicators and/or different pattems may be used to signal the movement information. The visual signaling is thus ernitted or provided by the robotic work tool to communicate its intended movement pattem, or current movement status.
Such information is in some embodiments intended for living entities in or in the proXimity of the operational area, whereby a living entity, for example a human or an animal, can be made aware of the movement of the robotic work tool.
Figure 4A is a schematic view showing how the a first robotic work tool 100:1 is intending to navigate in one manner (such as in a steps of navigational actions (turning, changing speed, changing operational status, stopping, starting, and so on) or in pattems) and a second robotic work tool 100:2 is intending to operate in a second manner as indicated by the difference in dotted arrows indicating a path to be taken. In both instances, the robotic work tools 100 provide visual information regarding the intended actions in different manners as indicated by the difference in visual provisions PV.
In some embodiments, the robotic work tool is thus configured to provide visual information regarding an intended or current navigation, including navigational actions and/or operational states, such as active, idle, pausing, waiting, error to mention a few examples.
The inventors have further realized that such movement information may also be of interest to not only other robotic work tools, but also to living entities (such as humans or animals) being in the operational area, and thereby possibly being affected by the robotic work tool°s movement. Furthermore, the inventors have realized that such movement information may also be of interest to humans in the vicinity of the work area, which are not directly affected by the movements of the robotic work tools 100, such as a parent or guardian of a child or animal being in the vicinity of the robotic work tool 100. The guardian or parent hereafter being referred to as an observer, as alsoother situations are of interest and applicable to the teachings herein. This enables the observer to assess whether there is a risk of collision or other engagement between any of the robotic work tools and the living entity.
In order to facilitate such deterrninations, by the observer and/or by the entity in the vicinity of the robotic work tool(s) 100, the inventors are proposing that an robotic work tool that is configured to detect an object, identify the object as a living entity, and adapt its operation (possibly including navigation) accordingly by taking preventive actions and also to signal this change visually to enable the living entity or observer to realize that the robotic work tool has recognized the entity and is taking preventive actions.
Example of preventive actions are to adapt navigation so that a collision is avoided, halt to allow the living entity to move away from the robotic work tool, disable the working tool l60 so as to lower risk of damage, cancel operation, move to a different work area, move to a safe zone, keep a distance to the living entity and retum to a service station, to mention a few examples.
Figure 4B is a schematic view showing how a robotic work tool l00 is configured to detect a living entity LE and identify it. The detection may be made in a number of manners that are known in the related art, but for the examples presented herein it will be assumed that the detection is made based on visual recognition utilizing the visual sensor.
In the example of figure 4B, the robotic work tool l00 thus detects the living entity, and in response thereto the robotic work tool l00 classifies the living entity as a living entity, and possibly what kind of living entity (animal, child, adult) in some embodiments. In such embodiments, the preventive action may be selected based on the identified living entity. For example, for a human a first operating pattem or preventive action may be chosen that is based on an assumption that a human is basically predictable in its movements, whereas for an animal a second operating pattern or preventive action may be chosen that is based on the assumption that an animal is basically unpredictable in its movements. In some cases, the classification may be seen as part of the detection.In some embodiments, the actual living entity may also be identified, such as for example a type of animal, an individual animal or an individual human. In such embodiments, the preventive action may be selected based on the identified living entity. For example, for an identified human a preventive action may be chosen that is based on stored knowledge about that human. In some cases, the identification may be seen as part of the detection.
The robotic Work tool is thus in some embodiments configured to detect a living entity and take preventive actions accordingly.
Furthermore, the robotic Work tool is also in some embodiments configured to classify a living entity and take preventive actions accordingly. And still further, the robotic Work tool is also in some embodiments configured to identify a living entity and take preventive actions accordingly.
The preventive action may also include a signaling, such as a loud sound, a bright and/or flashing light, a recorded or synthesized voice or other audio message, to mention a few examples.
In some such embodiments, the robotic Work tool is thus enabled to select a different signaling based on What type or the identity of the detected object, Whereby one signaling may be selected for a dog, one for a cat and one for a human, and more particularly, one signaling may be selected for a first human and a second for a second human.
In some embodiments the robotic Work tool 100 is further configured to determine Whether there is a risk of collision prior to take preventive action.
In some such embodiments, the robotic Work tool l00 is configured to determine that there is a risk of collision, if the robotic Work tool is Within a predetermined distance from the living entity. In some such embodiments the predeterrnined distance is based on the type or identity of living entity. In some embodiments the predeterrnined distance is based on the current location of the robotic Work tool, such as Which Work area or Which transport area. The predetermined distance may be in the range 0-5 m, 0- l0 m, and 0-20 m, Within the same Work area, Within the same transport area or Within the same operational area.In some such embodiments, the robotic work tool 100 is configured to determine that there is a risk of collision, by determining a current heading of the living entity and to determine whether the current heading (as indicated by the dotted arrow) is on a collision course with an intended operating pattern.
As the robotic work tool has taken or is about to take the preventive action, the robotic work tool is also configured to signal this. The signaling may be visual possibly supplemented by audio signaling.
Figure 4C is a schematic view showing how a robotic work tool 100 is configured to take a preventive action (as indicated by the changed dotted arrow emanating from the robotic work tool 100) and signal that preventive action is to be taken, possibly by signaling the preventive action.
This enables the living entity and/or an observer to realize that the robotic work tool 100 has detected the living entity and taken preventive actions, which enables the living entity or observer to go on with its current task, thereby increasing the living entity°s or the observer°s productivity.
To enable the robotic work tool to provide visual signaling the robotic work tool is in some embodiments arranged with signaling actuators 195, including visual indicators (one or more light emitting diodes, strobes, lamps, sirens, projectors to mention a few examples). In some embodiments the signaling actuators 195 include audible indicators, such as horns, speakers for emitting audio processed by the controller, such as audio recordings and/or synthesized words.
Figure 5A shows a general flowchart according to a method of the teachings herein, where a robotic work tool 100 is enabled to generate signaling 510 indicating an action. In one embodiment, the action is an intended operating or path as discussed above. This is an alternative functionality and may in some embodiments only be used when an object (other robotic work tool or living entity) is in the vicinity. It is thus optional as indicated by the dotted box.
The robotic work tool is further configured to detect 520 a living entity, possibly including classifying 522 it and also possibly including identifying 524 it), and select 530 a preventive action. The robotic work tool is also configured to signal 540 the selected preventive action.
It should be noted that the actions being signaled (initial as Well as preventive) includes -in some embodiments -the movement information discussed in relation to figures 2A, 2B, 2C, 2D, 3A and 3B.
It should also be noted that as the option of initially signaling the movement information is optional, the action to be signaled may be seen as the preventive action, Whereby signaling action 510 and signaling preventive action 540 is the same signaling.
Figure 5B shows a general floWchart according to a method of the teachings herein, Where a robotic Work tool 100 is enabled to generate signaling 510 indicating an action in further detail, and as can be seen in figure 5B, the function of signaling an action may be performed independently of detecting living objects. The robotic Work tool is configured for determining 505 movement information as discussed for example in relation to figure 2B above, and signal 515 the movement information. ln one embodiment through RF communication through the communication interface and in some embodiments (possibly combined) through the visual indicators A robotic Work tool may thus in some embodiments be configured to perform the method according to figure 5A, With or Without performing the signaling of an action 510 - or possibly signaling the action 510 as signaling the preventive action. The robotic Work tool may also (or altematively) in some embodiments be configured to perform the method according to figure 5B, independently as discussed above for example in relation to figures 2A, 2B, 2C and 2D, or in combination With performing the method according to figure 5A.

Claims (10)

1. A robotic Work tool (100: 1) arranged to operate in an operational area, the robotic Work tool comprising a visual sensor (190) and a controller (110), Wherein the controller (100) is configured to: detect a dynamic aasš 5~=s<~ff~< receive movement information for the dynamic object (100:2); and determine a path to be travelled for the robotic Work tool (100: 1) based on movement information for the dynamic object (100:2) so as to avoid collision With the dynamic object (100:2), Wherein the controller (110) is further configured to receive the movement information via the visual sensor (190), the movement information being signaled by the dynamic ;j;»(100:2) through visual indicator (195), Wherein the movement information comprises a current speed and/or heading and Wherein the dynamic object is a second robotic Work tool (100:2), Wherein the movement information comprises a next navigation action.
2. The robotic Work tool (100: 1) according to claim 1, Wherein the movement information comprises a movement pattem.
3. The robotic Work tool (100: 1) according to any preceding claim, Wherein the movement information comprises an area to be covered.
4. The robotic Work tool (100: 1) according to any preceding claim, Wherein the movement information comprises a time frame for said movement information.
5. The robotic Work tool (100: 1) according to any preceding claim, Wherein the controller is further conf1gured to adapt a navigation of the robotic Work tool (100: 1) to steer in a heading not crossing a heading of the dynamic object.
6. The robotic Work tool (100: 1) according to claim 5, Wherein the controller is further configured to adapt the navigation of the robotic Work tool (100: 1), but maintain a planned speed of the robotic Work tool.
7. The robotic Work tool (100: 1) according to any preceding claim, Wherein the controller is further configured to adapt a navigation of the robotic Work tool (100: 1) to follow and/or align the navigation of the robotic Work tool With the movement of the dynamic object.
8. The robotic Work tool (100: 1) according to claim 7, Wherein the second robotic Work tool (100 :2) is a robotic laWnmoWer.
9. The robotic Work tool (100: 1) according to any preceding claim, Wherein the robotic Work tool is a robotic laWnmoWer.
10. A method for use in a robotic Work tool (100: 1) arranged to operate in an operational area (205). ef, the method comprising: 1 00 :2); receiving movement information for the dynamic object (100:2); and detecting a dynamic deterrnining a path to be travelled for the robotic Work tool (100: 1) based on movement information for the dynamic object (100:2) so as to avoid collision With the dynamic object (100:2), Wherein the method further comprises receiving the movement information via a visual sensor (190), the movement information being signaled by the dynamic §§1,(100:2) through visual indicators (195), Wherein the movement information comprises a current speed and/or heading and Wherein the dynamic object (100:2) is a second robotic Work tool (100:2), Wherein the movement information comprises a next navigation action.
SE2151256A 2021-10-13 2021-10-13 Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects SE546338C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2151256A SE546338C2 (en) 2021-10-13 2021-10-13 Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects
DE102022126506.4A DE102022126506A1 (en) 2021-10-13 2022-10-12 IMPROVED NAVIGATION FOR A ROBOTIC WORK EQUIPMENT SYSTEM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2151256A SE546338C2 (en) 2021-10-13 2021-10-13 Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects

Publications (2)

Publication Number Publication Date
SE2151256A1 SE2151256A1 (en) 2023-04-14
SE546338C2 true SE546338C2 (en) 2024-10-08

Family

ID=85705627

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2151256A SE546338C2 (en) 2021-10-13 2021-10-13 Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects

Country Status (2)

Country Link
DE (1) DE102022126506A1 (en)
SE (1) SE546338C2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5819008A (en) * 1995-10-18 1998-10-06 Rikagaku Kenkyusho Mobile robot sensor system
US6408226B1 (en) * 2001-04-24 2002-06-18 Sandia Corporation Cooperative system and method using mobile robots for testing a cooperative search controller
US20090043440A1 (en) * 2007-04-12 2009-02-12 Yoshihiko Matsukawa Autonomous mobile device, and control device and program product for the autonomous mobile device
US20160016315A1 (en) * 2014-07-16 2016-01-21 Google Inc. Virtual safety cages for robotic devices
US20160132059A1 (en) * 2014-11-11 2016-05-12 Google Inc. Position-Controlled Robotic Fleet With Visual Handshakes
US20190337155A1 (en) * 2018-05-04 2019-11-07 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same
WO2021115901A1 (en) * 2019-12-13 2021-06-17 Husqvarna Ab Improved scheduling for a robotic work tool

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5819008A (en) * 1995-10-18 1998-10-06 Rikagaku Kenkyusho Mobile robot sensor system
US6408226B1 (en) * 2001-04-24 2002-06-18 Sandia Corporation Cooperative system and method using mobile robots for testing a cooperative search controller
US20090043440A1 (en) * 2007-04-12 2009-02-12 Yoshihiko Matsukawa Autonomous mobile device, and control device and program product for the autonomous mobile device
US20160016315A1 (en) * 2014-07-16 2016-01-21 Google Inc. Virtual safety cages for robotic devices
US20160132059A1 (en) * 2014-11-11 2016-05-12 Google Inc. Position-Controlled Robotic Fleet With Visual Handshakes
US20190337155A1 (en) * 2018-05-04 2019-11-07 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same
WO2021115901A1 (en) * 2019-12-13 2021-06-17 Husqvarna Ab Improved scheduling for a robotic work tool

Also Published As

Publication number Publication date
SE2151256A1 (en) 2023-04-14
DE102022126506A1 (en) 2023-04-13

Similar Documents

Publication Publication Date Title
US9851718B2 (en) Intelligent control apparatus, system, and method of use
EP4118509B1 (en) System and method for improved navigation of a robotic work tool
EP4017248A1 (en) Improved operation for a robotic work tool
JP2022099084A (en) Agricultural machine, and system and method controlling the same
WO2022203562A1 (en) Improved navigation for a robotic work tool
EP4419976B1 (en) Improved navigation for a robotic lawnmower
KR102788982B1 (en) Agricultural vehicles, work vehicle collision warning systems and work vehicles
SE546338C2 (en) Robotic work tool and method for use in a robotic work tool so as to avoid collisions with dynamic objects
WO2023146451A1 (en) Improved operation for a robotic work tool system
US12487604B2 (en) Navigation for a robotic work tool system
US20240182074A1 (en) Operation for a robotic work tool
EP4085745B1 (en) Improved navigation for a robotic work tool
US20260000012A1 (en) Navigation for a robotic lawnmower with regards to woody plants
JP7744795B2 (en) Automatic driving system for agricultural work cart and agricultural work cart
EP4371390A1 (en) System and method for controlling at least one robotic mower
EP4546074A1 (en) Improved navigation for a robotic work tool system
US20230086392A1 (en) Navigation for a robotic work tool system
SE2250557A1 (en) Navigation for a robotic work tool system
SE545376C2 (en) Navigation for a robotic work tool system
SE2151613A1 (en) Improved navigation for a robotic work tool system
SE2250247A1 (en) Improved navigation for a robotic work tool system
JP2026005863A (en) Driving control system, work vehicle, and driving control method
WO2023121528A1 (en) Improved navigation for a robotic work tool system
SE2350697A1 (en) Improved navigation for a robotic work tool system
WO2023121535A1 (en) Improved navigation for a robotic work tool system