[go: up one dir, main page]

GB2638007A - Subsystem control mode selection method and apparatus - Google Patents

Subsystem control mode selection method and apparatus

Info

Publication number
GB2638007A
GB2638007A GB2401888.9A GB202401888A GB2638007A GB 2638007 A GB2638007 A GB 2638007A GB 202401888 A GB202401888 A GB 202401888A GB 2638007 A GB2638007 A GB 2638007A
Authority
GB
United Kingdom
Prior art keywords
image classification
control mode
terrain
subsystem control
probability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2401888.9A
Other versions
GB202401888D0 (en
Inventor
Walker Stuart
Frampton Thomas
Silk James
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB2401888.9A priority Critical patent/GB2638007A/en
Publication of GB202401888D0 publication Critical patent/GB202401888D0/en
Priority to PCT/EP2025/053295 priority patent/WO2025172190A1/en
Priority to GBGB2501866.4A priority patent/GB202501866D0/en
Publication of GB2638007A publication Critical patent/GB2638007A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road, e.g. motorways, local streets, paved or unpaved roads

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Transportation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Vehicle Body Suspensions (AREA)

Abstract

A control system (51) for selecting a subsystem control mode of a vehicle (5) to facilitate traversal of a section of terrain, including processors (55) configured to receive image classification probabilities, each image classification probability indicating a probability that the section of terrain is classified, based on image data captured by imaging sensors (21), as each of a plurality of terrain classes. A first weighting matrix is applied to modify the image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode. A second weighting matrix is applied to modify the image classification probabilities to generate one or more second modified image classification probabilities, the second weighting matrix being associated with a second subsystem control mode. First and second subsystem control mode probabilities are determined based on the respective first and second modified image classification probabilities, providing an indication of a suitability of the first and second subsystem control modes for traversal of the section of terrain. The control system selects one of the first and second subsystem control modes corresponding to the higher of the first and second subsystem control mode probabilities.

Description

SUBSYSTEM CONTROL MODE SELECTION METHOD AND APPARATUS
TECHNICAL FIELD
The present disclosure relates to a subsystem control mode selection method and apparatus. Aspects of the invention relate to a control system for selecting a subsystem control mode, a system a vehicle, a method of selecting a subsystem control mode and computer readable instructions.
BACKGROUND
It is known to provide a control system in a vehicle to select a subsystem control mode in dependence on a class (or type) of terrain being traversed by the vehicle. The control system may, for example, receive state indicators providing an indication of an operating state of the vehicle. The state indicators are typically received from sensors provided on-board the vehicle to monitor operating parameter(s) of the vehicle. By analysing the state indicators, the control system can identify a subsystem control mode which is suitable for traversing the prevailing terrain. A potential limitation of this approach is that the control system is re-active and can only use vehicle inputs to infer the appropriate subsystem control mode.
It is an aim of the present invention to address one or more of the disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system for selecting a subsystem control mode, a system a vehicle, a method of selecting a subsystem control mode and computer readable instructions as claimed in the appended claims.
According to an aspect of the present invention there is provided a control system for selecting a subsystem control mode of a vehicle to facilitate traversal of a section of terrain, wherein the control system comprises one or more processors collectively configured to receive a plurality of image classification probabilities, each image classification probability indicating a probability that the section of terrain is classified, in dependence on image data captured by one or more imaging sensor, as each of a plurality of terrain classes: apply a first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode; apply a second weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more second modified image classification probabilities, the second weighting matrix being associated with a second subsystem control mode; determining a first subsystem control mode probability in dependence on the one or more first modified image classification probabilities, the first subsystem control mode probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain; determining a second subsystem control mode probability in dependence on the one or more second modified image classification probabilities, the second subsystem control mode probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain; and selecting one of the first and second subsystem control modes corresponding to the higher of the first and second subsystem control mode probabilities.
The control system facilitates classification of the section of terrain as being one of a plurality of terrain classes. The terrain classes are predefined, for example comprising one or more of the following: a dirt road, grass, mud and ruts, (paved/metalled) road, rock, sand and snow. The image data is captured by one or more imaging sensors and represents an image comprising the section of terrain to be classified. The image data is processed to generate a plurality of image classification probabilities indicating a likelihood that the section of terrain is each of the plurality of terrain classes. At least in certain embodiments, an artificial neural network may process the image data to generate the image classification probabilities. The control system described herein modifies the image classification probabilities derived from the analysis of the image data. The first and second matrixes modify the relative weightings of the image classification probabilities for each of the plurality of terrain classes. The first and second matrixes are defined in respect of the first and second subsystem control modes. The first and second matrixes are different from each other and modify the relative weightings of the image classification probabilities, thereby refining the image classification probabilities for the respective subsystem control modes. The modified image classification probabilities are used to determine first and second subsystem control mode image classification probabilities associated with the respective first and second subsystem control modes. The control system may increase the relative weighting(s) of the or each image classification probability relating to terrain classifications comprising strong visual indicators (such as snow-covered roads); and/or may decrease the relative weighting(s) of the or each image classification probability relating to terrain classifications comprising limited (or absent) visual indicators (such as deep snow). The first and second weighting matrixes may be defined to represent different terrain and/or environmental conditions.
As described herein, terrain classification probabilities may be derived from other (non-imaging) sensors provided onboard the vehicle. At least in certain embodiments, the first and second weighting matrixes facilitate integration of the image classification probabilities with the terrain classification probabilities. The control system may decrease the weightings of image terrain probabilities relating to terrain classifications comprising limited (or absent) visual indicators. The reliance on the terrain classification probabilities derived from other (non-imaging) sensors may be increased for those terrain classifications comprising limited (or absent) visual indicators. By way of example, determination of a rolling resistance of the vehicle may provide a stronger indication that the terrain comprises deep snow than an image classification probability determined in dependence on an image of the terrain. The relative weightings of each of the terrain classification probabilities may be adjusted by the first and second weighting matrixes to reflect the availability of visual indicators of the terrain classification.
At least in certain embodiments, the control system may enable the image classification probabilities to be combined with a different number of vehicle subsystem control modes. For example, the control system may receive more image classification probabilities than there are vehicle subsystem control modes.
The selection of one of the first and second subsystem control modes is based, at least partially, on the analysis of the image data. At least in certain embodiments, this enable the subsystem control mode to be selected pre-emptively. The image classification probabilities may predict the classification of the section of terrain before the beings to traverse the terrain. The drive mode may be selected pro-actively, rather than re-actively.
The control system comprises one or more controllers collectively comprising at least one electronic processor having an electrical input for receiving an input signal; and at least one memory device electrically coupled to the at least one electronic processor and having instructions stored therein; and wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon so as to: apply the first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities; apply the second weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more second modified image classification probabilities; determine the first subsystem control mode image classification probability in dependence on the one or more first modified image classification probabilities; determine the second subsystem control mode image classification probability in dependence on the one or more second modified image classification probabilities; and select one of the first and second subsystem control modes corresponding to the higher of the first and second subsystem control mode image classification probabilities.
The one or more processors collectively may be configured to output a signal indicative of the selected subsystem control mode. The signal may be output to other vehicle systems, for example to implement the selected subsystem control mode.
The one or more processors collectively may be configured to receive one or more state indicators representing an operating state of the vehicle. At least one of the first weighting matrix and the second weighting matrix may generated in dependence on the one or more state indicators. For example. the first weighting matrix and/or the second weighting matrix may be generated dynamically in dependence on the one or more state indicators. One or more first variable in the first weighting matrix may be defined in dependence on the one or more state indicators. One or more second variable in the second weighting matrix may be defined in dependence on the one or more state indicators. The first variable may be the same as the second variable. Alternatively, the first and second variables may be different from each other.
The one or more state indicators may comprise a first state indicator representing a first operating state of the vehicle. The first weighting matrix may be generated in dependence on the first state indicator. For example, a first variable in the first weighting matrix may be defined in dependence on the first state indicator.
The one or more state indicators may comprise a second state indicator representing a second operating state of the vehicle. The second weighting matrix may be generated in dependence on the second state indicator. For example, a second variable in the second weighting matrix may be defined in dependence on the first state indicator.
The at least one of the first weighting matrix and the second weighting matrix may comprise a conditional operator for modifying one or more of the plurality of image classification probabilities in dependence on the one or more state indicators. The conditional operator may be configured to apply a first operator in dependence on a determination that the one or more state indicators is greater than a threshold value: and/or a second operator in dependence on a determination that the one or more state indicators is less than the threshold value.
The first weighting matrix may comprise at least one first conditional operator for modifying one of the plurality of image classification probabilities in dependence on the one or more state indicators. The first weighting matrix may comprise at least one first conditional operator for modifying one of the plurality of image classification probabilities in dependence a first state indicator. The first operator may be applied in dependence on a determination that the first state indicator is greater than a first threshold value; and/or the second operator may be applied in dependence on a determination that the first state indicator is less than the first threshold value.
The second weighting matrix may comprise at least one second conditional operator for modifying one of the plurality of image classification probabilities in dependence on the one or more state indicators. The second weighting matrix may comprise at least one second conditional operator for modifying one of the plurality of image classification probabilities in dependence on a second state indicator. The first operator may be applied in dependence on a determination that the second state indicator is greater Than a second threshold value; and/or the second operator may be applied in dependence on a determination that the second state indicator is less than the second threshold value.
At least in certain embodiments, the one or more processors is collectively configured to: determine a first state classification probability in dependence on the one or more state indicators, the first state classification probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain: determine a second state classification probability in dependence on the one or more state indicators, the second state classification probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain: wherein the first subsystem control mode probability is determined in dependence on the first state classification probability: and the second subsystem control mode probability is determined in dependence on the second state classification probability.
The first weighting matrix may comprise a first factor. The one or more first modified image classification probabilities may be generated by applying the first factor to one or more of the image classification probabilities.
The second weighting matrix may comprise a second factor. The one or more second modified image classification probabilities may be generated by applying the second factor to one or more of the image classification probabilities.
The one or more processors may be configured collectively to receive image data captured by one or more imaging sensor the image data representing an image comprising or consisting of the section of terrain. The image data may be referred to as raw image data. The one or more processors may be configured collectively to process the image data to generate the plurality of image classification probabilities indicating the probability that the section of terrain is classified as each of the plurality of terrain classes; and to output the plurality of image classification probabilities. The one or more processors may, for example. implement a segmentation model to segment the image.
Determining the first subsystem control mode image classification probability may comprise summing each of the one or more first modified image classification probabilities. Determining the second subsystem control mode image classification probability may comprise summing each of the one or more second modified image classification probabilities.
According to a further aspect of the present invention there is provided a control system for selecting a subsystem control mode of a vehicle to facilitate traversal of a section of terrain, wherein the control system comprises one or more processors collectively configured to: receive a plurality of image classification probabilities, each image classification probability indicating a probability that the section of terrain is classified, in dependence on image data captured by one or more imaging sensor, as each of a plurality of terrain classes; and apply a first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode.
According to a further aspect of the present invention there is provided a system comprising one or more imaging sensors configured to capture image data representing an image comprising or consisting of the section of terrain; and the control system described herein.
According to a further aspect of the present invention there is provided a vehicle comprising the control system as claimed described herein in combination with the system described herein. The vehicle may comprise one or more imaging sensors for capturing the image data representing the image.
According to a further aspect of the present invention there is provided a method of selecting a subsystem control mode of a vehicle to facilitate traversal of a section of terrain, wherein the method comprises: determining a plurality of image classification probabilities, image classification probability indicating a probability that the section of terrain is classified, in dependence on image data captured by one or more imaging sensors, as each of a plurality of terrain classes; applying a first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode; applying a second weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more second modified image classification probabilities, the second weighting matrix being associated with a second subsystem control mode; determining a first subsystem control mode image classification probability in dependence on the one or more first modified image classification probabilities, the first subsystem control mode image classification probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain; determining a second subsystem control mode image classification probability in dependence on the one or more second modified image classification probabilities, the second subsystem control mode image classification probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain; and selecting one of the first and second subsystem control modes in dependence on the first and second subsystem control mode image classification probabilities.
The method may comprise receiving one or more state indicators representing an operating state of the vehicle. The method may comprise generating at least one of the first weighting matrix and the second weighting matrix in dependence on the one or more state indicators.
The at least one of the first weighting matrix and the second weighting matrix may comprise a conditional operator for modifying one or more of the plurality of image classification probabilities in dependence on the one or more state indicators.
The method may comprise receiving one or more state indicators representing an operating state of the vehicle. At least one of the first weighting matrix and the second weighting matrix may be generated in dependence on the one or more state indicators.
The at least one of the first weighting matrix and the second weighting matrix may comprise a conditional operator for modifying one or more of the plurality of image classification probabilities in dependence on the one or more state indicators. The conditional operator may be configured to apply: a first operator in dependence on a determination that the one or more state indicators is greater than a threshold value; and/or a second operator in dependence on a determination that the one or more state indicators is less than the threshold value.
The method may comprise determining a first state classification probability in dependence on the one or more state indicators, the first state classification probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain. The first subsystem control mode probability may be determined in dependence on the first state classification probability. AlternativeY, or in addition, the method may comprise determining a second state classification probability in dependence on the one or more state indicators, the second state classification probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain. The second subsystem control mode probability may be determined in dependence on the second state classification probability.
According to a further aspect of the present invention there is provided computer readable instructions which, when executed by one or more processors, cause the one or more processors to perform the method described herein.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a schematic representation of a vehicle incorporating a terrain classification system in accordance with an embodiment of the present invention; Figure 2 shows a schematic representation of a terrain classification system configured to process image data to generate a plurality of image classification probabilities; Figure 3 shows a schematic representation of a control system for controlling operation of subsystems on the vehicle shown in Figure 1; Figure 4 shows a schematic representation of a state classification system configured to process sensor input signals to generate a plurality of state classification probabilities; Figure 5 represents a set of weighting matrixes defined for subsystem control modes to modify the image classification probabilities generated by the terrain classification system illustrated in Figure 2; and Figure 6 is a block diagram representing a method of classifying terrain in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
A system 1 and method 100 for classifying a section of terrain (denoted generally by the reference numeral ORT) in accordance with an embodiment of the present invention is described herein with reference to the accompanying Figures.
The system 1 is configured to be used in a vehicle 5. The vehicle 5 is a wheeled vehicle, such as an automobile, a utility vehicle or a sports utility vehicle. As shown in Figure 1, the vehicle 5 comprises four (4) wheels W1-W4. The vehicle 5 in the present embodiment is suitable for use both on-road and off-road. Each of the wheels W1-W4 may be selectively driven to propel the vehicle 5. The vehicle 5 comprises at least one electric drive unit 7 configured to drive the wheels W1-W4. The electric drive unit 7 is supplied with electrical energy stored in a traction battery 9. One or more inverter 11 is provided for converting the direct current (DC) from the traction battery 9 into alternating current (AC) which is supplied to the at least one electric drive unit 7. The vehicle 5 is a battery electric vehicle (BEV) in the present embodiment. It will be understood that the system 1 and the method 100 described herein are applicable to other types of vehicle 5, such as a hybrid electric vehicle (HEV), a plug-in hybrid electric vehicle (PHEV) or an internal combustion engine (ICE) vehicle.
The system 1 is configured to classify the terrain ORT on which the vehicle 5 is operating as one of a plurality of terrain classes TYP(n) (the suffix n is used herein to differentiate between the different terrain classes). As described herein, the terrain classes TYP(n) are predefined and different from each other. The terrain classes TYP(n) represent a range of types of terrain Mich the vehicle 5 may encounter during normal operation. The terrain classes TYP(n) in the present embodiment comprise the following: a dirt road TYP(4), grass TYP(2), mud and ruts TYP(5), road TYP(1); rock TYP(7), sand TYP(6) and snow TYP(3). It will be understood that different terrain classes TYP(n) may be defined. In the present embodiment, seven (7) different terrain classes TYP(n) are defined, but it will be understood that less than or more than seven (7) terrain classes TYP(n) may be defined.
The system 1 is configured to monitor a region of terrain proximal to the vehicle 5, typically in front of the vehicle 5. The vehicle 5 comprises one or more imaging sensors 21 configured to capture image data IMD(n) representing an image IMG(n). The or each imaging sensor 21 comprise or consist of an optical imaging sensor 21. The or each optical imaging sensor 21 is configured to detect light in the portion of the electromagnetic spectrum that is visible to the human eye. The image IMG(n) captured by the optical imaging sensor may be referred to as a visible (optical) image IMG(n). The vehicle 5 in the present embodiment comprises one or more first imaging sensors 21 configured to capture first image data IMD(1) representing a first image IMG(1). The first imaging sensor 21 has a direct line-of-sight to the terrain ORT, preferably reducing or avoiding reflections (for example in a side mirror or rear-view mirror). The system 1 described herein may have a dedicated first imaging sensor 21 to capture the first image data IMD(1). It will be understood that the first imaging sensor 21 may also be shared with other vehicle systems. The first imaging sensor 21 in the present embodiment is mounted in an elevated position, for example at the top of a front windshield of the vehicle 5. The first imaging sensor 21 may be mounted in other locations on the vehicle 5. The first imaging sensor 21 may have different orientations/directions to the arrangements illustrated herein. The first image IMG(1) is a dynamic image which changes with respect to time. The vehicle 5 is described herein as comprising one said first imaging sensor 21, although it will be appreciated that this is merely illustrative. The first imaging sensor 21 in the present embodiment is an optical camera configured to detect visible light. The first imaging sensor 21 is configured to detect light in the portion of the electromagnetic spectrum that is visible to the human eye. The first image IMG(1) is a visible (optical) image IMG(1).
As illustrated in Figure 1, the first imaging sensor 21 has a first field of view FOV1. The first field of view FOV1 extends in front of the vehicle 5 such that the first image IMG(1) represents a scene to a front of the vehicle 5. The first imaging sensor 21 is a mono-camera. In a variant, the imaging sensor 21 may comprise a stereo camera for capturing stereo images, for example to facilitate determination of a distance (range) from the vehicle 5 to features represented in the first image IMG(1).
The system 1 comprises a terrain classification system 29 for classifying the terrain ORT. The terrain classification system 29 is configured to process at least a portion of the first image data IMD(1) received from the first imaging sensor 21 to classify the terrain ORT. The first image data IMD(1) represents the first image IMG(1) captured by the first imaging sensor 21. The terrain classification system 29 may classify the terrain ORT by analysing the first image data IMD(1) representing at least substantially all of the first image IMG(1). Alternatively, the terrain classification system 29 may classify the terrain ORT by processing a sub-set of the first image data IMD(1), for example corresponding a segment of the first image IMG(1).
The terrain classification system 29 in the present embodiment implements an artificial neural network ANN to classify the terrain class TYP(n). At least in certain embodiments, the artificial neural network is trained using a supervised learning technique. In the present embodiment, the artificial neural network is a convolutional neural network (CNN). The convolutional neural network (CNN) may be trained using image data representing off-road terrain as a primary source. Alternatively, transfer learning may be used to refine a deep convolutional neural network (DCNN) for use in the system 1 to classify the terrain ORT. The deep convolutional neural network (DCNN) may be pre-trained on general image data which is not specific to terrain classification. A classifier is then applied to the deep convolutional neural network (DCNN). The classifier is a convolutional neural network (CNN) trained using image data representing off-road terrain as a primary source. It has been determined that the combination of the deep convolutional neural network (DCNN) and a classifier is particularly effective. The artificial neural network ANN in the present embodiment comprises a deep convolutional neural network (DCNN) and a dedicated classifier. Other techniques may be used to train the artificial neural network. For example, the artificial neural network may be trained using unsupervised learning techniques, such as competitive learning.
The artificial neural network ANN is configured to classify the terrain ORT represented in the first image IMG(1) as being one of the plurality of predefined terrain classes TYP(n). The artificial neural network ANN calculates an image classification probability imp(n) for each of the plurality of terrain classes TYP(n). The deep convolutional neural network (DCNN) processes the first IMG(1) to identify features therein. The deep convolutional neural network (DCNN) calculates the image classification probability imp(n) for each of the plurality of terrain classes TYP(n) in dependence on the identified features. Each image classification probability imp(n) is calculated in the range negative one (-1) to positive one (+1) in the present embodiment. Other ranges may be used to define the image classification probabilities imp(n). Each of the plurality of image classification probabilities imp(n) indicate the likelihood that the terrain ORT comprises (or is predominantly composed of) a respective one of the plurality of predefined terrain classes TYP(n). The terrain ORT is classified as the terrain class TYP(n) having the highest image classification probability imp(n).
In the present embodiment, the artificial neural network ANN calculates the image classification probability imp(n) that the terrain ORT represented in the first image IMG(1) is each of the predefined terrain classes TYP(n). The classification probabilities imp(n) are derived from the analysis of the first image IMG(1) and are referred to herein as image classification probabilities imp(n). The artificial neural network ANN calculates each of the following image classification probabilities imp(n) in respect of the terrain ORT shown in the first image IMG(1): 1. A first image classification probability imp(1) indicates a probability that the terrain ORT comprises or consists of road TYPO), for example, having a paved or metalled road surface.
2. A second image classification probability imp(2) indicates a probability that the terrain ORT comprises or consists of grass TYP(2).
3. A third image classification probability imp(3) indicates a probability that the terrain ORT comprises or consists of snow TYP(3).
4. A fourth image classification probability imp(4) indicates a probability that the terrain ORT comprises or consists of a dirt road TYP(4).
5. A fifth image classification probability imp(5) indicates a probability that the terrain ORT comprises or consists of mud and ruts TYP(5).
6. A sixth image classification probability imp(6) indicates a probability that the terrain ORT comprises or consists of sand TYP(6).
7. A seventh image classification probability imp(7) indicates a probability that the terrain ORT comprises or consists of rock TYP(7).
The artificial neural network ANN updates each of the image classification probabilities imp (n) with respect to time. Thus, the classification probabilities imp(n) are updated dynamically as the terrain ORT represented by the image IMG(1) changes, for example during a journey. The classification of the terrain ORT may change dynamically to reflect changes in the terrain ORT. The artificial neural network ANN is trained using a machine learning algorithm to process a plurality of training data sets comprising annotated image data representing different terrain classes (i.e., different types of terrain).
As shown in Figure 2, the terrain classification system 29 comprises one controller 33, although it will be appreciated that this is merely illustrative.
The controller 33 comprises processing means 35 and memory means 37. The processing means 35 may be one or more electronic processing device 35 which operably executes computer-readable instructions. The memory means 37 may be one or more memory devices 37. The memory means 37 is electrically coupled to the processing means 35. The memory means 37 is configured to store instructions, and the processing means 35 is configured to access the memory means 37 and execute the instructions stored thereon. When executed, the instructions cause the controller 33 to perform the method(s) described herein. The controller 33 comprises an input means 39 and an output means 41. The input means 39 comprises an electrical input 39 of the controller 33 The input means 39 is configured to receive the first image data IMD(1) representing the first image IMG(1) The input means 39 may optionally be configured to receive second image data IMD(2) representing a second image IMG(2), for example captured by a second imaging sensor (not shown) provided on the vehicle 5. The second imaging sensor may be the same type of sensor as the first sensor 21 (for example an optical camera): or the second imaging sensor may be a different type of sensor, for example a LIDAR sensor or a radar sensor.
The output means 41 may comprise an electrical output 41. The output 41 is arranged to output an image classification signal IMS(n). The image classification signal IMS(n) is an electrical signal indicating the image classification probabilities imp(n) for each of the predefined terrain classes TYP(n). The image classification signal IMS(n) comprises discrete values indicating each of the plurality of classification probabilities imp(n) calculated by the artificial neural network ANN. For example, a first image classification signal IMS(1) may be output indicating a first image classification probabilities imp(1) for a predefined first terrain class TYP(1).
The vehicle 5 in the present embodiment comprises a control system 51 for controlling operation of a plurality of vehicle subsystems VSS(n). The vehicle subsystems VSS(n) include, but are not limited to, a propulsion (or engine) management system VSS(1), a transmission system VSS(2), a steering system VSS(3), a brakes system VSS(4), a suspension system VSS(5) and a differential system VSS(6). The propulsion (or engine) management system VSS(1) is configured to control operation of the at least one electric drive unit 7 (or an internal combustion engine). The transmission system VSS(2) comprises a transmission which is operable selectively to engage one of a plurality of drive ratios. The drive ratios may comprise a plurality of forward drive ratios, a reverse drive ratio and a neutral (or disengaged) drive ratio. The transmission is usually provided in combination with an internal combustion engine but may be omitted if the vehicle 5 comprises one or more electric drive units 7. The transmission system VSS(2) may optionally include a transfer case operable selectively to engage a low drive ratio. The transfer case may be omitted. The steering system VSS(3) is configured to control operation of a power assisted steering system provided on the vehicle 5. The steering system VSS(3) is configured to control operation of a power assisted steering system provided on the vehicle 5. The anti-lock braking system VSS(4) is configured to control operation of one or more braking systems provided on the vehicle 5. The suspension control system VSS(5) in the present embodiment comprises an adjustable-height suspension. The suspension control system VSS(5) may, for example, be an air suspension or a mechanically adjustable suspension. The suspension control system VSS(5) comprises a plurality of adjustable-height suspension units, the adjustable-height suspension units being associated with the respective wheels WI -W4 of the vehicle 5. The height of the suspension units may be controlled, for example by controllably inflating and deflating one or more air bladders, to raise or lower the vehicle body. The differential control system VSS(6) comprises one or more lockable differentials, for example one or more of a centre differential, a rear drfferential and a front differential. The differential control system VSS(6) in the present embodiment comprises a centre differential. Although six vehicle subsystems VSS(n) are illustrated as being under the control of the control system 51, in practice a greater number of vehicle subsystems may be included on the vehicle 5 and may be under the control of the control system 51. At least some of the vehicle subsystems VSS(n) may communicate with the control system 51 to feedback information on a current (instantaneous) operating status or condition.
The vehicle subsystems VSS(n) are configurable to adjust the dynamic operation of the vehicle 5. The control system 51 is configured to control the vehicle subsystems VSS(n) in dependence on a selected one of a plurality of subsystem control modes SSM(n). The subsystem control modes SSM(n) are selected automatically or semi-automatically by the control system 51. One of the predefined subsystem control modes SSM(n) is selected to provide appropriate control of the vehicle subsystems VSS(n). The subsystem control modes SSM(n) in the present embodiment include the following: 1. A first subsystem control mode SSM(1) in the form of a comfort subsystem control mode suitable for traversing terrain comprising a paved (metalled) road, motorway or regular roadway.
2. A second subsystem control mode SSM(2) in the form of a grass/graveUsnow subsystem control mode (GGS mode) suitable for traversing terrain comprising or consisting of grass, gravel or snow terrain; 3. A third subsystem control mode SSM(3) in the form of a mud/ruts subsystem control mode (MR mode) for traversing terrain comprising or consisting of mud and/or rutted terrain; 4. A fourth subsystem control mode SSM(4) in the form of a sand subsystem control mode suitable for traversing terrain comprising or consisting of sand (or deep, soft snow); A fifth subsystem control mode SSM (5) in the form of a rock subsystem control mode suitable for traversing terrain comprising or consisting
of rocky terrain such as a boulder field.
The control system 51 comprises a subsystem controller 53 for selecting one of the plurality of subsystem control modes SSM(n). The subsystem controller 53 is configured to output one or more control signals to control operation of the or each vehicle subsystem VSS(n) in a manner appropriate to the driving conditions, such as the terrain, on which the vehicle 5 is operating (referred to as the terrain condition). The selection of the subsystem control mode SSM(n) is dependent on the image classification probabilities imp (n) derived from the analysis of the first image IMG(1). As described in more detail herein, the selection of the subsystem control mode SSM(n) is also dependent on a plurality of state classification probabilities stp(n) derived from the analysis of one or more sensor input signals SIN(n) received from a plurality of vehicle sensors VSN(n). A state classification probability stp(n) is calculated for each of the subsystem control modes SSM(n). The state classification probabilities stp(n) indicate a likelihood that the associated subsystem control mode SSM(n) is appropriate for the current or prevailing driving/terrain condition(s) based on the one or more sensor input signals SIN(n). The state classification probabilities stp(n) are typically derived from analysis of the one or more sensor input signals SIN(n) received from non-image-based vehicle sensors VSN(n).
As shown in Figure 3, the control system 51 comprises one controller 53, although it will be appreciated that this is merely illustrative. The controller 53 comprises processing means 55 and memory means 57. The processing means 55 may be one or more electronic processing device 55 which operably executes computer-readable instructions. The memory means 57 may be one or more memory devices 57. The memory means 57 is electrically coupled to the processing means 55. The memory means 57 is configured to store instructions, and the processing means 55 is configured to access the memory means 57 and execute the instructions stored thereon. When executed. the instructions cause the controller 53 to perform the method(s) described herein. The controller 53 comprises an input means 59 and an output means 61. The input means 59 comprises an electrical input 59 of the controller 53. The output means 61 may comprise an electrical output 61. The output 61 is arranged to output a subsystem control signal SG2. The subsystem control signal SG2 is an electrical signal indicating the selected subsystem control mode SSM(n).
The control system 51 is configured to select one of the plurality of subsystem control modes SSM(n) in dependence on a combination of the image classification probabilities imp(n) and the state classification probabilities stp(n). The control system 51 is configured to apply one or more weighting matrixes MTX(n) to the image classification probabilities imp(n). Alternatively, or in addition, the control system 51 may apply the one or more weighting matrixes MTX(n) to the state classification probabilities stp(n). The control system 51 is configured to apply one or more weighting matrixes MTX(n) to the image classification probabilities imp(n). Alternatively, or in addition, the control system 51 may apply the one or more weighting matrixes MTX(n) to the state classification probabilities stp(n). Each of the weighting matrixes MTX(n) corresponds to a respective one of the subsystem control modes SSM(n). The weighting matrixes MTX(n) are applied to the image classification probabilities imp(n) to generate modified image classification probabilities mimp(n). The subsystem control mode SSM(n) is selected in dependence on the modified image classification probabilities mimp(n) and optionally also the state classification probabilities stp(n). In the present embodiment, the modified image classification probabilities mimp(n) are combined with the state classification probabilities stp(n) to generate a plurality of subsystem control mode probabilities cop(n). The subsystem control mode SSM(n) is selected in dependence on the subsystem control mode probabilities cop(n). Specifically, the control system 51 is configured to select the subsystem control mode SSM(n) having the largest subsystem control mode probability cop(n).
Each of the weighting matrixes MTX(n) corresponds to a respective one of the subsystem control modes SSM(n). The weighting matrixes MTX(n) are applied to the image classification probabilities imp(n) to generate modified image classification probabilities mimp(n). The modified image classification probabilities mimp(n) are combined with the state classification probabilities stp(n) to generate a plurality of subsystem control mode probabilities cop(n). The subsystem control mode SSM(n) is selected in dependence on the subsystem control mode probabilities cop(n). Thus, the subsystem control mode SSM(n) is selected in dependence on a combination of the modified image classification probabilities mimp(n) and the state classification probabilities stp(n). The control system 51 is configured to select the subsystem control mode SSM(n) having the largest subsystem control mode probability cop(n). The application of the weighting matrixes MTX(n) is described in more detail herein.
The determination of the state classification probabilities stp(n) will now be described with reference to Figure 4. A state classification system 71 is configured to calculate the state classification probabilities stp(n) in dependence on the one or more sensor input signals SIN(n) received from the plurality of vehicle sensors VSN(n) The state classification system 71 comprises a classification controller 73 for calculating the state classification probabilities stp(n). As shown in Figure 4, the classification system 71 comprises one controller 73, although it will be appreciated that this is merely illustrative. The controller 73 comprises processing means 75 and memory means 77. The processing means 75 may be one or more electronic processing devices 75 which operably executes computer-readable instructions. The memory means 77 may be one or more memory devices 77.
The memory means 77 is electrically coupled to the processing means 75. The memory means 77 is configured to store instructions, and the processing means 75 is configured to access the memory means 77 and execute the instructions stored thereon. When executed, the instructions cause the controller 73 to perform the method(s) described herein. The controller 73 comprises an input means 79 and an output means 81. The input means 79 comprises an electrical input 79 of the controller 73. The output means 81 may comprise an electrical output 81. The output 81 is arranged to output a state classification signal STS(n). The state classification signal STS(n) is an electrical signal indicating the state classification probabilities stp(n) for each of the subsystem control modes SSM(n).
The input 79 is arranged to receive the plurality of sensor input signals SIN(n) from the vehicle sensors VSN(n). The plurality of signals SIN are electrical sensor input signals SIN(n). The sensor input signals SIN(n) indicate a real-time operating state of the operating state of the vehicle 5. The sensor input signals SIN(n) provide, or are used to calculate, a plurality of state indicator IND(n) which are indicative of the class (type) and/or condition of the terrain in which the vehicle 5 is operating.
The vehicle sensors VSN(n) are provided on-board the vehicle 5 and include, but are not limited to, sensors which provide continuous sensor outputs to the controller 73, including wheel speed sensors VSN(1), an ambient temperature sensor VSN(2), an atmospheric pressure sensor VSN(3), tire pressure sensors VSN(4), sensors VSN(5), such as gyroscopic sensors. for measuring yaw, roll and pitch of the vehicle, a vehicle speed sensor VSN(6), a longitudinal acceleration sensor VSN(7), an engine torque sensor (or engine torque estimator) VSN(8), a steering angle sensor VSN(9), a steering wheel speed sensor VSN(1 0), a gradient sensor (or gradient estimator) VSN(1), a lateral acceleration sensor VSN(12) on the stability control system (SCS), a brake pedal position sensor VSN(13), an acceleration pedal position sensor VSN(4) and longitudinal, lateral, vertical motion sensors VSN(5). The vehicle speed sensor comprises an inertial measurement unit, for example including one or more accelerometers. The vehicle speed sensor in the present embodiment comprises an accelerometer configured to measure longitudinal acceleration of the vehicle 5. The vehicle speed sensor is configured to output a vehicle speed signal indicating the vehicle speed. In the present embodiment the vehicle speed signal indicates the longitudinal speed of the vehicle 5. The vehicle speed signal may indicate a magnitude of the longitudinal speed of the vehicle. The controller 73 also receives a signal from the electronic power assisted steering unit (ePAS unit) of the vehicle 5 to indicate the steering force that is applied to the wheels (steering force applied by the driver combined with steering force applied by the ePAS system).
The vehicle sensors VSN(n) comprise a plurality of sensors which provide discrete sensor outputs to the controller 73, including a cruise control status signal (ON/OFF), a transfer box status signal (whether the gear ratio is set to a HIGH range or a LOW range), a Hill Descent Control (HDC) status signal (ON/OFF), a trailer connect status signal (ON/OFF), a signal to indicate that the Stability Control System (SCS) has been activated (ONIOFF), a windscreen wiper signal (ON/OFF), air suspension status (Raised/High, Normal, or Low), and a Dynamic Stability Control (DSC) signal (ON/OFF).
The electronic processing device 75 is configured to implement an estimator module 83, as shown schematically in Figure 4. The estimator module 83 comprises one or more estimator modules for estimating a state of the vehicle 5 and/or the vehicle sub-systems VSS(n). The estimator module 83 calculates the one or more state indicators IND(n). In the illustrated example, the estimator module 83 comprises the following modules: wheel acceleration IND(1); wheel inertia torque estimator IND(2); vehicle longitudinal force IND(3); aerodynamic drag estimator IND(4); wheel longitudinal force estimator IND(5); wheel slip detection IND(6); lateral acceleration estimator IND(7); vehicle yaw estimator IND(8); wheel speed variation and corrugation detection IND(9); surface rolling resistance IND(10); wheel longitudinal slip or breakaway torque' IND(11); surface friction or 'mu' plausibility check I ND(12); lateral surface friction or 'mu. estimation/rut detection IND(13); steering force estimator IND(14); and corrugation detection estimation IND(15).
Within a first stage of the estimator module 83, one or more of the sensor input signals SIN(n) is used to derive the one or more state indicators IND(n). In a first stage of the estimator module 83, a vehicle speed is derived from the wheel speed sensors, wheel acceleration is derived from the wheel speed sensors, the longitudinal force on the wheels is derived from the vehicle longitudinal acceleration sensor, and the torque at which wheel slip occurs (if wheel slip occurs) is derived from the motion sensors to detect yaw, pitch and roll. Other calculations performed within the first stage of the estimator module 83 include the wheel inertia torque (the torque associated with accelerating or decelerating the rotating mass of the wheels), "continuity of progress" (the assessment of whether the vehicle is starting and stopping, for example as may be the case when the vehicle is travelling over rocky terrain), aerodynamic drag, yaw, and lateral vehicle acceleration.
The estimator module 83 also includes a second stage in which the following state indicators IND(n) are calculated: the surface rolling resistance IND(10) (based on one or more of the wheel inertia torque, the longitudinal force on the vehicle, aerodynamic drag, and the longitudinal force on the wheels), the steering force IND(14) on the steering wheel (based on the lateral acceleration and the output from the steering wheel sensor), the wheel longitudinal slip IND(11) (based on the longitudinal force on the wheels, the wheel acceleration, a Stability Control Systems (SCS) activity and a signal indicative of whether wheel slip has occurred), lateral friction (calculated from the measured lateral acceleration and the yaw versus the predicted lateral acceleration and yaw), and corrugation detection IND(15) (high frequency, low amplitude wheel height excitement indicative of a washboard type surface).
The SCS activity signal is derived from several outputs from a Stability Control Systems (SCS) ECU (not shown); which contains the DSC (Dynamic Stability Control) function, the TC (Traction Control) function; ABS (anti-lock braking system) and HDC (hill descent control) algorithms; indicating DSC activity, TC activity. ABS activity, brake interventions on individual wheels, and engine torque reduction requests from the SCS ECU to the engine. All these indicate a slip event has occurred and the SCS ECU has taken action to control it. The estimator module 83 also uses the outputs from the wheel speed sensors to determine a wheel speed variation and corrugation detection signal.
The estimator module 83 also calculates a state indicator IND(16) representing the terrain roughness/corrugation based on the air suspension sensors (the ride height sensors) and the wheel accelerometers. A state indicator IND(16) in the form of a road roughness output is output from the estimator module. Additionally, or alternatively, wheel articulation data may be provided to the road roughness module IND(16) by appropriate sensing means, such as suspension stroke transducers, such as continuously variable damping (CVD) sensors. The estimator module 83 also determines a state indicator IND(17) representing the ambient temperature in dependence on a temperature measurement. One or more other measurements may be provided as a state indicator IND(n).
Calculations for wheel speed variation and corrugation output; the surface rolling resistance estimation; the wheel longitudinal slip and the corrugation detection, together with the friction plausibility check, are output from the estimator module 83 and provide state indicator IND(n) signals, indicative of the nature of the terrain in which the vehicle is travelling, for further processing within the controller 73.
The estimator module 83 executes a probability algorithm to determine the state classification probabilities stp(n) associated with each of the subsystem control modes SSM(n). The probability algorithm calculates each state classification probability stp(n) in dependence on one or more of the state indicators IND(n). The state classification probabilities stp(n) provide an indication of the likelihood that each subsystem control mode SSM(n) is appropriate for the current or prevailing driving condition or terrain class TY P(n). The estimator module 83 outputs the state classification probabilities stp(n) to the control system 51.
The control system 51 is configured to determine the subsystem control mode probabilities cop(n) in dependence on the state classification probabilities stp(n) and the modified image classification probabilities mimp(n). The application of the weighting matrixes MTX(n) to determine the modified image classification probabilities mimp(n) will now be described with reference to Figure 5. A weighting matrix MTX(n) is defined for each of the subsystem control modes SSM(n). The control system 51 applies the following weighting matrixes MTX(n): 1. A first weighting matrix MTX(1) associated with the first subsystem control mode SSM(1); 2. A second weighting matrix MTX(2) associated with the second subsystem control mode SSM(2); 3. A third weighting matrix MTX(3) associated with the third subsystem control mode SSM (3); 4. A fourth weighting matrix MTX(4) associated with the fourth subsystem control mode SSM(4); and A fifth weighting matrix MTX(5) associated with the fifth subsystem control mode SSM(5) The weighting matrixes MTX(n) correspond to the respective rows of the combined matrix shown in Figure 5. Each weighting matrix MTX(n) defines a plurality of expressions to be applied to the image classification probabilities imp(n) derived from the analysis of the first image IMG(1). A separate expression is defined for each of the image classification probabilities imp(n) calculated by the artificial neural network ANN. A different weighting matrix MTX(n) is defined for each of the subsystem control modes SSM(n). The expressions may comprise a conditional operator which is dependent on one or more of the state indicators IND(n). The conditional operator may comprise applying a first modifier if a first condition is satisfied; and applying a second modifier if a second condition is satisfied. Alternatively, or in addition. the expressions may define a predetermined value to be allocated to one or more of the image classification probabilities imp(n). The predetermined value may be zero or non-zero. Alternatively; or in addition, expressions may define a weighting factor (w) which is applied to one or more of the image classification probabilities imp(n). The weighting factor (w) is defined in the range zero (0) to one (1) (0<w<1).
The expressions are applied to each of the image classification probabilities imp(n) to generate the modified image classification probabilities mimp(n). The unmodified (source) image classification probabilities imp(n) output by the artificial neural network ANN are referred as to herein as the raw image classification probabilities imp(n). The expressions defined in each weighting matrix MTX(n) are applied to the raw image classification probabilities imp(n) to determine the modified raw image classification probabilities imp(n). The control system 51 is configured to determine a subsystem control mode probabilities cop(n) for each subsystem control modes SSM(n) in dependence on the state classification probabilities stp(n) and the modified image classification probabilities mimp(n). In the present embodiment, the subsystem control mode probabilities cop(n) for each subsystem control modes SSM(n) is a sum of the state classification probability stp(n) and the modified image classification probabilities mimp(n) for that subsystem control modes SSM(n). This corresponds to the sum of the modified image probabilities imp(n) in each row of the weighting matric MTX(n) shown in Figure 5 and the corresponding state classification probability stp(n). It will be understood that the one or more state indicators IND(n) may be updated dynamically, for example in dependence on changes in the terrain ORT. The subsystem control mode probabilities cop(n) are updated in dependence on any such changes.
The first weighting matrix MTX(1) is applied in respect of the first subsystem control mode SSM(1). The first weighting matrix MTX(1) is configured to increase the significance (weighting) of the first image classification probability imp(1) for selection of the first subsystem control mode SSM(1). The first weighting matrix MTX(1) defines expressions with reference to a state indicator IND(n) representing the road roughness output 89. The application of the first weighting matrix MTX(1) to each of the raw image classification probabilities imp(n) is as follows: 1. In respect of the first raw image classification probability imp(1) representing terrain comprising a road TYP(1), the first weighting matrix MTX(1) calculates a first modified image classification probability imp(1) equal to (a) the larger of 0.5 plus either the raw image classification probability imp(4) or 0.4 if the road roughness output 89 is greater than a surface roughness threshold (indicated as 10 in the present example), or (b) the larger of 0.5 plus either the first raw image classification probability imp(1) or 0.5 if the road roughness output is less than or equal to the surface roughness threshold.
2. The first weighting matrix MTX(1) defines the second modified image classification probability imp(2) representing terrain comprising grass TYP(2) as being equal to zero (0).
3. The first weighting matrix MTX(1) defines the third modified image classification probability imp(13) representing terrain comprising snow TYP(3) as being equal to zero (0).
4. In respect of the fourth raw image classification probability imp(4) representing terrain comprising a dirt road TY P(4), the first weighting matrix MTX(1) generates a fourth modified image classification probability imp(4) equal to (a) zero (0) if the road roughness output 89 is greater than a surface roughness threshold (indicated as 10 in the present example), or (b) the larger of zero (0) and the raw image classification probability imp(4) if the road roughness output 89 is less than or equal to the surface roughness threshold.
5. The first weighting matrix MTX(1) defines the fifth modified image classification probability imp(5) representing terrain comprising mud and ruts TYP(5) as being equal to zero (0).
6. The first weighting matrix MTX(1) defines the sixth modified image classification probability imp(6) representing terrain comprising sand TYP(6) as being equal to zero (0).
7 The first weighting matrix MTX(1) defines the seventh modified image classification probability imp(7) representing terrain comprising rock TYP(7) as being equal to zero (0).
The control system 51 determines a first subsystem control mode probabilities cop(1) for the first subsystem control mode SSM(1). In the present embodiment. the first subsystem control mode probability cop(1) is the sum of the first state classification probability stp(1) and each of the modified image classification probabilities mimp(n) associated with the first subsystem control mode SSM(1).
The second weighting matrix MTX(2) is applied in respect of the second subsystem control mode SSM(2). The second weighting matrix MTX(1) is configured to increase the significance (weighting) of the second and third image classification probabilities imp(2), imp(3) and to decrease the significance (weighting) of the sixth and seventh image classification probabilities imp(6). imp(7) for selection of the second subsystem control mode SSM(2). The second weighting matrix MTX(2) defines expressions with reference to the state indicator IND(17) representing the ambient temperature (K). The application of the second weighting matrix MTX(2) to each of the raw image classification probabilities imp(n) is as follows: 1. The second weighting matrix MTX(2) defines the first modified image classification probability imp(1) representing terrain comprising road TYP(1) as being equal to 0.5.
2. In respect of the second raw image classification probability imp(2) representing terrain comprising grass TYP(2), the second weighting matrix MTX(2) calculates a second modified image classification probability imp(2) equal to (a) the larger of zero (0) and the product of the second raw image classification probability imp(2) and the weighting factor (w).
3. In respect of the third raw image classification probability imp(3) representing terrain comprising snow TYP(3), the second weighting matrix MTX(2) calculates a third modified image classification probability imp(13) equal to (a) the larger of zero (0) and the third raw image classification probability imp(3) if the ambient temperature IND(1 7) is less than a temperature threshold (indicated as 278K in the present example), or (b) the larger of zero (0) and the product of the third raw image classification probability imp(3) and the weighting factor (w).
4. In respect of the fourth raw image classification probability imp(4) representing terrain comprising a dirt road TYP(4), the second weighting matrix MTX(2) calculates a fourth modified image classification probability imp(4) equal to (a) the larger of zero (0) and the product of the fourth raw image classification probability imp(4) and the weighting factor (w).
5. The second weighting matrix MTX(2) defines the fifth modified image classification probability imp(5) representing terrain comprising mud and ruts TYP(5) as being equal to 0.
6. In respect of the sixth raw image classification probability imp(6) representing terrain comprising sand TYP(6), the second weighting matrix MTX(2) calculates a skth modified image classification probability imp(6) which is the negative of the larger of zero (0) and the product of the sixth raw image classification probability imp(6) and the weighting factor (w).
7. In respect of the seventh raw image classification probability imp(7) representing terrain comprising rock TYP(7), the second weighting matrix MTX(2) calculates a seventh modified image classification probability imp(7) which is the negative of the larger of zero (0) and the product of the seventh raw image classification probability imp(7) and the weighting factor (w).
The control system 51 determines a second subsystem control mode probabilities cop(2) for the second subsystem control mode SSM(2). In the present embodiment, the second subsystem control mode probability cop(2) is the sum of a second state classification probability stp(2) and each of the modified image classification probabilities mimp(n) associated with the second subsystem control mode SSM(2).
The third weighting matrix MTX(3) is applied in respect of the third subsystem control mode SSM(3). The third weighting matrix MTX(3) is configured to increase the significance (weighting) of the fourth and fifth image classification probabilities imp(4), imp(5) for selection of the third subsystem control mode SSM(3). The third weighting matrix MTX(3) defines expressions with reference to the state indicator IND(16) representing the road roughness. The application of the third weighting matrix MTX(3) to each of the raw image classification probabilities imp(n) is as follows: 1. The third weighting matrix MTX(3) defines the first modified image classification probability imp(1) representing terrain comprising road TYP(1) as being equal to 0.5.
2 The third weighting matrix MTX(3) defines the second modified image classification probability imp(2) representing terrain comprising grass TYP(2) as being equal to zero (0).
3. The third weighting matrix MTX(3) defines the third modified image classification probability imp(13) representing terrain comprising snow TYP(3) as being equal to zero (0).
4. In respect of the fourth raw image classification probability imp(4) representing terrain comprising a dirt road TYP(4), the third weighting matrix MTX(3) calculates a fourth modified image classification probability imp(4) equal to (a) the larger of 0.05 and the product third raw image classification probability imp(3) and the weighting factor (w) if the road roughness IND(16) is less than a road roughness threshold (indicated as 30 in the present example), or (b) zero (0) if the road roughness IND(16) is less than the road roughness threshold (indicated as 30 in the present example).
5. The third weighting matrix MTX(3) defines the fifth modified image classification probability imp(5) representing terrain comprising mud and ruts TYP(5) as being equal to the raw image classification probability imp(3).
6. The third weighting matrix MTX(3) defines the sixth modified image classification probability imp(6) representing terrain comprising sand TYP(6) as being equal to zero (0).
7. The third weighting matrix MTX(3) defines the seventh modified image classification probability imp(7) representing terrain comprising rock TYP(7) as being equal to zero (0).
The canto! system 51 determines a third subsystem control mode probabilities cop(3) for the third subsystem control mode SSM(3). In the present embodiment. the third subsystem control mode probability cop(3) is the sum of a third state classification probability stp(3) and each of the modified image classification probabilities mimp(n) associated with the third subsystem control mode SSM(3).
The fourth weighting matrix MTX(4) is applied in respect of the fourth subsystem control mode SSM(4). The fourth weighting matrix MTX(4) is configured to increase the significance (weighting) of the sixth image classification probability imp(6) and, dependent on the state indicator IND(10) representing the rolling resistance, to decrease the significance (weighting) of the third image classification probability imp(3) for selection of the fourth subsystem control mode SSM(4).The fourth weighting matrix MTX(4) defines expressions with reference to the state indicator IND(10) representing the rolling resistance. The application of the fourth weighting matrix MTX(4) to each of the raw image classification probabilities imp(n) is as follows: 1. The fourth weighting matrix MTX(4) defines the first modified image classification probability imp(1) representing terrain comprising road TYP(1) as being equal to 0.5.
2. The fourth weighting matrix MTX(4) defines the second modified image classification probability imp(2) representing terrain comprising grass TYP(2) as being equal to zero (0).
3. In respect of the third raw image classification probability imp(3) representing terrain comprising snow TYP(3), the fourth weighting matrix MTX(2) calculates a third modified image classification probability imp(13) equal to (a) the larger of zero (0) and the product third raw image classification probability imp(3) and the weighting factor (w) if the rolling resistance IND(10) is greater than a rolling resistance threshold (indicated as 3000 in the present example), or (b) the negative of the product of the third raw image classification probability imp(3) and the weighting factor (w) if the rolling resistance IND(10) is less than or equal to the rolling resistance threshold (indicated as 3000 in the present
example).
4. In respect of the fourth raw image classification probability imp(4) representing terrain comprising a dirt road TYP(4), the fourth weighting matrix MTX(4) calculates a first modified image classification probability imp(7) equal to (a) the larger of zero (0) and the product of the fourth raw image classification probability imp(4) and the weighting factor (w) if the rolling resistance IND(10) is greater than a rolling resistance threshold (indicated as 3000 in the present example), or (b) set to zero (0) if the rolling resistance IND(10) is less or equal to the rolling resistance threshold (indicated as 3000 in the present example).
5. The fourth weighting matrix MTX(4) defines the fifth modified image classification probability imp(5) representing terrain comprising mud and ruts TYP(5) as being equal to equal to zero (0).
6. The fourth weighting matrix MTX(4) defines the sixth modified image classification probability imp(6) representing terrain comprising sand TYP(6) as being equal to the raw sixth modified image classification probability imp(6).
7 The fourth weighting matrix MTX(4) defines the seventh modified image classification probability imp(7) representing terrain comprising rock TYP(7) as being equal to zero (0).
The control system 51 determines a fourth subsystem control mode probabilities cop(4) for the fourth subsystem control mode SSM(4). In the present embodiment. the fourth subsystem control mode probability cop(4) is the sum of a fourth state classification probability stp(4) and each of the modified image classification probabilities mimp(n) associated with the fourth subsystem control mode SSM(4).
The fifth weighting matrix MTX(5) is applied in respect of the fifth subsystem control mode SSM(5). The fifth weighting matrix MTX(5) defines expressions with reference to the state indicator IND(16) representing the rolling resistance. The fifth weighting matrix MTX(5) is configured to increase the significance (weighting) of the seventh image classification probability imp(7) and, dependent on the road roughness IND(16). to decrease the significance (weighting) of the fifth image classification probability imp(5) for selection of the fifth subsystem control mode SSM(5). The application of the fifth weighting matrix MTX(5) to each of the raw image classification probabilities imp(n) is as follows: 1. The fifth weighting matrix MTX(5) defines the first modified image classification probability imp(1) representing terrain comprising road TYP(1) as being equal to 0.5.
2. The fifth weighting matrix MTX(5) defines the second modified image classification probability imp(2) representing terrain comprising grass TYP(2) as being equal to zero (0).
3. The fifth weighting matrix MTX(5) defines the third modified image classification probability imp(13) representing terrain comprising snow TYP(3) as being equal to zero (0).
4. The fifth weighting matrix MTX(5) defines the fourth modified image classification probability imp(4) representing terrain comprising dirt road TYP(4) as being equal to zero (0).
5. In respect of the fifth raw image classification probability imp(5) representing terrain comprising mud ruts TYP(5), the fifth weighting matrix MTX(5) calculates a third modified image classification probability imp(7) equal to (a) the product of the raw image classification probability imp(7) if the road roughness is greater than a road roughness threshold (indicated as 30 in the present example), or (b) the larger of zero (0) and the product of the raw image classification probability imp(7) and the weighting factor (w).
6. The fifth weighting matrix MTX(5) defines the sixth modified image classification probability imp(6) representing terrain comprising sand TYP(6) as being equal to zero (0).
7. The fifth weighting matrix MTX(5) defines the seventh modified image classification probability imp(7) representing terrain comprising rock TYP(7) as being equal to the raw image classification probability imp (5).
The control system 51 determines a fifth subsystem control mode probabilities cop(5) for the fifth subsystem control mode SSM(5). In the present embodiment, the fifth subsystem control mode probability cop(5) is the sum of a fifth state classification probability stp(5) and each of the modified image classification probabilities mimp(n) associated with the fifth subsystem control mode SSM(5).
Figure 6 illustrates a method 100 according to an embodiment of the invention. The method 100 is a method of selecting one of a plurality of subsystem control modes SSM(n) in dependence on the image classification probabilities imp(n) and the state classification probabilities stp(n). The method 100 may be performed by the system 1 described herein. In particular, the memory 37 may comprise computer-readable instructions which, when executed by the processor 35, perform the method 100 according to an embodiment of the invention.
The method 100 will be described with reference to the vehicle 5 situated in a section of terrain ORT. The method 100 is initiated (BLOCK 105). The method 100 comprises receiving first image data IMD(1) representing a first image IMG(1) of the terrain ORT (BLOCK 110). The first image data IMD(1) is captured by the first imaging sensor 21 provided on the vehicle 5 in the present embodiment. The first image IMG(1) comprises a scene in front of the vehicle 5. The first image IMG(1) is processed by the artificial neural network ANN (BLOCK 115). The artificial neural network ANN processes the first image IMG(1) to calculate an image classification probability imp(n) for each of the plurality of predefined terrain classes TYP(n).
The image classification probabilities imp(n) indicate a probability that the terrain ORT represented in the first IMG(1) is a respective one of the plurality of terrain classes TYP(n) The method comprises determining a plurality of state indicators IND(n) which represent the operating condition of the vehicle 5 (BLOCK 120). The (raw) image classification probabilities imp(n) and the state indicators IND(n) are supplied to an integration matrix (BLOCK 125). The integration matrix comprises a plurality of weighting matrixes MTX(n). The weighting matrixes MTX(n) are associated with respective subsystem control modes SSM(n). The application of the weighting matrixes MTX(n) generates modified image classification probabilities mimp(n) in respect of each of the subsystem control modes SSM(n) (BLOCK 130). A plurality of state classification probabilities stp(n) is generated in dependence on a plurality of sensor input signals SIN(n) (BLOCK 135). As described herein, the state classification probabilities stp(n) may be determined in dependence on one or more of the state indicators IND(n). The modified image classification probabilities mimp(n) and the state classification probabilities stp(n) are combined to generate subsystem control mode probabilities cop(n) (BLOCK 140). The control system 51 is configured to select one of the plurality of subsystem control modes SSM(n) in dependence on the subsystem control mode probabilities cop(n) (BLOCK 145). The control system 51 outputs a subsystem control signal SG2 to control operation of one or more of the vehicle subsystems VSS(n) (BLOCK 150). In the present embodiment, the subsystem control signal SG2 indicates the selected one of the plurality of subsystem control modes SSM(n). The method 100 continues to update dynamically the image classification probabilities imp(n) and the state classification probabilities stp(n) to reflect changes in the operating conditions. The selected vehicle subsystem control mode SSM(n) may be updated in dependence on changes in the operating conditions. The method 100 ends when the vehicle 5 is switched off, for example ignition OFF (BLOCK 155).
It will be appreciated that various changes and modifications can be made to the present invention without departing from the scope of the present application. Text in the following tables may be used in combination with numerals shown in Figure 5 and Figure 6. Table 1 L11 if rough_road > 10 max (0.5 + raw_road, 0.4) else: max (0.5 + raw_road, 0.5) L41 if rough_road >10: 0 else: max (0, raw_dirt*w) L22 max (0, raw_grass*w) L32 if ambiemp < 278: max (0, raw_snow) else: max (0, raw_snovew) L42 max (0, dirt_road*w) L62 -max (0, raw_sand*w) L72 -max (0, raw_rock*w) L43 if rough_road >30: max (0.05. raw_dirrw) else: 0 L53 Raw_mud L34 if roll_res > 3000 max (0, raw_snoVw) else: -max (0, raw_snow*w) L44 if roll_res >3000: max (0, raw_dirt*w) else: 0 L64 raw_sand L55 if rough_road < 30 raw_mud*w else: max (0, raw_mud*w) L75 raw_rock
Table 2
115a Road Probability 1156 Grass Probability 115c Snow Probability 115d Dirt Road Probability 115e Mud Ruts Probability 115f Sand Probability 115g Rock Probability 120a Surface Roughness 120b Rolling Resistance 120c Ambient Temperature 120d Vehicle Speed Integration matrix 130a Comfort Probability 130b GGS Probability 130c M&R Probability 130d Sand Probability 130e Rock Probability AutoTR Probabilities

Claims (15)

  1. CLAIMS1. A control system for selecting a subsystem control mode of a vehicle to facilitate traversal of a section of terrain, wherein the control system comprises one or more processors collectively configured to: receive a plurality of image classification probabilities, each image classification probability indicating a probability that the section of terrain is classified, in dependence on image data captured by one or more imaging sensors, as each of a plurality of terrain classes; apply a first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode; apply a second weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more second modified image classification probabilifies, the second weighting matrix being associated with a second subsystem control mode; determining a first subsystem control mode probability in dependence on the one or more first modified image classification probabilities; the first subsystem control mode probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain; determining a second subsystem control mode probability in dependence on the one or more second modified image classification probabilities, the second subsystem control mode probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain; and selecting one of the first and second subsystem control modes corresponding to the higher of the first and second subsystem control mode probabilities.
  2. 2. The control system as claimed in claim 1, wherein the one or more processors is collectively configured to: output a signal indicative of the selected subsystem control mode.
  3. 3. A control system as claimed in claim 1 or claim 2, wherein the one or more processors is collectively configured to: receive one or more state indicators representing an operating state of the vehicle; wherein at least one of the first weighting matrix and the second weighting matrix is generated in dependence on the one or more state indicators.
  4. 4. A control system as claimed in claim 3, wherein the at least one of the first weighting matrix and the second weighting matrix comprises a conditional operator for modifying one or more of the plurality of image classification probabilities in dependence on the one or more state indicators.
  5. 5. A control system as claimed in claim 4, wherein the conditional operator is configured to apply: a first operator in dependence on a determination that the one or more state indicators is greater than a threshold value; and/or a second operator in dependence on a determination that the one or more state indicators is less than the threshold value.
  6. 6. A control system as claimed in any one of claims 3, 4 or 5 the one or more processors is collectively configured to: determine a first state classification probability in dependence on the one or more state indicators, the first state classification probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain; determine a second state classification probability in dependence on the one or more state indicators, the second state classification probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain; wherein the first subsystem control mode probability is determined in dependence on the first state classification probability; and the second subsystem control mode probability is determined in dependence on the second state classification probability.
  7. 7. A control system as claimed in any one of the preceding claims, wherein the first weighting matrix comprises a first factor; the one or more first modified image classification probabilities being generated by applying the first factor to one or more of the image classification probabilities.
  8. 8. A control system as claimed in any one of the preceding claims, wherein the second weighting matrix comprises a second factor; the one or more second modified image classification probabilities is generated by applying the second factor to one or more of the image classification probabilities.
  9. 9. A control system as claimed in any one of the preceding claims, wherein the one or more processors is collectively configured to: receive image data captured by one or more imaging sensors, the image data representing an image comprising or consisting of the section of terrain; process the image data to generate the plurality of image classification probabilities indicating the probability that the section of terrain is classified as each of the plurality of terrain classes; and outputting the plurality of image classification probabilities.
  10. 10. A control system as claimed in any one of the preceding claims, wherein determining the first subsystem control mode probability comprises summing each of the one or more first modified image classification probabilities; and determining the second subsystem control mode probability comprises summing each of the one or more second modified image classification probabilities.
  11. 11. A system comprising: one or more imaging sensors configured to capture image data representing an image comprising or consisting of the section of terrain; and the control system claimed directly or indirectly in claim 9.
  12. 12. A vehicle comprising the control system as claimed in any one of the preceding claims or the system of claim 11. 20
  13. 13. A method of selecting a subsystem control mode of a vehicle to facilitate traversal of a section of terrain, wherein the method comprises: determining a plurality of image classification probabilities, image classification probability indicating a probability that the section of terrain is classified, in dependence on image data captured by one or more imaging sensors, as each of a plurality of terrain classes; applying a first weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more first modified image classification probabilities, the first weighting matrix being associated with a first subsystem control mode; applying a second weighting matrix to modify one or more of the plurality of image classification probabilities to generate one or more second modified image classification probabilities, the second weighting matrix being associated with a second subsystem control mode; determining a first subsystem control mode probability in dependence on the one or more first modified image classification probabilities, the first subsystem control mode probability providing an indication of a suitability of the first subsystem control mode for traversal of the section of terrain; determining a second subsystem control mode probability in dependence on the one or more second modified image classification probabilities, the second subsystem control mode probability providing an indication of a suitability of the second subsystem control mode for traversal of the section of terrain; and selecting one of the first and second subsystem control modes in dependence on the first and second subsystem control mode probabilities.
  14. 14. A method as claimed in claim 13 comprising: receiving one or more state indicators representing an operating state of the vehicle; and generating at least one of the first weighting matrix and the second weighting matrix in dependence on the one or more state indicators.
  15. 15. A method as claimed in claim 14, wherein the at least one of the first weighting matrix and the second weighting matrix comprises a conditional operator for modifying one or more of the plurality of image classification probabilities in dependence on the one or more state indicators.
GB2401888.9A 2024-02-12 2024-02-12 Subsystem control mode selection method and apparatus Pending GB2638007A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB2401888.9A GB2638007A (en) 2024-02-12 2024-02-12 Subsystem control mode selection method and apparatus
PCT/EP2025/053295 WO2025172190A1 (en) 2024-02-12 2025-02-07 Subsystem control mode selection method and apparatus
GBGB2501866.4A GB202501866D0 (en) 2024-02-12 2025-02-07 Subsystem control mode selection method and apparatus technical

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2401888.9A GB2638007A (en) 2024-02-12 2024-02-12 Subsystem control mode selection method and apparatus

Publications (2)

Publication Number Publication Date
GB202401888D0 GB202401888D0 (en) 2024-03-27
GB2638007A true GB2638007A (en) 2025-08-13

Family

ID=90354721

Family Applications (2)

Application Number Title Priority Date Filing Date
GB2401888.9A Pending GB2638007A (en) 2024-02-12 2024-02-12 Subsystem control mode selection method and apparatus
GBGB2501866.4A Pending GB202501866D0 (en) 2024-02-12 2025-02-07 Subsystem control mode selection method and apparatus technical

Family Applications After (1)

Application Number Title Priority Date Filing Date
GBGB2501866.4A Pending GB202501866D0 (en) 2024-02-12 2025-02-07 Subsystem control mode selection method and apparatus technical

Country Status (2)

Country Link
GB (2) GB2638007A (en)
WO (1) WO2025172190A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140188350A1 (en) * 2011-07-13 2014-07-03 Jaguar Land Rover Limited Vehicle control system and method
CN116946098A (en) * 2022-04-19 2023-10-27 北京罗克维尔斯科技有限公司 Vehicle four-wheel drive control method and device, vehicle and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140188350A1 (en) * 2011-07-13 2014-07-03 Jaguar Land Rover Limited Vehicle control system and method
CN116946098A (en) * 2022-04-19 2023-10-27 北京罗克维尔斯科技有限公司 Vehicle four-wheel drive control method and device, vehicle and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NEURAL NETWORKS (IJCNN), 2010, ABOU-NASR M. A, "Terrain identification in grayscale images with recurrent neural networks", pages 1-5 *

Also Published As

Publication number Publication date
GB202501866D0 (en) 2025-03-26
WO2025172190A1 (en) 2025-08-21
GB202401888D0 (en) 2024-03-27

Similar Documents

Publication Publication Date Title
US9573595B2 (en) System and method for controlling vehicle speed
EP2045155B1 (en) A control system for a vehicle and trailer combination
CN103648880B (en) Vehicle control system and method for controlling a vehicle
JP6377162B2 (en) Vehicle control system and method
US10611375B2 (en) Vehicle speed control
US20150217766A1 (en) Speed control system and method of operating the same
US20220176825A1 (en) Control system and method for controlling an electric motor
US11021160B2 (en) Slope detection system for a vehicle
WO2018007079A1 (en) Improvements in vehicle speed control
CN104106013A (en) Driver advice system for a vehicle
US11505073B1 (en) Method for controlling driving force of vehicle
US20190039591A1 (en) Improvements in vehicle speed control
WO2019166142A1 (en) Methods and apparatus for acquisition and tracking, object classification and terrain inference
GB2576265A (en) Improvements in vehicle speed control
JP2019206312A (en) Control device of towing vehicle
JP2004224262A (en) Automatic brake control device
WO2017178189A1 (en) Improvements in vehicle speed control
GB2638007A (en) Subsystem control mode selection method and apparatus
US20240369592A1 (en) Method for estimating the reference speed of a vehicle for limit conditions
RU2702476C1 (en) Vehicle overturn prevention method
WO2024235983A1 (en) System and method for monitoring vehicle dynamics
CN114454683B (en) Control method, device, medium and vehicle for vehicle suspension damping
WO2023210534A1 (en) Control device for vehicle
GB2642078A (en) System, vehicle and method
CN119058653A (en) Vehicle control method, device, vehicle and storage medium based on escape condition