WO2024069690A1 - Driving assistance method and driving assistance device - Google Patents
Driving assistance method and driving assistance device Download PDFInfo
- Publication number
- WO2024069690A1 WO2024069690A1 PCT/JP2022/035674 JP2022035674W WO2024069690A1 WO 2024069690 A1 WO2024069690 A1 WO 2024069690A1 JP 2022035674 W JP2022035674 W JP 2022035674W WO 2024069690 A1 WO2024069690 A1 WO 2024069690A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- edge
- driving
- host vehicle
- host
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
Definitions
- the present invention relates to a driving assistance method and a driving assistance device.
- a surround view system for vehicles that generates a three-dimensional model of the vehicle's surroundings based on data on the vehicle's surroundings, and maps visual data onto each part of the three-dimensional model, thereby reducing distortion in the generated virtual surround view (Patent Document 1).
- the process of mapping the above-mentioned visual data onto a three-dimensional model places a large load on the processing device.
- the above-mentioned conventional technology is unable to adequately reduce distortion in the virtual surround view in driving scenes where the conditions around the vehicle are constantly changing, and is unable to accurately recognize the driving conditions of other vehicles around the vehicle.
- unnecessary evasive action is taken based on the misrecognized driving conditions of other vehicles, disrupting the behavior of the vehicle.
- the problem that this invention aims to solve is to provide a driving assistance method and driving assistance device that can reduce the impact of erroneous recognition of the driving conditions of other vehicles on the driving conditions of the vehicle itself.
- the present invention solves the above problem by autonomously controlling the driving of the vehicle so that the position of the other vehicle is not included in the edge of the detection range of the imaging device when the other vehicle is detected traveling beside the vehicle and it is determined that the other vehicle is located at the edge of the detection range of the imaging device or will enter the edge within a predetermined time.
- the present invention can reduce the impact of erroneous recognition of the driving conditions of other vehicles on the driving conditions of the vehicle itself.
- FIG. 1 is a block diagram showing an example of a driving assistance system including a driving assistance device according to the present invention.
- FIG. 2 is a plan view showing an example of the imaging device of FIG. 1 .
- 3 is a plan view showing an example of a detection result of another vehicle by the imaging device shown in FIG. 2 (part 1);
- FIG. 3 is a plan view showing an example of a detection result of another vehicle by the imaging device shown in FIG. 2 (part 2);
- FIG. FIG. 2 is a plan view showing an example of a driving scene in which driving assistance is performed by the driving assistance system shown in FIG. 1 (part 1).
- FIG. 2 is a plan view showing an example of a driving scene in which driving assistance is performed by the driving assistance system shown in FIG. 1 (part 2).
- FIG. 3 is a plan view showing an example of a driving scene in which driving assistance is performed by the driving assistance system shown in FIG. 1 (part 3).
- 8 is a flowchart showing an example of a subroutine of step S6 in FIG. 7; 8 is a flowchart showing another example of the subroutine of step S6 in FIG. 7 .
- 8 is a flowchart showing still another example of the subroutine of step S6 of FIG.
- FIG. 1 is a block diagram showing a driving assistance system 10 according to the present invention.
- the driving assistance system 10 is an in-vehicle system that drives a vehicle to a destination set by a vehicle occupant (including a driver) by autonomous driving control.
- the autonomous driving control means that the driving operation of the vehicle is autonomously controlled using a driving assistance device described later, and the driving operation includes all driving operations such as acceleration, deceleration, starting, stopping, steering to the right or left, lane changing, and pulling over.
- autonomously controlling the driving operation means that the driving assistance device controls the driving operation using a device of the vehicle.
- the driving assistance device controls these driving operations within a predetermined range, and the driving operations that are not controlled by the driving assistance device are manually operated by the driver.
- the driving assistance system 10 comprises an imaging device 11, a distance measuring device 12, a vehicle state detection device 13, map information 14, a vehicle position detection device 15, a navigation device 16, a vehicle control device 17, a display device 18, and a driving assistance device 19.
- the devices that make up the driving assistance system 10 are connected by a CAN (Controller Area Network) or other in-vehicle LAN, and can send and receive information between each other.
- CAN Controller Area Network
- the imaging device 11 is a device that recognizes objects around the vehicle using images, and is, for example, a camera equipped with an imaging element such as a CCD, an ultrasonic camera, an infrared camera, or the like.
- a single vehicle can be provided with multiple imaging devices 11, and they can be placed, for example, in the vehicle's front grille, under the left and right door mirrors, and near the rear bumper. This makes it possible to reduce blind spots when recognizing objects around the vehicle.
- the ranging device 12 is a device for calculating the relative distance and relative speed between the vehicle and an object, and is, for example, a radar device or sonar such as a laser radar, millimeter wave radar (LRF, etc.), a LiDAR (light detection and ranging) unit, or an ultrasonic radar.
- a single vehicle can be provided with multiple ranging devices 12, and can be positioned, for example, at the front, right side, left side, and rear of the vehicle. This makes it possible to accurately calculate the relative distance and relative speed between the vehicle and objects around it.
- Objects detected by the imaging device 11 and distance measuring device 12 include road lane boundaries, center lines, road markings, medians, guardrails, curbs, highway sidewalls, road signs, traffic lights, crosswalks, construction sites, accident sites, traffic restrictions, etc. Objects also include obstacles that may affect the travel of the vehicle, such as automobiles (other vehicles) other than the vehicle itself, motorcycles, bicycles, pedestrians, etc.
- the detection results of the imaging device 11 and distance measuring device 12 are obtained at predetermined time intervals by the driving assistance device 19 as necessary. The predetermined time intervals can be set to an appropriate value depending on the processing capacity of the driving assistance device 19.
- the detection results of the imaging device 11 and the distance measuring device 12 can be integrated or synthesized (so-called sensor fusion) by the driving assistance device 19, which makes it possible to supplement missing information about the detected object.
- the driving assistance device 19 can calculate the position information of the object based on the self-position information, which is the position where the vehicle is traveling, acquired by the vehicle position detection device 15, and the relative position (distance and direction) between the vehicle and the object.
- the calculated position information of the object is integrated by the driving assistance device 19 with multiple pieces of information, such as the detection results of the imaging device 11 and the distance measuring device 12 and the map information 14, to become information about the driving environment around the vehicle.
- the detection results of the imaging device 11 and the distance measuring device 12 and the map information 14 can be used to recognize objects around the vehicle and predict their movements.
- the vehicle state detection device 13 is a device for detecting the vehicle's running state, and examples of such devices include a vehicle speed sensor, an acceleration sensor, a yaw rate sensor (e.g., a gyro sensor), a steering angle sensor, and an inertial measurement unit. There are no particular limitations on these devices, and any known device can be used. Furthermore, the placement and number of these devices can be set appropriately within a range in which the vehicle's running state can be appropriately detected. The detection results of each device are obtained by the driving assistance device 19 at specified time intervals as necessary.
- Map information 14 is information used for generating driving routes, controlling driving operations, etc., and includes road information, facility information, and their attribute information.
- Road information and road attribute information include information such as road width, road curvature radius, road shoulder structures, road traffic regulations (speed limit, whether lane changes are permitted), road junctions and branching points, and locations where the number of lanes increases or decreases.
- Map information 14 is high-definition map information that allows the movement trajectory of each lane to be grasped, and includes two-dimensional position information and/or three-dimensional position information at each map coordinate, road/lane boundary information at each map coordinate, road attribute information, lane uphill/downhill information, lane identification information, connecting lane information, etc. Note that high-precision maps are also called HD (High-Definition) maps.
- HD High-Definition
- the road/lane boundary information in the high-resolution map information is information that indicates the boundary between the lane on which the vehicle travels and other lane.
- the lane on which the vehicle travels is the road on which the vehicle travels, and the form of the lane is not particularly limited.
- the boundary exists on both the left and right sides of the vehicle's direction of travel, and the form is not particularly limited.
- the boundary is, for example, a road marking or a road structure. Examples of road markings include lane boundaries and center lines, and examples of road structures include medians, guard rails, curbs, tunnels, and side walls of expressways. Note that at points where the lane boundary cannot be clearly identified, such as within intersections, a boundary is set for the lane in advance. This boundary is imaginary, and is not an actual road marking or road structure.
- Map information 14 is stored in a readable state on a recording medium provided in the driving assistance device 19, an in-vehicle device, or a server on a network.
- the driving assistance device 19 acquires map information 14 as necessary.
- the vehicle position detection device 15 is a positioning system for detecting the current position of the vehicle, and is not particularly limited, and any known system can be used.
- the vehicle position detection device 15 calculates the current position of the vehicle from radio waves received from a satellite for the Global Positioning System (GPS), for example.
- GPS Global Positioning System
- the vehicle position detection device 15 may also estimate the current position of the vehicle from vehicle speed information and acceleration information acquired from the vehicle state detection device 13, which includes a vehicle speed sensor, an acceleration sensor, and a gyro sensor, and calculate the current position of the vehicle by comparing the estimated current position with the map information 14.
- the navigation device 16 is a device that refers to the map information 14 and calculates a driving route from the current position of the vehicle detected by the vehicle position detection device 15 to a destination set by the occupants (including the driver).
- the navigation device 16 uses road information and facility information in the map information 14 to search for a driving route for the vehicle to reach the destination from the current position.
- the driving route includes at least information on the road the vehicle is traveling on, the driving lane, and the vehicle's driving direction, and is displayed, for example, linearly. There may be multiple driving routes depending on the search conditions.
- the driving route calculated by the navigation device 16 is output to the driving assistance device 19.
- the vehicle control device 17 is an on-board computer such as an electronic control unit (ECU), and electronically controls on-board equipment that governs the driving of the vehicle.
- the vehicle control device 17 is equipped with a vehicle speed control device 171 that controls the vehicle speed, and a steering control device 172 that controls the steering operation of the vehicle.
- the vehicle speed control device 171 and the steering control device 172 autonomously control the operation of these drive devices and steering devices in response to control signals input from the driving assistance device 19. This allows the vehicle to drive autonomously according to a set driving route.
- Information required for autonomous control by the vehicle speed control device 171 and the steering control device 172 such as the vehicle speed, acceleration, steering angle, and attitude, is obtained from the vehicle state detection device 13.
- the drive devices controlled by the vehicle speed control device 171 include an electric motor and/or an internal combustion engine that are driving sources for driving, a power transmission device including a drive shaft and an automatic transmission that transmits the output from these driving sources for driving to the drive wheels, and a drive device that controls the power transmission device.
- the braking device controlled by the vehicle speed control device 171 is, for example, a braking device that brakes the wheels.
- a control signal corresponding to the set vehicle speed is input to the vehicle speed control device 171 from the driving assistance device 19.
- the vehicle speed control device 171 generates signals to control these drive devices based on the control signals input from the driving assistance device 19, and transmits the signals to the drive devices, thereby autonomously controlling the vehicle speed of the vehicle.
- the steering device controlled by the steering control device 172 is a steering device that controls the steered wheels according to the rotation angle of the steering wheel, and an example of this is a steering actuator such as a motor attached to the steering column shaft.
- the steering control device 172 autonomously controls the operation of the steering device so that the vehicle travels while maintaining a predetermined lateral position (left-right position of the vehicle) with respect to the set travel route. For this control, at least one of the detection results of the imaging device 11 and the distance measuring device 12, the vehicle's travel state obtained by the vehicle state detection device 13, the map information 14, and the information on the current position of the vehicle obtained by the vehicle position detection device 15 is used.
- the display device 18 is a device for providing necessary information to vehicle occupants, and is, for example, a liquid crystal display provided on the instrument panel, a projector such as a head-up display (HUD), etc.
- the display device 18 may also be equipped with an input device for the vehicle occupants to input instructions to the driving assistance device 19. Examples of input devices include a touch panel that receives input by the user's finger or a stylus pen, a microphone that receives instructions by the user's voice, and a switch attached to the steering wheel of the vehicle.
- the display device 18 may also be equipped with a speaker as an output device.
- the driving assistance device 19 is a device that controls the driving of the vehicle by controlling and coordinating the devices that make up the driving assistance system 10, and drives the vehicle to a set destination.
- the destination is set, for example, by the vehicle occupant.
- the driving assistance device 19 is, for example, a computer, and includes a CPU (Central Processing Unit) 191, which is a processor, a ROM (Read Only Memory) 192 in which programs are stored, and a RAM (Random Access Memory) 193 that functions as an accessible storage device.
- the CPU 191 is an operating circuit that executes the programs stored in the ROM 192 and realizes the functions of the driving assistance device 19.
- the driving assistance device 19 has a driving assistance function of driving the vehicle to a set destination by autonomous driving control.
- the driving assistance device 19 has, as driving assistance functions, a route generation function of generating a driving route, an environment recognition function of recognizing the driving environment around the vehicle, a determination function of making a determination necessary for executing autonomous driving control based on the recognized driving environment, and a driving control function of generating a driving trajectory and driving the vehicle along the driving trajectory.
- the programs stored in ROM 192 include programs for realizing these functions, and these functions are realized by CPU 191 executing the programs stored in ROM 192.
- Figure 1 shows functional blocks that realize each function extracted for convenience.
- the support unit 20 has a driving support function that drives the vehicle to a set destination by autonomous driving control.
- FIG. 2 is a plan view showing an example of a driving scene in which the driving support device 19 autonomously controls the driving of the vehicle by the driving support function.
- a three-lane road extends in the vertical direction of the drawing, and the vehicle drives on the road from the bottom to the top of the drawing.
- the lanes are designated as lanes L1, L2, and L3 in order from the lane on the left side of the driving direction.
- the host vehicle V1 is driving at position P1 on lane L2, and is driving straight toward a destination (not shown) ahead that has been set by the occupant of the host vehicle V1.
- the recognition unit 21 has an environmental recognition function that recognizes the driving environment around the vehicle.
- the driving assistance device 19 recognizes the driving environment around the vehicle using the imaging device 11 and distance measuring device 12 through the environmental recognition function of the recognition unit 21.
- the driving environment is information for determining whether the vehicle can maintain its current driving state or needs to change its driving state, and includes information such as the type and position of an object, the type and position of an obstacle if one exists, road conditions such as road surface conditions, and weather.
- the driving assistance device 19 recognizes the driving environment by performing appropriate processing such as pattern matching and sensor fusion on the detection results of the imaging device 11 and distance measuring device 12.
- the imaging device 11 is composed of multiple imaging devices 11.
- the host vehicle V1 shown in FIG. 2 is equipped with a front camera that detects obstacles present within a detection range A1 in front of the host vehicle V1, and a rear camera that detects obstacles present within a detection range A2 behind the host vehicle V1.
- the host vehicle V1 is equipped with a front wide-angle camera that detects obstacles present within a detection range B1 in front of the host vehicle V1, a rear wide-angle camera that detects obstacles present within a detection range B2 behind the host vehicle V1, a left side wide-angle camera that detects obstacles present within a detection range B3 to the left of the host vehicle V1, and a right side wide-angle camera that detects obstacles present within a detection range B4 to the right of the host vehicle V1.
- a front wide-angle camera that detects obstacles present within a detection range B1 in front of the host vehicle V1
- a rear wide-angle camera that detects obstacles present within a detection range B2 behind the host vehicle V1
- a left side wide-angle camera that detects obstacles present within a detection range B3 to the left of the host vehicle V1
- a right side wide-angle camera that detects obstacles present within a detection range B4 to the right of the host vehicle V1.
- Wide-angle cameras have a wide-angle lens, and therefore a wider angle of view and a shorter focal length than normal cameras. Therefore, the detection range B1 of the front wide-angle camera is shorter in distance along the lane L2 than the detection range A1 of the front camera, and has a wider angle of view in the width direction of lane L2. Similarly, the detection range B2 of the rear wide-angle camera is shorter in distance along the lane L2 than the detection range A2 of the rear camera, and has a wider angle of view in the width direction of lane L2.
- the angle of view is the range that can be photographed by the imaging device 11 (i.e., the detection range) expressed in degrees, and for example, indicates the horizontal detection range of the imaging device 11 by the angle centered on the imaging device 11.
- the driving assistance device 19 uses the recognition unit 21 to integrate and process the detection results from the front wide-angle camera, rear wide-angle camera, left side wide-angle camera, and right side wide-angle camera using sensor fusion to thoroughly detect obstacles around the vehicle V1.
- These wide-angle cameras are positioned so that a portion of the detection ranges of adjacent cameras overlap (for example, a range of about 10 to 15% of the angle of view from the horizontal end of the detection range) to prevent blind spots around the vehicle V1 where obstacles cannot be detected.
- the detection ranges of adjacent cameras overlap at the ends of the angle of view of the imaging device 11 within the detection range.
- the ends of the angle of view refer to, for example, a range of 10 to 15% of the angle of view from the horizontal end of the detection range.
- FIG. 2 shows ends C1 to C4 of the detection range of each camera.
- Ends C1 and C3 are ends of detection range B1
- ends C2 and C4 are ends of detection range B2.
- ends C1 and C2 overlap with the end of detection range B3, and ends C3 and C4 overlap with the end of detection range B4. It is known that the state of an obstacle cannot be accurately detected at these ends C1, C2, C3, and C4. This is because the shape of the obstacle photographed at the ends of the angle of view of a wide-angle lens is distorted due to the characteristics of the lens.
- Figure 3A is a plan view showing the driving scene shown in Figure 2 in which another vehicle V2 is driving on lane L3.
- the other vehicle V2 is driving at position Q1 on lane L3, and the vehicle speed of the other vehicle V2 is faster than the vehicle speed of the host vehicle V1.
- the other vehicle V2 travels straight from position Q1 to position Q2 along the driving trajectory U1, and overtakes the host vehicle V1.
- positions Q1 and Q2 are relative positions to position P1.
- Figures 3A-3B only show the ends C1, C2, C3, and C4 shown in Figure 2 of the detection range of the imaging device 11, but this does not mean that obstacles are not detected in other detection ranges, and obstacles are also detected in detection ranges not shown in Figures 3A-3B.
- Figures 4A-4C, 5, and 6A-6B which will be described later.
- the driving assistance device 19 recognizes that the other vehicle V2 is traveling from position Q1 to position Q2 along the traveling trajectory Ux shown in FIG. 3B. In other words, the driving assistance device 19 erroneously recognizes the other vehicle V2 traveling straight along the traveling trajectory U1 as meandering along the traveling trajectory Ux and approaching the host vehicle V1 at the ends C3 and C4.
- the driving assistance device 19 may mistakenly predict that the other vehicle V2 will approach the host vehicle V1 by traveling along the travel trajectory Uy, and may execute evasive action to avoid the other vehicle V2.
- the driving assistance device 19 may decelerate the host vehicle V1 using the vehicle speed control device 171, or may change lanes of the host vehicle V1 from lane L2 to lane L1 using the steering control device 172.
- Such evasive actions are unnecessary driving actions to avoid the other vehicle V2 traveling straight, and these driving actions disrupt the behavior of the host vehicle V1 and cause the occupants of the host vehicle V1 to feel uncomfortable.
- the driving assistance device 19 of this embodiment therefore autonomously controls the driving of the vehicle V1 so that the position of the other vehicle V2 is not included in the edge of the detection range of the imaging device 11, in order to reduce the effect that the erroneously recognized driving state of the other vehicle V2 has on the driving state of the vehicle V1.
- FIG. 4A is a plan view showing an example of a driving scene in which autonomous driving control is executed by the driving assistance device 19 of this embodiment.
- the driving scene shown in FIG. 4A is a driving scene in which the host vehicle V1 and other vehicles V3 and V4 are driving on the road shown in FIG. 2, where the host vehicle V1 is driving at position P2 in lane L2, the other vehicle V3 is driving at position Q3 in lane L3, and the other vehicle V4 is driving at position Q4 in lane L3.
- the host vehicle V1 is driving at a constant speed by lane keeping control, and the other vehicles V3 and V4 are driving straight at the same vehicle speed as the host vehicle V1.
- FIG. 4A the host vehicle V1 is driving at a constant speed by lane keeping control, and the other vehicles V3 and V4 are driving straight at the same vehicle speed as the host vehicle V1.
- the position Q3 of the other vehicle V3 is included in the range of the end C3, and the position Q4 of the other vehicle V4 is included in the range of the end C4.
- the functions performed by the recognition unit 21, the determination unit 22, and the control unit 23 of this embodiment in the driving scene shown in FIG. 4A will be described.
- the recognition unit 21 has a function of detecting other vehicles traveling on the side of the host vehicle V1 using the imaging device 11.
- the side of the host vehicle V1 refers to, for example, the detection range of the imaging device 11 installed on the left side of the host vehicle V1 and the detection range of the imaging device 11 installed on the right side.
- the detection ranges B3 and B4 shown in FIG. 2 are the sides of the host vehicle V1.
- the side of the host vehicle V1 may be the range in which an obstacle can be detected using the detection device of the host vehicle V1 (for example, the imaging device 11 and the distance measuring device 12) of the host vehicle V1 among the adjacent lanes of the host vehicle V1 traveling on the adjacent lanes and the adjacent lanes adjacent to the adjacent lanes.
- the detection device of the host vehicle V1 for example, the imaging device 11 and the distance measuring device 12
- Another vehicle traveling to the side of the vehicle V1 is, for example, another vehicle traveling in an adjacent lane or the lane adjacent to the vehicle.
- the driving assistance device 19 recognizes the other vehicles V3 and V4 traveling in the adjacent lane L3 using the imaging device 11 and the distance measuring device 12 through the function of the recognition unit 21.
- the positions of the other vehicles V3 and V4 are recognized by combining (sensor fusion) the detection results of the imaging device 11 and the distance measuring device 12.
- lane L2 is the vehicle's lane
- lanes L1 and L3 are adjacent lanes.
- lane L1 is the adjacent lane
- lane L3 is the lane adjacent to the vehicle.
- the determination unit 22 has a determination function of determining whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter that edge within a predetermined time.
- the driving assistance device 19 uses the function of the determination unit 22 to determine whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 based on the driving environment information around the vehicle V1 acquired by the function of the recognition unit 21. Alternatively or in addition to this, the driving assistance device 19 may determine whether the position of the other vehicle will enter that edge within a predetermined time.
- the driving assistance device 19 determines whether the position of the detected other vehicle is included in the end of the detection range of the imaging device 11.
- the end of the detection range is defined when the imaging device 11 is installed, so the end range is registered in advance in the ROM 192 of the driving assistance device 19, etc. Therefore, the driving assistance device 19 detects the position of the other vehicle and determines whether the position of the other vehicle is included in the end of the registered detection range.
- the position Q3 of the other vehicle V3 is included in the range of the end C3, and the position Q4 of the other vehicle V4 is included in the range of the end C4, so the driving assistance device 19 determines that the positions of the other vehicles V3 and V4 are at the ends of the detection range.
- the driving assistance device 19 recognizes the driving states of the host vehicle V1 and the other vehicle from driving environment information acquired by the function of the recognition unit 21, for example.
- the driving state of the vehicle refers to the state of the vehicle's traveling direction and vehicle speed, and includes states such as a state in which the vehicle is traveling straight, a state in which the vehicle is steering to the right or left, a state in which the vehicle is accelerating or decelerating, and a state in which the vehicle is traveling at a constant speed.
- the driving state of the vehicle also includes the state of the driving operation performed by the vehicle. Examples include a state in which the vehicle's turn indicators are flashing, a state in which the vehicle's headlights are on, etc.
- the driving assistance device 19 acquires information such as the vehicle speed, acceleration, yaw rate, steering angle, and steering wheel rotation angle of the host vehicle V1 from various sensors of the host vehicle state detection device 13, and recognizes the current driving state of the host vehicle V1.
- the driving assistance device 19 may acquire road information from the map information 14, acquire the current position of the host vehicle V1 from the host vehicle position detection device 15, acquire the driving route from the navigation device 16, and recognize the traveling direction and/or vehicle speed of the host vehicle V1 from the shape of the road at the current position of the host vehicle V1 and/or the driving route.
- the driving assistance device 19 also acquires control signals output from the vehicle control device 17 to the drive device and/or steering device, and recognizes how to control (change) the traveling direction and/or vehicle speed of the host vehicle V1. Then, based on these, it predicts how the traveling state of the host vehicle V1 will change after a predetermined time.
- the driving assistance device 19 may acquire road information from the map information 14, acquire the current position of the host vehicle V1 from the host vehicle position detection device 15, acquire the traveling route from the navigation device 16, and predict the traveling direction and/or vehicle speed of the host vehicle V1 after a predetermined time from the shape of the road ahead of the current position of the host vehicle V1 and/or the traveling route.
- the driving assistance device 19 acquires image data, for example, from the imaging device 11, extracts and identifies obstacles by pattern matching, and recognizes the type, position, and state of the obstacle. It also acquires information obtained by scanning the area around the vehicle V1 from the distance measuring device 12, and recognizes the position and direction of the obstacle from this information. If it recognizes that the obstacle is another vehicle from the image data acquired from the imaging device 11, it recognizes from its shape how much the vehicle body is tilted (i.e. how much it is being steered). It also acquires the position and relative speed of the other vehicle with respect to the vehicle V1 from the scan results of the distance measuring device 12. Then, it recognizes the driving position, direction of travel, and speed of the other vehicle based on these detection results.
- the driving assistance device 19 When predicting the driving state of the other vehicle V2 after a predetermined time, the driving assistance device 19 obtains information obtained by scanning the surroundings of the vehicle V1 from the distance measuring device 12, and recognizes the position and direction of the obstacle from that information. The driving assistance device 19 repeats the process of recognizing the position and direction of the obstacle from the scan results of the distance measuring device 12 multiple times (e.g., three or more times) at time intervals shorter than a predetermined time, recognizes the tendency of changes in the obstacle position, and predicts the state of the obstacle after a predetermined time (i.e. the driving state of the other vehicle V2) from that tendency.
- a predetermined time i.e. the driving state of the other vehicle V2
- the driving assistance device 19 determines whether the position of the other vehicle V2 will enter the edge of the detection range within a predetermined time based on the recognized driving state of the subject vehicle V1 and the driving state of the other vehicle V2. The driving assistance device 19 determines whether the other vehicle will enter the edge of the detection range based on the positional relationship between the subject vehicle V1 and the other vehicle, the speed difference between the subject vehicle V1 and the other vehicle, and the traveling direction of the subject vehicle V1 and the other vehicle.
- the specified time can be set to an appropriate value within the range in which autonomous driving control can be initiated to prevent the position of the other vehicle from being included in the edge of the detection range between the time the other vehicle is detected and the time the other vehicle's position actually enters the edge of the detection range, for example 10 to 20 seconds. If the specified time is shorter than this, the start of the autonomous driving control will be delayed and the behavior of the vehicle V1 will change significantly. Conversely, if the specified time is longer than this, it will be impossible to accurately predict the driving state, and there is a risk that autonomous driving control will be executed to prevent the position of the other vehicle from being included in the edge of the detection range in a driving scene in which normal autonomous driving control should be used.
- the other vehicle in a driving scene in which the other vehicle is located behind the host vehicle V1 and is traveling at a position other than the end of the detection range, if the other vehicle's speed is faster than the host vehicle V1, the other vehicle catches up with the host vehicle V1 within a specified time, and the other vehicle's direction of travel is toward the end of the detection range, it is determined that the other vehicle will enter that end within the specified time. In contrast, in the same driving scene, if the other vehicle's speed is equal to or slower than the host vehicle V1, it is determined that the other vehicle will not enter that end within the specified time.
- the vehicle speed of the other vehicle is equal to or faster than the vehicle speed of the host vehicle V1
- the host vehicle V1 catches up with the other vehicle within the predetermined time, and the host vehicle V1 (particularly the end of the detection range) moves toward the other vehicle, it is determined that the other vehicle will enter that end within the predetermined time.
- the driving assistance device 19 may determine whether the position of the other vehicle will enter the edge of the detection range within a predetermined time based on the predicted driving state of the vehicle V1 and the driving state of the other vehicle. Specifically, it determines whether the position of the other vehicle will enter the edge from the positional relationship between the vehicle V1 and the other vehicle after a predetermined time.
- the path of the other vehicle may be predicted based on the recognized driving state of the other vehicle, and based on the path of the other vehicle, it may determine whether the position of the other vehicle will enter the edge of the detection range within a predetermined time.
- the path of the other vehicle is, for example, the traveling direction of the other vehicle described above, and may be the driving route of the other vehicle obtained from the other vehicle if vehicle-to-vehicle communication is possible between the vehicle V1 and the other vehicle.
- the driving assistance device 19 recognizes the driving state (particularly the traveling direction) of the other vehicle V4.
- the other vehicle V4 is traveling straight along lane L3, so the traveling direction of the other vehicle V4 is toward end C4. Therefore, the driving assistance device 19 determines that the position of the other vehicle V4 will enter end C4 within the predetermined time.
- predicting the course of the other vehicle based on the driving state of the other vehicle and determining whether the position of the other vehicle will enter the end of the detection range within the predetermined time based on the course of the other vehicle are not essential components of the present invention, and may be added or omitted as necessary.
- control unit 23 determines that the other vehicle is at the edge of the detection range of the imaging device 11 or will enter said edge within a predetermined time, it has the function of autonomously controlling the driving of the host vehicle V1 so that the other vehicle is not included in said edge. If the driving assistance device 19 determines, by the function of the determination unit 22, that the other vehicle is not at the edge of the detection range of the imaging device 11 and will not enter said edge within a predetermined time, it executes normal autonomous driving control. In contrast, if it determines that the other vehicle is at said edge or will enter said edge within a predetermined time, it uses the function of the control unit 23 to autonomously control the driving of the host vehicle V1 so that the other vehicle is not included in said edge.
- the autonomous driving control that prevents the other vehicle from being included in the edge of the detection range of the imaging device 11 is also referred to as edge avoidance control.
- the edge avoidance control includes, for example, autonomously controlling the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases. For example, the vehicle speed of the host vehicle V1 obtained from the host vehicle state detection device 13 is compared with the vehicle speed of the other vehicle obtained from the traveling environment information, and if the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, the host vehicle V1 is decelerated via the vehicle speed control device 171, if the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, the host vehicle V1 is accelerated via the vehicle speed control device 171, and if the vehicle speed of the host vehicle V1 and the vehicle speed of the other vehicle are the same, the host vehicle V1 is accelerated or decelerated via the vehicle speed control device 171.
- the edge avoidance control may include setting the path of the host vehicle V1 based on the traveling direction (or the path of the other vehicle) of the other vehicle. For example, if the path of the other vehicle is heading toward the edge of the detection range of the imaging device 11, the path (travel direction) of the host vehicle V1 is changed in a direction away from the other vehicle, and the host vehicle V1 is changed from the lane in which the host vehicle V1 is currently traveling to an adjacent lane in a direction away from the other vehicle.
- autonomously controlling the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases, and setting the path of the host vehicle V1 based on the path of the other vehicle in the same case are not essential components of the present invention and may be added or omitted as necessary.
- the driving assistance device 19 when the driving assistance device 19 detects another vehicle traveling to the side of the host vehicle V1, it may determine whether the other vehicle is traveling behind the host vehicle V1. Then, when it is determined that the other vehicle is traveling behind the host vehicle V1 and that the position of the other vehicle is at the end of the detection range or will enter the end within a predetermined time, the driving assistance device 19 autonomously controls the driving of the host vehicle V1 so that the position of the other vehicle is not included in the end of the detection range that exists behind the host vehicle V1. This is because the imaging device 11 that captures the rear of the host vehicle V1 has a shorter focal length than the one that captures the front of the host vehicle V1, and it is easy for a situation to occur in which the driving state of the other vehicle cannot be accurately recognized.
- autonomous control of the traveling of the vehicle V1 so that the position of the other vehicle traveling behind the vehicle V1 is not included in the edge behind the vehicle V1 is not an essential configuration of the present invention, and may be added or omitted as necessary.
- the other vehicle V3 is traveling at position Q3 of end C3, and the other vehicle V4 is traveling at position Q4 of end C4, so the driving assistance device 19 compares the vehicle speed of the host vehicle V1 with the vehicle speeds of the other vehicles V3 and V4 as end avoidance control. Then, when the vehicle speed of the host vehicle V1 is faster than the vehicle speeds of the other vehicles V3 and V4, the host vehicle V1 is accelerated as shown in FIG. 4B, and travels from position P2 to position P3 along the travel trajectory T1.
- the host vehicle V1 when the vehicle speed of the host vehicle V1 is faster than the vehicle speeds of the other vehicles V3 and V4, the host vehicle V1 is accelerated to move the positions of the other vehicles V3 and V4 out of the ends C3 and C4 so that they are not included in the ends C3 and C4, even if the other vehicle V3 is traveling in front of the host vehicle V1 or the other vehicle V4 is traveling behind the host vehicle V1.
- positions P3 and P4 are relative positions to position P2, and the host vehicle V1 does not move backward in the driving scene shown in FIG. 4C.
- the host vehicle V1 when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3, V4, the host vehicle V1 is accelerated or decelerated. In other words, when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3, V4, the host vehicle V1 may be accelerated or decelerated as long as the positions of the other vehicles V3, V4 leave the ends C3, C4 and do not enter the ends C3, C4.
- the positions of the other vehicles V3, V4 are not included in the ends C3, C4 of the detection range when viewed from above means that the entire body of the other vehicles V3, V4 is not included in the ends C3, C4, and that most of the body of the other vehicles V3, V4 (e.g., 90% or more) is not included in the ends C3, C4. In other words, a part of the body of the other vehicles V3, V4 (e.g., 10% or less) may be included in the ends C3, C4.
- the driving assistance device 19 may maintain the vehicle speed of the host vehicle V1 when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicles V3 and V4 in the driving scene shown in FIG. 4A.
- the driving assistance device 19 may determine whether the positions of the other vehicles V3 and V4 are at the ends of the detection range when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3 and V4 in the driving scene shown in FIG. 4A. Then, as in the driving scene shown in FIG. 4A, when it is determined that the positions of the other vehicles V3 and V4 are at the ends, the host vehicle V1 is accelerated or decelerated.
- the vehicle speed of the host vehicle V1 is maintained. This is because the end avoidance control is performed only when there is a possibility that the other vehicle V4 will enter a position in front of the host vehicle V1.
- the other vehicle V4 will overtake the host vehicle V1 and change lanes to a position ahead of the host vehicle V1, but in a driving scene where the vehicle speed of the host vehicle V1 is faster than that of the other vehicle V4, it is difficult to predict that the other vehicle will change lanes to a position ahead of the host vehicle V1.
- FIG. 5 is a plan view showing an example of a driving scene in which edge avoidance control is executed.
- the driving scene shown in FIG. 5 is a driving scene in which the host vehicle V1 and other vehicles V5 and V6 are driving on the road shown in FIG. 2, with the host vehicle V1 driving at position P5 in lane L1 and the other vehicle V5 driving ahead of it at position Q5 in lane L1.
- the other vehicle V6 is driving at position Q6 in lane L3.
- the host vehicle V1 drives at a constant speed using lane keeping control, and the other vehicles V5 and V6 drive straight at the same speed as the host vehicle V1.
- the driving support device 19 acquires the driving state of the other vehicle from the driving environment information using the function of the recognition unit 21. At the same time, the driving support device 19 uses the function of the determination unit 22 to determine whether or not there is another vehicle in the adjacent lane and the adjacent lane next to the other vehicle based on the detection result of the other vehicle.
- the driving support device 19 performs edge avoidance control to determine whether or not the position of the other vehicle traveling in the adjacent lane is at the edge of the detection range or will enter the edge within a predetermined time. On the other hand, if there is another vehicle in the adjacent lane, lane change assistance is not performed regardless of whether there is another vehicle in the adjacent lane next to the other vehicle.
- the driving assistance device 19 determines whether or not the position Q6 of the other vehicle V6 traveling in lane L3, which is the next adjacent lane, is at the edge of the detection range or will enter said edge within a predetermined time.
- the host vehicle V1 changes lanes from lane L1 to lane L2, and determines whether or not the position Q6 of the other vehicle V6 will enter any of the edges C1 to C4 within a predetermined time when the host vehicle V1 travels along the driving trajectory T3.
- the driving assistance device 19 determines that the position Q6 of the other vehicle V6 traveling in the adjacent lane L3 will enter the end C4 within a predetermined time, and does not assist the vehicle V1 in changing lanes through autonomous driving control. In this case, the lane change will be performed by manual driving by the driver.
- the vehicle V1 will be changed lanes from lane L1 to lane L2 through autonomous driving control in order to overtake the preceding vehicle V5.
- lane change assistance for the host vehicle V1 is not always performed in the driving scene shown in FIG. 5. For example, when the host vehicle V1 overtakes a vehicle ahead of the host vehicle V1, it is determined whether there are other vehicles in the lane adjacent to the lane in which the host vehicle V1 is traveling and in the adjacent lane. Then, if it is determined that there is no other vehicle in the adjacent lane but there is another vehicle in the adjacent lane, it is determined whether the other vehicle traveling in the adjacent lane is at the edge of the detection range or will enter the edge of the detection range within a predetermined time.
- the host vehicle V1 attempts to overtake the other vehicle V5, which is a preceding vehicle, and determines whether there are other vehicles in the lane L2 adjacent to the lane L1 in which the host vehicle V1 is traveling, and in the lane L3 adjacent to lane L2.
- the position Q6 of the other vehicle V6 is at end C4 or will enter end C4 within a predetermined time.
- position Q6 of the other vehicle V6 enters end C4 within a predetermined time, so the driving assistance device 19 decelerates the host vehicle V1 by autonomous driving control and drives it along the driving trajectory T4 from position P5 to position P7, as shown in FIG. 6A. Then, in conjunction with the deceleration control, as shown in FIG. 6B, the host vehicle V1 changes lanes from lane L1, which is the host vehicle's lane, to lane L2, which is the adjacent lane.
- the host vehicle V1 drives along the driving trajectory T5 from position P7 of lane L1 to position P8 of lane L2, but during the lane change, position Q6 of the other vehicle V6 driving in lane L3, which is the adjacent lane, does not enter any of the ends C1 to C4 of the detection range.
- positions P7 and P8 are relative positions to position P5, and in the driving scene shown in FIG. 6A, the host vehicle V1 decelerates but does not move backward.
- the vehicle speed of the host vehicle V1 may be set so that the host vehicle V1 and the other vehicle V6 run side by side, and the lane change may be performed while the host vehicle V1 and the other vehicle V6 are running side by side. This is because as long as the host vehicle V1 and the other vehicle V6 are running side by side, the position Q6 of the other vehicle V6 will not enter the ends C3 and C4.
- the timing of decelerating the host vehicle V1 by the autonomous driving control and the timing of changing lanes from lane L1 to lane L2 may be such that the lane change assistance is performed after the deceleration control is completed as shown in FIGS. 6A-6B, or the two controls may be performed simultaneously.
- the host vehicle V1 may be changed lanes from lane L1 to lane L2 while decelerating.
- edge avoidance control in each driving scene shown in Figures 4A to 4C, 5, and 6A to 6B is merely an example, and edge avoidance control other than the edge avoidance control described above may be executed in each driving scene.
- edge avoidance control other than the edge avoidance control described above may be executed in each driving scene.
- it is not essential to execute edge avoidance control for all of the driving scenes shown in Figures 4A to 4C, 5, and 6A to 6B, and the driving assistance device 19 may execute edge avoidance control for some of the driving scenes.
- Fig. 7 is an example of a flowchart showing information processing executed in the driving assistance system 10 of this embodiment. The processing described below is executed at predetermined time intervals by the CPU 191, which is the processor of the driving assistance device 19. Note that the flowchart shown in Fig. 7 is premised on a driving scene in which the host vehicle V1 is driving on a road using lane keeping control.
- step S1 of FIG. 7 the recognition unit 21 detects another vehicle using the imaging device 11.
- step S2 it is determined from the detection result whether or not another vehicle V2 is present to the side of the host vehicle V1. If the other vehicle V2 is not present to the side of the host vehicle V1, the process proceeds to step S7, where the control unit 23 executes normal autonomous driving control and proceeds to step S8. On the other hand, if another vehicle is present to the side of the host vehicle V1, the process proceeds to step S3, where the determination unit 22 determines whether or not the other vehicle is located at the edge of the detection range of the imaging device 11. If it is determined that the other vehicle is located at the edge of the detection range, the process proceeds to step S6. On the other hand, if it is determined that the other vehicle is not located at the edge of the detection range, the process proceeds to step S4.
- step S4 the function of the determination unit 22 predicts the driving state of the host vehicle V1 and the other vehicle after a predetermined time, and in the following step S5, it is determined whether the position of the other vehicle will enter the edge of the detection range of the imaging device 11 within the predetermined time. If it is determined that the position of the other vehicle will not enter the edge of the detection range within the predetermined time, the process proceeds to step S7, where normal autonomous driving control is executed, and the process proceeds to step S8.
- step S6 the function of the control unit 23 autonomously controls the driving of the host vehicle V1 so that the position of the other vehicle is not included in the edge of the detection range. Then, the process proceeds to step S8.
- step S8 the support unit 20 determines whether the vehicle V1 has reached the destination. If it is determined that the vehicle V1 has reached the destination, the routine ends and the display device 18 is used to prompt the driver of the vehicle V1 to drive the vehicle manually. In contrast, if it is determined that the vehicle V1 has not reached the destination, the process proceeds to step S1.
- manual driving refers to the driving support device 19 not performing autonomous driving control of the driving operation, but rather controlling the vehicle's driving through the driver's operation.
- step S11 of FIG. 8 the vehicle speed of the host vehicle V1 is compared with the vehicle speed of the other vehicle, and in the subsequent step S12, it is determined whether the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle. If the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, the process proceeds to step S13, where the vehicle speed control device 171 is used to accelerate the host vehicle V1 or maintain the vehicle speed of the host vehicle V1.
- step S14 it is determined whether the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle.
- step S15 the host vehicle V1 is decelerated using the vehicle speed control device 171.
- step S16 it is determined whether the position of the other vehicle is at the end of the detection range of the imaging device 11. If it is determined that the position of the other vehicle is at the end of the detection range, the process proceeds to step S17, where the host vehicle V1 is accelerated or decelerated using the vehicle speed control device 171.
- step S18 the vehicle speed of the host vehicle is maintained. Note that steps S16 and S18 are not essential steps and may be provided as necessary.
- step S21 of FIG. 9 it is determined whether the vehicle V1 will change lanes to the adjacent lane. If it is determined that the vehicle V1 will not change lanes to the adjacent lane, the process proceeds to step S7 of FIG. 7, where normal autonomous driving control is executed. On the other hand, if it is determined that the vehicle V1 will change lanes to the adjacent lane, the process proceeds to step S22, where the recognition unit 21 uses the imaging device 11 to detect other vehicles traveling in the adjacent lane and the adjacent lane next to it.
- step S23 it is determined whether there are other vehicles traveling in the adjacent lane, and if it is determined that there are other vehicles traveling in the adjacent lane, the process proceeds to step S24, where the lane change to the adjacent lane is not executed, and the control unit 23 performs autonomous control to maintain lane keeping control, for example. On the other hand, if it is determined that there are no other vehicles traveling in the adjacent lane, the process proceeds to step S25.
- step S25 it is determined whether or not there is another vehicle traveling in the adjacent lane. If it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S27, where the control unit 23 functions to execute a lane change to the adjacent lane. On the other hand, if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S26, where it is determined whether or not the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range of the imaging device 11 within a predetermined time.
- step S24 If it is determined that the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range within the predetermined time, the process proceeds to step S24, and if it is determined that the position of the other vehicle traveling in the adjacent lane will not enter the edge of the detection range within the predetermined time, the process proceeds to step S27.
- step S31 of FIG. 10 the function of the recognition unit 21 detects a preceding vehicle using the imaging device 11, and in the following step S32, the function of the determination unit 22 determines whether the host vehicle V1 will overtake the preceding vehicle. If it is determined that the host vehicle V1 will not overtake the preceding vehicle, the process proceeds to step S7 of FIG. 7, where normal autonomous driving control is executed. On the other hand, if it is determined that the host vehicle V1 will overtake the preceding vehicle, the process proceeds to step S33, where the function of the recognition unit 21 detects other vehicles traveling in the adjacent lane and the adjacent lane next to it using the imaging device 11.
- step S34 it is determined whether there is another vehicle traveling in the adjacent lane, and if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S35, where overtaking of the preceding vehicle is not executed, and the function of the control unit 23 is used to make the preceding vehicle follow, for example, by following control. On the other hand, if it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S36.
- step S36 it is determined whether or not there is another vehicle traveling in the adjacent lane. If it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S39, where the control unit 23 functions to overtake the preceding vehicle. On the other hand, if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S37, where it is determined whether or not the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range of the imaging device 11 within a predetermined time.
- step S38 the vehicle speed control device 171 is used to decelerate the host vehicle V1
- step S39 the process proceeds to step S39.
- the driving assistance device 19 and driving assistance method according to the present invention can be used in any of the following cases: autonomous control of only the vehicle's driving speed, autonomous control of only the vehicle's steering operation, and autonomous control of both the vehicle's driving speed and steering operation. Furthermore, the driving assistance device 19 and driving assistance method according to the present invention can be used not only for autonomous driving control, but also to assist the driver in manual driving.
- a driving assistance method in which the processor detects another vehicle traveling beside the vehicle V1 using the imaging device 11, determines whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter the edge within a predetermined time, and, if it is determined that the position of the other vehicle is at the edge or will enter the edge within the predetermined time, autonomously controls the traveling of the vehicle V1 so that the position of the other vehicle is not included in the edge.
- This embodiment is referred to as embodiment (1). This makes it possible to suppress the influence of erroneous recognition of the traveling state of the other vehicle on the traveling state of the vehicle V1.
- the processor predicts the course of the other vehicle based on the traveling state of the other vehicle, and determines whether the position of the other vehicle will enter the edge within the predetermined time based on the course of the other vehicle. If it is determined that the position of the other vehicle will enter the edge within the predetermined time, the processor may autonomously control the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases, or may set the course of the host vehicle V1 based on the course of the other vehicle.
- This embodiment is referred to as embodiment (2). Thereby, the edge avoidance control to be executed can be set in advance and executed smoothly.
- the processor when the processor determines that the position of the other vehicle traveling behind the host vehicle V1 among the other vehicles traveling to the side of the host vehicle V1 is at the edge or will enter the edge within the predetermined time, the processor may autonomously control the traveling of the host vehicle V1 so that the position of the other vehicle traveling behind the host vehicle V1 is not included in the edge behind the host vehicle V1.
- This embodiment is referred to as embodiment (3). Thereby, edge avoidance control can be executed even when an obstacle present behind the host vehicle V1 is detected by an imaging device 11 with a short focal length.
- the processor determines that the other vehicle is located at the edge or will enter the edge within the predetermined time, it compares the vehicle speed of the host vehicle V1 with the vehicle speed of the other vehicle, and when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, it may accelerate the host vehicle V1, when the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, it may decelerate the host vehicle V1, and when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle, it may accelerate or decelerate the host vehicle V1.
- This embodiment is referred to as embodiment (4). Thereby, edge avoidance control according to the driving scene can be executed.
- the processor determines that the other vehicle is at the edge or will enter the edge within the predetermined time, it compares the vehicle speed of the host vehicle V1 with the vehicle speed of the other vehicle, and when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, it maintains the vehicle speed of the host vehicle V1, and when the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, it decelerates the host vehicle V1, and when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle, it determines whether the other vehicle is at the edge, and when it determines that the other vehicle is at the edge, it accelerates or decelerates the host vehicle V1, and when it determines that the other vehicle is not at the edge, it may maintain the vehicle speed of the host vehicle V1.
- This embodiment is called embodiment (5). Thereby, edge avoidance control according to the driving scene can be executed.
- the processor determines whether the other vehicle is present in the adjacent lane and the adjacent lane adjacent to the adjacent lane from the detection result of the other vehicle, and if it is determined that the other vehicle is not present in the adjacent lane and the other vehicle is present in the adjacent lane, it determines whether the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the specified time, and if it is determined that the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the specified time, it is not necessary to perform lane change assistance by autonomous driving control.
- This embodiment is referred to as embodiment (6). This makes it possible to avoid a driving scene in which edge avoidance control is performed during lane change.
- the processor determines from the detection result of the other vehicle whether or not the other vehicle is present in the lane adjacent to the host vehicle V1 in which the host vehicle V1 is traveling and in the adjacent lane adjacent to the adjacent lane, and if it is determined that the other vehicle is not present in the adjacent lane and that the other vehicle is present in the adjacent lane, the processor determines whether the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time.
- the vehicle V1 may be changed from the own lane to the adjacent lane by autonomous driving control in order to overtake the preceding vehicle, and when it is determined that the position of the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time, the vehicle V1 may be decelerated and changed from the own lane to the adjacent lane by the autonomous driving control.
- This embodiment is referred to as embodiment (7). This makes it possible to overtake a preceding vehicle while avoiding a driving scene in which edge avoidance control is performed during lane change.
- a driving assistance device 19 includes a recognition unit 21 that detects other vehicles traveling beside the host vehicle V1 using an imaging device 11, a determination unit 22 that determines whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter the edge within a predetermined time, and a control unit 23 that autonomously controls the traveling of the host vehicle V1 so that the position of the other vehicle is not included in the edge when the determination unit 22 determines that the position of the other vehicle is at the edge or will enter the edge within the predetermined time.
- This embodiment is referred to as embodiment (8). This makes it possible to suppress the effect of erroneous recognition of the traveling state of the other vehicle on the traveling state of the host vehicle V1.
- the position of the other vehicle may be regarded as the whole vehicle or as a specific position in the overall length (or longitudinal direction) of the other vehicle (for example, the center of the vehicle or a specific part).
- the same reference may always be used for the position of the other vehicle, or the reference may be changed for each situation (driving scene).
- specific parts such as headlights, taillights, and the front and/or rear bumpers of the other vehicle may be used as the reference.
- a specific part in the front of the other vehicle front of the vehicle body
- a specific part in the rear of the other vehicle rear of the vehicle body
- the embodiment (1) may be combined with the embodiment (2), the embodiment (1) may be combined with the embodiment (3), or the embodiment (1) may be combined with the embodiments (2) and (3). Also, according to the driving assistance method of the present embodiment, the embodiment (1) may be combined with the embodiment (4), the embodiment (1) may be combined with the embodiments (2) and (4), the embodiment (1) may be combined with the embodiments (3) and (4), or the embodiment (1) may be combined with the embodiments (2), (3), and (4).
- embodiment (8) may be combined with embodiment (2), embodiment (8) may be combined with embodiment (3), or embodiment (8) may be combined with embodiments (2) and (3).
- embodiment (8) may be combined with embodiment (4), embodiment (8) may be combined with embodiments (2) and (4), embodiment (8) may be combined with embodiments (3) and (4), or embodiment (8) may be combined with embodiments (2), (3), and (4).
- embodiment (1) may be combined with embodiment (5), embodiment (1) may be combined with embodiments (2) and (5), embodiment (1) may be combined with embodiments (3) and (5), or embodiment (1) may be combined with embodiments (2), (3), and (5).
- embodiment (8) may be combined with embodiment (5), embodiment (8) may be combined with embodiments (2) and (5), embodiment (8) may be combined with embodiments (3) and (5), or embodiment (8) may be combined with embodiments (2), (3), and (5).
- embodiment (1) may be combined with embodiment (6), embodiment (1) may be combined with embodiment (2) and (6), embodiment (1) may be combined with embodiment (3) and (6), embodiment (1) may be combined with embodiment (4) and (6), or embodiment (1) may be combined with embodiment (5) and (6).
- embodiment (1) may be combined with embodiment (2), (3), and (6), embodiment (1) may be combined with embodiment (2), (4), and (6), embodiment (1) may be combined with embodiment (2), (5), and (6), embodiment (1) may be combined with embodiment (3), (4), and (6), or embodiment (1) may be combined with embodiment (3), (5), and (6).
- embodiment (1) may be combined with embodiments (2), (3), (4), and (6), embodiment (1) may be combined with embodiments (2), (3), (5), and (6), embodiment (1) may be combined with embodiments (3), (4), (5), and (6), and embodiment (1) may be combined with embodiments (2) to (6).
- embodiment (8) may be combined with embodiment (6), embodiment (8) may be combined with embodiment (2) and (6), embodiment (8) may be combined with embodiment (3) and (6), embodiment (8) may be combined with embodiment (4) and (6), or embodiment (8) may be combined with embodiment (5) and (6).
- embodiment (8) may be combined with embodiment (2), (3), and (6), embodiment (8) may be combined with embodiment (2), (4), and (6), embodiment (8) may be combined with embodiment (2), (5), and (6), embodiment (8) may be combined with embodiment (3), (4), and (6), or embodiment (8) may be combined with embodiment (3), (5), and (6).
- embodiment (8) may be combined with embodiments (2), (3), (4), and (6), embodiment (8) may be combined with embodiments (2), (3), (5), and (6), embodiment (8) may be combined with embodiments (3), (4), (5), and (6), and embodiment (8) may be combined with embodiments (2) to (6).
- embodiment (1) may be combined with embodiment (7), embodiment (1) may be combined with embodiment (2) and (7), embodiment (1) may be combined with embodiment (3) and (7), embodiment (1) may be combined with embodiment (4) and (7), or embodiment (1) may be combined with embodiment (5) and (7).
- embodiment (1) may be combined with embodiment (2), (3), and (7), embodiment (1) may be combined with embodiment (2), (4), and (7), embodiment (1) may be combined with embodiment (2), (5), and (7), embodiment (1) may be combined with embodiment (3), (4), and (7), or embodiment (1) may be combined with embodiment (3), (5), and (7).
- embodiment (1) may be combined with embodiments (2), (3), (4), and (7), embodiment (1) may be combined with embodiments (2), (3), (5), and (7), embodiment (1) may be combined with embodiments (3), (4), (5), and (7), and embodiment (1) may be combined with embodiments (2) to (5) and (7).
- embodiment (8) may be combined with embodiment (7), embodiment (8) may be combined with embodiment (2) and (7), embodiment (8) may be combined with embodiment (3) and (7), embodiment (8) may be combined with embodiment (4) and (7), or embodiment (8) may be combined with embodiment (5) and (7).
- embodiment (8) may be combined with embodiment (2), (3), and (7), embodiment (8) may be combined with embodiment (2), (4), and (7), embodiment (8) may be combined with embodiment (2), (5), and (7), embodiment (8) may be combined with embodiment (3), (4), and (7), or embodiment (8) may be combined with embodiment (3), (5), and (7).
- embodiment (8) may be combined with embodiments (2), (3), (4), and (7), embodiment (8) may be combined with embodiments (2), (3), (5), and (7), embodiment (8) may be combined with embodiments (3), (4), (5), and (7), and embodiment (8) may be combined with embodiments (2) to (5) and (7).
- Driving assistance system 11 Imaging device 12 Distance measuring device 13 Vehicle state detection device 14 Map information 15 Vehicle position detection device 16 Navigation device 17 Vehicle control device 171 Vehicle speed control device 172 Steering control device 18 Display device 19 Driving assistance device 191 CPU (processor) 192...ROM 193...RAM 20: Support unit 21: Recognition unit 22: Determination unit 23: Control unit A1, A2, B1, B2, B3, B4: Detection ranges C1, C2, C3, C4: Ends L1, L2, L3: Lanes P1, P2, P3, P4, P5, P6, P7, P8: Position (own vehicle) Q1, Q2, Q3, Q4, Q5, Q6...position (other vehicles) T1, T2, T3, T4, T5...Travel trajectory (own vehicle) U1, Ux, Uy...Travel trajectory (other vehicles) V1: own vehicle V2, V3, V4, V5, V6: other vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
本発明は、運転支援方法及び運転支援装置に関するものである。 The present invention relates to a driving assistance method and a driving assistance device.
車両周辺環境データに基づいて車両周辺環境の3次元モデルを生成し、ビジュアルデータを3次元モデルの各部にマッピングすることで、生成される仮想サラウンドビューにおける歪みを減少する、車両用サラウンドビューシステムが知られている(特許文献1)。 A surround view system for vehicles is known that generates a three-dimensional model of the vehicle's surroundings based on data on the vehicle's surroundings, and maps visual data onto each part of the three-dimensional model, thereby reducing distortion in the generated virtual surround view (Patent Document 1).
上述のビジュアルデータを3次元モデルにマッピングする処理は、処理装置に対する負荷が大きい。そのため、上記従来技術では、自車両の周囲の状況が時々刻々変化する走行シーンにおいて、仮想サラウンドビューの歪みを適切に低減できず、自車両の周囲を走行する他車両の走行状態を正確に認識できない。その結果、誤認識した他車両の走行状態に基づいて不要な回避動作を行い、自車両の挙動が乱れてしまう。 The process of mapping the above-mentioned visual data onto a three-dimensional model places a large load on the processing device. As a result, the above-mentioned conventional technology is unable to adequately reduce distortion in the virtual surround view in driving scenes where the conditions around the vehicle are constantly changing, and is unable to accurately recognize the driving conditions of other vehicles around the vehicle. As a result, unnecessary evasive action is taken based on the misrecognized driving conditions of other vehicles, disrupting the behavior of the vehicle.
本発明が解決しようとする課題は、他車両の走行状態の誤認識が自車両の走行状態に与える影響を抑制できる運転支援方法及び運転支援装置を提供することである。 The problem that this invention aims to solve is to provide a driving assistance method and driving assistance device that can reduce the impact of erroneous recognition of the driving conditions of other vehicles on the driving conditions of the vehicle itself.
本発明は、自車両の側方を走行する他車両を検出した場合に、他車両の位置が撮像装置の検出範囲の端部にある又は所定時間以内に端部に進入すると判定したときは、他車両の位置が検出範囲の端部に含まれないように自車両の走行を自律制御することによって上記課題を解決する。 The present invention solves the above problem by autonomously controlling the driving of the vehicle so that the position of the other vehicle is not included in the edge of the detection range of the imaging device when the other vehicle is detected traveling beside the vehicle and it is determined that the other vehicle is located at the edge of the detection range of the imaging device or will enter the edge within a predetermined time.
本発明によれば、他車両の走行状態の誤認識が自車両の走行状態に与える影響を抑制できる。 The present invention can reduce the impact of erroneous recognition of the driving conditions of other vehicles on the driving conditions of the vehicle itself.
以下、本発明の実施形態を図面に基づいて説明する。なお、以下の説明は、左側通行の法規を有する国で、車両が左側通行で走行することを前提としている。右側通行の法規を有する国では、車両が右側通行で走行するため、以下の説明の右と左を対称にして読み替えるものとする。 Below, an embodiment of the present invention will be described with reference to the drawings. Note that the following description is based on the assumption that vehicles drive on the left side of the road in countries that have laws stipulating left-hand traffic. In countries that have laws stipulating right-hand traffic, vehicles drive on the right side of the road, so the following description should be interpreted as symmetrical between right and left.
[運転支援システムの構成]
図1は、本発明に係る運転支援システム10を示すブロック図である。運転支援システム10は車載システムであり、自律走行制御により、車両の乗員(ドライバーを含む)により設定された目的地まで車両を走行させる。自律走行制御とは、後述する運転支援装置を用いて車両の走行動作を自律的に制御することをいい、当該走行動作には、加速、減速、発進、停車、右方向又は左方向への転舵、車線変更、幅寄せなど、あらゆる走行動作が含まれる。また、自律的に走行動作を制御するとは、運転支援装置が、車両の装置を用いて走行動作の制御を行うことをいう。運転支援装置は、予め定められた範囲内でこれらの走行動作を制御し、運転支援装置により制御されない走行動作については、ドライバーによる手動の操作が行われる。
[Configuration of driving assistance system]
FIG. 1 is a block diagram showing a driving assistance system 10 according to the present invention. The driving assistance system 10 is an in-vehicle system that drives a vehicle to a destination set by a vehicle occupant (including a driver) by autonomous driving control. The autonomous driving control means that the driving operation of the vehicle is autonomously controlled using a driving assistance device described later, and the driving operation includes all driving operations such as acceleration, deceleration, starting, stopping, steering to the right or left, lane changing, and pulling over. In addition, autonomously controlling the driving operation means that the driving assistance device controls the driving operation using a device of the vehicle. The driving assistance device controls these driving operations within a predetermined range, and the driving operations that are not controlled by the driving assistance device are manually operated by the driver.
図1に示すように、運転支援システム10は、撮像装置11、測距装置12、自車状態検出装置13、地図情報14、自車位置検出装置15、ナビゲーション装置16、車両制御装置17、表示装置18及び運転支援装置19を備える。運転支援システム10を構成する装置は、CAN(Controller Area Network)その他の車載LANによって接続され、互いに情報を授受できる。 As shown in FIG. 1, the driving assistance system 10 comprises an imaging device 11, a distance measuring device 12, a vehicle state detection device 13, map information 14, a vehicle position detection device 15, a navigation device 16, a vehicle control device 17, a display device 18, and a driving assistance device 19. The devices that make up the driving assistance system 10 are connected by a CAN (Controller Area Network) or other in-vehicle LAN, and can send and receive information between each other.
撮像装置11は、画像により車両の周囲の対象物を認識する装置であり、たとえば、CCDなどの撮像素子を備えるカメラ、超音波カメラ、赤外線カメラなどのカメラである。撮像装置11は、一台の車両に複数を設けることができ、たとえば、車両のフロントグリル部、左右ドアミラーの下部及びリアバンパ近傍に配置できる。これにより、車両の周囲の対象物を認識する場合の死角を減らすことができる。 The imaging device 11 is a device that recognizes objects around the vehicle using images, and is, for example, a camera equipped with an imaging element such as a CCD, an ultrasonic camera, an infrared camera, or the like. A single vehicle can be provided with multiple imaging devices 11, and they can be placed, for example, in the vehicle's front grille, under the left and right door mirrors, and near the rear bumper. This makes it possible to reduce blind spots when recognizing objects around the vehicle.
測距装置12は、車両と対象物との相対距離および相対速度を演算するための装置であり、たとえば、レーザーレーダー、ミリ波レーダーなど(LRFなど)、LiDAR(light detection and ranging)ユニット、超音波レーダーなどのレーダー装置又はソナーである。測距装置12は、一台の車両に複数設けることができ、たとえば、車両の前方、右側方、左側方及び後方に配置できる。これにより、車両の周囲の対象物との相対距離及び相対速度を正確に演算できる。 The ranging device 12 is a device for calculating the relative distance and relative speed between the vehicle and an object, and is, for example, a radar device or sonar such as a laser radar, millimeter wave radar (LRF, etc.), a LiDAR (light detection and ranging) unit, or an ultrasonic radar. A single vehicle can be provided with multiple ranging devices 12, and can be positioned, for example, at the front, right side, left side, and rear of the vehicle. This makes it possible to accurately calculate the relative distance and relative speed between the vehicle and objects around it.
撮像装置11及び測距装置12にて検出する対象物は、道路の車線境界線、中央線、路面標識、中央分離帯、ガードレール、縁石、高速道路の側壁、道路標識、信号機、横断歩道、工事現場、事故現場、交通制限などである。また、対象物には、自車両以外の自動車(他車両)、自動二輪車(オートバイ)、自転車、歩行者など、車両の走行に影響を与える可能性がある障害物も含まれている。撮像装置11及び測距装置12の検出結果は、必要に応じ、運転支援装置19により所定の時間間隔で取得される。当該所定の時間間隔は、運転支援装置19の処理能力に応じて適宜の値を設定できる。 Objects detected by the imaging device 11 and distance measuring device 12 include road lane boundaries, center lines, road markings, medians, guardrails, curbs, highway sidewalls, road signs, traffic lights, crosswalks, construction sites, accident sites, traffic restrictions, etc. Objects also include obstacles that may affect the travel of the vehicle, such as automobiles (other vehicles) other than the vehicle itself, motorcycles, bicycles, pedestrians, etc. The detection results of the imaging device 11 and distance measuring device 12 are obtained at predetermined time intervals by the driving assistance device 19 as necessary. The predetermined time intervals can be set to an appropriate value depending on the processing capacity of the driving assistance device 19.
また、撮像装置11及び測距装置12の検出結果は、運転支援装置19にて統合又は合成(いわゆるセンサフュージョン)することができ、これにより、検出した対象物の不足する情報を補完できる。たとえば、自車位置検出装置15により取得した、車両が走行する位置である自己位置情報と、車両と対象物の相対位置(距離と方向)とにより、運転支援装置19にて対象物の位置情報を算出できる。算出された対象物の位置情報は、運転支援装置19にて、撮像装置11及び測距装置12の検出結果、並びに地図情報14などの複数の情報と統合され、車両の周囲の走行環境情報となる。また、撮像装置11及び測距装置12の検出結果と、地図情報14とを用いて、車両の周囲の対象物を認識し、その動きを予測することもできる。 In addition, the detection results of the imaging device 11 and the distance measuring device 12 can be integrated or synthesized (so-called sensor fusion) by the driving assistance device 19, which makes it possible to supplement missing information about the detected object. For example, the driving assistance device 19 can calculate the position information of the object based on the self-position information, which is the position where the vehicle is traveling, acquired by the vehicle position detection device 15, and the relative position (distance and direction) between the vehicle and the object. The calculated position information of the object is integrated by the driving assistance device 19 with multiple pieces of information, such as the detection results of the imaging device 11 and the distance measuring device 12 and the map information 14, to become information about the driving environment around the vehicle. In addition, the detection results of the imaging device 11 and the distance measuring device 12 and the map information 14 can be used to recognize objects around the vehicle and predict their movements.
自車状態検出装置13は、車両の走行状態を検出するための装置であり、車速センサ、加速度センサ、ヨーレートセンサ(たとえばジャイロセンサ)、舵角センサ、慣性計測ユニットなどが挙げられる。これらの装置については、特に限定はなく、公知のものを用いることができる。また、これらの装置の配置及び数は、車両の走行状態を適切に検出できる範囲内で適宜に設定できる。各装置の検出結果は、必要に応じ、運転支援装置19により所定の時間間隔で取得される。 The vehicle state detection device 13 is a device for detecting the vehicle's running state, and examples of such devices include a vehicle speed sensor, an acceleration sensor, a yaw rate sensor (e.g., a gyro sensor), a steering angle sensor, and an inertial measurement unit. There are no particular limitations on these devices, and any known device can be used. Furthermore, the placement and number of these devices can be set appropriately within a range in which the vehicle's running state can be appropriately detected. The detection results of each device are obtained by the driving assistance device 19 at specified time intervals as necessary.
地図情報14は、走行経路の生成、走行動作の制御などに用いられる情報であり、道路情報、施設情報及びそれらの属性情報を含む。道路情報及び道路の属性情報には、道路の幅、道路の曲率半径、路肩の構造物、道路交通法規(制限速度、車線変更の可否)、道路の合流地点と分岐地点、車線数の増加・減少位置などの情報が含まれている。地図情報14は、レーンごとの移動軌跡を把握できる高精細地図情報であり、各地図座標における二次元位置情報及び/又は三次元位置情報、各地図座標における道路・レーンの境界情報、道路属性情報、レーンの上り・下り情報、レーン識別情報、接続先レーン情報などを含む。なお、高精度地図のことをHD(High-Definition)マップとも言う。 Map information 14 is information used for generating driving routes, controlling driving operations, etc., and includes road information, facility information, and their attribute information. Road information and road attribute information include information such as road width, road curvature radius, road shoulder structures, road traffic regulations (speed limit, whether lane changes are permitted), road junctions and branching points, and locations where the number of lanes increases or decreases. Map information 14 is high-definition map information that allows the movement trajectory of each lane to be grasped, and includes two-dimensional position information and/or three-dimensional position information at each map coordinate, road/lane boundary information at each map coordinate, road attribute information, lane uphill/downhill information, lane identification information, connecting lane information, etc. Note that high-precision maps are also called HD (High-Definition) maps.
高精細地図情報の道路・レーンの境界情報は、車両が走行する走路とそれ以外との境界を示す情報である。車両が走行する走路とは、車両が走行するための道であり、走路の形態は特に限定されない。境界は、車両の進行方向に対して左右それぞれに存在し、形態は特に限定されない。境界は、たとえば、路面標示又は道路構造物であり、路面標示としては車線境界線、中央線などが挙げられ、道路構造物としては中央分離帯、ガードレール、縁石、トンネル、高速道路の側壁などが挙げられる。なお、交差点内のような走路境界が明確に特定できない地点では、予め、走路に境界が設定されている。この境界は架空のものであって、実際に存在する路面標示または道路構造物ではない。 The road/lane boundary information in the high-resolution map information is information that indicates the boundary between the lane on which the vehicle travels and other lane. The lane on which the vehicle travels is the road on which the vehicle travels, and the form of the lane is not particularly limited. The boundary exists on both the left and right sides of the vehicle's direction of travel, and the form is not particularly limited. The boundary is, for example, a road marking or a road structure. Examples of road markings include lane boundaries and center lines, and examples of road structures include medians, guard rails, curbs, tunnels, and side walls of expressways. Note that at points where the lane boundary cannot be clearly identified, such as within intersections, a boundary is set for the lane in advance. This boundary is imaginary, and is not an actual road marking or road structure.
地図情報14は、運転支援装置19、車載装置、又はネットワーク上のサーバに設けられた記録媒体に読み込み可能な状態で記憶されている。運転支援装置19は、必要に応じて地図情報14を取得する。 Map information 14 is stored in a readable state on a recording medium provided in the driving assistance device 19, an in-vehicle device, or a server on a network. The driving assistance device 19 acquires map information 14 as necessary.
自車位置検出装置15は、車両の現在位置を検出するための測位システムであり、特に限定されず、公知のものを用いることができる。自車位置検出装置15は、たとえば、GPS(Global Positioning System)用の衛星から受信した電波などから車両の現在位置を算出する。また、自車位置検出装置15は、自車状態検出装置13である車速センサ、加速度センサ及びジャイロセンサから取得した車速情報及び加速度情報から車両の現在位置を推定し、推定した現在位置を地図情報14と照合することで、車両の現在位置を算出してもよい。 The vehicle position detection device 15 is a positioning system for detecting the current position of the vehicle, and is not particularly limited, and any known system can be used. The vehicle position detection device 15 calculates the current position of the vehicle from radio waves received from a satellite for the Global Positioning System (GPS), for example. The vehicle position detection device 15 may also estimate the current position of the vehicle from vehicle speed information and acceleration information acquired from the vehicle state detection device 13, which includes a vehicle speed sensor, an acceleration sensor, and a gyro sensor, and calculate the current position of the vehicle by comparing the estimated current position with the map information 14.
ナビゲーション装置16は、地図情報14を参照して、自車位置検出装置15により検出された車両の現在位置から、乗員(ドライバーを含む)により設定された目的地までの走行経路を算出する装置である。ナビゲーション装置16は、地図情報14の道路情報及び施設情報などを用いて、車両が現在位置から目的地まで到達するための走行経路を検索する。走行経路は、車両が走行する道路、走行車線及び車両の走行方向の情報を少なくとも含み、たとえば線形で表示される。検索条件に応じ、走行経路は複数存在し得る。ナビゲーション装置16にて算出された走行経路は、運転支援装置19に出力される。 The navigation device 16 is a device that refers to the map information 14 and calculates a driving route from the current position of the vehicle detected by the vehicle position detection device 15 to a destination set by the occupants (including the driver). The navigation device 16 uses road information and facility information in the map information 14 to search for a driving route for the vehicle to reach the destination from the current position. The driving route includes at least information on the road the vehicle is traveling on, the driving lane, and the vehicle's driving direction, and is displayed, for example, linearly. There may be multiple driving routes depending on the search conditions. The driving route calculated by the navigation device 16 is output to the driving assistance device 19.
車両制御装置17は、電子制御ユニット(ECU:Electronic Control Unit)などの車載コンピュータであり、車両の走行を律する車載機器を電子的に制御する。車両制御装置17は、車両の車速を制御する車速制御装置171と、車両の操舵操作を制御する操舵制御装置172を備える。車速制御装置171及び操舵制御装置172は、運転支援装置19から入力された制御信号に応じ、これらの駆動装置及び操舵装置の動作を自律的に制御する。これにより、車両は、設定した走行経路に従って自律的に走行できる。車速制御装置171及び操舵制御装置172による自律的な制御に必要な情報、たとえば車両の車速、加速度、操舵角度及び姿勢は、自車状態検出装置13から取得する。 The vehicle control device 17 is an on-board computer such as an electronic control unit (ECU), and electronically controls on-board equipment that governs the driving of the vehicle. The vehicle control device 17 is equipped with a vehicle speed control device 171 that controls the vehicle speed, and a steering control device 172 that controls the steering operation of the vehicle. The vehicle speed control device 171 and the steering control device 172 autonomously control the operation of these drive devices and steering devices in response to control signals input from the driving assistance device 19. This allows the vehicle to drive autonomously according to a set driving route. Information required for autonomous control by the vehicle speed control device 171 and the steering control device 172, such as the vehicle speed, acceleration, steering angle, and attitude, is obtained from the vehicle state detection device 13.
車速制御装置171が制御する駆動装置としては、走行駆動源である電動モータ及び/又は内燃機関、これら走行駆動源からの出力を駆動輪に伝達するドライブシャフトや自動変速機を含む動力伝達装置、動力伝達装置を制御する駆動装置などが挙げられる。また、車速制御装置171が制御する制動装置は、たとえば、車輪を制動する制動装置である。車速制御装置171には、運転支援装置19から、設定した車速に応じた制御信号が入力される。車速制御装置171は、運転支援装置19から入力された制御信号に基づいて、これらの駆動装置を制御する信号を生成し、駆動装置に当該信号を送信することで、車両の車速を自律的に制御する。 The drive devices controlled by the vehicle speed control device 171 include an electric motor and/or an internal combustion engine that are driving sources for driving, a power transmission device including a drive shaft and an automatic transmission that transmits the output from these driving sources for driving to the drive wheels, and a drive device that controls the power transmission device. In addition, the braking device controlled by the vehicle speed control device 171 is, for example, a braking device that brakes the wheels. A control signal corresponding to the set vehicle speed is input to the vehicle speed control device 171 from the driving assistance device 19. The vehicle speed control device 171 generates signals to control these drive devices based on the control signals input from the driving assistance device 19, and transmits the signals to the drive devices, thereby autonomously controlling the vehicle speed of the vehicle.
一方、操舵制御装置172が制御する操舵装置は、ステアリングホイールの回転角度に応じて操舵輪を制御する操舵装置であり、たとえば、ステアリングのコラムシャフトに取り付けられるモータなどのステアリングアクチュエータが挙げられる。操舵制御装置172は、運転支援装置19から入力された制御信号に基づき、設定した走行経路に対して所定の横位置(車両の左右方向の位置)を維持しながら車両が走行するように、操舵装置の動作を自律的に制御する。この制御には、撮像装置11及び測距装置12の検出結果、自車状態検出装置13で取得した車両の走行状態、地図情報14及び自車位置検出装置15で取得した車両の現在位置の情報のうちの少なくとも一つを用いる。 On the other hand, the steering device controlled by the steering control device 172 is a steering device that controls the steered wheels according to the rotation angle of the steering wheel, and an example of this is a steering actuator such as a motor attached to the steering column shaft. Based on a control signal input from the driving assistance device 19, the steering control device 172 autonomously controls the operation of the steering device so that the vehicle travels while maintaining a predetermined lateral position (left-right position of the vehicle) with respect to the set travel route. For this control, at least one of the detection results of the imaging device 11 and the distance measuring device 12, the vehicle's travel state obtained by the vehicle state detection device 13, the map information 14, and the information on the current position of the vehicle obtained by the vehicle position detection device 15 is used.
表示装置18は、車両の乗員に必要な情報を提供するための装置であり、たとえば、インストルメントパネルに設けられた液晶ディスプレイ、ヘッドアップディスプレイ(HUD)などのプロジェクターである。表示装置18は、車両の乗員が、運転支援装置19に指示を入力するための入力装置を備えてもよい。入力装置としては、ユーザの指触又はスタイラスペンによって入力されるタッチパネル、ユーザの音声による指示を取得するマイクロフォン、車両のステアリングホイールに取付けられたスイッチなどが挙げられる。また、表示装置18は、出力装置としてのスピーカーを備えてもよい。 The display device 18 is a device for providing necessary information to vehicle occupants, and is, for example, a liquid crystal display provided on the instrument panel, a projector such as a head-up display (HUD), etc. The display device 18 may also be equipped with an input device for the vehicle occupants to input instructions to the driving assistance device 19. Examples of input devices include a touch panel that receives input by the user's finger or a stylus pen, a microphone that receives instructions by the user's voice, and a switch attached to the steering wheel of the vehicle. The display device 18 may also be equipped with a speaker as an output device.
運転支援装置19は、運転支援システム10を構成する装置を制御して協働させることで車両の走行を制御し、設定された目的地まで車両を走行させるための装置である。目的地は、たとえば車両の乗員が設定する。運転支援装置19は、たとえばコンピュータであり、プロセッサであるCPU(Central Processing Unit)191と、プログラムが格納されたROM(Read Only Memory)192と、アクセス可能な記憶装置として機能するRAM(Random Access Memory)193とを備える。CPU191は、ROM192に格納されたプログラムを実行し、運転支援装置19が有する機能を実現するための動作回路である。 The driving assistance device 19 is a device that controls the driving of the vehicle by controlling and coordinating the devices that make up the driving assistance system 10, and drives the vehicle to a set destination. The destination is set, for example, by the vehicle occupant. The driving assistance device 19 is, for example, a computer, and includes a CPU (Central Processing Unit) 191, which is a processor, a ROM (Read Only Memory) 192 in which programs are stored, and a RAM (Random Access Memory) 193 that functions as an accessible storage device. The CPU 191 is an operating circuit that executes the programs stored in the ROM 192 and realizes the functions of the driving assistance device 19.
運転支援装置19は、自律走行制御により、設定された目的地まで車両を走行させる運転支援機能を有する。運転支援装置19は、運転支援機能として、走行経路を生成する経路生成機能と、車両の周囲の走行環境を認識する環境認識機能と、認識した走行環境に基づいて自律走行制御の実行に必要な判定を行う判定機能と、走行軌跡を生成し、走行軌跡に沿って車両を走行させる走行制御機能とを有する。ROM192に格納されたプログラムはこれらの機能を実現するためのプログラムを備え、CPU191がROM192に格納されたプログラムを実行することで、これらの機能が実現される。図1には、各機能を実現する機能ブロックを便宜的に抽出して示す。 The driving assistance device 19 has a driving assistance function of driving the vehicle to a set destination by autonomous driving control. The driving assistance device 19 has, as driving assistance functions, a route generation function of generating a driving route, an environment recognition function of recognizing the driving environment around the vehicle, a determination function of making a determination necessary for executing autonomous driving control based on the recognized driving environment, and a driving control function of generating a driving trajectory and driving the vehicle along the driving trajectory. The programs stored in ROM 192 include programs for realizing these functions, and these functions are realized by CPU 191 executing the programs stored in ROM 192. Figure 1 shows functional blocks that realize each function extracted for convenience.
[各機能ブロックの機能]
以下、図1に示す支援部20、認識部21、判定部22及び制御部23の各機能ブロックが有する機能について説明する。
[Functions of each functional block]
The functions of each of the functional blocks, ie, the support unit 20, the recognition unit 21, the determination unit 22, and the control unit 23 shown in FIG. 1, will be described below.
支援部20は、自律走行制御により、設定された目的地まで車両を走行させる運転支援機能を有する。図2は、運転支援装置19が運転支援機能により車両の走行を自律制御する走行シーンの一例を示す平面図である。図2に示す走行シーンでは、3車線の道路が図面の上下方向に延在し、車両は当該道路を図面の下側から上側に向かって走行するものとする。図2に示すように、走行方向左側の車線から順に、各車線を車線L1,L2,L3とする。図2に示す走行シーンでは、自車両V1は、車線L2の位置P1を走行しており、自車両V1の乗員により設定された前方の目的地(図示しない)に向かって直進するものとする。 The support unit 20 has a driving support function that drives the vehicle to a set destination by autonomous driving control. FIG. 2 is a plan view showing an example of a driving scene in which the driving support device 19 autonomously controls the driving of the vehicle by the driving support function. In the driving scene shown in FIG. 2, a three-lane road extends in the vertical direction of the drawing, and the vehicle drives on the road from the bottom to the top of the drawing. As shown in FIG. 2, the lanes are designated as lanes L1, L2, and L3 in order from the lane on the left side of the driving direction. In the driving scene shown in FIG. 2, the host vehicle V1 is driving at position P1 on lane L2, and is driving straight toward a destination (not shown) ahead that has been set by the occupant of the host vehicle V1.
認識部21は、車両の周囲の走行環境を認識する環境認識機能を有する。運転支援装置19は、認識部21の環境認識機能により、撮像装置11及び測距装置12を用いて、車両の周囲の走行環境を認識する。走行環境とは、車両が、現在の走行状態を維持できるか、走行状態を変更する必要があるかを判定するための情報であり、たとえば、対象物の種類及び位置、障害物が存在する場合はその種類及び位置、路面状況などの道路状況、天気などの情報が含まれる。運転支援装置19は、撮像装置11及び測距装置12の検出結果に対し、パターンマッチング、センサフュージョンなどの適宜の処理を行い、走行環境を認識する。 The recognition unit 21 has an environmental recognition function that recognizes the driving environment around the vehicle. The driving assistance device 19 recognizes the driving environment around the vehicle using the imaging device 11 and distance measuring device 12 through the environmental recognition function of the recognition unit 21. The driving environment is information for determining whether the vehicle can maintain its current driving state or needs to change its driving state, and includes information such as the type and position of an object, the type and position of an obstacle if one exists, road conditions such as road surface conditions, and weather. The driving assistance device 19 recognizes the driving environment by performing appropriate processing such as pattern matching and sensor fusion on the detection results of the imaging device 11 and distance measuring device 12.
撮像装置11は、複数の撮像装置11から構成される。たとえば、図2に示す自車両V1は、自車両V1の前方の検出範囲A1に存在する障害物を検出する前方カメラと、自車両V1の後方の検出範囲A2に存在する障害物を検出する後方カメラとを備える。これに加え、自車両V1は、自車両V1の前方の検出範囲B1に存在する障害物を検出する前方広角カメラと、自車両V1の後方の検出範囲B2に存在する障害物を検出する後方広角カメラと、自車両V1の左側方の検出範囲B3に存在する障害物を検出する左側方広角カメラと、自車両V1の右側方の検出範囲B4に存在する障害物を検出する右側方広角カメラとを備える。 The imaging device 11 is composed of multiple imaging devices 11. For example, the host vehicle V1 shown in FIG. 2 is equipped with a front camera that detects obstacles present within a detection range A1 in front of the host vehicle V1, and a rear camera that detects obstacles present within a detection range A2 behind the host vehicle V1. In addition, the host vehicle V1 is equipped with a front wide-angle camera that detects obstacles present within a detection range B1 in front of the host vehicle V1, a rear wide-angle camera that detects obstacles present within a detection range B2 behind the host vehicle V1, a left side wide-angle camera that detects obstacles present within a detection range B3 to the left of the host vehicle V1, and a right side wide-angle camera that detects obstacles present within a detection range B4 to the right of the host vehicle V1.
広角カメラは広角レンズを備えるため、通常のカメラより画角が広く、焦点距離が短い。そのため、前方広角カメラの検出範囲B1は、前方カメラの検出範囲A1より車線L2に沿う方向の距離が短く、車線L2の幅方向の画角が広い。同様に、後方広角カメラの検出範囲B2は、後方カメラの検出範囲A2より車線L2に沿う方向の距離が短く、車線L2の幅方向の画角が広い。なお、画角とは、撮像装置11の撮影可能な範囲(つまり検出範囲)を角度で表したものであり、たとえば、撮像装置11を中心とする角度により、当該撮像装置11の水平方向の検出範囲を示す。 Wide-angle cameras have a wide-angle lens, and therefore a wider angle of view and a shorter focal length than normal cameras. Therefore, the detection range B1 of the front wide-angle camera is shorter in distance along the lane L2 than the detection range A1 of the front camera, and has a wider angle of view in the width direction of lane L2. Similarly, the detection range B2 of the rear wide-angle camera is shorter in distance along the lane L2 than the detection range A2 of the rear camera, and has a wider angle of view in the width direction of lane L2. The angle of view is the range that can be photographed by the imaging device 11 (i.e., the detection range) expressed in degrees, and for example, indicates the horizontal detection range of the imaging device 11 by the angle centered on the imaging device 11.
運転支援装置19は、認識部21の機能により、前方広角カメラ、後方広角カメラ、左側方広角カメラ及び右側方広角カメラの検出結果をセンサフュージョンにより統合して処理し、自車両V1の周囲に存在する障害物を隈なく検出する。これらの広角カメラは、自車両V1の周囲に障害物が検出できない死角が生じないよう、隣り合うカメラ同士の検出範囲の一部(たとえば、検出範囲の水平方向端部から画角の10~15%程度の範囲)が重複するように配置されている。 The driving assistance device 19 uses the recognition unit 21 to integrate and process the detection results from the front wide-angle camera, rear wide-angle camera, left side wide-angle camera, and right side wide-angle camera using sensor fusion to thoroughly detect obstacles around the vehicle V1. These wide-angle cameras are positioned so that a portion of the detection ranges of adjacent cameras overlap (for example, a range of about 10 to 15% of the angle of view from the horizontal end of the detection range) to prevent blind spots around the vehicle V1 where obstacles cannot be detected.
隣り合うカメラ同士の検出範囲が重複するのは、検出範囲の中でも撮像装置11の画角の端部である。画角の端部とは、たとえば、検出範囲の水平方向端部から画角の10~15%の範囲のことを言う。検出範囲の端部の一例として、図2には、各カメラの検出範囲の端部C1~C4を示す。端部C1,C3は検出範囲B1の端部であり、端部C2,C4は検出範囲B2の端部である。図2に示す走行シーンでは、端部C1,C2は検出範囲B3の端部と重複しており、端部C3,C4は検出範囲B4の端部と重複しているものとする。これらの端部C1,C2,C3、C4では、障害物の状態を正確に検出できないことが知られている。広角レンズの画角の端部では、レンズの特性により撮影した障害物の形状が歪むためである。 The detection ranges of adjacent cameras overlap at the ends of the angle of view of the imaging device 11 within the detection range. The ends of the angle of view refer to, for example, a range of 10 to 15% of the angle of view from the horizontal end of the detection range. As an example of the ends of the detection range, FIG. 2 shows ends C1 to C4 of the detection range of each camera. Ends C1 and C3 are ends of detection range B1, and ends C2 and C4 are ends of detection range B2. In the driving scene shown in FIG. 2, ends C1 and C2 overlap with the end of detection range B3, and ends C3 and C4 overlap with the end of detection range B4. It is known that the state of an obstacle cannot be accurately detected at these ends C1, C2, C3, and C4. This is because the shape of the obstacle photographed at the ends of the angle of view of a wide-angle lens is distorted due to the characteristics of the lens.
検出範囲の端部における障害物の検出について、図3A~3Bを用いて説明する。図3Aは、図2に示す走行シーンおいて他車両V2が車線L3を走行している走行シーンを示す平面図である。図3Aに示す走行シーンでは、他車両V2は車線L3の位置Q1を走行しており、他車両V2の車速は自車両V1の車速より速いものとする。つまり、他車両V2は、走行軌跡U1に沿って位置Q1から位置Q2まで直進し、自車両V1を追い抜くものとする。なお、位置Q1及びQ2は、位置P1に対する相対的な位置である。 The detection of an obstacle at the end of the detection range will be explained using Figures 3A and 3B. Figure 3A is a plan view showing the driving scene shown in Figure 2 in which another vehicle V2 is driving on lane L3. In the driving scene shown in Figure 3A, the other vehicle V2 is driving at position Q1 on lane L3, and the vehicle speed of the other vehicle V2 is faster than the vehicle speed of the host vehicle V1. In other words, the other vehicle V2 travels straight from position Q1 to position Q2 along the driving trajectory U1, and overtakes the host vehicle V1. Note that positions Q1 and Q2 are relative positions to position P1.
なお、図3A~3Bでは、説明のため、撮像装置11の検出範囲のうち図2に示す端部C1,C2,C3、C4のみを示すが、これは他の検出範囲において障害物の検出をしていないことを意味するものではなく、図3A~3Bに示していない検出範囲においても障害物の検出は行われている。後述する図4A~4C、図5及び図6A~6Bについても同様である。 Note that for the sake of explanation, Figures 3A-3B only show the ends C1, C2, C3, and C4 shown in Figure 2 of the detection range of the imaging device 11, but this does not mean that obstacles are not detected in other detection ranges, and obstacles are also detected in detection ranges not shown in Figures 3A-3B. The same applies to Figures 4A-4C, 5, and 6A-6B, which will be described later.
撮像装置11の検出範囲の端部では、端部以外の部分より障害物が近い位置に存在するように認識されることが知られている。そのため、図3Aに示す走行シーンでは、運転支援装置19は、他車両V2が、図3Bに示す走行軌跡Uxに沿って位置Q1から位置Q2まで走行すると認識する。つまり、走行軌跡U1に沿って直進する他車両V2を、走行軌跡Uxに沿って蛇行し、端部C3,C4において自車両V1に接近すると誤認識する。 It is known that obstacles are recognized as being closer at the ends of the detection range of the imaging device 11 than at other parts. Therefore, in the driving scene shown in FIG. 3A, the driving assistance device 19 recognizes that the other vehicle V2 is traveling from position Q1 to position Q2 along the traveling trajectory Ux shown in FIG. 3B. In other words, the driving assistance device 19 erroneously recognizes the other vehicle V2 traveling straight along the traveling trajectory U1 as meandering along the traveling trajectory Ux and approaching the host vehicle V1 at the ends C3 and C4.
この場合、運転支援装置19は、他車両V2が走行軌跡Uyに沿って走行して自車両V1に接近すると誤って予測し、他車両V2を回避する回避行動を実行するおそれがある。すなわち、直進する他車両V2が後方から接近する場合に、車速制御装置171を用いて自車両V1を減速させたり、操舵制御装置172を用いて自車両V1を車線L2から車線L1に車線変更させたりするおそれがある。このような回避動作は、実際は直進している他車両V2を回避するための不要な走行動作であり、当該走行動作により自車両V1の挙動が乱れるとともに、自車両V1の乗員に違和感を与えることになる。 In this case, the driving assistance device 19 may mistakenly predict that the other vehicle V2 will approach the host vehicle V1 by traveling along the travel trajectory Uy, and may execute evasive action to avoid the other vehicle V2. In other words, when the other vehicle V2 traveling straight approaches from behind, the driving assistance device 19 may decelerate the host vehicle V1 using the vehicle speed control device 171, or may change lanes of the host vehicle V1 from lane L2 to lane L1 using the steering control device 172. Such evasive actions are unnecessary driving actions to avoid the other vehicle V2 traveling straight, and these driving actions disrupt the behavior of the host vehicle V1 and cause the occupants of the host vehicle V1 to feel uncomfortable.
そこで、本実施形態の運転支援装置19は、誤認識した他車両V2の走行状態が自車両V1の走行状態に与える影響を抑制するため、他車両V2の位置が、撮像装置11の検出範囲の端部に含まれないように自車両V1の走行を自律制御する。 The driving assistance device 19 of this embodiment therefore autonomously controls the driving of the vehicle V1 so that the position of the other vehicle V2 is not included in the edge of the detection range of the imaging device 11, in order to reduce the effect that the erroneously recognized driving state of the other vehicle V2 has on the driving state of the vehicle V1.
図4Aは、本実施形態の運転支援装置19により自律走行制御を実行する走行シーンの一例を示す平面図である。図4Aに示す走行シーンは、図2に示す道路を自車両V1と他車両V3,V4が走行している走行シーンであり、自車両V1は車線L2の位置P2を走行し、他車両V3が車線L3の位置Q3を走行し、他車両V4が車線L3の位置Q4を走行している。図4Aに示す走行シーンでは、自車両V1はレーンキープ制御により定速走行し、他車両V3,V4は自車両V1と同じ車速で直進しているものとする。図4Aに示すとおり、他車両V3の位置Q3は端部C3の範囲に含まれ、他車両V4の位置Q4は端部C4の範囲に含まれている。以下、図4Aに示す走行シーンにおいて本実施形態の認識部21、判定部22及び制御部23が果たす機能について説明する。 4A is a plan view showing an example of a driving scene in which autonomous driving control is executed by the driving assistance device 19 of this embodiment. The driving scene shown in FIG. 4A is a driving scene in which the host vehicle V1 and other vehicles V3 and V4 are driving on the road shown in FIG. 2, where the host vehicle V1 is driving at position P2 in lane L2, the other vehicle V3 is driving at position Q3 in lane L3, and the other vehicle V4 is driving at position Q4 in lane L3. In the driving scene shown in FIG. 4A, the host vehicle V1 is driving at a constant speed by lane keeping control, and the other vehicles V3 and V4 are driving straight at the same vehicle speed as the host vehicle V1. As shown in FIG. 4A, the position Q3 of the other vehicle V3 is included in the range of the end C3, and the position Q4 of the other vehicle V4 is included in the range of the end C4. Below, the functions performed by the recognition unit 21, the determination unit 22, and the control unit 23 of this embodiment in the driving scene shown in FIG. 4A will be described.
認識部21は、撮像装置11を用いて、自車両V1の側方を走行する他車両を検出する機能を有する。自車両V1の側方とは、たとえば、自車両V1の左側に設置された撮像装置11の検出範囲と、右側に設置された撮像装置11の検出範囲のことを言う。図4Aに示す走行シーンであれば、図2に示す検出範囲B3,B4が自車両V1の側方である。また、自車両V1が走行する自車線の隣接車線と、当該隣接車線に隣接する隣々接車線のうち、自車両V1の検出装置(たとえば撮像装置11及び測距装置12)を用いて障害物の検出ができる範囲を自車両V1の側方としてもよい。 The recognition unit 21 has a function of detecting other vehicles traveling on the side of the host vehicle V1 using the imaging device 11. The side of the host vehicle V1 refers to, for example, the detection range of the imaging device 11 installed on the left side of the host vehicle V1 and the detection range of the imaging device 11 installed on the right side. In the driving scene shown in FIG. 4A, the detection ranges B3 and B4 shown in FIG. 2 are the sides of the host vehicle V1. In addition, the side of the host vehicle V1 may be the range in which an obstacle can be detected using the detection device of the host vehicle V1 (for example, the imaging device 11 and the distance measuring device 12) of the host vehicle V1 among the adjacent lanes of the host vehicle V1 traveling on the adjacent lanes and the adjacent lanes adjacent to the adjacent lanes.
自車両V1の側方を走行する他車両とは、たとえば隣接車線又は隣々接車線を走行する他車両である。図4Aに示す走行シーンでは、運転支援装置19は、認識部21の機能により、撮像装置11及び測距装置12を用いて、隣接車線である車線L3を走行する他車両V3,V4を認識する。他車両V3,V4の走行する位置は、撮像装置11と測距装置12の検出結果を組み合わせて(センサフュージョンして)認識する。なお、図4Aに示す走行シーンであれば、自車両V1が車線L2を走行している場合は、車線L2が自車線であり、車線L1,L3が隣接車線である。また、自車両V1が車線L1を走行している場合は、車線L2が隣接車線であり、車線L3が隣々接車線である。 Another vehicle traveling to the side of the vehicle V1 is, for example, another vehicle traveling in an adjacent lane or the lane adjacent to the vehicle. In the driving scene shown in FIG. 4A, the driving assistance device 19 recognizes the other vehicles V3 and V4 traveling in the adjacent lane L3 using the imaging device 11 and the distance measuring device 12 through the function of the recognition unit 21. The positions of the other vehicles V3 and V4 are recognized by combining (sensor fusion) the detection results of the imaging device 11 and the distance measuring device 12. In the driving scene shown in FIG. 4A, when the vehicle V1 is traveling in lane L2, lane L2 is the vehicle's lane, and lanes L1 and L3 are adjacent lanes. Also, when the vehicle V1 is traveling in lane L1, lane L2 is the adjacent lane, and lane L3 is the lane adjacent to the vehicle.
判定部22は、判定機能として、他車両の位置が、撮像装置11の検出範囲の端部にある又は所定時間以内に当該端部に進入するか否かを判定する機能を有する。運転支援装置19は、認識部21の機能により取得した、自車両V1の周囲の走行環境情報に基づき、判定部22の機能により、他車両の位置が、撮像装置11の検出範囲の端部にある否かを判定する。これに代え又はこれに加え、運転支援装置19は、他車両の位置が所定時間以内に当該端部に進入するか否かを判定してもよい。 The determination unit 22 has a determination function of determining whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter that edge within a predetermined time. The driving assistance device 19 uses the function of the determination unit 22 to determine whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 based on the driving environment information around the vehicle V1 acquired by the function of the recognition unit 21. Alternatively or in addition to this, the driving assistance device 19 may determine whether the position of the other vehicle will enter that edge within a predetermined time.
他車両の位置が検出範囲の端部にあるか否かを判定する場合、運転支援装置19は、検出した他車両の位置が、撮像装置11の検出範囲の端部に含まれるか否かを判定する。当該検出範囲の端部は、撮像装置11を設置した時点でその範囲が画定されるため、予め運転支援装置19のROM192などに端部の範囲が登録されている。したがって、運転支援装置19は、他車両の位置を検出し、当該他車両の位置が、登録された検出範囲の端部に含まれているか否かを判定する。図4Aに示す走行シーンでは、他車両V3の位置Q3が端部C3の範囲に含まれ、他車両V4の位置Q4が端部C4の範囲に含まれているため、運転支援装置19は、他車両V3,V4の位置が検出範囲の端部にあると判定する。 When determining whether the position of another vehicle is at the end of the detection range, the driving assistance device 19 determines whether the position of the detected other vehicle is included in the end of the detection range of the imaging device 11. The end of the detection range is defined when the imaging device 11 is installed, so the end range is registered in advance in the ROM 192 of the driving assistance device 19, etc. Therefore, the driving assistance device 19 detects the position of the other vehicle and determines whether the position of the other vehicle is included in the end of the registered detection range. In the driving scene shown in FIG. 4A, the position Q3 of the other vehicle V3 is included in the range of the end C3, and the position Q4 of the other vehicle V4 is included in the range of the end C4, so the driving assistance device 19 determines that the positions of the other vehicles V3 and V4 are at the ends of the detection range.
一方、他車両の位置が所定時間以内に検出範囲の端部に進入するか否かを判定する場合、運転支援装置19は、たとえば、認識部21の機能により取得した走行環境情報から自車両V1と他車両の走行状態を認識する。車両の走行状態とは、車両の進行方向と車速の状態であり、車両が直進している状態、車両が右方向又は左方向に転舵している状態、車両が加速又は減速している状態、車両が定速で走行している状態などの状態が含まれる。また、車両の走行状態には、車両が行う走行動作の状態も含まれる。例として、車両が方向指示器を点滅させている状態、車両が前照灯を点灯している状態などが挙げられる。 On the other hand, when determining whether the position of the other vehicle will enter the end of the detection range within a specified time, the driving assistance device 19 recognizes the driving states of the host vehicle V1 and the other vehicle from driving environment information acquired by the function of the recognition unit 21, for example. The driving state of the vehicle refers to the state of the vehicle's traveling direction and vehicle speed, and includes states such as a state in which the vehicle is traveling straight, a state in which the vehicle is steering to the right or left, a state in which the vehicle is accelerating or decelerating, and a state in which the vehicle is traveling at a constant speed. The driving state of the vehicle also includes the state of the driving operation performed by the vehicle. Examples include a state in which the vehicle's turn indicators are flashing, a state in which the vehicle's headlights are on, etc.
自車両V1の走行状態について、運転支援装置19は、たとえば、自車状態検出装置13の各種センサから自車両V1の車速、加速度、ヨーレート、舵角、ステアリングホイールの回転角度などの情報を取得し、自車両V1の現在の走行状態を認識する。これに代え、又はこれに加え、運転支援装置19は、地図情報14から道路情報を取得し、自車位置検出装置15から自車両V1の現在位置を取得し、ナビゲーション装置16から走行経路を取得し、自車両V1の現在位置における道路の形状及び/又は走行経路から、自車両V1の進行方向及び/又は車速を認識してもよい。 Regarding the driving state of the host vehicle V1, the driving assistance device 19 acquires information such as the vehicle speed, acceleration, yaw rate, steering angle, and steering wheel rotation angle of the host vehicle V1 from various sensors of the host vehicle state detection device 13, and recognizes the current driving state of the host vehicle V1. Alternatively or in addition to this, the driving assistance device 19 may acquire road information from the map information 14, acquire the current position of the host vehicle V1 from the host vehicle position detection device 15, acquire the driving route from the navigation device 16, and recognize the traveling direction and/or vehicle speed of the host vehicle V1 from the shape of the road at the current position of the host vehicle V1 and/or the driving route.
また、運転支援装置19は、車両制御装置17から、駆動装置及び/又は操舵装置に出力された制御信号を取得し、自車両V1の進行方向及び/又は車速をどのように制御するのか(どのように変化させるのか)を認識する。そして、これらに基づき、所定時間後に自車両V1の走行状態がどのように変化するのかを予測する。これに代え、運転支援装置19は、地図情報14から道路情報を取得し、自車位置検出装置15から自車両V1の現在位置を取得し、ナビゲーション装置16から走行経路を取得し、自車両V1の現在位置の前方における道路の形状及び/又は走行経路から、所定時間後の自車両V1の進行方向及び/又は車速を予測してもよい。 The driving assistance device 19 also acquires control signals output from the vehicle control device 17 to the drive device and/or steering device, and recognizes how to control (change) the traveling direction and/or vehicle speed of the host vehicle V1. Then, based on these, it predicts how the traveling state of the host vehicle V1 will change after a predetermined time. Alternatively, the driving assistance device 19 may acquire road information from the map information 14, acquire the current position of the host vehicle V1 from the host vehicle position detection device 15, acquire the traveling route from the navigation device 16, and predict the traveling direction and/or vehicle speed of the host vehicle V1 after a predetermined time from the shape of the road ahead of the current position of the host vehicle V1 and/or the traveling route.
これに対し、他車両V2の走行状態について、運転支援装置19は、たとえば撮像装置11から画像データを取得し、パターンマッチングによる障害物の抽出と特定を行い、障害物の種類、位置及びその状態を認識する。また、測距装置12から自車両V1の周囲をスキャンした情報を取得し、当該情報から障害物の位置とその方向を認識する。撮像装置11から取得した画像データから、障害物が他車両であることを認識した場合は、その形状から車体がどの程度傾いているか(つまりどの程度転舵しているか)を認識する。また、測距装置12のスキャン結果から、自車両V1に対する他車両の位置と相対速度を取得する。そして、これらの検出結果に基づいて他車両の走行位置、進行方向及び車速を認識する。 In response to this, with regard to the driving state of the other vehicle V2, the driving assistance device 19 acquires image data, for example, from the imaging device 11, extracts and identifies obstacles by pattern matching, and recognizes the type, position, and state of the obstacle. It also acquires information obtained by scanning the area around the vehicle V1 from the distance measuring device 12, and recognizes the position and direction of the obstacle from this information. If it recognizes that the obstacle is another vehicle from the image data acquired from the imaging device 11, it recognizes from its shape how much the vehicle body is tilted (i.e. how much it is being steered). It also acquires the position and relative speed of the other vehicle with respect to the vehicle V1 from the scan results of the distance measuring device 12. Then, it recognizes the driving position, direction of travel, and speed of the other vehicle based on these detection results.
所定時間後の他車両V2の走行状態を予測する場合、運転支援装置19は、測距装置12から自車両V1の周囲をスキャンした情報を取得し、当該情報から障害物の位置とその方向を認識する。運転支援装置19は、測距装置12のスキャン結果から障害物の位置とその方向を認識する処理を、所定時間より短い時間間隔で複数回(たとえば3回以上)繰り返し行い、障害物の位置の変化の傾向を認識し、当該傾向から所定時間後の障害物の状態(つまり他車両V2の走行状態)を予測する。 When predicting the driving state of the other vehicle V2 after a predetermined time, the driving assistance device 19 obtains information obtained by scanning the surroundings of the vehicle V1 from the distance measuring device 12, and recognizes the position and direction of the obstacle from that information. The driving assistance device 19 repeats the process of recognizing the position and direction of the obstacle from the scan results of the distance measuring device 12 multiple times (e.g., three or more times) at time intervals shorter than a predetermined time, recognizes the tendency of changes in the obstacle position, and predicts the state of the obstacle after a predetermined time (i.e. the driving state of the other vehicle V2) from that tendency.
運転支援装置19は、認識した自車両V1の走行状態と他車両V2の走行状態とに基づき、他車両V2の位置が所定時間以内に検出範囲の端部に進入するか否かを判定する。運転支援装置19は、自車両V1と他車両の位置関係、自車両V1と他車両の速度差、及び自車両V1と他車両の進行方向に基づいて、他車両が検出範囲の端部に進入するか否かを判定する。 The driving assistance device 19 determines whether the position of the other vehicle V2 will enter the edge of the detection range within a predetermined time based on the recognized driving state of the subject vehicle V1 and the driving state of the other vehicle V2. The driving assistance device 19 determines whether the other vehicle will enter the edge of the detection range based on the positional relationship between the subject vehicle V1 and the other vehicle, the speed difference between the subject vehicle V1 and the other vehicle, and the traveling direction of the subject vehicle V1 and the other vehicle.
所定時間は、他車両を検出してから、他車両の位置が実際に検出範囲の端部に進入するまでの間に、他車両の位置が当該端部に含まれないようにする自律走行制御が開始できる範囲内で適宜の値を設定でき、たとえば10~20秒である。所定時間がこれより短いと、当該自律走行制御の開始が遅れ、自車両V1の挙動の変化が大きくなる。逆に、所定時間がこれより長いと、走行状態を正確に予測できなくなり、通常の自律走行制御により走行すべき走行シーンにおいて、他車両の位置が検出範囲の端部に含まれないようにする自律走行制御を実行するおそれがある。 The specified time can be set to an appropriate value within the range in which autonomous driving control can be initiated to prevent the position of the other vehicle from being included in the edge of the detection range between the time the other vehicle is detected and the time the other vehicle's position actually enters the edge of the detection range, for example 10 to 20 seconds. If the specified time is shorter than this, the start of the autonomous driving control will be delayed and the behavior of the vehicle V1 will change significantly. Conversely, if the specified time is longer than this, it will be impossible to accurately predict the driving state, and there is a risk that autonomous driving control will be executed to prevent the position of the other vehicle from being included in the edge of the detection range in a driving scene in which normal autonomous driving control should be used.
たとえば、他車両の位置が自車両V1の後方の位置であって、他車両が検出範囲の端部以外の位置を走行している走行シーンにおいて、他車両の車速が自車両V1の車速より速く、他車両が所定時間以内に自車両V1に追いつき、他車両の進行方向が検出範囲の端部に向かうときは、他車両の位置が所定時間以内に当該端部に進入すると判定する。これに対し、同じ走行シーンにおいて、他車両の車速が自車両V1の車速以下であるときは、他車両の位置が所定時間以内に当該端部に進入しないと判定する。 For example, in a driving scene in which the other vehicle is located behind the host vehicle V1 and is traveling at a position other than the end of the detection range, if the other vehicle's speed is faster than the host vehicle V1, the other vehicle catches up with the host vehicle V1 within a specified time, and the other vehicle's direction of travel is toward the end of the detection range, it is determined that the other vehicle will enter that end within the specified time. In contrast, in the same driving scene, if the other vehicle's speed is equal to or slower than the host vehicle V1, it is determined that the other vehicle will not enter that end within the specified time.
また、他車両の位置が自車両V1の前方の位置であって、他車両が検出範囲の端部以外の位置を走行している走行シーンにおいて、他車両の車速が自車両V1の車速以上であるときは、他車両の位置が所定時間以内に検出範囲の端部に進入しないと判定する。これに対し、同じ走行シーンにおいて、他車両の車速が自車両V1の車速より遅く、自車両V1が所定時間以内に他車両に追いつき、自車両V1(特に検出範囲の端部)が他車両に向かうときは、他車両の位置が所定時間以内に当該端部に進入すると判定する。 Furthermore, in a driving scene in which the other vehicle is located in front of the host vehicle V1 and is traveling at a position other than the end of the detection range, if the vehicle speed of the other vehicle is equal to or faster than the vehicle speed of the host vehicle V1, it is determined that the other vehicle will not enter the end of the detection range within a predetermined time. In contrast, in the same driving scene, if the vehicle speed of the other vehicle is slower than the vehicle speed of the host vehicle V1, the host vehicle V1 catches up with the other vehicle within the predetermined time, and the host vehicle V1 (particularly the end of the detection range) moves toward the other vehicle, it is determined that the other vehicle will enter that end within the predetermined time.
またこれに代え、運転支援装置19は、予測した自車両V1の走行状態と他車両の走行状態とに基づき、他車両の位置が所定時間以内に検出範囲の端部に進入するか否かを判定してもよい。具体的には、所定時間後の自車両V1と他車両の位置関係から、他車両の位置が当該端部に進入するか否かを判定する。これに代え又はこれに加え、認識した他車両の走行状態に基づいて他車両の進路を予測し、他車両の進路に基づき、他車両の位置が所定時間以内に検出範囲の端部に進入するか否かを判定してもよい。他車両の進路とは、たとえば上述した他車両の進行方向であり、自車両V1と他車両との間で車車間通信が可能な場合は、他車両から取得した他車両の走行経路であってもよい。 Alternatively, the driving assistance device 19 may determine whether the position of the other vehicle will enter the edge of the detection range within a predetermined time based on the predicted driving state of the vehicle V1 and the driving state of the other vehicle. Specifically, it determines whether the position of the other vehicle will enter the edge from the positional relationship between the vehicle V1 and the other vehicle after a predetermined time. Alternatively or in addition, the path of the other vehicle may be predicted based on the recognized driving state of the other vehicle, and based on the path of the other vehicle, it may determine whether the position of the other vehicle will enter the edge of the detection range within a predetermined time. The path of the other vehicle is, for example, the traveling direction of the other vehicle described above, and may be the driving route of the other vehicle obtained from the other vehicle if vehicle-to-vehicle communication is possible between the vehicle V1 and the other vehicle.
たとえば、図4Aに示す走行シーンにおいて、他車両V4が位置Q4より後方の位置を走行している場合に、他車両V4の車速が自車両V1の車速より速く、他車両V4が所定時間以内に自車両V1に追いつけるときは、運転支援装置19は他車両V4の走行状態(特に進行方向)を認識する。図4Aに示す走行シーンでは、他車両V4は車線L3を直進しているため、他車両V4の進行方向は端部C4に向かっている。したがって、運転支援装置19は、他車両V4の位置が所定時間以内に端部C4に進入すると判定する。なお、他車両の走行状態に基づいて他車両の進路を予測することと、他車両の進路に基づき、他車両の位置が所定時間以内に検出範囲の端部に進入するか否かを判定することとは、本発明に必須の構成でなく、必要に応じて追加してもよく省略してもよい。 For example, in the driving scene shown in FIG. 4A, when the other vehicle V4 is traveling at a position behind position Q4, the vehicle speed of the other vehicle V4 is faster than the vehicle speed of the host vehicle V1, and the other vehicle V4 can catch up with the host vehicle V1 within a predetermined time, the driving assistance device 19 recognizes the driving state (particularly the traveling direction) of the other vehicle V4. In the driving scene shown in FIG. 4A, the other vehicle V4 is traveling straight along lane L3, so the traveling direction of the other vehicle V4 is toward end C4. Therefore, the driving assistance device 19 determines that the position of the other vehicle V4 will enter end C4 within the predetermined time. Note that predicting the course of the other vehicle based on the driving state of the other vehicle and determining whether the position of the other vehicle will enter the end of the detection range within the predetermined time based on the course of the other vehicle are not essential components of the present invention, and may be added or omitted as necessary.
制御部23は、他車両の位置が、撮像装置11の検出範囲の端部にある又は所定時間以内に当該端部に進入すると判定した場合は、他車両の位置が当該端部に含まれないように自車両V1の走行を自律制御する機能を有する。運転支援装置19は、判定部22の機能により、他車両の位置が、撮像装置11の検出範囲の端部になく、且つ、所定時間以内に当該端部に進入しないと判定した場合は、通常の自律走行制御を実行する。これに対し、他車両の位置が当該端部にある又は所定時間以内に当該端部に進入すると判定した場合は、制御部23の機能により、他車両の位置が当該端部に含まれないように自車両V1の走行を自律制御する。以下、他車両の位置が撮像装置11の検出範囲の端部に含まれないようにする自律走行制御を、端部回避制御とも言う。 If the control unit 23 determines that the other vehicle is at the edge of the detection range of the imaging device 11 or will enter said edge within a predetermined time, it has the function of autonomously controlling the driving of the host vehicle V1 so that the other vehicle is not included in said edge. If the driving assistance device 19 determines, by the function of the determination unit 22, that the other vehicle is not at the edge of the detection range of the imaging device 11 and will not enter said edge within a predetermined time, it executes normal autonomous driving control. In contrast, if it determines that the other vehicle is at said edge or will enter said edge within a predetermined time, it uses the function of the control unit 23 to autonomously control the driving of the host vehicle V1 so that the other vehicle is not included in said edge. Hereinafter, the autonomous driving control that prevents the other vehicle from being included in the edge of the detection range of the imaging device 11 is also referred to as edge avoidance control.
端部回避制御は、たとえば、自車両V1と他車両との速度差が大きくなるように自車両V1の走行を自律制御することを含む。たとえば、自車状態検出装置13から取得した自車両V1の車速と、走行環境情報から取得した他車両の車速とを比較し、自車両V1の車速が他車両の車速より遅い場合は、車速制御装置171を介して自車両V1を減速させ、自車両V1の車速が他車両の車速より速い場合は、車速制御装置171を介して自車両V1を加速させ、自車両V1の車速と他車両の車速が同じ場合は、車速制御装置171を介して自車両V1を加速又は減速させる。 The edge avoidance control includes, for example, autonomously controlling the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases. For example, the vehicle speed of the host vehicle V1 obtained from the host vehicle state detection device 13 is compared with the vehicle speed of the other vehicle obtained from the traveling environment information, and if the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, the host vehicle V1 is decelerated via the vehicle speed control device 171, if the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, the host vehicle V1 is accelerated via the vehicle speed control device 171, and if the vehicle speed of the host vehicle V1 and the vehicle speed of the other vehicle are the same, the host vehicle V1 is accelerated or decelerated via the vehicle speed control device 171.
これに代え又はこれに加え、端部回避制御は、他車両の進行方向(又は他車両の進路)に基づいて自車両V1の進路を設定することを含んでもよい。たとえば、他車両の進路が、撮像装置11の検出範囲の端部に向かっている場合は、自車両V1の進路(進行方向)を他車両から離れる方向に変更し、自車両V1が現在走行している車線から、他車両から離れる方向の隣接車線に車線変更させる。なお、他車両の位置が所定時間以内に検出範囲の端部に進入すると判定した場合に、自車両V1と他車両との速度差が大きくなるように自車両V1の走行を自律制御することと、同じ場合に他車両の進路に基づいて自車両V1の進路を設定することとは、本発明に必須の構成でなく、必要に応じて追加してもよく省略してもよい。 Alternatively or in addition, the edge avoidance control may include setting the path of the host vehicle V1 based on the traveling direction (or the path of the other vehicle) of the other vehicle. For example, if the path of the other vehicle is heading toward the edge of the detection range of the imaging device 11, the path (travel direction) of the host vehicle V1 is changed in a direction away from the other vehicle, and the host vehicle V1 is changed from the lane in which the host vehicle V1 is currently traveling to an adjacent lane in a direction away from the other vehicle. Note that when it is determined that the position of the other vehicle will enter the edge of the detection range within a predetermined time, autonomously controlling the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases, and setting the path of the host vehicle V1 based on the path of the other vehicle in the same case, are not essential components of the present invention and may be added or omitted as necessary.
また、運転支援装置19は、自車両V1の側方を走行する他車両を検出した場合に、当該他車両が自車両V1の後方を走行しているか否かを判定してもよい。そして、当該他車両が自車両V1の後方を走行しており、且つ、当該他車両の位置が検出範囲の端部にある又は所定時間以内に当該端部に進入すると判定した場合は、当該他車両の位置が、自車両V1の後方に存在する検出範囲の端部に含まれないように自車両V1の走行を自律制御する。これは、自車両V1の後方を撮影する撮像装置11は、自車両V1の前方を撮影するものよりも焦点距離が短く、他車両の走行状態を正確に認識できない事態が生じやすいためである。 Furthermore, when the driving assistance device 19 detects another vehicle traveling to the side of the host vehicle V1, it may determine whether the other vehicle is traveling behind the host vehicle V1. Then, when it is determined that the other vehicle is traveling behind the host vehicle V1 and that the position of the other vehicle is at the end of the detection range or will enter the end within a predetermined time, the driving assistance device 19 autonomously controls the driving of the host vehicle V1 so that the position of the other vehicle is not included in the end of the detection range that exists behind the host vehicle V1. This is because the imaging device 11 that captures the rear of the host vehicle V1 has a shorter focal length than the one that captures the front of the host vehicle V1, and it is easy for a situation to occur in which the driving state of the other vehicle cannot be accurately recognized.
なお、自車両V1の後方を走行する他車両の位置が、検出範囲の端部にある又は所定時間以内に当該端部に進入すると判定された場合に、自車両V1の後方を走行する他車両の位置が、自車両V1の後方の当該端部に含まれないように自車両V1の走行を自律制御することは、本発明に必須の構成でなく、必要に応じて追加してもよく省略してもよい。 Note that when it is determined that the position of another vehicle traveling behind the vehicle V1 is at the edge of the detection range or will enter that edge within a predetermined time, autonomous control of the traveling of the vehicle V1 so that the position of the other vehicle traveling behind the vehicle V1 is not included in the edge behind the vehicle V1 is not an essential configuration of the present invention, and may be added or omitted as necessary.
図4Aに示す走行シーンでは、他車両V3が端部C3の位置Q3を走行し、他車両V4が端部C4の位置Q4を走行しているため、運転支援装置19は、端部回避制御として、自車両V1の車速と他車両V3,V4の車速とを比較する。そして、自車両V1の車速が他車両V3,V4の車速より速いときは、図4Bに示すように自車両V1を加速させ、走行軌跡T1に沿って位置P2から位置P3まで走行させる。すなわち、自車両V1の車速が他車両V3,V4の車速より速い場合は、他車両V3が自車両V1の前方を走行している場合でも、他車両V4が自車両V1の後方を走行している場合でも、自車両V1を加速させて他車両V3,V4の位置を端部C3,C4から外し、端部C3,C4に含まれないようにする。 In the driving scene shown in FIG. 4A, the other vehicle V3 is traveling at position Q3 of end C3, and the other vehicle V4 is traveling at position Q4 of end C4, so the driving assistance device 19 compares the vehicle speed of the host vehicle V1 with the vehicle speeds of the other vehicles V3 and V4 as end avoidance control. Then, when the vehicle speed of the host vehicle V1 is faster than the vehicle speeds of the other vehicles V3 and V4, the host vehicle V1 is accelerated as shown in FIG. 4B, and travels from position P2 to position P3 along the travel trajectory T1. In other words, when the vehicle speed of the host vehicle V1 is faster than the vehicle speeds of the other vehicles V3 and V4, the host vehicle V1 is accelerated to move the positions of the other vehicles V3 and V4 out of the ends C3 and C4 so that they are not included in the ends C3 and C4, even if the other vehicle V3 is traveling in front of the host vehicle V1 or the other vehicle V4 is traveling behind the host vehicle V1.
これに対し、自車両V1の車速が他車両V3,V4の車速より遅いときは、図4Cに示すように自車両V1を減速させ、走行軌跡T2に沿って位置P2から位置P4まで走行させる。すなわち、自車両V1の車速が他車両V3,V4の車速より遅い場合は、他車両V3が自車両V1の前方を走行している場合でも、他車両V4が自車両V1の後方を走行している場合でも、自車両V1を減速させて他車両V3,V4の位置を端部C3,C4から外し、端部C3,C4に含まれないようにする。なお、位置P3及びP4は位置P2に対する相対的な位置であり、図4Cに示す走行シーンにおいて自車両V1は後退はしない。 In contrast, when the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicles V3 and V4, the host vehicle V1 is decelerated as shown in FIG. 4C and driven along the driving trajectory T2 from position P2 to position P4. In other words, when the vehicle speed of the host vehicle V1 is slower than the vehicle speeds of the other vehicles V3 and V4, the host vehicle V1 is decelerated to move the positions of the other vehicles V3 and V4 away from the ends C3 and C4 and away from being included in the ends C3 and C4, even if the other vehicle V3 is driving ahead of the host vehicle V1 or the other vehicle V4 is driving behind the host vehicle V1. Note that positions P3 and P4 are relative positions to position P2, and the host vehicle V1 does not move backward in the driving scene shown in FIG. 4C.
また、自車両V1の車速が他車両V3,V4の車速と同じときは、自車両V1を加速又は減速させる。すなわち、自車両V1の車速が他車両V3,V4の車速と同じ場合は、他車両V3,V4の位置が端部C3,C4から外れ、端部C3,C4に進入しない限り、自車両V1を加速させても減速させてもよい。なお、他車両V3,V4の位置が検出範囲の端部C3,C4に含まれないとは、平面視した場合に他車両V3,V4の車体全体が端部C3,C4に含まれていないことと、他車両V3,V4の車体の大部分(たとえば90%以上)が端部C3,C4に含まれていないことを言う。つまり、他車両V3,V4の車体の一部(たとえば10%以下)が端部C3,C4に含まれていてもよい。 Furthermore, when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3, V4, the host vehicle V1 is accelerated or decelerated. In other words, when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3, V4, the host vehicle V1 may be accelerated or decelerated as long as the positions of the other vehicles V3, V4 leave the ends C3, C4 and do not enter the ends C3, C4. Note that the positions of the other vehicles V3, V4 are not included in the ends C3, C4 of the detection range when viewed from above means that the entire body of the other vehicles V3, V4 is not included in the ends C3, C4, and that most of the body of the other vehicles V3, V4 (e.g., 90% or more) is not included in the ends C3, C4. In other words, a part of the body of the other vehicles V3, V4 (e.g., 10% or less) may be included in the ends C3, C4.
またこれに代え、運転支援装置19は、図4Aに示す走行シーンにおいて、自車両V1の車速が他車両V3,V4の車速より速いときは、自車両V1の車速を維持してもよい。これに代え又はこれに加え、運転支援装置19は、図4Aに示す走行シーンにおいて、自車両V1の車速が他車両V3,V4の車速と同じときは、他車両V3,V4の位置が検出範囲の端部にあるか否かを判定してもよい。そして、図4Aに示す走行シーンのように、他車両V3,V4の位置が当該端部にあると判定したときは、自車両V1を加速又は減速させる。これに対し、他車両V3,V4の位置が当該端部にないと判定したときは、自車両V1の車速を維持する。これは、自車両V1の前方の位置に他車両V4が進入する可能性のある場合にだけ端部回避制御を行うためである。つまり、自車両V1の車速が他車両V4の車速より遅い走行シーンでは、他車両V4が自車両V1を追い抜き、自車両V1の前方の位置に向かって車線変更することが予測されるが、自車両V1の車速が他車両V4の車速より速い走行シーンでは、他車両が自車両V1の前方の位置に向かって車線変更するとは予測されにくい。 Alternatively, the driving assistance device 19 may maintain the vehicle speed of the host vehicle V1 when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicles V3 and V4 in the driving scene shown in FIG. 4A. Alternatively or in addition, the driving assistance device 19 may determine whether the positions of the other vehicles V3 and V4 are at the ends of the detection range when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicles V3 and V4 in the driving scene shown in FIG. 4A. Then, as in the driving scene shown in FIG. 4A, when it is determined that the positions of the other vehicles V3 and V4 are at the ends, the host vehicle V1 is accelerated or decelerated. On the other hand, when it is determined that the positions of the other vehicles V3 and V4 are not at the ends, the vehicle speed of the host vehicle V1 is maintained. This is because the end avoidance control is performed only when there is a possibility that the other vehicle V4 will enter a position in front of the host vehicle V1. In other words, in a driving scene where the vehicle speed of the host vehicle V1 is slower than that of the other vehicle V4, it is predicted that the other vehicle V4 will overtake the host vehicle V1 and change lanes to a position ahead of the host vehicle V1, but in a driving scene where the vehicle speed of the host vehicle V1 is faster than that of the other vehicle V4, it is difficult to predict that the other vehicle will change lanes to a position ahead of the host vehicle V1.
次に、異なる走行シーンにおける端部回避制御について、図5を用いて説明する。 Next, we will explain edge avoidance control in different driving situations using Figure 5.
図5は、端部回避制御を実行する走行シーンの一例を示す平面図である。図5に示す走行シーンは、図2に示す道路を自車両V1と他車両V5,V6が走行している走行シーンであり、自車両V1は車線L1の位置P5を走行し、その前方において、他車両V5が車線L1の位置Q5を走行している。また、他車両V6は車線L3の位置Q6を走行している。図5に示す走行シーンでは、自車両V1はレーンキープ制御により定速走行し、他車両V5,V6は自車両V1と同じ車速で直進しているものとする。 FIG. 5 is a plan view showing an example of a driving scene in which edge avoidance control is executed. The driving scene shown in FIG. 5 is a driving scene in which the host vehicle V1 and other vehicles V5 and V6 are driving on the road shown in FIG. 2, with the host vehicle V1 driving at position P5 in lane L1 and the other vehicle V5 driving ahead of it at position Q5 in lane L1. The other vehicle V6 is driving at position Q6 in lane L3. In the driving scene shown in FIG. 5, the host vehicle V1 drives at a constant speed using lane keeping control, and the other vehicles V5 and V6 drive straight at the same speed as the host vehicle V1.
図5に示す走行シーンでは、他車両V5に遮られて車線L1の前方の障害物が検出できないため、自車両V1は、他車両V5を追い越すために車線L1から車線L2に車線変更するものとする。この場合、運転支援装置19は、認識部21の機能により、走行環境情報から他車両の走行状態を取得する。これとともに、運転支援装置19は、判定部22の機能により、他車両の検出結果から、隣接車線と隣々接車線に他車両が存在するか否かを判定する。そして、隣接車線に他車両が存在せず、隣々接車線に他車両が存在すると判定した場合は、端部回避制御として、隣々接車線を走行する他車両の位置が検出範囲の端部にある又は所定時間以内に当該端部に進入するか否かを判定する。これに対し、隣接車線に他車両が存在する場合は、隣々接車線に他車両が存在するか否かに関わらず、車線変更の支援を実行しない。 In the driving scene shown in FIG. 5, the vehicle V1 cannot detect an obstacle ahead of the lane L1 due to being blocked by the other vehicle V5, so the vehicle V1 changes lanes from lane L1 to lane L2 to overtake the other vehicle V5. In this case, the driving support device 19 acquires the driving state of the other vehicle from the driving environment information using the function of the recognition unit 21. At the same time, the driving support device 19 uses the function of the determination unit 22 to determine whether or not there is another vehicle in the adjacent lane and the adjacent lane next to the other vehicle based on the detection result of the other vehicle. If it is determined that there is no other vehicle in the adjacent lane but there is another vehicle in the adjacent lane, the driving support device 19 performs edge avoidance control to determine whether or not the position of the other vehicle traveling in the adjacent lane is at the edge of the detection range or will enter the edge within a predetermined time. On the other hand, if there is another vehicle in the adjacent lane, lane change assistance is not performed regardless of whether there is another vehicle in the adjacent lane next to the other vehicle.
図5に示す走行シーンでは、隣々接車線である車線L3の位置Q6を走行する他車両V6が検出されるため、隣々接車線に他車両V6が存在すると判定する。これに対し、隣接車線である車線L2には他車両が存在しない。そこで、運転支援装置19は、隣々接車線である車線L3を走行する他車両V6の位置Q6が、検出範囲の端部にある又は所定時間以内に当該端部に進入するか否かを判定する。具体的には、自車両V1が車線L1から車線L2に車線変更する走行軌跡T3を生成し、走行軌跡T3に沿って走行した場合に、所定時間以内に他車両V6の位置Q6が端部C1~C4のいずれかに進入するか否かを判定する。 In the driving scene shown in FIG. 5, another vehicle V6 is detected traveling at position Q6 in lane L3, which is the next adjacent lane, and it is determined that the other vehicle V6 is present in the next adjacent lane. In contrast, there is no other vehicle in lane L2, which is the adjacent lane. Therefore, the driving assistance device 19 determines whether or not the position Q6 of the other vehicle V6 traveling in lane L3, which is the next adjacent lane, is at the edge of the detection range or will enter said edge within a predetermined time. Specifically, it generates a driving trajectory T3 in which the host vehicle V1 changes lanes from lane L1 to lane L2, and determines whether or not the position Q6 of the other vehicle V6 will enter any of the edges C1 to C4 within a predetermined time when the host vehicle V1 travels along the driving trajectory T3.
図5に示すように、自車両V1が走行軌跡T3に沿って位置P5から位置P6まで走行すると、他車両V6の位置Q6が端部C4に進入することになる。したがって、運転支援装置19は、図5に示す走行シーンでは、隣々接車線である車線L3を走行する他車両V6の位置Q6が所定時間以内に端部C4に進入すると判定し、自律走行制御による自車両V1の車線変更の支援を行わない。この場合は、ドライバーの手動運転により車線変更を行うことになる。これに対し、隣々接車線を走行する他車両V6の位置Q6が所定時間以内に検出範囲の端部に進入しないと判定した場合は、先行車両である他車両V5を追い越すため、自律走行制御により車線L1から車線L2に車線変更させる。 As shown in FIG. 5, when the vehicle V1 travels from position P5 to position P6 along the travel trajectory T3, the position Q6 of the other vehicle V6 will enter the end C4. Therefore, in the travel scene shown in FIG. 5, the driving assistance device 19 determines that the position Q6 of the other vehicle V6 traveling in the adjacent lane L3 will enter the end C4 within a predetermined time, and does not assist the vehicle V1 in changing lanes through autonomous driving control. In this case, the lane change will be performed by manual driving by the driver. In contrast, if it is determined that the position Q6 of the other vehicle V6 traveling in the adjacent lane will not enter the end of the detection range within a predetermined time, the vehicle V1 will be changed lanes from lane L1 to lane L2 through autonomous driving control in order to overtake the preceding vehicle V5.
ただし、本実施形態では、図5に示す走行シーンにおいて常に自車両V1の車線変更の支援を行わないわけではない。たとえば、自車両V1が、自車両V1の先行車両を追い越す場合は、自車両V1が走行する車線の隣接車線と隣々接車線に他車両が存在するか否かを判定する。そして、隣接車線に他車両が存在せず、隣々接車線に他車両が存在すると判定した場合は、隣々接車線を走行する他車両の位置が検出範囲の端部にある又は所定時間以内に検出範囲の端部に進入するか否かを判定する。 However, in this embodiment, lane change assistance for the host vehicle V1 is not always performed in the driving scene shown in FIG. 5. For example, when the host vehicle V1 overtakes a vehicle ahead of the host vehicle V1, it is determined whether there are other vehicles in the lane adjacent to the lane in which the host vehicle V1 is traveling and in the adjacent lane. Then, if it is determined that there is no other vehicle in the adjacent lane but there is another vehicle in the adjacent lane, it is determined whether the other vehicle traveling in the adjacent lane is at the edge of the detection range or will enter the edge of the detection range within a predetermined time.
図5に示す走行シーンでは、自車両V1が、先行車両である他車両V5の追い越しを試みるため、自車両V1が走行する車線L1に隣接する車線L2と、車線L2に隣接する車線L3に他車両が存在するか否かを判定する。図5に示す走行シーンでは、隣接車線である車線L2には他車両が存在せず、隣々接車線である車線L3には他車両V6が存在するため、他車両V6の位置Q6が端部C4にある又は所定時間以内に端部C4に進入するか否かを判定する。 In the driving scene shown in FIG. 5, the host vehicle V1 attempts to overtake the other vehicle V5, which is a preceding vehicle, and determines whether there are other vehicles in the lane L2 adjacent to the lane L1 in which the host vehicle V1 is traveling, and in the lane L3 adjacent to lane L2. In the driving scene shown in FIG. 5, there is no other vehicle in the adjacent lane L2, but there is another vehicle V6 in the adjacent lane L3, so it is determined whether the position Q6 of the other vehicle V6 is at end C4 or will enter end C4 within a predetermined time.
上述のとおり、図5に示す走行シーンでは所定時間以内に他車両V6の位置Q6が端部C4に進入するため、運転支援装置19は、図6Aに示すように、自律走行制御により自車両V1を減速させ、走行軌跡T4に沿って位置P5から位置P7まで走行させる。そして、当該減速制御とともに、図6Bに示すように、自車線である車線L1から隣接車線である車線L2に車線変更させる。自車両V1は走行軌跡T5に沿って車線L1の位置P7から車線L2の位置P8まで走行するが、車線変更の間、隣々接車線である車線L3を走行する他車両V6の位置Q6は検出範囲の端部C1~C4のいずれにも進入しない。なお、位置P7及びP8は、位置P5に対する相対的な位置であり、図6Aに示す走行シーンにおいて、自車両V1は減速はするが後退はしない。 As described above, in the driving scene shown in FIG. 5, position Q6 of the other vehicle V6 enters end C4 within a predetermined time, so the driving assistance device 19 decelerates the host vehicle V1 by autonomous driving control and drives it along the driving trajectory T4 from position P5 to position P7, as shown in FIG. 6A. Then, in conjunction with the deceleration control, as shown in FIG. 6B, the host vehicle V1 changes lanes from lane L1, which is the host vehicle's lane, to lane L2, which is the adjacent lane. The host vehicle V1 drives along the driving trajectory T5 from position P7 of lane L1 to position P8 of lane L2, but during the lane change, position Q6 of the other vehicle V6 driving in lane L3, which is the adjacent lane, does not enter any of the ends C1 to C4 of the detection range. Note that positions P7 and P8 are relative positions to position P5, and in the driving scene shown in FIG. 6A, the host vehicle V1 decelerates but does not move backward.
なお、必ずしも、図6Bに示すように自車両V1を他車両V6の後方の位置P7まで走行させる必要はなく、自車両V1と他車両V6とが並走するように自車両V1の車速を設定し、自車両V1と他車両V6とが並走した状態で車線変更を行ってもよい。自車両V1と他車両V6とが並走している限り、他車両V6の位置Q6は端部C3,C4には進入しないからである。また、自律走行制御により自車両V1を減速させるタイミングと、車線L1から車線L2に車線変更させるタイミングは、図6A~6Bに示すように減速制御が完了した後に車線変更の支援を行ってもよく、2つの制御を同時におこなってもよい。つまり、自車両V1を減速させながら、車線L1から車線L2に車線変更させてもよい。 Note that it is not necessary to drive the host vehicle V1 to position P7 behind the other vehicle V6 as shown in FIG. 6B. The vehicle speed of the host vehicle V1 may be set so that the host vehicle V1 and the other vehicle V6 run side by side, and the lane change may be performed while the host vehicle V1 and the other vehicle V6 are running side by side. This is because as long as the host vehicle V1 and the other vehicle V6 are running side by side, the position Q6 of the other vehicle V6 will not enter the ends C3 and C4. In addition, the timing of decelerating the host vehicle V1 by the autonomous driving control and the timing of changing lanes from lane L1 to lane L2 may be such that the lane change assistance is performed after the deceleration control is completed as shown in FIGS. 6A-6B, or the two controls may be performed simultaneously. In other words, the host vehicle V1 may be changed lanes from lane L1 to lane L2 while decelerating.
なお、図4A~4C、図5及び図6A~6Bに示した各走行シーンにおける端部回避制御はあくまで例示であり、各走行シーンにおいて、上述した端部回避制御以外の端部回避制御を実行してもよい。また、本実施形態の運転支援装置19において、図4A~4C、図5及び図6A~6Bに示した各走行シーンの全てに対して端部回避制御を実行することは必須でなく、運転支援装置19は、一部の走行シーンに対して端部回避制御を行うものであってもよい。 Note that the edge avoidance control in each driving scene shown in Figures 4A to 4C, 5, and 6A to 6B is merely an example, and edge avoidance control other than the edge avoidance control described above may be executed in each driving scene. Furthermore, in the driving assistance device 19 of this embodiment, it is not essential to execute edge avoidance control for all of the driving scenes shown in Figures 4A to 4C, 5, and 6A to 6B, and the driving assistance device 19 may execute edge avoidance control for some of the driving scenes.
[システムにおける処理]
図7を参照して、運転支援装置19が情報を処理する際の手順を説明する。図7は、本実施形態の運転支援システム10において実行される、情報の処理を示すフローチャートの一例である。以下に説明する処理は、運転支援装置19のプロセッサであるCPU191により所定の時間間隔で実行される。なお、図7に示すフローチャートは、自車両V1が道路をレーンキープ制御で走行する走行シーンを前提とする。
[Processing in the system]
The procedure for information processing by the driving assistance device 19 will be described with reference to Fig. 7. Fig. 7 is an example of a flowchart showing information processing executed in the driving assistance system 10 of this embodiment. The processing described below is executed at predetermined time intervals by the CPU 191, which is the processor of the driving assistance device 19. Note that the flowchart shown in Fig. 7 is premised on a driving scene in which the host vehicle V1 is driving on a road using lane keeping control.
まず、図7のステップS1にて、認識部21の機能により、撮像装置11を用いて他車両を検出する。ステップS2にて、検出結果から、自車両V1の側方に他車両V2が存在するか否かを判定する。自車両V1の側方に他車両V2が存在しない場合は、ステップS7に進み、制御部23の機能により、通常の自律走行制御を実行してステップS8に進む。これに対し、自車両V1の側方に他車両が存在する場合は、ステップS3に進み、判定部22の機能により、他車両の位置が、撮像装置11の検出範囲の端部にあるか否かを判定する。他車両の位置が検出範囲の端部にあると判定した場合は、ステップS6に進む。これに対し、他車両の位置が検出範囲の端部にないと判定した場合は、ステップS4に進む。 First, in step S1 of FIG. 7, the recognition unit 21 detects another vehicle using the imaging device 11. In step S2, it is determined from the detection result whether or not another vehicle V2 is present to the side of the host vehicle V1. If the other vehicle V2 is not present to the side of the host vehicle V1, the process proceeds to step S7, where the control unit 23 executes normal autonomous driving control and proceeds to step S8. On the other hand, if another vehicle is present to the side of the host vehicle V1, the process proceeds to step S3, where the determination unit 22 determines whether or not the other vehicle is located at the edge of the detection range of the imaging device 11. If it is determined that the other vehicle is located at the edge of the detection range, the process proceeds to step S6. On the other hand, if it is determined that the other vehicle is not located at the edge of the detection range, the process proceeds to step S4.
ステップS4にて、判定部22の機能により、所定時間後の自車両V1と他車両の走行状態を予測し、続くステップS5にて、他車両の位置が、所定時間以内に撮像装置11の検出範囲の端部に進入するか否かを判定する。他車両の位置が所定時間以内に検出範囲の端部に進入しないと判定した場合は、ステップS7に進み、通常の自律走行制御を実行してステップS8に進む。これに対し、他車両の位置が所定時間以内に検出範囲の端部に進入すると判定した場合は、ステップS6に進み、制御部23の機能により、他車両の位置が検出範囲の端部に含まれないように自車両V1の走行を自律制御する。その後、ステップS8に進む。 In step S4, the function of the determination unit 22 predicts the driving state of the host vehicle V1 and the other vehicle after a predetermined time, and in the following step S5, it is determined whether the position of the other vehicle will enter the edge of the detection range of the imaging device 11 within the predetermined time. If it is determined that the position of the other vehicle will not enter the edge of the detection range within the predetermined time, the process proceeds to step S7, where normal autonomous driving control is executed, and the process proceeds to step S8. On the other hand, if it is determined that the position of the other vehicle will enter the edge of the detection range within the predetermined time, the process proceeds to step S6, where the function of the control unit 23 autonomously controls the driving of the host vehicle V1 so that the position of the other vehicle is not included in the edge of the detection range. Then, the process proceeds to step S8.
ステップS8にて、支援部20の機能により、自車両V1が目的地に到達したか否かを判定する。目的地に到達したと判定した場合は、ルーチンの実行を終了し、表示装置18を用いて、自車両V1のドライバーに手動運転による走行を促す。これに対し、自車両V1が目的地に到達していないと判定した場合は、ステップS1に進む。なお、手動運転とは、運転支援装置19が走行動作の自律走行制御を行わず、ドライバーの操作により車両の走行を制御することを言う。 In step S8, the support unit 20 determines whether the vehicle V1 has reached the destination. If it is determined that the vehicle V1 has reached the destination, the routine ends and the display device 18 is used to prompt the driver of the vehicle V1 to drive the vehicle manually. In contrast, if it is determined that the vehicle V1 has not reached the destination, the process proceeds to step S1. Note that manual driving refers to the driving support device 19 not performing autonomous driving control of the driving operation, but rather controlling the vehicle's driving through the driver's operation.
次に、図8を参照して、図7のステップS6のサブルーチンの一例を説明する。 Next, an example of a subroutine of step S6 in FIG. 7 will be described with reference to FIG. 8.
まず、図8のステップS11にて、自車両V1の車速と他車両の車速を比較し、続くステップS12にて、自車両V1の車速が他車両の車速より速いか否かを判定する。自車両V1の車速が他車両の車速より速い場合は、ステップS13に進み、車速制御装置171を用いて自車両V1を加速させる又は自車両V1の車速を維持する。これに対し、自車両V1の車速が他車両の車速より速くない場合(つまり、自車両V1の車速が他車両の車速以下である場合)は、ステップS14に進み、自車両V1の車速と他車両の車速が同じか否かを判定する。 First, in step S11 of FIG. 8, the vehicle speed of the host vehicle V1 is compared with the vehicle speed of the other vehicle, and in the subsequent step S12, it is determined whether the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle. If the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, the process proceeds to step S13, where the vehicle speed control device 171 is used to accelerate the host vehicle V1 or maintain the vehicle speed of the host vehicle V1. On the other hand, if the vehicle speed of the host vehicle V1 is not faster than the vehicle speed of the other vehicle (i.e., if the vehicle speed of the host vehicle V1 is equal to or lower than the vehicle speed of the other vehicle), the process proceeds to step S14, where it is determined whether the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle.
自車両V1の車速と他車両の車速が異なる場合(つまり、自車両V1の車速が他車両の車速より遅い場合)は、ステップS15に進み、車速制御装置171を用いて自車両V1を減速させる。これに対し、自車両V1の車速と他車両の車速が同じ場合は、ステップS16に進み、他車両の位置が、撮像装置11の検出範囲の端部にあるか否かを判定する。他車両の位置が検出範囲の端部にあると判定した場合は、ステップS17に進み、車速制御装置171を用いて自車両V1を加速又は減速させる。これに対し、他車両の位置が検出範囲の端部にないと判定した場合は、ステップS18に進み、自車両の車速を維持する。なお、ステップS16及びS18は必須のステップでなく、必要に応じて設けてもよい。 If the vehicle speed of the host vehicle V1 is different from the vehicle speed of the other vehicle (i.e., if the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle), the process proceeds to step S15, where the host vehicle V1 is decelerated using the vehicle speed control device 171. On the other hand, if the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle, the process proceeds to step S16, where it is determined whether the position of the other vehicle is at the end of the detection range of the imaging device 11. If it is determined that the position of the other vehicle is at the end of the detection range, the process proceeds to step S17, where the host vehicle V1 is accelerated or decelerated using the vehicle speed control device 171. On the other hand, if it is determined that the position of the other vehicle is not at the end of the detection range, the process proceeds to step S18, where the vehicle speed of the host vehicle is maintained. Note that steps S16 and S18 are not essential steps and may be provided as necessary.
次に、図9を参照して、図7のステップS6のサブルーチンの他の例を説明する。 Next, another example of the subroutine of step S6 in FIG. 7 will be described with reference to FIG. 9.
まず、図9のステップS21にて、自車両V1が隣接車線に車線変更するか否かを判定する。自車両V1が隣接車線に車線変更しないと判定した場合は、図7のステップS7に進み、通常の自律走行制御を実行する。これに対し、自車両V1が隣接車線に車線変更すると判定した場合は、ステップS22に進み、認識部21の機能により、撮像装置11を用いて隣接車線と隣々接車線を走行する他車両を検出する。ステップS23にて、隣接車線を走行する他車両が存在するか否かを判定し、隣接車線を走行する他車両が存在すると判定した場合は、ステップS24に進み、隣接車線への車線変更を実行せず、制御部23の機能により、たとえばレーンキープ制御を維持するように自律制御する。これに対し、隣接車線を走行する他車両が存在しないと判定した場合は、ステップS25に進む。 First, in step S21 of FIG. 9, it is determined whether the vehicle V1 will change lanes to the adjacent lane. If it is determined that the vehicle V1 will not change lanes to the adjacent lane, the process proceeds to step S7 of FIG. 7, where normal autonomous driving control is executed. On the other hand, if it is determined that the vehicle V1 will change lanes to the adjacent lane, the process proceeds to step S22, where the recognition unit 21 uses the imaging device 11 to detect other vehicles traveling in the adjacent lane and the adjacent lane next to it. In step S23, it is determined whether there are other vehicles traveling in the adjacent lane, and if it is determined that there are other vehicles traveling in the adjacent lane, the process proceeds to step S24, where the lane change to the adjacent lane is not executed, and the control unit 23 performs autonomous control to maintain lane keeping control, for example. On the other hand, if it is determined that there are no other vehicles traveling in the adjacent lane, the process proceeds to step S25.
ステップS25にて、隣々接車線を走行する他車両が存在するか否かを判定し、隣々接車線を走行する他車両が存在しないと判定した場合は、ステップS27に進み、制御部23の機能により隣接車線への車線変更を実行する。これに対し、隣々接車線を走行する他車両が存在すると判定した場合は、ステップS26に進み、隣々接車線を走行する他車両の位置が、所定時間以内に撮像装置11の検出範囲の端部に進入するか否かを判定する。隣々接車線を走行する他車両の位置が、所定時間以内に検出範囲の端部に進入すると判定した場合は、ステップS24に進み、隣々接車線を走行する他車両の位置が、所定時間以内に検出範囲の端部に進入しないと判定した場合は、ステップS27に進む。 In step S25, it is determined whether or not there is another vehicle traveling in the adjacent lane. If it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S27, where the control unit 23 functions to execute a lane change to the adjacent lane. On the other hand, if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S26, where it is determined whether or not the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range of the imaging device 11 within a predetermined time. If it is determined that the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range within the predetermined time, the process proceeds to step S24, and if it is determined that the position of the other vehicle traveling in the adjacent lane will not enter the edge of the detection range within the predetermined time, the process proceeds to step S27.
次に、図10を参照して、図7のステップS6のサブルーチンのまた他の例を説明する。 Next, with reference to FIG. 10, another example of the subroutine of step S6 in FIG. 7 will be described.
まず、図10のステップS31にて、認識部21の機能により、撮像装置11を用いて先行車両を検出し、続くステップS32にて、判定部22の機能により、自車両V1が先行車両を追い越すか否かを判定する。自車両V1が先行車両を追い越さないと判定した場合は、図7のステップS7に進み、通常の自律走行制御を実行する。これに対し、自車両V1が先行車両を追い越すと判定した場合は、ステップS33に進み、認識部21の機能により、撮像装置11を用いて隣接車線と隣々接車線を走行する他車両を検出する。ステップS34にて、隣接車線を走行する他車両が存在するか否かを判定し、隣接車線を走行する他車両が存在すると判定した場合は、ステップS35に進み、先行車両の追い越しを実行せず、制御部23の機能により、たとえば追従制御により先行車両を追従させる。これに対し、隣接車線を走行する他車両が存在しないと判定した場合は、ステップS36に進む。 First, in step S31 of FIG. 10, the function of the recognition unit 21 detects a preceding vehicle using the imaging device 11, and in the following step S32, the function of the determination unit 22 determines whether the host vehicle V1 will overtake the preceding vehicle. If it is determined that the host vehicle V1 will not overtake the preceding vehicle, the process proceeds to step S7 of FIG. 7, where normal autonomous driving control is executed. On the other hand, if it is determined that the host vehicle V1 will overtake the preceding vehicle, the process proceeds to step S33, where the function of the recognition unit 21 detects other vehicles traveling in the adjacent lane and the adjacent lane next to it using the imaging device 11. In step S34, it is determined whether there is another vehicle traveling in the adjacent lane, and if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S35, where overtaking of the preceding vehicle is not executed, and the function of the control unit 23 is used to make the preceding vehicle follow, for example, by following control. On the other hand, if it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S36.
ステップS36にて、隣々接車線を走行する他車両が存在するか否かを判定し、隣々接車線を走行する他車両が存在しないと判定した場合は、ステップS39に進み、制御部23の機能により先行車両の追い越しを実行する。これに対し、隣々接車線を走行する他車両が存在すると判定した場合は、ステップS37に進み、隣々接車線を走行する他車両の位置が、所定時間以内に撮像装置11の検出範囲の端部に進入するか否かを判定する。隣々接車線を走行する他車両の位置が、所定時間以内に検出範囲の端部に進入すると判定した場合は、ステップS38に進み、車速制御装置171を用いて自車両V1を減速させ、その後、ステップS39に進む。これに対し、隣々接車線を走行する他車両の位置が、所定時間以内に検出範囲の端部に進入しないと判定した場合は、ステップS39に進む。 In step S36, it is determined whether or not there is another vehicle traveling in the adjacent lane. If it is determined that there is no other vehicle traveling in the adjacent lane, the process proceeds to step S39, where the control unit 23 functions to overtake the preceding vehicle. On the other hand, if it is determined that there is another vehicle traveling in the adjacent lane, the process proceeds to step S37, where it is determined whether or not the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range of the imaging device 11 within a predetermined time. If it is determined that the position of the other vehicle traveling in the adjacent lane will enter the edge of the detection range within the predetermined time, the process proceeds to step S38, where the vehicle speed control device 171 is used to decelerate the host vehicle V1, and then the process proceeds to step S39. On the other hand, if it is determined that the position of the other vehicle traveling in the adjacent lane will not enter the edge of the detection range within the predetermined time, the process proceeds to step S39.
なお、本発明に係る運転支援装置19及び運転支援方法は、車両の走行速度のみを自律制御する場合と、車両の操舵操作のみを自律制御する場合と、車両の走行速度と操舵操作の両方を自律制御する場合とのいずれの場合にも用いることができる。また、本発明に係る運転支援装置19及び運転支援方法は、自律走行制御のみならず、ドライバーの手動運転における運転操作を支援するためも用いることができる。 The driving assistance device 19 and driving assistance method according to the present invention can be used in any of the following cases: autonomous control of only the vehicle's driving speed, autonomous control of only the vehicle's steering operation, and autonomous control of both the vehicle's driving speed and steering operation. Furthermore, the driving assistance device 19 and driving assistance method according to the present invention can be used not only for autonomous driving control, but also to assist the driver in manual driving.
[本発明の実施態様]
以上のとおり、本実施形態によれば、プロセッサにより実行される運転支援方法であって、前記プロセッサは、撮像装置11を用いて、自車両V1の側方を走行する他車両を検出し、前記他車両の位置が、前記撮像装置11の検出範囲の端部にある又は所定時間以内に前記端部に進入するか否かを判定し、前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記他車両の位置が前記端部に含まれないように前記自車両V1の走行を自律制御する運転支援方法が提供される。この実施態様を実施態様(1)とする。これにより、他車両の走行状態の誤認識が自車両V1の走行状態に与える影響を抑制できる。
[Embodiments of the invention]
As described above, according to the present embodiment, a driving assistance method is provided, in which the processor detects another vehicle traveling beside the vehicle V1 using the imaging device 11, determines whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter the edge within a predetermined time, and, if it is determined that the position of the other vehicle is at the edge or will enter the edge within the predetermined time, autonomously controls the traveling of the vehicle V1 so that the position of the other vehicle is not included in the edge. This embodiment is referred to as embodiment (1). This makes it possible to suppress the influence of erroneous recognition of the traveling state of the other vehicle on the traveling state of the vehicle V1.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記他車両の走行状態に基づいて前記他車両の進路を予測し、前記他車両の進路に基づき、前記他車両の位置が前記所定時間以内に前記端部に進入するか否かを判定し、前記他車両の位置が前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両V1と前記他車両との速度差が大きくなるように前記自車両V1の走行を自律制御する又は前記他車両の進路に基づいて前記自車両V1の進路を設定してもよい。この実施態様を実施態様(2)とする。これにより、実行する端部回避制御を予め設定し、円滑に実行できる。 Furthermore, according to the driving assistance method of this embodiment, the processor predicts the course of the other vehicle based on the traveling state of the other vehicle, and determines whether the position of the other vehicle will enter the edge within the predetermined time based on the course of the other vehicle. If it is determined that the position of the other vehicle will enter the edge within the predetermined time, the processor may autonomously control the traveling of the host vehicle V1 so that the speed difference between the host vehicle V1 and the other vehicle increases, or may set the course of the host vehicle V1 based on the course of the other vehicle. This embodiment is referred to as embodiment (2). Thereby, the edge avoidance control to be executed can be set in advance and executed smoothly.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記自車両V1の側方を走行する前記他車両のうち前記自車両V1の後方を走行する前記他車両の位置が、前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両V1の後方を走行する前記他車両の位置が、前記自車両V1の後方の前記端部に含まれないように前記自車両V1の走行を自律制御してもよい。この実施態様を実施態様(3)とする。これにより、自車両V1の後方に存在する障害物を、焦点距離の短い撮像装置11で検出する場合でも端部回避制御を実行できる。 Furthermore, according to the driving assistance method of this embodiment, when the processor determines that the position of the other vehicle traveling behind the host vehicle V1 among the other vehicles traveling to the side of the host vehicle V1 is at the edge or will enter the edge within the predetermined time, the processor may autonomously control the traveling of the host vehicle V1 so that the position of the other vehicle traveling behind the host vehicle V1 is not included in the edge behind the host vehicle V1. This embodiment is referred to as embodiment (3). Thereby, edge avoidance control can be executed even when an obstacle present behind the host vehicle V1 is detected by an imaging device 11 with a short focal length.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両V1の車速と前記他車両の車速とを比較し、前記自車両V1の車速が前記他車両の車速より速いときは、前記自車両V1を加速させ、前記自車両V1の車速が前記他車両の車速より遅いときは、前記自車両V1を減速させ、前記自車両V1の車速が前記他車両の車速と同じときは、前記自車両V1を加速又は減速させてもよい。この実施態様を実施態様(4)とする。これにより、走行シーンに応じた端部回避制御を実行できる。 Furthermore, according to the driving assistance method of this embodiment, when the processor determines that the other vehicle is located at the edge or will enter the edge within the predetermined time, it compares the vehicle speed of the host vehicle V1 with the vehicle speed of the other vehicle, and when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, it may accelerate the host vehicle V1, when the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, it may decelerate the host vehicle V1, and when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle, it may accelerate or decelerate the host vehicle V1. This embodiment is referred to as embodiment (4). Thereby, edge avoidance control according to the driving scene can be executed.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両V1の車速と前記他車両の車速とを比較し、前記自車両V1の車速が前記他車両の車速より速いときは、前記自車両V1の車速を維持し、前記自車両V1の車速が前記他車両の車速より遅いときは、前記自車両V1を減速させ、前記自車両V1の車速が前記他車両の車速と同じときは、前記他車両の位置が前記端部にあるか否かを判定し、前記他車両の位置が前記端部にあると判定したときは、前記自車両V1を加速又は減速させ、前記他車両の位置が前記端部にないと判定したときは、前記自車両V1の車速を維持してもよい。この実施態様を実施態様(5)とする。これにより、走行シーンに応じた端部回避制御を実行できる。 Furthermore, according to the driving assistance method of this embodiment, when the processor determines that the other vehicle is at the edge or will enter the edge within the predetermined time, it compares the vehicle speed of the host vehicle V1 with the vehicle speed of the other vehicle, and when the vehicle speed of the host vehicle V1 is faster than the vehicle speed of the other vehicle, it maintains the vehicle speed of the host vehicle V1, and when the vehicle speed of the host vehicle V1 is slower than the vehicle speed of the other vehicle, it decelerates the host vehicle V1, and when the vehicle speed of the host vehicle V1 is the same as the vehicle speed of the other vehicle, it determines whether the other vehicle is at the edge, and when it determines that the other vehicle is at the edge, it accelerates or decelerates the host vehicle V1, and when it determines that the other vehicle is not at the edge, it may maintain the vehicle speed of the host vehicle V1. This embodiment is called embodiment (5). Thereby, edge avoidance control according to the driving scene can be executed.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記自車両V1が、前記自車両V1が走行する自車線から前記自車線の隣接車線に車線変更する場合は、前記他車両の検出結果から、前記隣接車線と、前記隣接車線に隣接する隣々接車線とに前記他車両が存在するか否かを判定し、前記隣接車線に前記他車両が存在せず、前記隣々接車線に前記他車両が存在すると判定した場合は、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入するか否かを判定し、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定したときは、自律走行制御による車線変更の支援を行わなくてもよい。この実施態様を実施態様(6)とする。これにより、車線変更の実行中に端部回避制御を実行する走行シーンを回避できる。 In addition, according to the driving assistance method of this embodiment, when the host vehicle V1 changes lanes from the host vehicle V1's own lane to an adjacent lane of the host vehicle V1, the processor determines whether the other vehicle is present in the adjacent lane and the adjacent lane adjacent to the adjacent lane from the detection result of the other vehicle, and if it is determined that the other vehicle is not present in the adjacent lane and the other vehicle is present in the adjacent lane, it determines whether the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the specified time, and if it is determined that the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the specified time, it is not necessary to perform lane change assistance by autonomous driving control. This embodiment is referred to as embodiment (6). This makes it possible to avoid a driving scene in which edge avoidance control is performed during lane change.
また、本実施形態の運転支援方法によれば、前記プロセッサは、前記自車両V1が、前記自車両V1の先行車両を追い越す場合は、前記他車両の検出結果から、前記自車両V1が走行する自車線の隣接車線と、前記隣接車線に隣接する隣々接車線とに前記他車両が存在するか否かを判定し、前記隣接車線に前記他車両が存在せず、前記隣々接車線に前記他車両が存在すると判定した場合は、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入するか否かを判定し、前記隣々接車線を走行する前記他車両の位置が前記端部になく、且つ、前記所定時間以内に前記端部に進入しないと判定したときは、前記先行車両を追い越すため、自律走行制御により前記自車線から前記隣接車線に車線変更させ、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定したときは、前記自律走行制御により、前記自車両V1を減速させるとともに前記自車線から前記隣接車線に車線変更させてもよい。この実施態様を実施態様(7)とする。これにより、車線変更の実行中に端部回避制御を実行する走行シーンを回避しつつ、先行車両を追い越せる。 Furthermore, according to the driving assistance method of this embodiment, when the host vehicle V1 overtakes a vehicle preceding the host vehicle V1, the processor determines from the detection result of the other vehicle whether or not the other vehicle is present in the lane adjacent to the host vehicle V1 in which the host vehicle V1 is traveling and in the adjacent lane adjacent to the adjacent lane, and if it is determined that the other vehicle is not present in the adjacent lane and that the other vehicle is present in the adjacent lane, the processor determines whether the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time. When it is determined that the position of the other vehicle traveling in the adjacent lane is not at the edge and will not enter the edge within the predetermined time, the vehicle V1 may be changed from the own lane to the adjacent lane by autonomous driving control in order to overtake the preceding vehicle, and when it is determined that the position of the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time, the vehicle V1 may be decelerated and changed from the own lane to the adjacent lane by the autonomous driving control. This embodiment is referred to as embodiment (7). This makes it possible to overtake a preceding vehicle while avoiding a driving scene in which edge avoidance control is performed during lane change.
また、本実施形態によれば、撮像装置11を用いて、自車両V1の側方を走行する他車両を検出する認識部21と、前記他車両の位置が、前記撮像装置11の検出範囲の端部にある又は所定時間以内に前記端部に進入するか否かを判定する判定部22と、前記判定部22が、前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記他車両の位置が前記端部に含まれないように前記自車両V1の走行を自律制御する制御部23と、を備える、運転支援装置19が提供される。この実施態様を実施態様(8)とする。これにより、他車両の走行状態の誤認識が自車両V1の走行状態に与える影響を抑制できる。 Furthermore, according to this embodiment, a driving assistance device 19 is provided that includes a recognition unit 21 that detects other vehicles traveling beside the host vehicle V1 using an imaging device 11, a determination unit 22 that determines whether the position of the other vehicle is at the edge of the detection range of the imaging device 11 or will enter the edge within a predetermined time, and a control unit 23 that autonomously controls the traveling of the host vehicle V1 so that the position of the other vehicle is not included in the edge when the determination unit 22 determines that the position of the other vehicle is at the edge or will enter the edge within the predetermined time. This embodiment is referred to as embodiment (8). This makes it possible to suppress the effect of erroneous recognition of the traveling state of the other vehicle on the traveling state of the host vehicle V1.
なお、他車両の位置とは、他車両の車両全体と捉えても、他車両の車両の全長方向(又は前後方向)の特定の位置(たとえば、車両の中心や特定の部位)と捉えてもよい。また、他車両の位置として常に同じ基準を用いてもよく、状況(走行シーン)ごとに基準を変更してもよい。たとえば、ヘッドライトやテールランプ、他車両の前部及び/又は後部のバンパーなどの特定の部位を基準とすることができる。また、自車両V1に後方から接近する他車両については、他車両の前方(車体前部)の特定の部位を基準とし、自車両V1に後方から接近する他車両については、他車両の後方(車体後部)の特定の部位を基準としてもよい。 The position of the other vehicle may be regarded as the whole vehicle or as a specific position in the overall length (or longitudinal direction) of the other vehicle (for example, the center of the vehicle or a specific part). The same reference may always be used for the position of the other vehicle, or the reference may be changed for each situation (driving scene). For example, specific parts such as headlights, taillights, and the front and/or rear bumpers of the other vehicle may be used as the reference. Furthermore, for other vehicles approaching the host vehicle V1 from behind, a specific part in the front of the other vehicle (front of the vehicle body) may be used as the reference, and for other vehicles approaching the host vehicle V1 from behind, a specific part in the rear of the other vehicle (rear of the vehicle body) may be used as the reference.
[実施態様の組み合わせ]
本実施形態の運転支援方法によれば、実施態様(1)に実施態様(2)を組み合わせてもよく、実施態様(1)に実施態様(3)を組み合わせてもよく、実施態様(1)に実施態様(2)と(3)を組み合わせてもよい。また、本実施形態の運転支援方法によれば、実施態様(1)に実施態様(4)を組み合わせてもよく、実施態様(1)に実施態様(2)と(4)を組み合わせてもよく、実施態様(1)に実施態様(3)と(4)を組み合わせてもよく、実施態様(1)に実施態様(2)と(3)と(4)を組み合わせてもよい。
[Combination of embodiments]
According to the driving assistance method of the present embodiment, the embodiment (1) may be combined with the embodiment (2), the embodiment (1) may be combined with the embodiment (3), or the embodiment (1) may be combined with the embodiments (2) and (3). Also, according to the driving assistance method of the present embodiment, the embodiment (1) may be combined with the embodiment (4), the embodiment (1) may be combined with the embodiments (2) and (4), the embodiment (1) may be combined with the embodiments (3) and (4), or the embodiment (1) may be combined with the embodiments (2), (3), and (4).
本実施形態の運転支援装置19についても同様で、実施態様(8)に実施態様(2)を組み合わせてもよく、実施態様(8)に実施態様(3)を組み合わせてもよく、実施態様(8)に実施態様(2)と(3)を組み合わせてもよい。また、本実施形態の運転支援装置19によれば、実施態様(8)に実施態様(4)を組み合わせてもよく、実施態様(8)に実施態様(2)と(4)を組み合わせてもよく、実施態様(8)に実施態様(3)と(4)を組み合わせてもよく、実施態様(8)に実施態様(2)と(3)と(4)を組み合わせてもよい。 The same is true for the driving assistance device 19 of this embodiment; embodiment (8) may be combined with embodiment (2), embodiment (8) may be combined with embodiment (3), or embodiment (8) may be combined with embodiments (2) and (3). Also, according to the driving assistance device 19 of this embodiment, embodiment (8) may be combined with embodiment (4), embodiment (8) may be combined with embodiments (2) and (4), embodiment (8) may be combined with embodiments (3) and (4), or embodiment (8) may be combined with embodiments (2), (3), and (4).
本実施形態の運転支援方法によれば、実施態様(1)に実施態様(5)を組み合わせてもよく、実施態様(1)に実施態様(2)と(5)を組み合わせてもよく、実施態様(1)に実施態様(3)と(5)を組み合わせてもよく、実施態様(1)に実施態様(2)と(3)と(5)を組み合わせてもよい。同様に、本実施形態の運転支援装置19によれば、実施態様(8)に実施態様(5)を組み合わせてもよく、実施態様(8)に実施態様(2)と(5)を組み合わせてもよく、実施態様(8)に実施態様(3)と(5)を組み合わせてもよく、実施態様(8)に実施態様(2)と(3)と(5)を組み合わせてもよい。 According to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiment (5), embodiment (1) may be combined with embodiments (2) and (5), embodiment (1) may be combined with embodiments (3) and (5), or embodiment (1) may be combined with embodiments (2), (3), and (5). Similarly, according to the driving assistance device 19 of this embodiment, embodiment (8) may be combined with embodiment (5), embodiment (8) may be combined with embodiments (2) and (5), embodiment (8) may be combined with embodiments (3) and (5), or embodiment (8) may be combined with embodiments (2), (3), and (5).
本実施形態の運転支援方法によれば、実施態様(1)に実施態様(6)を組み合わせてもよく、実施態様(1)に実施態様(2)と(6)を組み合わせてもよく、実施態様(1)に実施態様(3)と(6)を組み合わせてもよく、実施態様(1)に実施態様(4)と(6)を組み合わせてもよく、実施態様(1)に実施態様(5)と(6)を組み合わせてもよい。また、本実施形態の運転支援方法によれば、実施態様(1)に実施態様(2)と(3)と(6)を組み合わせてもよく、実施態様(1)に実施態様(2)と(4)と(6)を組み合わせてもよく、実施態様(1)に実施態様(2)と(5)と(6)を組み合わせてもよく、実施態様(1)に実施態様(3)と(4)と(6)を組み合わせてもよく、実施態様(1)に実施態様(3)と(5)と(6)を組み合わせてもよい。さらに、本実施形態の運転支援方法によれば、実施態様(1)に実施態様(2)と(3)と(4)と(6)を組み合わせてもよく、実施態様(1)に実施態様(2)と(3)と(5)と(6)を組み合わせてもよく、実施態様(1)に実施態様(3)と(4)と(5)と(6)を組み合わせてもよく、実施態様(1)に実施態様(2)~(6)を組み合わせてもよい。 According to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiment (6), embodiment (1) may be combined with embodiment (2) and (6), embodiment (1) may be combined with embodiment (3) and (6), embodiment (1) may be combined with embodiment (4) and (6), or embodiment (1) may be combined with embodiment (5) and (6). According to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiment (2), (3), and (6), embodiment (1) may be combined with embodiment (2), (4), and (6), embodiment (1) may be combined with embodiment (2), (5), and (6), embodiment (1) may be combined with embodiment (3), (4), and (6), or embodiment (1) may be combined with embodiment (3), (5), and (6). Furthermore, according to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiments (2), (3), (4), and (6), embodiment (1) may be combined with embodiments (2), (3), (5), and (6), embodiment (1) may be combined with embodiments (3), (4), (5), and (6), and embodiment (1) may be combined with embodiments (2) to (6).
本実施形態の運転支援装置19についても同様で、実施態様(8)に実施態様(6)を組み合わせてもよく、実施態様(8)に実施態様(2)と(6)を組み合わせてもよく、実施態様(8)に実施態様(3)と(6)を組み合わせてもよく、実施態様(8)に実施態様(4)と(6)を組み合わせてもよく、実施態様(8)に実施態様(5)と(6)を組み合わせてもよい。また、本実施形態の運転支援方法によれば、実施態様(8)に実施態様(2)と(3)と(6)を組み合わせてもよく、実施態様(8)に実施態様(2)と(4)と(6)を組み合わせてもよく、実施態様(8)に実施態様(2)と(5)と(6)を組み合わせてもよく、実施態様(8)に実施態様(3)と(4)と(6)を組み合わせてもよく、実施態様(8)に実施態様(3)と(5)と(6)を組み合わせてもよい。さらに、本実施形態の運転支援方法によれば、実施態様(8)に実施態様(2)と(3)と(4)と(6)を組み合わせてもよく、実施態様(8)に実施態様(2)と(3)と(5)と(6)を組み合わせてもよく、実施態様(8)に実施態様(3)と(4)と(5)と(6)を組み合わせてもよく、実施態様(8)に実施態様(2)~(6)を組み合わせてもよい。 The same applies to the driving assistance device 19 of this embodiment, and embodiment (8) may be combined with embodiment (6), embodiment (8) may be combined with embodiment (2) and (6), embodiment (8) may be combined with embodiment (3) and (6), embodiment (8) may be combined with embodiment (4) and (6), or embodiment (8) may be combined with embodiment (5) and (6). Also, according to the driving assistance method of this embodiment, embodiment (8) may be combined with embodiment (2), (3), and (6), embodiment (8) may be combined with embodiment (2), (4), and (6), embodiment (8) may be combined with embodiment (2), (5), and (6), embodiment (8) may be combined with embodiment (3), (4), and (6), or embodiment (8) may be combined with embodiment (3), (5), and (6). Furthermore, according to the driving assistance method of this embodiment, embodiment (8) may be combined with embodiments (2), (3), (4), and (6), embodiment (8) may be combined with embodiments (2), (3), (5), and (6), embodiment (8) may be combined with embodiments (3), (4), (5), and (6), and embodiment (8) may be combined with embodiments (2) to (6).
本実施形態の運転支援方法によれば、実施態様(1)に実施態様(7)を組み合わせてもよく、実施態様(1)に実施態様(2)と(7)を組み合わせてもよく、実施態様(1)に実施態様(3)と(7)を組み合わせてもよく、実施態様(1)に実施態様(4)と(7)を組み合わせてもよく、実施態様(1)に実施態様(5)と(7)を組み合わせてもよい。また、本実施形態の運転支援方法によれば、実施態様(1)に実施態様(2)と(3)と(7)を組み合わせてもよく、実施態様(1)に実施態様(2)と(4)と(7)を組み合わせてもよく、実施態様(1)に実施態様(2)と(5)と(7)を組み合わせてもよく、実施態様(1)に実施態様(3)と(4)と(7)を組み合わせてもよく、実施態様(1)に実施態様(3)と(5)と(7)を組み合わせてもよい。さらに、本実施形態の運転支援方法によれば、実施態様(1)に実施態様(2)と(3)と(4)と(7)を組み合わせてもよく、実施態様(1)に実施態様(2)と(3)と(5)と(7)を組み合わせてもよく、実施態様(1)に実施態様(3)と(4)と(5)と(7)を組み合わせてもよく、実施態様(1)に実施態様(2)~(5)及び(7)を組み合わせてもよい。 According to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiment (7), embodiment (1) may be combined with embodiment (2) and (7), embodiment (1) may be combined with embodiment (3) and (7), embodiment (1) may be combined with embodiment (4) and (7), or embodiment (1) may be combined with embodiment (5) and (7). According to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiment (2), (3), and (7), embodiment (1) may be combined with embodiment (2), (4), and (7), embodiment (1) may be combined with embodiment (2), (5), and (7), embodiment (1) may be combined with embodiment (3), (4), and (7), or embodiment (1) may be combined with embodiment (3), (5), and (7). Furthermore, according to the driving assistance method of this embodiment, embodiment (1) may be combined with embodiments (2), (3), (4), and (7), embodiment (1) may be combined with embodiments (2), (3), (5), and (7), embodiment (1) may be combined with embodiments (3), (4), (5), and (7), and embodiment (1) may be combined with embodiments (2) to (5) and (7).
本実施形態の運転支援装置19についても同様で、実施態様(8)に実施態様(7)を組み合わせてもよく、実施態様(8)に実施態様(2)と(7)を組み合わせてもよく、実施態様(8)に実施態様(3)と(7)を組み合わせてもよく、実施態様(8)に実施態様(4)と(7)を組み合わせてもよく、実施態様(8)に実施態様(5)と(7)を組み合わせてもよい。また、本実施形態の運転支援方法によれば、実施態様(8)に実施態様(2)と(3)と(7)を組み合わせてもよく、実施態様(8)に実施態様(2)と(4)と(7)を組み合わせてもよく、実施態様(8)に実施態様(2)と(5)と(7)を組み合わせてもよく、実施態様(8)に実施態様(3)と(4)と(7)を組み合わせてもよく、実施態様(8)に実施態様(3)と(5)と(7)を組み合わせてもよい。さらに、本実施形態の運転支援方法によれば、実施態様(8)に実施態様(2)と(3)と(4)と(7)を組み合わせてもよく、実施態様(8)に実施態様(2)と(3)と(5)と(7)を組み合わせてもよく、実施態様(8)に実施態様(3)と(4)と(5)と(7)を組み合わせてもよく、実施態様(8)に実施態様(2)~(5)及び(7)を組み合わせてもよい。 The same applies to the driving assistance device 19 of this embodiment, and embodiment (8) may be combined with embodiment (7), embodiment (8) may be combined with embodiment (2) and (7), embodiment (8) may be combined with embodiment (3) and (7), embodiment (8) may be combined with embodiment (4) and (7), or embodiment (8) may be combined with embodiment (5) and (7). Also, according to the driving assistance method of this embodiment, embodiment (8) may be combined with embodiment (2), (3), and (7), embodiment (8) may be combined with embodiment (2), (4), and (7), embodiment (8) may be combined with embodiment (2), (5), and (7), embodiment (8) may be combined with embodiment (3), (4), and (7), or embodiment (8) may be combined with embodiment (3), (5), and (7). Furthermore, according to the driving assistance method of this embodiment, embodiment (8) may be combined with embodiments (2), (3), (4), and (7), embodiment (8) may be combined with embodiments (2), (3), (5), and (7), embodiment (8) may be combined with embodiments (3), (4), (5), and (7), and embodiment (8) may be combined with embodiments (2) to (5) and (7).
10…運転支援システム
11…撮像装置
12…測距装置
13…自車状態検出装置
14…地図情報
15…自車位置検出装置
16…ナビゲーション装置
17…車両制御装置
171…車速制御装置
172…操舵制御装置
18…表示装置
19…運転支援装置
191…CPU(プロセッサ)
192…ROM
193…RAM
20…支援部
21…認識部
22…判定部
23…制御部
A1,A2,B1,B2,B3,B4…検出範囲
C1,C2,C3,C4…端部
L1,L2,L3…車線
P1,P2,P3,P4,P5,P6,P7,P8…位置(自車両)
Q1,Q2,Q3,Q4,Q5,Q6…位置(他車両)
T1,T2,T3,T4,T5…走行軌跡(自車両)
U1,Ux,Uy…走行軌跡(他車両)
V1…自車両
V2,V3,V4,V5,V6…他車両
REFERENCE SIGNS LIST 10 Driving assistance system 11 Imaging device 12 Distance measuring device 13 Vehicle state detection device 14 Map information 15 Vehicle position detection device 16 Navigation device 17 Vehicle control device 171 Vehicle speed control device 172 Steering control device 18 Display device 19 Driving assistance device 191 CPU (processor)
192...ROM
193...RAM
20: Support unit 21: Recognition unit 22: Determination unit 23: Control unit A1, A2, B1, B2, B3, B4: Detection ranges C1, C2, C3, C4: Ends L1, L2, L3: Lanes P1, P2, P3, P4, P5, P6, P7, P8: Position (own vehicle)
Q1, Q2, Q3, Q4, Q5, Q6...position (other vehicles)
T1, T2, T3, T4, T5...Travel trajectory (own vehicle)
U1, Ux, Uy...Travel trajectory (other vehicles)
V1: own vehicle V2, V3, V4, V5, V6: other vehicles
Claims (8)
前記プロセッサは、
撮像装置を用いて、自車両の側方を走行する他車両を検出し、
前記他車両の位置が、前記撮像装置の検出範囲の端部にある又は所定時間以内に前記端部に進入するか否かを判定し、
前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記他車両の位置が前記端部に含まれないように前記自車両の走行を自律制御する運転支援方法。 A driving assistance method executed by a processor, comprising:
The processor,
Using an imaging device, another vehicle traveling beside the vehicle is detected;
determining whether the position of the other vehicle is at an edge of a detection range of the imaging device or will enter the edge within a predetermined time;
A driving assistance method that autonomously controls the driving of the vehicle so that the position of the other vehicle is not included in the edge when it is determined that the position of the other vehicle is at the edge or will enter the edge within the specified time.
前記他車両の走行状態に基づいて前記他車両の進路を予測し、
前記他車両の進路に基づき、前記他車両の位置が前記所定時間以内に前記端部に進入するか否かを判定し、
前記他車両の位置が前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両と前記他車両との速度差が大きくなるように前記自車両の走行を自律制御する又は前記他車両の進路に基づいて前記自車両の進路を設定する、請求項1に記載の運転支援方法。 The processor,
predicting a course of the other vehicle based on a traveling state of the other vehicle;
determining whether or not the position of the other vehicle will enter the edge portion within the predetermined time based on the path of the other vehicle;
2. The driving assistance method according to claim 1, wherein, when it is determined that the position of the other vehicle will enter the end portion within the predetermined time, the driving of the host vehicle is autonomously controlled so that a speed difference between the host vehicle and the other vehicle increases, or a path of the host vehicle is set based on a path of the other vehicle.
前記自車両の側方を走行する前記他車両のうち前記自車両の後方を走行する前記他車両の位置が、前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両の後方を走行する前記他車両の位置が、前記自車両の後方の前記端部に含まれないように前記自車両の走行を自律制御する、請求項1又は2に記載の運転支援方法。 The processor,
3. The driving assistance method according to claim 1, further comprising: when it is determined that the position of the other vehicle traveling behind the host vehicle among the other vehicles traveling to the side of the host vehicle is at the edge or will enter the edge within the predetermined time, autonomously controlling the traveling of the host vehicle so that the position of the other vehicle traveling behind the host vehicle is not included in the edge behind the host vehicle.
前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両の車速と前記他車両の車速とを比較し、
前記自車両の車速が前記他車両の車速より速いときは、前記自車両を加速させ、
前記自車両の車速が前記他車両の車速より遅いときは、前記自車両を減速させ、
前記自車両の車速が前記他車両の車速と同じときは、前記自車両を加速又は減速させる、請求項1~3のいずれか一項に記載の運転支援方法。 The processor,
When it is determined that the position of the other vehicle is at the edge or will enter the edge within the predetermined time, a vehicle speed of the host vehicle is compared with a vehicle speed of the other vehicle;
When the vehicle speed of the host vehicle is faster than the vehicle speed of the other vehicle, the host vehicle is accelerated.
When the vehicle speed of the host vehicle is slower than the vehicle speed of the other vehicle, the host vehicle is decelerated;
The driving support method according to any one of claims 1 to 3, further comprising: accelerating or decelerating the host vehicle when a vehicle speed of the host vehicle is the same as a vehicle speed of the other vehicle.
前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記自車両の車速と前記他車両の車速とを比較し、
前記自車両の車速が前記他車両の車速より速いときは、前記自車両の車速を維持し、
前記自車両の車速が前記他車両の車速より遅いときは、前記自車両を減速させ、
前記自車両の車速が前記他車両の車速と同じときは、前記他車両の位置が前記端部にあるか否かを判定し、
前記他車両の位置が前記端部にあると判定したときは、前記自車両を加速又は減速させ、
前記他車両の位置が前記端部にないと判定したときは、前記自車両の車速を維持する、請求項1~3のいずれか一項に記載の運転支援方法。 The processor,
When it is determined that the position of the other vehicle is at the edge or will enter the edge within the predetermined time, a vehicle speed of the host vehicle is compared with a vehicle speed of the other vehicle;
When the vehicle speed of the host vehicle is faster than the vehicle speed of the other vehicle, the vehicle speed of the host vehicle is maintained.
When the vehicle speed of the host vehicle is slower than the vehicle speed of the other vehicle, the host vehicle is decelerated;
When the vehicle speed of the host vehicle is the same as the vehicle speed of the other vehicle, it is determined whether or not the other vehicle is located at the edge;
When it is determined that the position of the other vehicle is at the edge, the host vehicle is accelerated or decelerated;
The driving assistance method according to any one of claims 1 to 3, further comprising the step of: maintaining a vehicle speed of the host vehicle when it is determined that the position of the other vehicle is not at the end portion.
前記自車両が、前記自車両が走行する自車線から前記自車線の隣接車線に車線変更する場合は、前記他車両の検出結果から、前記隣接車線と、前記隣接車線に隣接する隣々接車線とに前記他車両が存在するか否かを判定し、
前記隣接車線に前記他車両が存在せず、前記隣々接車線に前記他車両が存在すると判定した場合は、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入するか否かを判定し、
前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定したときは、自律走行制御による車線変更の支援を行わない、請求項1~5のいずれか一項に記載の運転支援方法。 The processor,
When the host vehicle changes lanes from a lane in which the host vehicle is traveling to an adjacent lane of the host vehicle, it is determined whether or not the other vehicle is present in the adjacent lane and a lane adjacent to the adjacent lane, based on a detection result of the other vehicle;
when it is determined that the other vehicle is not present in the adjacent lane and that the other vehicle is present in the adjacent lane, it is determined whether or not the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time;
The driving assistance method according to any one of claims 1 to 5, wherein when it is determined that the position of the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time, lane change assistance is not performed by autonomous driving control.
前記自車両が、前記自車両の先行車両を追い越す場合は、前記他車両の検出結果から、前記自車両が走行する自車線の隣接車線と、前記隣接車線に隣接する隣々接車線とに前記他車両が存在するか否かを判定し、
前記隣接車線に前記他車両が存在せず、前記隣々接車線に前記他車両が存在すると判定した場合は、前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入するか否かを判定し、
前記隣々接車線を走行する前記他車両の位置が前記端部になく、且つ、前記所定時間以内に前記端部に進入しないと判定したときは、前記先行車両を追い越すため、自律走行制御により前記自車線から前記隣接車線に車線変更させ、
前記隣々接車線を走行する前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定したときは、前記自律走行制御により、前記自車両を減速させるとともに前記自車線から前記隣接車線に車線変更させる、請求項1~5のいずれか一項に記載の運転支援方法。 The processor,
When the host vehicle is overtaking a preceding vehicle of the host vehicle, it is determined from a detection result of the other vehicle whether or not the other vehicle is present in an adjacent lane to the host vehicle lane in which the host vehicle is traveling and in a lane adjacent to the adjacent lane;
when it is determined that the other vehicle is not present in the adjacent lane and that the other vehicle is present in the adjacent lane, it is determined whether or not the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time;
when it is determined that the position of the other vehicle traveling in the adjacent lane is not at the edge and will not enter the edge within the predetermined time, changing lanes from the own lane to the adjacent lane by autonomous driving control in order to overtake the preceding vehicle;
The driving assistance method according to any one of claims 1 to 5, wherein when it is determined that the position of the other vehicle traveling in the adjacent lane is at the edge or will enter the edge within the predetermined time, the autonomous driving control decelerates the host vehicle and changes lanes from the host vehicle lane to the adjacent lane.
前記他車両の位置が、前記撮像装置の検出範囲の端部にある又は所定時間以内に前記端部に進入するか否かを判定する判定部と、
前記判定部が、前記他車両の位置が前記端部にある又は前記所定時間以内に前記端部に進入すると判定した場合は、前記他車両の位置が前記端部に含まれないように前記自車両の走行を自律制御する制御部と、を備える、運転支援装置。 A recognition unit that detects another vehicle traveling beside the vehicle by using an imaging device;
a determination unit that determines whether the position of the other vehicle is at an edge of a detection range of the imaging device or will enter the edge within a predetermined time;
a control unit that, when the determination unit determines that the other vehicle is located at the edge or will enter the edge within the specified time, autonomously controls the driving of the vehicle so that the other vehicle's position is not included in the edge.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024548820A JP7743938B2 (en) | 2022-09-26 | 2022-09-26 | Driving assistance method and driving assistance device |
| PCT/JP2022/035674 WO2024069690A1 (en) | 2022-09-26 | 2022-09-26 | Driving assistance method and driving assistance device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/035674 WO2024069690A1 (en) | 2022-09-26 | 2022-09-26 | Driving assistance method and driving assistance device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024069690A1 true WO2024069690A1 (en) | 2024-04-04 |
Family
ID=90476606
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/035674 Ceased WO2024069690A1 (en) | 2022-09-26 | 2022-09-26 | Driving assistance method and driving assistance device |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7743938B2 (en) |
| WO (1) | WO2024069690A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021100827A (en) * | 2019-12-24 | 2021-07-08 | 日立Astemo株式会社 | Vehicle control system and vehicle control method |
| JP2021154935A (en) * | 2020-03-27 | 2021-10-07 | パナソニックIpマネジメント株式会社 | Vehicle simulation system, vehicle simulation method and computer program |
-
2022
- 2022-09-26 WO PCT/JP2022/035674 patent/WO2024069690A1/en not_active Ceased
- 2022-09-26 JP JP2024548820A patent/JP7743938B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021100827A (en) * | 2019-12-24 | 2021-07-08 | 日立Astemo株式会社 | Vehicle control system and vehicle control method |
| JP2021154935A (en) * | 2020-03-27 | 2021-10-07 | パナソニックIpマネジメント株式会社 | Vehicle simulation system, vehicle simulation method and computer program |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7743938B2 (en) | 2025-09-25 |
| JPWO2024069690A1 (en) | 2024-04-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10676093B2 (en) | Vehicle control system, vehicle control method, and storage medium | |
| US10643474B2 (en) | Vehicle control device, vehicle control method, and recording medium | |
| JP6592852B2 (en) | Vehicle control device, vehicle control method, and program | |
| JP6344695B2 (en) | Vehicle control device, vehicle control method, and vehicle control program | |
| US20190071075A1 (en) | Vehicle control system, vehicle control method, and vehicle control program | |
| US20190315348A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
| JP6843819B2 (en) | Traffic guide recognition device, traffic guide recognition method, and program | |
| JP2019156174A (en) | Vehicle control device, vehicle, vehicle control method, and program | |
| WO2018216194A1 (en) | Vehicle control system and vehicle control method | |
| JPWO2017158731A1 (en) | Vehicle control system, vehicle control method, and vehicle control program | |
| JP6600889B2 (en) | Vehicle control device, vehicle control method, and program | |
| US10854083B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
| JP7377822B2 (en) | Driving support method and driving support device | |
| WO2023089837A1 (en) | Travel assistance method and travel assistance device for vehicle | |
| US20200211396A1 (en) | Notification device and vehicle control device | |
| WO2018211645A1 (en) | Driving assistance method and driving assistance apparatus | |
| WO2023067793A1 (en) | Driving assistance method and driving assistance device | |
| JP6759611B2 (en) | Vehicle control system | |
| JP7743938B2 (en) | Driving assistance method and driving assistance device | |
| JP2023169524A (en) | Driving support method and driving support device for vehicle | |
| JP2023167861A (en) | Driving support method and driving support device for vehicle | |
| JP7782710B2 (en) | Driving assistance method and driving assistance device | |
| JP7532242B2 (en) | Driving support method and driving support device | |
| JP2022018853A (en) | Mobile control device | |
| JP7768360B2 (en) | Information providing device and information providing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22960755 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024548820 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22960755 Country of ref document: EP Kind code of ref document: A1 |