WO2017061183A1 - Interface humaine - Google Patents
Interface humaine Download PDFInfo
- Publication number
- WO2017061183A1 WO2017061183A1 PCT/JP2016/074806 JP2016074806W WO2017061183A1 WO 2017061183 A1 WO2017061183 A1 WO 2017061183A1 JP 2016074806 W JP2016074806 W JP 2016074806W WO 2017061183 A1 WO2017061183 A1 WO 2017061183A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- finger
- information
- display
- driver
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Definitions
- the present invention relates to a human interface, and more particularly to a human interface used in a vehicle.
- a pre-crash safety system (automatic brake system) that detects a pedestrian or vehicle ahead and alerts the driver and automatically brakes the vehicle when there is a possibility of contact, etc.
- advanced driving support systems ADAS (Advanced Driver Assistance System)
- ADAS Advanced Driver Assistance System
- an adaptive auto-cruise system that detects a preceding vehicle and keeps the distance between the preceding vehicles and travels at a predetermined speed. It is being advanced.
- Patent Document 1 discloses a system (human interface) that allows a driver to perform a query search operation while driving.
- the pointing of the driver's finger to the object (or item) is detected using the first input device, and the selection instruction of the object (or item) is received using the second input device.
- a first input device for example, a camera
- the display device for example, head-up
- the position of an object or the like on the display is determined.
- the second input device for example, a microphone
- the second input device for example, a microphone
- the driver can perform a query search operation while driving.
- this system human interface
- in order to select an object, etc. it is necessary to perform, for example, a pointing operation with a fingertip and an operation such as utterance, so the driver's attention selects an object, etc. rather than driving (Matching) may be devoted, and the time required for selection (matching) may be increased.
- the present invention has been made to solve the above-described problems, and an object existing around a vehicle desired by a driver without moving the line of sight to an operation switch or the like and without releasing both hands from the steering wheel. It is an object of the present invention to provide a human interface capable of matching and presenting the above information in a shorter time by a simpler operation.
- the human interface includes peripheral object information acquisition means for acquiring information on peripheral objects existing in the vicinity of the vehicle, and landmark information indicating the peripheral objects acquired by the peripheral object information acquisition means in an actual scene around the vehicle.
- peripheral object information acquisition means for acquiring information on peripheral objects existing in the vicinity of the vehicle, and landmark information indicating the peripheral objects acquired by the peripheral object information acquisition means in an actual scene around the vehicle.
- display means according to the position or movement of the finger detected by the display means for superimposing display
- finger detection means for detecting the finger of the driver's hand holding the steering wheel of the vehicle
- finger detection means A matching unit that matches the landmark information desired by the driver from the superimposed marker information, and a display control unit that causes the display unit to superimpose and display information on peripheral objects indicated by the landmark information matched by the matching unit; It is characterized by providing.
- the human interface information on surrounding objects existing around the vehicle is acquired, and landmark information indicating the acquired surrounding objects is displayed superimposed on the actual scene around the vehicle. Therefore, the driver can recognize a peripheral object that can provide information while keeping the line of sight forward, for example. At that time, only the mark information is superimposed and displayed, so that it is possible to prevent the displayed information from becoming complicated.
- the finger of the driver's hand holding the steering wheel of the vehicle is detected, and the driver is selected from the mark information superimposed on the detected finger's position (including presence / absence) or movement. The desired landmark information is matched. Then, information on the peripheral object indicated by the matched landmark information is displayed in a superimposed manner.
- the driver matches the desired landmark information with a single motion (only the movement of the fingertip) while holding the steering wheel with both hands, and information on the surrounding objects indicated by the landmark information (detailed information) ) Can be displayed in a superimposed manner.
- the information on the object around the vehicle desired by the driver can be obtained in a shorter time by a simpler operation, Matching can be presented.
- the finger detection means detects a finger protruding forward from the steering wheel. If it does in this way, it will become possible to detect the position or movement of a driver's finger reliably and accurately, and it will become possible to match mark information desired by the driver accurately using the detection result (detection information). .
- the finger detection means is disposed in an upper part of the vehicle interior of the vehicle and has a detection range directed downward from the disposed position. It is preferable that a finger protruding forward from the wheel enters and the steering wheel does not enter. In this way, when processing the detection information, it is not necessary to remove the steering wheel from the detection information, so that the processing load can be reduced and the processing time can be shortened.
- the matching means divides the detection range into a plurality of areas, and the driver is placed in any one of the plurality of areas according to the position or movement of the finger detected by the finger detection means. It is preferable to match the mark information corresponding to the region containing the finger when it is determined whether the finger is present. In this case, by determining whether or not a finger has entered the region set as the detection range, it is possible to determine the landmark information desired by the driver, so the processing load can be reduced and the processing time can be further shortened. It becomes.
- the matching unit acquires the direction in which the finger is pointing based on the position or movement of the finger detected by the finger detection unit, and the driver points with the finger from the acquired direction. It is preferable to match the mark information.
- the direction in which the finger is pointing is acquired, and the mark information that the driver is pointing with the finger is matched from the acquired direction. . Therefore, even when there are a large number of acquired peripheral objects and a large number of landmark information is superimposed and displayed, it is possible to accurately determine the landmark information desired by the driver from the large number of landmark information. It becomes possible.
- the finger detection means is constituted by a distance image sensor.
- three-dimensional position information three-dimensional coordinates indicating the position of the finger can be obtained with high accuracy by the distance image sensor. Furthermore, from the three-dimensional position information, for example, the direction or the like that the driver is pointing with a finger can be accurately acquired.
- the landmark information is preferably an icon. In this way, it is possible to easily recognize the surrounding object that can provide information to the driver by the icon superimposed on the actual scene around the vehicle.
- information on an object existing around a vehicle desired by a driver can be shortened by a simpler operation without moving the line of sight to an operation switch or the like and without releasing both hands from the steering wheel.
- Matching and presentation can be made in time.
- the driving assistance system concerning a 1st embodiment it is a figure showing an example when the icon of a peripheral object is superimposed and displayed on the real scene ahead of vehicles.
- the driving assistance system concerning a 1st embodiment it is a figure showing an example when detailed information on a peripheral object is superimposed and displayed on a real scene ahead of a vehicle.
- the driving assistance system concerning a 2nd embodiment it is a figure showing an example when the icon of a peripheral object is superimposed and displayed on the real scene ahead of vehicles.
- the driving assistance system concerning a 2nd embodiment it is a figure showing an example when detailed information on a peripheral object is superimposed and displayed on a real scene ahead of a vehicle.
- the human interface Human Machine Interface (HMI)
- HMI Human Machine Interface
- AR Augmented Reality
- a case will be described as an example in which the present invention is applied to a driving support system that presents information on an object (front object) existing in the front) to the driver.
- the forward direction of the vehicle is “front (front side)”
- the backward direction of the vehicle is “rear (rear side)”
- the right side when facing the forward direction of the vehicle is “right (right side)”
- And the left side when the vehicle is moving forward is“ left (left side) ”.
- FIG. 1 is a block diagram showing the configuration of the driving support system 1.
- FIG. 2 is a diagram showing a detection range of the motion sensor 11 constituting the driving support system 1, wherein (a) is a front view and (b) is a side view.
- the driving support system 1 superimposes an icon indicating a surrounding object that can provide information (corresponding to the landmark information described in the claims) on the actual scene around the vehicle, and the driver points out of the superimposed icons. Detailed information of the surrounding object corresponding to the (selected) icon is superimposed and displayed.
- the driving support system 1 detects a gesture by a finger (for example, an index finger) of a hand holding the steering wheel by the driver and matches an icon pointed by the driver with the finger.
- the driving support system 1 determines the icon that is pointed by dividing the finger detection range into a plurality of areas and determining whether or not the finger has entered an arbitrary area among the plurality of areas.
- a finger for example, an index finger
- the driving support system 1 determines the icon that is pointed by dividing the finger detection range into a plurality of areas and determining whether or not the finger has entered an arbitrary area among the plurality of areas.
- an example will be described in which one icon is superimposed on the left side and the right side, and the detection
- the driving support system 1 includes a peripheral object information acquisition device 10 (corresponding to the peripheral object information acquisition means described in the claims), a motion sensor 11 (corresponding to the finger detection means described in the claims), a head-up display 20 (corresponding to display means described in claims) and an electronic control unit (hereinafter referred to as “ECU (Electronic Control Unit)”) 31.
- ECU Electronic Control Unit
- the peripheral object information acquisition device 10 is a device that acquires information on peripheral objects existing around the vehicle (particularly in front of the vehicle).
- the peripheral object information acquisition device 10 may acquire information by detecting a peripheral object using a sensor mounted on the vehicle, or may acquire information on the peripheral object from the outside of the vehicle.
- the peripheral object information acquisition device 10 transmits the acquired peripheral object information (peripheral object information) to the ECU 31.
- the peripheral objects are, for example, automobiles, motorcycles, motorbikes, bicycles, pedestrians, animals, and the like.
- the peripheral object information is, for example, a position (for example, absolute coordinates such as relative coordinates and latitude / longitude with reference to the host vehicle), a moving speed, a moving direction, a type, a vehicle type, and an exhaust amount.
- a radar sensor such as a millimeter wave radar or a laser radar
- an image sensor using a camera or the like
- information such as the presence / absence of an object around the vehicle and the position, size, type, and the like of the object are acquired when the object exists.
- time-series data of the position of the object the moving speed, moving direction, etc. of the object can also be acquired.
- an image sensor an object can be detected even when the surroundings are dark, such as at night, by using a near infrared camera or an infrared camera.
- examples of the acquisition device from the outside of the vehicle include an inter-vehicle communication device and a road-vehicle communication device.
- an inter-vehicle communication device for example, information such as position, moving speed, moving direction, type, vehicle type, and displacement can be acquired from other vehicles within the communicable range of the inter-vehicle communication device.
- a road-to-vehicle communication device when the host vehicle enters the communicable range of the roadside device (traffic infrastructure), the position, moving speed, moving direction of the object acquired by the roadside device, And information such as type can be acquired.
- information on a peripheral object by vehicle-to-vehicle communication or road-to-vehicle communication information on a peripheral object that exists in a place where the vehicle (driver) becomes a blind spot can also be acquired.
- the motion sensor 11 is a position (including presence / absence) and movement (hand gesture / gesture) of a finger that comes forward (front glass side) from the steering wheel while the driver is holding the steering wheel with both hands (especially both hands). It is a sensor which detects.
- the motion sensor 11 for example, it is preferable to use a distance image sensor of a TOF (Time Of Flight) method using near infrared rays.
- TOF Time Of Flight
- This distance image sensor receives, for example, an LED that emits near-infrared light and reflected light reflected by an object (for example, a finger of a hand), and outputs an electrical signal photoelectrically converted for each pixel (for example, CMOS image sensor or CCD image sensor), and a TOF processing unit that measures the time from near-infrared light emission to light reception and calculates the time and the distance from the light speed to the object for each pixel.
- the motion sensor 11 uses the distance information for each pixel to position information (for example, three-dimensional relative to the motion sensor 11 on the basis of the motion sensor 11) of the object (the finger that is in front of the steering wheel) within the detection range. Get coordinates). For example, the motion sensor 11 determines whether or not an object exists by determining whether or not the distance is shorter than a predetermined value for each pixel, and if it is determined that the object exists, the object exists. The position information is acquired from the distance at each of the plurality of pixels in the region to be processed and the two-dimensional position of the pixel. The motion sensor 11 transmits detection information indicating the position information of the detected object to the ECU 31. Note that the motion sensor 11 may transmit distance information for each pixel to the ECU 31, and the ECU 31 may perform processing for determining whether or not an object exists from the distance information for each pixel and acquiring position information of the object.
- position information for example, three-dimensional relative to the motion sensor 11 on the basis of the motion sensor 11
- the position information is acquired from the
- the motion sensor 11 is a predetermined location on the driver's seat side in the upper part of the passenger compartment of the vehicle and on the front side of the steering wheel (for example, a predetermined location on the front end of the driver's seat on the ceiling of the passenger compartment, or the driver's seat on the windshield). At a predetermined location on the upper end of the side).
- the motion sensor 11 emits near-infrared light downward from the place where it is disposed, and has a detection range directed downward.
- the detection range R of the motion sensor 11 is preferably set to a range in which the steering wheel W does not enter and the finger FF coming out from the steering wheel W enters.
- the detection range R is a finger pointing in any forward direction when the driver is holding the steering wheel W with both hands at the home position (for example, 10:10 when the vehicle is traveling straight). It is desirable to set the index finger to a range where only FF enters and no other fingers enter. Further, it is desirable that the detection range R is set to a range in which an instrument panel arranged in front of the steering wheel W does not enter.
- the detection range R is defined by, for example, the left viewing angle LA, the right viewing angle RA, the front viewing angle FA, the rear viewing angle BA, and the sensing distance L of the central axis A.
- the detection distance L is determined based on the vertical distance between the motion sensor 11 and the steering wheel W or the like.
- the left and right viewing angles LA and RA are determined based on the diameter (for example, 35 cm) of the steering wheel W and the detection distance L (for example, 30 °).
- the front and rear viewing angles FA and BA are determined based on the installation angle ⁇ of the steering wheel W (for example, 15 °).
- the head-up display 20 is a display device that superimposes and displays various information on a real scene (landscape) in front of the vehicle.
- the head-up display 20 projects an image (real image) toward the windshield, and reflects the image toward the driver by a half mirror provided in a part of the inside of the windshield.
- a virtual image is formed by forming an image in front of the windshield. Since the head-up display 20 is a well-known display device, description of the detailed configuration is omitted.
- the head-up display 20 When the head-up display 20 receives the display control information from the ECU 31, the head-up display 20 projects a predetermined image (for example, an icon indicating a peripheral object or detailed information of the peripheral object) on a predetermined display position of the windshield indicated by the display control information. To do.
- a predetermined image for example, an icon indicating a peripheral object or detailed information of the peripheral object
- the ECU 31 is a control unit that comprehensively controls the driving support system 1.
- the ECU 31 includes, for example, a microprocessor that performs calculations, a ROM that stores a program for causing the microprocessor to execute each process, a RAM that stores various data such as calculation results, a backup RAM that stores the stored contents, and Input / output I / F and the like.
- a program stored in a ROM or the like is executed by a microprocessor, whereby a matching unit 31a (corresponding to the matching unit described in the claims) and a display control unit 31b (display described in the claims). Each function of the control means is realized.
- the matching unit 31a matches the icon indicated by the driver with the finger FF from the left icon and the right icon superimposed on the head-up display 20. Specifically, the matching unit 31a divides the detection range R of the motion sensor 11 into a left side and a right side of the central axis A, and sets a left region and a right region in the detection range R. When the left icon and the right icon are superimposed, the matching unit 31a uses the position information of the object detected by the motion sensor 11 so that the driver's finger FF is in the left region or the right icon Determine if you are in the area.
- the matching unit 31a determines that the driver's finger FF is in the area when the finger FF is continuously in any one of the areas for a predetermined time (for example, several seconds to several seconds). .
- the matching unit 31a determines that the driver is pointing to the left icon, and the driver's finger FF is in the right area. It is determined that the driver is pointing to the right icon.
- the detection range R is divided into two regions, a left region and a right region, and an icon is matched corresponding to each region.
- the intermediate region, the left region For example, the detection range R may be divided into three or more regions, and icons may be matched to correspond to the three or more regions, for example, when divided into three regions on the right side.
- the display control unit 31b causes the head-up display 20 to superimpose and display an icon indicating a peripheral object (information-provided peripheral object) for which information is acquired by the peripheral object information acquisition device 10.
- the display control unit 31b divides the display range of the head-up display 20 into a left side and a right side, and displays icons one by one on the left side and the right side.
- the display control unit 31b converts the position of the peripheral object in the coordinate system of the peripheral object information acquisition apparatus 10 into the position in the coordinate system of the head-up display 20 for each of the left and right peripheral objects. Further, the display control unit 31b sets a position where the icon is superimposed and displayed near the position of the peripheral object in the coordinate system of the head-up display 20 for the left and right peripheral objects. Then, the display control unit 31b transmits to the head-up display 20 display information indicating the video information of the icons, the position of the superimposed display, and the like for each of the left and right peripheral objects. As for the icon pattern, color, etc., an icon that is easily visible to the driver is appropriately selected.
- the icons shown as an example in FIGS. 3, 4, 6 and 7 are pictures in which an exclamation mark (exclamation mark) is arranged in a triangle.
- the priority order is, for example, whether the surrounding object is a moving object or a stationary object, whether it exists on the traveling road, whether it is moving toward the traveling road, It is preferable to determine whether or not it exists at a place where it becomes, depending on the moving speed or the like.
- the display control unit 31b causes the head-up display 20 to superimpose and display the detailed information of the peripheral object indicated by the icon. Specifically, the display control unit 31b sets a position where detailed information is superimposed and displayed in the vicinity of the position of a peripheral object (may be an icon) in the coordinate system of the head-up display 20. Then, the display control unit 31b transmits to the head-up display 20 video information of detailed information (for example, video information including a character string of detailed information) and display control information indicating the position of superimposed display and the like.
- the detailed information to be superimposed is, for example, the type, the vehicle type, the displacement, the moving speed, the moving direction, and the like.
- the superimposed display of all icons may be stopped, or only the superimposed display of the icons of peripheral objects that do not display the detailed information may be stopped.
- the display control unit 31b transmits display control information for displaying the icon in a superimposed manner to the head-up display 20.
- FIG. 3 is a diagram illustrating an example when the icons of the peripheral objects are superimposed and displayed on the actual scene in front of the vehicle in the driving support system 1.
- FIG. 4 is a diagram illustrating an example when the detailed information of the surrounding objects is superimposed and displayed on the actual scene ahead of the vehicle in the driving support system 1.
- the motorized bicycle O1 (FIG. 3) exists on the front side of the parked vehicle PV that is parked on the left side in front of the vehicle on which the driving support system 1 is mounted.
- the helmet of the motorbike O1 driver can be seen from the parked vehicle PV) and the animal (cat) O2 existing on the right side in front of the vehicle is in the vicinity of the vehicle in front. It is an object.
- the peripheral object information acquisition device 10 acquires each piece of information about the peripheral objects O1 and O2, and transmits a peripheral object information signal indicating these pieces of information to the ECU 31.
- the ECU 31 converts the position of the peripheral object acquired by the peripheral object information acquisition device 10 into a position in the coordinate system of the head-up display 20, and displays an icon superimposed on the vicinity of the converted position.
- the display control information indicating the video information of the icon and the position of the superimposed display is transmitted to the head-up display 20.
- the head-up display 20 receives the display control information, the head-up display 20 projects an icon image at a position indicated by the display control information.
- the icon I1 is superimposed and displayed above the front side of the left parked vehicle PV in front of the vehicle, and the icon I2 is superimposed and displayed near the right cat O2.
- the driver looking forward can visually recognize these two icons I1 and I2 and easily recognize that there are surrounding objects that can provide information by the icons I1 and I2.
- the driver can visually recognize the peripheral object (cat O2) indicated by the icon I2, but cannot visually recognize the peripheral object indicated by the icon I1 in the blind spot of the parked vehicle PV.
- the detailed information of the surrounding object indicated by the icon I1 is to be obtained.
- the driver points the index finger FF of the left hand LH toward the icon I1 while holding the steering wheel W with both hands LH and RH and looking forward. At that time, the driver only has to put out the index finger FF so as to enter the region on the left side of the detection range R of the motion sensor 11.
- the motion sensor 11 detects the finger FF that has entered the detection range R, and transmits a detection signal indicating position information of the detected finger FF to the ECU 31.
- the ECU 31 determines that the finger FF is in the left region of the detection range R using the position information of the finger FF detected by the motion sensor 11, and the driver points to the left icon I1 by the operation with the finger FF. It is determined that Then, the ECU 31 sets a position where the detailed information of the peripheral object O1 is superimposed and displayed near the position of the icon I1, and heads up display control information indicating the video information of the detailed information of the peripheral object O1, the superimposed display position, and the like. It transmits to the display 20. When the head-up display 20 receives the display control information, the head-up display 20 projects an image of detailed information of the peripheral object O1 on the position indicated by the display control information.
- the driver looking forward looks at this detailed information DI. From this detailed information D1, the driver knows that there is a motorized bicycle on the front side of the parked vehicle PV, and this motorized bicycle may come out on the road. Therefore, the driver can perform, for example, a driving operation for decelerating the vehicle, assuming that the motorized bicycle comes out when overtaking the parked vehicle PV.
- information on surrounding objects existing around the vehicle is acquired, and icons indicating the acquired surrounding objects are displayed superimposed on the actual scene around the vehicle.
- the driver can recognize a peripheral object that can provide information while keeping the line of sight forward, for example.
- the finger FF of the driver's hand holding the steering wheel W of the vehicle is detected, and depending on the detected position (detection region) of the finger FF, the driver can select from the superimposed icons.
- the desired icon is matched.
- information on surrounding objects corresponding to the matched icon is displayed in a superimposed manner.
- the driver matches a desired icon by a single operation (operation) while holding the steering wheel W with both hands, and superimposes and displays information on the surrounding object (detailed information) indicated by the icon. Can be made.
- the driver can display information on objects around the desired vehicle in a shorter time by moving the line of sight to the operation switch or the like and without releasing both hands from the steering wheel W by a simpler operation.
- matching can be presented.
- the motion sensor 11 is disposed in the upper part of the vehicle interior, and the detection range directed downward from the disposed position is within the range where the steering wheel W does not enter and forward from the steering wheel W. Since it is set in the range where the protruding finger FF enters, it is not necessary to remove the steering wheel W from the detection information of the motion sensor 11, the processing load of the ECU 31 can be reduced, and the processing time can be shortened. .
- the motion sensor 11 since the motion sensor 11 is configured by a TOF type distance image sensor, the motion sensor 11 can obtain three-dimensional position information indicating the position of the finger FF with high accuracy. From the original position information, for example, it is possible to accurately acquire the direction of the driver pointing with the finger FF.
- the detection range of the motion sensor 11 is divided into a plurality of (for example, two) areas, and the driver's finger enters an arbitrary area among the plurality of areas using the detection information of the motion sensor 11. Therefore, it is possible to reduce the processing load on the ECU 31 and to shorten the processing time.
- the driver can match icons with a rough pointing operation (operation).
- the driving support system 2 is different from the driving support system 1 according to the first embodiment in the matching method of the icon that the driver points with the finger.
- the driving support system 2 calculates the direction in which the finger is pointing and matches the icon desired by the driver from the pointing direction.
- the number of superimposed icons is limited according to the number of areas that divide the detection range.
- the number of icons superimposed is not limited.
- the driving support system 2 includes a peripheral object information acquisition device 10, a motion sensor 11, a head-up display 20, and an ECU 32. Since the peripheral object information acquisition device 10, the motion sensor 11, and the head-up display 20 are the same as those described in the first embodiment, description thereof will be omitted, and the ECU 32 will be described below.
- the ECU 32 is a control unit that comprehensively controls the driving support system 2.
- the ECU 32 has, for example, a microprocessor, a ROM, a RAM, a backup RAM, an input / output I / F, and the like, similar to the ECU 31 described above.
- a program stored in a ROM or the like is executed by a microprocessor, whereby a matching unit 32a (corresponding to the matching unit described in the claims) and a display control unit 32b (display described in the claims). Each function of the control means is realized.
- the matching unit 32a matches the icon indicated by the driver with the finger FF from the icons superimposed on the head-up display 20. Specifically, when the icon is superimposed and displayed, the matching unit 32a calculates the direction in which the driver's finger FF is pointing using the position information of the object (finger FF) detected by the motion sensor 11. . For example, the matching unit 32a obtains a virtual line extending from the steering wheel side to the windshield side using a plurality of three-dimensional coordinate data of the object, and obtains a direction in which the virtual line points from the steering wheel side to the windshield side.
- the matching unit 32a calculates the position indicated by the finger FF in the coordinate system of the head-up display 20 from the direction indicated by the finger FF. Then, the matching unit 32a compares the position of the finger FF with the position of the icon in the coordinate system of the head-up display 20 for each superimposed icon, and which icon the finger FF points to. Determine. For example, the matching unit 32a sets a predetermined determination area around the position of the icon, and the position pointed by the finger FF is continuously included in the determination area for a predetermined time (for example, several seconds to several commas). In this case, it is determined that the finger FF points to the icon.
- a predetermined time for example, several seconds to several commas
- the display control unit 32b causes the head-up display 20 to superimpose and display an icon indicating the peripheral object for which information is acquired by the peripheral object information acquisition device 10. Specifically, the display control unit 32b displays, for each peripheral object for which the peripheral object information acquisition device 10 has acquired information, an icon indicating the peripheral object by the same process as the display control unit 31b according to the first embodiment. The superimposition display position is obtained, and display control information indicating the video information of the icon and the superposition display position is transmitted to the head-up display 20.
- the display control unit 32b causes the head-up display 20 to superimpose and display a cursor indicating the position pointed by the driver's finger FF. Specifically, the display control unit 32b sets the position indicated by the finger FF obtained by the matching unit 32a as a position where the cursor is superimposed and displayed (for example, the center position of the carson, the tip position of the cursor), Display control information indicating the position of the video information and superimposed display is transmitted to the head-up display 20. As for the design, color, etc. of the cursor, those that are easy for the driver to visually recognize are appropriately selected.
- the cursor shown as an example in FIG. 6 is a picture of an arrow. The superimposed display of the cursor is stopped when the icons are matched. Note that the cursor may not be superimposed and displayed. Further, instead of displaying the cursor, for example, the icon pointed with the finger FF may be displayed in reverse (change the display color).
- the display control unit 32b when one icon is matched by the matching unit 32a, the display control unit 32b superimposes and displays the detailed information of the peripheral object indicated by the icon on the head-up display 20. Specifically, the display control unit 32b obtains a position where the detailed information is superimposed and displayed by the same process as the display control unit 31b described above, and displays the video information of the detailed information and the display control information indicating the position of the superimposed display, etc. It transmits to the up display 20.
- FIG. 6 is a diagram illustrating an example when the icons of the peripheral objects are superimposed and displayed on the actual scene in front of the vehicle in the driving support system 2.
- FIG. 7 is a diagram illustrating an example when the detailed information of the surrounding objects is superimposed and displayed on the actual scene ahead of the vehicle in the driving support system 2.
- O2 pedestrians O3 and O4 trying to cross a pedestrian crossing in front of the vehicle, a preceding vehicle O5 existing in front of the vehicle, and an oncoming vehicle O6 are peripheral objects existing in front of the vehicle.
- the peripheral object information acquisition device 10 acquires each piece of information on these peripheral objects O1 to O6, and transmits a peripheral information signal indicating these pieces of information to the ECU 32.
- the ECU 32 converts the position of the peripheral object acquired by the peripheral object information acquisition device 10 into a position in the coordinate system of the head-up display 20, and displays an icon superimposed on the vicinity of the converted position.
- the display control information indicating the video information of the icon and the position of the superimposed display is transmitted to the head-up display 20.
- the head-up display 20 When the head-up display 20 receives the display control information, the head-up display 20 projects an icon image at a position indicated by the display control information.
- the icon I1 is superimposed on the front side of the left parked vehicle PV in front of the vehicle
- the icon I2 is superimposed on the right side of the cat O2
- each pedestrian O3, Icons I3 and I4 are superimposed and displayed near O4
- icon I5 is superimposed and displayed near the preceding vehicle O5
- icon I6 is superimposed and displayed near the oncoming vehicle O6.
- the driver looking forward looks at these six icons I1 to I6.
- the driver can visually recognize the surrounding objects (cat O2, each pedestrian O3, O4, each vehicle O5, O6) indicated by the icons I2 to I6, but enters the blind spot of the parked vehicle PV. Since the peripheral object indicated by the icon I1 is not visually recognized, detailed information on the peripheral object indicated by the icon I1 is obtained.
- the driver points the index finger FF of the left hand LH toward the icon I1 while looking forward while holding the steering wheel W with both hands LH and RH.
- the motion sensor 11 detects the finger FF that has entered the detection range R, and transmits position information of the detected finger FF to the ECU 32.
- the ECU 32 calculates the direction pointed by the finger FF using the position information of the finger FF detected by the motion sensor 11, and the position pointed by the finger FF in the coordinate system of the head-up display 20 from the pointing direction is calculated. calculate. Then, the ECU 32 transmits to the head-up display 20 display information indicating the video information of the cursor, the position of the superimposed display, and the like in order to superimpose and display the cursor at the position indicated by the finger FF.
- the head-up display 20 receives the display control information
- the head-up display 20 projects a cursor image at a position indicated by the display control information.
- the cursor C indicating the position where the driver is pointing with the finger FF is superimposed and displayed.
- the icons I1 to I6 pointed by the finger FF may be displayed in reverse (change the display color).
- the driver looking forward looks at the cursor C and moves the finger FF while looking at the cursor C.
- the driver moves the finger FF so that the cursor C approaches the icon I1.
- the ECU 32 When the cursor C comes close to the icon I1 or overlaps the icon 11, the ECU 32 indicates the direction pointed by the finger FF calculated from the position information of the finger FF detected by the motion sensor 11 (the position pointed by the finger FF). Thus, the icon I1 pointed by the driver with the finger FF is matched. Then, the ECU 32 transmits to the head-up display 20 video information of the detailed information of the peripheral object O1 indicated by the icon I1, display control information indicating the position of the superimposed display, and the like.
- the head-up display 20 projects an image of detailed information of the peripheral object O1 at a position indicated by the display control information. Thereby, as shown in FIG. 7, the detailed information DI is superimposed and displayed in the vicinity of the peripheral object O1.
- the icon I1 indicating the peripheral object O1 on which the detailed display DI is superimposed is continuously displayed, and the superimposed display of the icons I2 to I6 indicating the peripheral objects O2 to O6 is stopped.
- the driver looking forward can visually recognize this detailed information DI as in the first embodiment described above, and there is a possibility that the motorized bicycle existing on the front side of the parked vehicle PV may come out on the road. Can know.
- the direction of the finger FF is acquired, and the driver moves the finger from the acquired direction.
- the icon pointed to by is matched. Therefore, even if there are many acquired peripheral objects and many icons are superimposed, the icon desired by the driver can be accurately determined from the many icons. . Further, according to the present embodiment, since the cursor is superimposed and displayed during the matching operation with the driver's finger FF, the position pointed to by the finger FF can be easily matched with a desired icon.
- the number of peripheral objects (icons) to be superimposed is not limited. Furthermore, since only the detailed information (desired detailed information) of the matched icon is displayed as the detailed information, it can be prevented from being complicated by the superimposed image.
- the present invention is not limited to the above-described embodiments, and various modifications can be made.
- the head-up display 20 is used to superimpose and display the actual scene.
- a display device such as a head-mounted display may be used.
- the present invention can also be applied to the case of superimposing and displaying on the actual scene in other directions such as the rear of the vehicle.
- the configuration is such that the icons of the peripheral objects present in the front are superimposed on the front real scene, but in addition to this, the icons of the peripheral objects existing in the rear or side may be superimposed and displayed.
- a vehicle that travels rearward or laterally is detected by a detection means that detects an object that exists behind or laterally, and an icon indicating the vehicle or the like is superimposed and displayed when the icon is pointed Detailed information about the vehicle and the side vehicle is superimposed and displayed.
- an image acquired by a camera or the like is displayed as an alternative to a door mirror, and for example, the driver is notified by an icon only when a parallel running vehicle exists on the left and right, and the icon is pointed
- an icon indicates that there is a peripheral object that can provide information, but other mark information such as a rectangular frame surrounding the peripheral object may be used to indicate that the peripheral object exists.
- other mark information such as a rectangular frame surrounding the peripheral object may be used to indicate that the peripheral object exists.
- the motion sensor configured by the distance image sensor is shown as the finger detection unit, but other finger detection unit configured by a three-dimensional camera, a radar sensor, or the like may be applied.
- the range in which the steering wheel W does not enter is set as the detection range of the motion sensor 11, but a detection range in which the steering wheel W enters may be set. In this case, for example, it is preferable to mask the information of the steering wheel W from the detection information of the motion sensor 11 and determine the icon (detailed information) desired by the driver using the detection information after masking.
- the motion sensor 11 (distance image sensor) is placed at the upper part of the vehicle interior on the driver's seat and in front of the steering wheel (for example, the front end of the driver's seat on the ceiling or the driver's seat on the windshield).
- the motion sensor 11 may be arranged between the windshield and the dashboard, or in the housing of the meter.
- the motion sensor 11 based on the three-dimensional position information of the finger
- the motion sensor 11 detects the three-dimensional coordinates of the fingertip, synchronizes the three-dimensional coordinates with the coordinates on the head-up display 20, and It is good also as a structure which performs coordinate matching.
- the number of icons displayed in a superimposed manner is not limited.
- Peripheral object information acquisition device peripheral object information acquisition means
- Motion sensor finger detection means
- Head-up display display means
- 32 ECU 32 ECU 31a
- 32a Matching part matching means
- 32b 32b
- Display control unit display control means
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Instrument Panels (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne une interface humaine d'un système d'assistance à la conduite (1), qui comprend : un dispositif d'acquisition d'informations d'objet périphérique (10) qui acquiert des informations concernant un objet périphérique existant au voisinage d'un véhicule ; un affichage tête haute (20) qui affiche une icône représentant l'objet périphérique sur la vue réelle autour du véhicule d'une manière superposée ; un capteur de mouvement (11) qui détecte les doigts des mains d'un conducteur qui saisit le volant ; une unité de mise en correspondance (31a) qui met en correspondance une icône souhaitée par le conducteur avec un élément quelconque d'informations de repère en fonction d'une position ou d'un mouvement de doigt détecté ; et une unité de commande d'affichage (31b) qui affiche les informations concernant l'objet périphérique, indiqué par l'icône correspondante, sur l'affichage tête haute (20) d'une manière superposée.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017544407A JP6589991B2 (ja) | 2015-10-05 | 2016-08-25 | ヒューマンインターフェース |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015-197375 | 2015-10-05 | ||
| JP2015197375 | 2015-10-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017061183A1 true WO2017061183A1 (fr) | 2017-04-13 |
Family
ID=58487504
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/074806 Ceased WO2017061183A1 (fr) | 2015-10-05 | 2016-08-25 | Interface humaine |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6589991B2 (fr) |
| WO (1) | WO2017061183A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019131926A1 (fr) * | 2017-12-28 | 2019-07-04 | 株式会社デンソー | Dispositif de sélection de cible |
| JP2019194079A (ja) * | 2019-06-14 | 2019-11-07 | 株式会社Subaru | 車両の情報表示装置 |
| JP2021174054A (ja) * | 2020-04-20 | 2021-11-01 | 株式会社小松製作所 | 作業機械の障害物報知システムおよび作業機械の障害物報知方法 |
| CN114360321A (zh) * | 2021-11-09 | 2022-04-15 | 易显智能科技有限责任公司 | 机动车驾驶员手部动作感知系统、培训系统和培训方法 |
| JP2023076045A (ja) * | 2021-11-22 | 2023-06-01 | トヨタ自動車株式会社 | 画像表示装置 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006335112A (ja) * | 2005-05-31 | 2006-12-14 | Nissan Motor Co Ltd | コマンド入力装置 |
| JP2008296792A (ja) * | 2007-05-31 | 2008-12-11 | Toyota Motor Corp | 車両用走行制御装置、及び車両 |
| JP2013079930A (ja) * | 2012-02-07 | 2013-05-02 | Pioneer Electronic Corp | ヘッドアップディスプレイ、制御方法、及び表示装置 |
| JP2014179097A (ja) * | 2013-03-13 | 2014-09-25 | Honda Motor Co Ltd | ポインティングによる情報クエリ |
| WO2014196038A1 (fr) * | 2013-06-05 | 2014-12-11 | 三菱電機株式会社 | Dispositif de traitement d'informations par détection de ligne de visibilité et procédé de traitement d'informations |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001330450A (ja) * | 2000-03-13 | 2001-11-30 | Alpine Electronics Inc | ナビゲーション装置 |
| JP2010108264A (ja) * | 2008-10-30 | 2010-05-13 | Honda Motor Co Ltd | 車両周辺監視装置 |
-
2016
- 2016-08-25 WO PCT/JP2016/074806 patent/WO2017061183A1/fr not_active Ceased
- 2016-08-25 JP JP2017544407A patent/JP6589991B2/ja active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006335112A (ja) * | 2005-05-31 | 2006-12-14 | Nissan Motor Co Ltd | コマンド入力装置 |
| JP2008296792A (ja) * | 2007-05-31 | 2008-12-11 | Toyota Motor Corp | 車両用走行制御装置、及び車両 |
| JP2013079930A (ja) * | 2012-02-07 | 2013-05-02 | Pioneer Electronic Corp | ヘッドアップディスプレイ、制御方法、及び表示装置 |
| JP2014179097A (ja) * | 2013-03-13 | 2014-09-25 | Honda Motor Co Ltd | ポインティングによる情報クエリ |
| WO2014196038A1 (fr) * | 2013-06-05 | 2014-12-11 | 三菱電機株式会社 | Dispositif de traitement d'informations par détection de ligne de visibilité et procédé de traitement d'informations |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019131926A1 (fr) * | 2017-12-28 | 2019-07-04 | 株式会社デンソー | Dispositif de sélection de cible |
| JP2019194079A (ja) * | 2019-06-14 | 2019-11-07 | 株式会社Subaru | 車両の情報表示装置 |
| JP2021174054A (ja) * | 2020-04-20 | 2021-11-01 | 株式会社小松製作所 | 作業機械の障害物報知システムおよび作業機械の障害物報知方法 |
| JP7560955B2 (ja) | 2020-04-20 | 2024-10-03 | 株式会社小松製作所 | ショベル、作業機械の障害物報知システムおよびショベルの障害物報知方法 |
| US12359405B2 (en) | 2020-04-20 | 2025-07-15 | Komatsu Ltd. | Work machine obstacle notification system and work machine obstacle notification method |
| CN114360321A (zh) * | 2021-11-09 | 2022-04-15 | 易显智能科技有限责任公司 | 机动车驾驶员手部动作感知系统、培训系统和培训方法 |
| JP2023076045A (ja) * | 2021-11-22 | 2023-06-01 | トヨタ自動車株式会社 | 画像表示装置 |
| JP7655202B2 (ja) | 2021-11-22 | 2025-04-02 | トヨタ自動車株式会社 | 画像表示装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6589991B2 (ja) | 2019-10-16 |
| JPWO2017061183A1 (ja) | 2018-08-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11364930B2 (en) | Vehicle control system, vehicle control method and program | |
| US11203360B2 (en) | Vehicle control system, vehicle control method and program | |
| US10351060B2 (en) | Parking assistance apparatus and vehicle having the same | |
| JP7048398B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
| US10311618B2 (en) | Virtual viewpoint position control device and virtual viewpoint position control method | |
| US20180345790A1 (en) | Vehicle control system, vehicle control method, and storage medium | |
| US20180345991A1 (en) | Vehicle control system, vehicle control method, and storage medium | |
| JP6827378B2 (ja) | 車両制御システム、車両制御方法、およびプログラム | |
| US11565713B2 (en) | Vehicle control system, vehicle control method, and vehicle control program | |
| US11287879B2 (en) | Display control device, display control method, and program for display based on travel conditions | |
| JP6307895B2 (ja) | 車両用周辺監視装置 | |
| JP6744064B2 (ja) | 自動車用画像表示システム | |
| JP6589991B2 (ja) | ヒューマンインターフェース | |
| WO2017094316A1 (fr) | Dispositif d'assistance à la conduite | |
| US10503167B2 (en) | Vehicle control system, vehicle control method, and storage medium | |
| US20180348757A1 (en) | Vehicle control system, vehicle control method, and storage medium | |
| JP2010173530A (ja) | 走行支援装置 | |
| JP2015077876A (ja) | ヘッドアップディスプレイ装置 | |
| US20190244515A1 (en) | Augmented reality dsrc data visualization | |
| JP2013196595A (ja) | 車両走行支援システム | |
| JP2018203014A (ja) | 撮像表示ユニット | |
| JPWO2020105685A1 (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
| US20220203888A1 (en) | Attention calling device, attention calling method, and computer-readable medium | |
| JP2020140603A (ja) | 表示制御装置、表示制御方法、および表示制御プログラム | |
| JP2018163501A (ja) | 情報表示装置、情報表示方法及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16853337 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017544407 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16853337 Country of ref document: EP Kind code of ref document: A1 |