US12407794B2 - Method and apparatus for SVM top view image processing - Google Patents
Method and apparatus for SVM top view image processingInfo
- Publication number
- US12407794B2 US12407794B2 US18/345,323 US202318345323A US12407794B2 US 12407794 B2 US12407794 B2 US 12407794B2 US 202318345323 A US202318345323 A US 202318345323A US 12407794 B2 US12407794 B2 US 12407794B2
- Authority
- US
- United States
- Prior art keywords
- top view
- view image
- region
- vehicle
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/002—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
- B60Q9/004—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
- B60Q9/005—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/002—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
- B60Q9/004—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
- B60Q9/006—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a distance sensor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/102—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/30—Sensors
- B60Y2400/303—Speed sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30264—Parking
Definitions
- the present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image.
- SVM surround view monitor
- a surround view monitor (SVM) system is a parking assistance system which displays the surroundings of a vehicle with a top view image. Using this system, a driver can easily check parking lines or obstacles in blind zones through an indoor monitor.
- the SVM system typically uses four super-wide-angle cameras each having the view angle of 180 degrees or greater.
- the SVM system performs a perspective transformation procedure for correcting images collected by the camera to transform the parking lines, which appear curved due to a super-wide-angle lens, into straight lines. After the perspective transformation, four images are combined into one to provide a top view image.
- top view images that may be selected by a driver are limited to (1) front and rear top view images and (2) omnidirectional standard, reduced, and enlarged top view images.
- the parking distance warning (PDW) system is a system which detects an object around the vehicle using ultrasonic waves and then warns the driver through an auditory or visual display.
- the present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image.
- SVM surround view monitor
- Particular embodiments relate to a method and an apparatus for processing a SVM top view image which can provide top view images having different regions of interest to a driver in various parking situations in cooperation with a parking distance warning (PDW) system.
- PGW parking distance warning
- embodiments of the present disclosure provide a method and an apparatus that provide top view images having different regions of interest to a driver according to situations around a vehicle by operating a SVM system and a PDW system in cooperation with each other, so that it is possible to intuitively and clearly provide information around a vehicle to the driver, and it is possible to increase a driver's parking convenience.
- an image processing apparatus includes an image collection unit configured to collect an image of the surroundings around a vehicle using at least one camera attached to the vehicle.
- An object recognition sensor unit is configured to collect obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle.
- a first display unit is configured to display a parking warning image based on the obstacle location information.
- a control unit is configured to generate a top view image based on the image of the surroundings, and to determine whether to display the parking warning image based on the obstacle location information.
- a second display unit is configured to display the top view image. When a determination is made to display the parking warning image, the control unit generates the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information and automatically displays the top view image on the second display unit.
- an image processing method including an image collection step of collecting an image of surroundings around a vehicle using at least one camera attached to the vehicle, a top view image display step of generating a top view image based on the image of the surroundings and displaying the top view image on a central display, an obstacle location information collection step of collecting obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle, a parking warning determination step of determining whether to display a parking warning image based on the obstacle location information and displaying the parking warning image on a cluster display, and a top view image generation step of generating the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information, when the parking warning image is displayed, and automatically displaying the top view image on the central display.
- FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating a camera and an ultrasonic sensor attached to a vehicle including the image processing apparatus according to an embodiment of the present disclosure.
- FIG. 3 is a diagram illustrating a cluster display and a central display attached to the vehicle including the image processing apparatus according to an embodiment of the present disclosure.
- FIG. 4 A is a diagram illustrating a top view image displayed on a first region of a second display unit according to some embodiments of the present disclosure.
- FIG. 4 B is a diagram illustrating a top view image displayed on a second region of the second display unit according to some embodiments of the present disclosure.
- FIG. 5 is a diagram illustrating a warning image displayed on a first display unit according to some embodiments of the present disclosure.
- FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
- FIGS. 7 A to 7 E are conceptual diagrams illustrating a cooperative operation between an SVM system and a PDW system according to embodiments of the present disclosure.
- FIGS. 8 A and 8 B are conceptual diagrams illustrating a top view image setting process of a user in an image control device according to some embodiments of the present disclosure.
- FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
- the image processing apparatus may include an SVM system 100 , a control unit 200 , a PDW system 300 , a speed sensor unit 400 , a gear sensor unit 500 , and a steering-angle sensor unit 600 .
- the SVM system 100 may include an image collection unit 110 , a second display unit 120 , and a user input unit 130 .
- the image collection unit 110 may include cameras 110 a to 110 d .
- the cameras 110 a to 110 d may be located on the front, rear, and/or left and right sides of the vehicle.
- the cameras 110 a to 110 d may collect images of the surroundings including obstacles (e.g., surrounding vehicles, pedestrians, pillars, etc.) by photographing the front, rear, and/or left and right sides of the vehicle.
- the image collection unit 110 may provide the collected images of the surroundings to the control unit 200 .
- the cameras 100 a to 110 d may include any one of an image sensor such as a complementary metal-oxide semiconductor (CMOS), a charge-coupled device (CCD) or an active pixel sensor, a linear lens, a concave lens, a convex lens, a wide-angle lens, or a fish eye lens.
- CMOS complementary metal-oxide semiconductor
- CCD charge-coupled device
- active pixel sensor a linear lens
- a concave lens a concave lens
- a convex lens a wide-angle lens
- fish eye lens a fish eye lens
- the second display unit 120 may display a top view image.
- the top view image may include a driver's vehicle, a surrounding vehicle, a pedestrian, a pillar, etc.
- the second display unit 120 may be located on a dashboard of the vehicle disposed between a driver's seat and a passenger's seat.
- the second display unit 120 is not limited to the location shown in FIG. 3 .
- the second display unit 120 may be divided into a first region 120 a and a second region 120 b .
- the first region 120 a may be located on the left side of the second display unit 120
- the second region 120 b may be located on the right side of the second display unit 120 .
- a screen of the second display unit 120 is not limited to the structure shown in FIG. 3 .
- the second display unit 120 may display an image of the surroundings photographed through each of the cameras 110 a to 110 d on each of the screens 120 a and 120 b .
- the screen displayed on each of the first region 120 a and the second region 120 b will be described with reference to FIGS. 4 A and 4 B .
- FIG. 4 A is a diagram illustrating some embodiments of the top view image displayed on the first region 120 a of the second display unit 120 .
- image (a) is the first region 120 a showing the top view image of the front of the vehicle 800 .
- a region of interest includes the entire front region of the vehicle 800 .
- image (b) is the first region 120 a showing the top view image of the rear of the vehicle 800 .
- a region of interest includes the entire rear region of the vehicle 800 .
- image (c) of FIG. 4 A is the first region 120 a which displays the top view image of the changed region of interest.
- each of the cameras 110 a to 110 d readjusts a focal point and/or a principal point, and the control unit 200 changes the reference point of an image collected by the image collection unit 110 and then performs perspective transformation again.
- the image displayed on the first region 120 a is not limited to the embodiments shown in FIG. 4 A . Those skilled in the art will recognize from embodiments of the present disclosure that various top view images having different regions of interest may be displayed on the first region 120 a.
- FIG. 4 B is a diagram illustrating some embodiments of the top view image displayed on the second region 120 b of the second display unit 120 .
- image (a) is a second region 120 b displaying an omnidirectional standard top view image of the vehicle 800 .
- image (b) is a second region 120 b displaying an omnidirectional reduced top view image of the vehicle 800 .
- image (c) is a second region 120 b displaying an omnidirectional enlarged top view image of the vehicle 800 .
- the image displayed on the second region 120 b is not limited to the embodiments shown in FIG. 4 B .
- various omnidirectional top view images having different sizes may be displayed on the second region 120 b .
- various configurations of top view images may be displayed on the second display unit 120 .
- each of the images displayed on the first region 120 a and the second region 120 b may overlap or have different relative sizes or different positions.
- an additional or auxiliary screen may be provided in addition to the first region 120 a and the second region 120 b.
- the second display unit 120 may be configured as a physical device including any one of an LCD display, an OLED display, an LED display, a flat panel display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
- the user input unit 130 may apply power to the SVM system 100 or set the image of the first region 120 a or the second region 120 b by receiving a driver's input.
- the user input unit 130 may include a touch panel.
- the user input unit 130 may be coupled with the second display unit 120 to be provided as a touch screen.
- the user input unit 130 may include an integrated module in which a touch panel is coupled to the central display, i.e., the second display unit 120 , in a stacked structure.
- the user input unit 130 may sense a driver's touch input and may output a touch event value corresponding to the sensed touch signal.
- the touch panel may be implemented as various types of touch sensors such as a capacitive type, a resistive type, or a piezoelectric type.
- the PDW system 300 is a parking assistance system and assists a driver in parking by notifying a driver of a collision possibility when there is a risk of collision between a surrounding object and the vehicle.
- the PDW system 300 may include an object recognition sensor unit 310 , a first display unit 320 , and a PDW power supply unit 330 .
- the object recognition sensor unit 310 may sense an object around the vehicle 800 and may provide information about the object to the control unit 200 .
- the object recognition sensor unit 310 may include ultrasonic sensors 310 a to 310 d .
- the ultrasonic sensors 310 a to 310 d may be located on the front, rear, and/or left and right sides of the vehicle 800 .
- the ultrasonic sensors 310 a to 310 d may emit ultrasonic waves to the front, rear, and/or left and right sides of the vehicle 800 and may receive ultrasonic waves reflected from an obstacle (e.g., a surrounding vehicle, a pedestrian, a pillar, etc.).
- the object recognition sensor unit 310 may provide reflected ultrasonic-wave information to the control unit 200 .
- the control unit 200 may calculate information about the location, speed, and/or angle of the obstacle based on the reflected ultrasonic-wave information.
- the first display unit 320 may display one or more pieces of information including a direction in which an object is located, a distance between the vehicle 800 and the object, and a collision risk.
- the first display unit 320 is disposed to face the driver's seat.
- the first display unit 320 is not limited to the location shown in FIG. 3 .
- the first display unit 320 may be coupled to the second display unit 120 .
- the first display unit 320 may display a warning according to the warning level of the PDW system 300 .
- the warning level of the PDW system 300 may be divided into four levels.
- FIG. 5 is a diagram illustrating an image displayed on the first display unit 320 for each warning level of the PDW system 300 according to an embodiment of the present disclosure.
- the warning level may be divided into a non-warning level, a first level, a second level, and a third level.
- the first display unit 320 may notify the driver of each warning level by expressing a different image color for each warning level. By brightly displaying an area where an obstacle is located after dividing an area around the vehicle 800 in the image into a plurality of areas, the first display unit 320 may notify the driver of the location of the obstacle.
- image (a) is an image displayed on the first display unit 320 in the non-warning level of the PDW system 300 .
- the area around the vehicle 800 is displayed darkly or lightly, e.g., with no surrounding oval as shown in images (b), (c), and (d), on the first display unit 320 .
- image (b) is an image displayed on the first display unit 320 in the first level warning of the PDW system 300 .
- the first display unit 320 may display the entire area around the vehicle 800 in green or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in green.
- image (c) is an image displayed on the first display unit 320 in the second level warning of the PDW system 300 .
- the first display unit 320 may display the entire area around the vehicle 800 in yellow or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in yellow.
- image (d) is an image displayed on the first display unit 320 in the third level warning of the PDW system 300 .
- the first display unit 320 may display the entire area around the vehicle 800 in red or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in red.
- the image displayed on the first display unit 320 is not limited to the embodiments shown in FIG. 5 .
- the top view image may be displayed on the first display unit 320 .
- the first display unit 320 may display an image in which the top view image and the warning image are overlapped.
- various images having different colors and surrounding areas may be displayed on the first display unit 320 .
- an additional or auxiliary screen may be provided in addition to the first display unit 320 .
- the first display unit 320 may be a cluster display and may be configured as a physical device including any one of an LCD display, a PDP display, an OLED display, an FED display, an LED display, a flat panel display, a 3D display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
- the PDW power supply unit 330 may drive the PDW system 300 by receiving a driver's input.
- the PDW power supply unit 330 may include a power button.
- the power button may be a touch type and may be combined with the user input unit 130 to be implemented as a touch screen.
- the speed sensor unit 400 may sense the driving speed of the vehicle and may transmit the driving speed information to the control unit 200 .
- the gear sensor unit 500 may sense the operation of a transmission gear lever by a driver's operation and may transmit information about the operation of the transmission gear lever to the control unit 200 .
- the steering-angle sensor unit 600 may sense the steering angle of the vehicle as a steering wheel is operated and may transmit information about the steering angle to the control unit 200 .
- the control unit 200 may include at least one core which may execute at least one command.
- the control unit 200 may execute commands stored in a memory.
- the control unit 200 may be a single processor or a plurality of processors.
- the control unit 200 may include at least one of an advanced driver assistance system (ADAS), a central processing unit (CPU), a microprocessor, a graphic processing unit (GPU), application specific integrated circuits (ASICs), or field programmable gate arrays (FPGAs), but embodiments of the present disclosure are not limited thereto.
- ADAS advanced driver assistance system
- CPU central processing unit
- microprocessor a graphic processing unit
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- control unit 200 may be implemented with software and hardware including the SVM system 100 .
- the control unit 200 may transform the collected images into a top view image which is a perspective from above the vehicle. Meanwhile, since a specific method in which distorted images of the surroundings of the vehicle are transformed using a perspective transformation matrix and are combined into one top view image is known in an image processing field, a detailed description thereof will be omitted.
- control unit 200 may change the top view image displayed on the second display unit 120 based on the driver's input. For example, when a user scrolls the touch panel of the user input unit 130 , the top view image in which the region of interest is changed according to the scroll input may be formed. Further, when the user touches the touch panel of the user input unit 130 , the top view image in which the region of interest is changed according to the touch input may be formed. Meanwhile, since a specific method of performing the perspective transformation by changing a reference point according to the region of interest in the distorted image of the surroundings of the vehicle is known in the image processing field, a detailed description thereof will be omitted.
- control unit 200 may be implemented with software and hardware including the PDW system 300 .
- the control unit 200 may recognize a driver's input for activating (turning on) the PDW system 300 .
- the control unit 200 may determine the parking warning level for the vehicle to be parked.
- the warning level may be divided into four levels (the non-warning level, the first level, the second level, and the third level).
- the PDW system 300 may recognize the location and distance of the obstacle (e.g., another vehicle, a pillar, a pedestrian, etc.) located around the vehicle 800 through the ultrasonic sensors 310 a to 310 d .
- the control unit 200 may determine the parking warning level based on information about the location and/or distance of the obstacle and the vehicle 800 .
- the control unit 200 may transmit the warning level to the SVM system 100 when the warning level is determined.
- the SVM system 100 may display the top view image on the first region 120 a and the second region 120 b based on the warning level and obstacle information transmitted from the control unit 200 .
- a front top view or a rear top view is displayed on the first region 120 a
- a reduced top view is displayed on the second region 120 b . That is, the region of interest (ROI) is set omni-directionally and a top view range is set to the reduced range.
- ROI region of interest
- an image in which the obstacle is enlarged is displayed on the first region 120 a to draw a driver's attention and allow a driver to easily identify a surrounding obstacle, and a reduced top view is displayed on the second region 120 b . That is, the region of interest is set to the region around the vehicle in which the obstacle is located, and the top view range is set to the reduced range.
- the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a
- a standard top view or an enlarged top view may be displayed on the second region 120 b.
- a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the first warning level, and a standard top view is displayed on the second region 120 b .
- a region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a
- a reduced top view or an enlarged top view may be displayed on the second region 120 b.
- a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the second warning level, and an enlarged top view is displayed on the second region 120 b .
- embodiments of the present disclosure are not limited thereto, as the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a , and a reduced top view or a standard top view may be displayed on the second region 120 b.
- control unit 200 may collect information on the steering angle of the vehicle 800 from the steering-angle sensor unit 600 and may calculate the reverse path of the vehicle based on the collected information on the steering angle and the top view image.
- the control unit 200 may calculate the collision possibility of the vehicle 800 and the obstacle based on the reverse path, the top view image, and the obstacle information.
- control unit 200 may provide the reduced and/or enlarged top view image to an RSPA (remote smart parking assist) system with reference to the location and angle of a parking line to easily secure a distance between the vehicle 800 and the parking line, which is minimally required for activating an RSPA function.
- RSPA remote smart parking assist
- All components shown in FIG. 1 are not essential components of embodiments of the present disclosure, and some components included in the image processing apparatus may be added, changed, or deleted in other embodiments. Meanwhile, the components shown in FIG. 1 represent functional elements which are functionally different from each other, and a plurality of components may be implemented in a form integrated with each other in an actual physical environment. In addition, the function of one component may be distributed and performed in a computer system in a plurality of physical devices.
- FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
- step S 610 the control unit 200 determines whether the driver applies power to the PDW system 300 from the PDW power supply unit 330 .
- step S 670 the second display unit 120 displays a front top view having the front as the region of interest in the first region 120 a by default and displays a standard top view in the second region 120 b .
- the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
- step S 620 the control unit 200 determines whether the speed of the vehicle 800 is less than a preset speed (e.g., 10 kph) based on information on the speed of the vehicle 800 obtained from the speed sensor unit 400 .
- a preset speed e.g. 10 kph
- step S 640 the control unit 200 determines whether the gear of the vehicle 800 is set to a D-gear (drive) or an R-gear (reverse) based on the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500 .
- step S 660 the second display unit 120 displays a front top view having the front area as the region of interest in the first region 120 a by default and displays a reduced top view in the second region 120 b .
- the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
- step S 640 When the gear of the vehicle 800 is not the D-gear or the R-gear, the process moves from step S 640 to step S 670 .
- the process may move to step S 660 or step S 670 regardless of the gear information of the vehicle 800 .
- the process may move to step S 670 .
- step S 630 the control unit 200 determines whether the gear of the vehicle 800 is set to a P-gear (park) or an N-gear (neutral) through the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500 .
- step S 650 the SVM system 100 and the PDW system 300 are cooperatively operated. Such a cooperative operation may be performed through the control unit 200 .
- a detailed process in which the SVM system 100 and the PDW system 300 are cooperatively operated will be described below with reference to FIGS. 7 A to 7 E .
- step S 630 When the gear of the vehicle 800 is not the P-gear or the N-gear, the process moves from step S 630 to step S 670 .
- the process may move to step S 650 or step S 670 regardless of the gear information of the vehicle 800 .
- the process may move to step S 670 .
- FIGS. 7 A to 7 E are conceptual diagrams illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to embodiments of the present disclosure.
- FIG. 7 A is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t 1 , t 2 , and t 3 in the first level warning situation in the image control device according to an embodiment of the present disclosure.
- reference numerals will be omitted for the convenience of understanding.
- the parking warning is not displayed on the first display unit 320 , the front top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
- a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
- the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
- the obstacle 700 is located on the right side of the front of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset first level warning distance range.
- the object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d ) receiving the ultrasonic signal to the control unit 200 .
- the control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310 .
- the control unit 200 determines the warning level as the first level based on the calculated result.
- the control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320 .
- the first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200 .
- the control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110 , the warning level obtained from the PDW system 300 , and information on the obstacle 700 .
- the new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700 .
- the control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700 .
- the control unit 200 transmits the generated image to the second display unit 120 .
- the top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional reduced top view image is displayed on the second region 120 b so that the driver can easily identify the situation around where the vehicle 800 and the obstacle 700 are located.
- a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
- FIG. 7 B is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t 1 , t 2 , and t 3 in the third level warning situation in the image control device according to an embodiment of the present disclosure.
- reference numerals will be omitted for the convenience of understanding.
- the parking warning is not displayed on the first display unit 320 , the front top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
- a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
- the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
- the obstacle 700 is located on the right side of the front of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range.
- the object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d ) receiving the ultrasonic signal to the control unit 200 .
- the control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310 .
- the control unit 200 determines the warning level as the third level based on the calculated result.
- the control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320 .
- the first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200 .
- the control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110 , the warning level obtained from the PDW system 300 , and information on the obstacle 700 .
- the new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700 .
- the control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700 .
- the control unit 200 transmits the generated image to the second display unit 120 .
- the top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional enlarged top view image is displayed on the second region 120 b so that the driver can easily identify the locations of the vehicle 800 and the obstacle 700 .
- a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
- FIG. 7 C is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the steering angle of the vehicle 800 according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
- reference numerals will be omitted for the convenience of understanding.
- FIG. 7 C shows a situation where the vehicle 800 is reversing.
- a reverse path 810 according to the steering angle of the vehicle 800 is shown in the first region 120 a .
- the reverse path 810 may be displayed on the first region 120 a and/or the second region 120 b so that the driver can easily identify the reverse path 810 .
- the reverse path 810 may exist only conceptually for the cooperative operation of the SVM system 100 and the PDW system 300 .
- the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
- the warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
- the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
- the obstacle 700 is located on the right side of the rear of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
- the control unit 200 determines the new region of interest 900 based on the steering angle received from the steering-angle sensor unit 600 , the image of the surroundings received from the image collection unit 110 , the warning level received from the PDW system 300 , and information on the obstacle 700 .
- control unit 200 first generates the top view image by transforming and combining collected images.
- the control unit 200 calculates the reverse path 810 of the vehicle 800 based on the generated top view image and steering-angle information.
- the control unit 200 determines the new region of interest 900 including the reverse path 810 and the obstacle 700 based on the reverse path 810 and the obstacle 700 information.
- the control unit 200 generates a new top view image focused on the reverse path 810 and the obstacle 700 by retransforming and recombining collected images according to the new region of interest 900 .
- the control unit 200 transmits the generated image to the second display unit 120 .
- the top view image in which the reverse path 810 and the obstacle 700 are focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional enlarged top view image is displayed on the second region 120 b to allow the driver to move to another space for avoiding the obstacle 700 .
- a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
- FIG. 7 D is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the locations of the obstacles 700 a and 700 b according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
- reference numerals will be omitted for the convenience of understanding.
- the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
- warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
- all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
- the right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800 , and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800 .
- the control unit 200 may set the region of interest by assigning a weight to an angle among a plurality of obstacles having the same warning level.
- control unit 200 may assign a high weight to an obstacle located in a diagonal direction having a relatively high probability of collision. In this case, the region of interest 900 a focused on the right rear obstacle 700 a is determined.
- the top view image according to the third level warning and the new region of interest 900 a in which a weight is assigned to the angle of the obstacle is displayed in the first region 120 a and the second region 120 b.
- an angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined with a region where space maps assigned to each of the cameras 110 a to 110 d overlap. In another embodiment, the angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined as an angle at which a collision is most likely to occur according to steering angle information received from the steering-angle sensor unit 600 .
- FIG. 7 E is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the locations of the obstacles 700 a and 700 b according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
- reference numerals will be omitted for the convenience of understanding.
- the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
- warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
- all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
- the right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800 , and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800 .
- the control unit 200 may set the region of interest by assigning a weight to a distance among a plurality of obstacles having the same warning level.
- control unit 200 may assign a high weight to an obstacle located within a short distance having a relatively high probability of collision. In this case, the region of interest 900 b focused on the rear obstacle 700 b is determined.
- the top view image according to the third level warning and the new region of interest 900 b in which the weight is assigned to the distance of the obstacle is displayed in the first region 120 a and the second region 120 b.
- FIG. 8 A and FIG. 8 B are conceptual diagrams illustrating an image setting process as pixels of the camera increase and the region of interest varies in the image control device according to an embodiment of the present disclosure.
- FIG. 8 A and FIG. 8 B illustrate a process in which the region of interest is changed as a driver scrolls or touches the touch panel, when the user input unit 130 includes the touch panel, in the image control device according to an embodiment of the present disclosure.
- the touch panel when the touch panel is coupled to the second display unit 120 in a stacked structure, the touch panel may correspond to the first region 120 a and/or the second region 120 b.
- the driver may scroll the second region 120 b where the top view image of the region of interest 815 by default is displayed. If the driver scrolls any location on the second region 120 b , the user input unit 130 may recognize the driver's scroll length and direction.
- the control unit 200 receives scroll information from the user input unit 130 . Based on the scroll information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
- the control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
- control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8 A .
- the driver may touch the second region 120 b on which the top view image of the default region of interest 815 is displayed.
- the user input unit 130 may recognize a driver's touch location.
- the control unit 200 receives touch information from the user input unit 130 . Based on the touch information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
- the driver may set a region of interest 830 which is further enlarged to the front of the vehicle 800 by touching the new region of interest 820 .
- the control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
- control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8 B .
- Each component of the apparatus or method according to embodiments of the present disclosure may be implemented as hardware or software or a combination of hardware and software. Further, the function of each component may be implemented as software and a microprocessor may be implemented to execute the function of software corresponding to each component.
- Various implementations of systems and techniques described herein may be realized as digital electronic circuits, integrated circuits, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof.
- These various implementations may include one or more computer programs executable on a programmable system.
- the programmable system includes at least one programmable processor (which may be a special-purpose processor or a general-purpose processor) coupled to receive and transmit data and instructions from and to a storage system, at least one input device, and at least one output device.
- the computer programs also known as programs, software, software applications, or codes
- the computer-readable recording medium includes all types of recording devices in which data readable by a computer system is stored.
- a computer-readable recording medium may be a non-volatile or non-transitory medium, such as ROM, a CD-ROM, a magnetic tape, a floppy disk, a memory card, a hard disk, a magneto-optical disk, or a storage device, and may further include a transitory medium such as a data transmission medium.
- the computer-readable recording medium may be distributed in a computer system connected via a network, so that computer-readable codes may be stored and executed in a distributed manner.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (20)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020230003098A KR20240111376A (en) | 2023-01-09 | 2023-01-09 | Method And Apparatus for SVM Top View Image Processing |
| KR10-2023-0003098 | 2023-01-09 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20240236276A1 US20240236276A1 (en) | 2024-07-11 |
| US12407794B2 true US12407794B2 (en) | 2025-09-02 |
Family
ID=91732087
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/345,323 Active 2043-07-27 US12407794B2 (en) | 2023-01-09 | 2023-06-30 | Method and apparatus for SVM top view image processing |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US12407794B2 (en) |
| KR (1) | KR20240111376A (en) |
| CN (1) | CN118306314A (en) |
Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170028917A1 (en) * | 2014-01-27 | 2017-02-02 | Denso Corporation | Driving assistance device and driving assistance method |
| US20180137760A1 (en) * | 2015-05-11 | 2018-05-17 | Panasonic Intellectual Property Management Co. Ltd. | Monitoring-target-region setting device and monitoring-target-region setting method |
| US20180265004A1 (en) * | 2016-03-07 | 2018-09-20 | Mazda Motor Corporation | Vehicle periphery image display device |
| US20180330175A1 (en) * | 2017-05-10 | 2018-11-15 | Fotonation Limited | Multi-camera vision system and method of monitoring |
| US20190215465A1 (en) * | 2017-02-28 | 2019-07-11 | JVC Kenwood Corporation | Bird's-eye view image generating device, bird's-eye view image generating system, bird's-eye view image generating method, and medium |
| US20200039506A1 (en) * | 2018-08-02 | 2020-02-06 | Faraday&Future Inc. | System and method for providing visual assistance during an autonomous driving maneuver |
| US20200081607A1 (en) * | 2018-09-07 | 2020-03-12 | Aisin Seiki Kabushiki Kaisha | Display control device |
| US20200137322A1 (en) * | 2018-10-26 | 2020-04-30 | Denso Corporation | Image processing apparatus |
| US20200242374A1 (en) * | 2018-04-02 | 2020-07-30 | Jvckenwood Corporation | Vehicle display control device, vehicle display system, vehicle display control method, non-transitory storage medium |
| US20200398824A1 (en) * | 2019-06-24 | 2020-12-24 | Honda Motor Co., Ltd. | Parking assist system |
| US20210107511A1 (en) * | 2019-10-11 | 2021-04-15 | Toyota Jidosha Kabushiki Kaisha | Parking assist apparatus |
| US20220408062A1 (en) * | 2020-03-27 | 2022-12-22 | Jvckenwood Corporation | Display control apparatus, display control method, and program |
| US20230012629A1 (en) * | 2020-03-26 | 2023-01-19 | Samsung Electronics Co., Ltd. | Electronic device for displaying image by using camera monitoring system (cms) side display mounted in vehicle, and operation method thereof |
| US20230302999A1 (en) * | 2022-03-23 | 2023-09-28 | Isuzu Motors Limited | Vehicle rearward monitoring system and vehicle rearward monitoring method |
-
2023
- 2023-01-09 KR KR1020230003098A patent/KR20240111376A/en active Pending
- 2023-06-30 US US18/345,323 patent/US12407794B2/en active Active
- 2023-07-28 CN CN202310941520.1A patent/CN118306314A/en active Pending
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170028917A1 (en) * | 2014-01-27 | 2017-02-02 | Denso Corporation | Driving assistance device and driving assistance method |
| US20180137760A1 (en) * | 2015-05-11 | 2018-05-17 | Panasonic Intellectual Property Management Co. Ltd. | Monitoring-target-region setting device and monitoring-target-region setting method |
| US20180265004A1 (en) * | 2016-03-07 | 2018-09-20 | Mazda Motor Corporation | Vehicle periphery image display device |
| US20190215465A1 (en) * | 2017-02-28 | 2019-07-11 | JVC Kenwood Corporation | Bird's-eye view image generating device, bird's-eye view image generating system, bird's-eye view image generating method, and medium |
| US20180330175A1 (en) * | 2017-05-10 | 2018-11-15 | Fotonation Limited | Multi-camera vision system and method of monitoring |
| US20200242374A1 (en) * | 2018-04-02 | 2020-07-30 | Jvckenwood Corporation | Vehicle display control device, vehicle display system, vehicle display control method, non-transitory storage medium |
| US20200039506A1 (en) * | 2018-08-02 | 2020-02-06 | Faraday&Future Inc. | System and method for providing visual assistance during an autonomous driving maneuver |
| US20200081607A1 (en) * | 2018-09-07 | 2020-03-12 | Aisin Seiki Kabushiki Kaisha | Display control device |
| US20200137322A1 (en) * | 2018-10-26 | 2020-04-30 | Denso Corporation | Image processing apparatus |
| US20200398824A1 (en) * | 2019-06-24 | 2020-12-24 | Honda Motor Co., Ltd. | Parking assist system |
| US20210107511A1 (en) * | 2019-10-11 | 2021-04-15 | Toyota Jidosha Kabushiki Kaisha | Parking assist apparatus |
| US20230012629A1 (en) * | 2020-03-26 | 2023-01-19 | Samsung Electronics Co., Ltd. | Electronic device for displaying image by using camera monitoring system (cms) side display mounted in vehicle, and operation method thereof |
| US20220408062A1 (en) * | 2020-03-27 | 2022-12-22 | Jvckenwood Corporation | Display control apparatus, display control method, and program |
| US20230302999A1 (en) * | 2022-03-23 | 2023-09-28 | Isuzu Motors Limited | Vehicle rearward monitoring system and vehicle rearward monitoring method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN118306314A (en) | 2024-07-09 |
| KR20240111376A (en) | 2024-07-17 |
| US20240236276A1 (en) | 2024-07-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12220825B2 (en) | Display apparatus | |
| US10899277B2 (en) | Vehicular vision system with reduced distortion display | |
| US11858424B2 (en) | Electronic device for displaying image by using camera monitoring system (CMS) side display mounted in vehicle, and operation method thereof | |
| TWI478833B (en) | Method of adjusting the vehicle image device and system thereof | |
| US20150109444A1 (en) | Vision-based object sensing and highlighting in vehicle image display systems | |
| US20140114534A1 (en) | Dynamic rearview mirror display features | |
| US10495458B2 (en) | Image processing system for vehicle | |
| CN115087584A (en) | Vehicle Trailer Guidance System | |
| JP6014433B2 (en) | Image processing apparatus, image processing method, and image processing system | |
| US11760262B2 (en) | Surround view monitoring system and method for vehicle, and parking assist control system of vehicle | |
| JP5991648B2 (en) | Display control device for vehicle | |
| WO2013046407A1 (en) | Image display device, and image display method | |
| WO2013046408A1 (en) | Image display device and image display method | |
| JP2012076483A (en) | Parking support device | |
| JP2005186648A (en) | Vehicle periphery visual recognition device and display control device | |
| JP7631275B2 (en) | Mobile body and imaging device installation method | |
| WO2014155953A1 (en) | Vehicle-surroundings-monitoring control device | |
| WO2018159016A1 (en) | Bird's eye view image generation device, bird's eye view image generation system, bird's eye view image generation method and program | |
| US12185017B2 (en) | Display control apparatus, vehicle, and display control method | |
| US12240386B2 (en) | Vehicle sensing system with enhanced obstacle detection forward and sideward of the vehicle | |
| JP5445719B2 (en) | Image display device and image display method | |
| JP6617462B2 (en) | Vehicle periphery visual recognition device | |
| US12407794B2 (en) | Method and apparatus for SVM top view image processing | |
| KR20170133743A (en) | Vehicle control system based on user input and method thereof | |
| JP2016070951A (en) | Display device, control method, program, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: HYUNDAI MOTOR COMPANY, KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, MIN CHUL;REEL/FRAME:064128/0385 Effective date: 20230614 Owner name: KIA CORPORATION, KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, MIN CHUL;REEL/FRAME:064128/0385 Effective date: 20230614 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| CC | Certificate of correction |