[go: up one dir, main page]

US12407794B2 - Method and apparatus for SVM top view image processing - Google Patents

Method and apparatus for SVM top view image processing

Info

Publication number
US12407794B2
US12407794B2 US18/345,323 US202318345323A US12407794B2 US 12407794 B2 US12407794 B2 US 12407794B2 US 202318345323 A US202318345323 A US 202318345323A US 12407794 B2 US12407794 B2 US 12407794B2
Authority
US
United States
Prior art keywords
top view
view image
region
vehicle
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US18/345,323
Other versions
US20240236276A1 (en
Inventor
Min Chul Kang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Kia Corp
KLA Corp
Original Assignee
Hyundai Motor Co
KLA Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Motor Co, KLA Corp filed Critical Hyundai Motor Co
Assigned to HYUNDAI MOTOR COMPANY, KIA CORPORATION reassignment HYUNDAI MOTOR COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANG, MIN CHUL
Publication of US20240236276A1 publication Critical patent/US20240236276A1/en
Application granted granted Critical
Publication of US12407794B2 publication Critical patent/US12407794B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/005Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/006Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a distance sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/18Steering angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors
    • B60Y2400/303Speed sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking

Definitions

  • the present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image.
  • SVM surround view monitor
  • a surround view monitor (SVM) system is a parking assistance system which displays the surroundings of a vehicle with a top view image. Using this system, a driver can easily check parking lines or obstacles in blind zones through an indoor monitor.
  • the SVM system typically uses four super-wide-angle cameras each having the view angle of 180 degrees or greater.
  • the SVM system performs a perspective transformation procedure for correcting images collected by the camera to transform the parking lines, which appear curved due to a super-wide-angle lens, into straight lines. After the perspective transformation, four images are combined into one to provide a top view image.
  • top view images that may be selected by a driver are limited to (1) front and rear top view images and (2) omnidirectional standard, reduced, and enlarged top view images.
  • the parking distance warning (PDW) system is a system which detects an object around the vehicle using ultrasonic waves and then warns the driver through an auditory or visual display.
  • the present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image.
  • SVM surround view monitor
  • Particular embodiments relate to a method and an apparatus for processing a SVM top view image which can provide top view images having different regions of interest to a driver in various parking situations in cooperation with a parking distance warning (PDW) system.
  • PGW parking distance warning
  • embodiments of the present disclosure provide a method and an apparatus that provide top view images having different regions of interest to a driver according to situations around a vehicle by operating a SVM system and a PDW system in cooperation with each other, so that it is possible to intuitively and clearly provide information around a vehicle to the driver, and it is possible to increase a driver's parking convenience.
  • an image processing apparatus includes an image collection unit configured to collect an image of the surroundings around a vehicle using at least one camera attached to the vehicle.
  • An object recognition sensor unit is configured to collect obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle.
  • a first display unit is configured to display a parking warning image based on the obstacle location information.
  • a control unit is configured to generate a top view image based on the image of the surroundings, and to determine whether to display the parking warning image based on the obstacle location information.
  • a second display unit is configured to display the top view image. When a determination is made to display the parking warning image, the control unit generates the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information and automatically displays the top view image on the second display unit.
  • an image processing method including an image collection step of collecting an image of surroundings around a vehicle using at least one camera attached to the vehicle, a top view image display step of generating a top view image based on the image of the surroundings and displaying the top view image on a central display, an obstacle location information collection step of collecting obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle, a parking warning determination step of determining whether to display a parking warning image based on the obstacle location information and displaying the parking warning image on a cluster display, and a top view image generation step of generating the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information, when the parking warning image is displayed, and automatically displaying the top view image on the central display.
  • FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating a camera and an ultrasonic sensor attached to a vehicle including the image processing apparatus according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating a cluster display and a central display attached to the vehicle including the image processing apparatus according to an embodiment of the present disclosure.
  • FIG. 4 A is a diagram illustrating a top view image displayed on a first region of a second display unit according to some embodiments of the present disclosure.
  • FIG. 4 B is a diagram illustrating a top view image displayed on a second region of the second display unit according to some embodiments of the present disclosure.
  • FIG. 5 is a diagram illustrating a warning image displayed on a first display unit according to some embodiments of the present disclosure.
  • FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
  • FIGS. 7 A to 7 E are conceptual diagrams illustrating a cooperative operation between an SVM system and a PDW system according to embodiments of the present disclosure.
  • FIGS. 8 A and 8 B are conceptual diagrams illustrating a top view image setting process of a user in an image control device according to some embodiments of the present disclosure.
  • FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
  • the image processing apparatus may include an SVM system 100 , a control unit 200 , a PDW system 300 , a speed sensor unit 400 , a gear sensor unit 500 , and a steering-angle sensor unit 600 .
  • the SVM system 100 may include an image collection unit 110 , a second display unit 120 , and a user input unit 130 .
  • the image collection unit 110 may include cameras 110 a to 110 d .
  • the cameras 110 a to 110 d may be located on the front, rear, and/or left and right sides of the vehicle.
  • the cameras 110 a to 110 d may collect images of the surroundings including obstacles (e.g., surrounding vehicles, pedestrians, pillars, etc.) by photographing the front, rear, and/or left and right sides of the vehicle.
  • the image collection unit 110 may provide the collected images of the surroundings to the control unit 200 .
  • the cameras 100 a to 110 d may include any one of an image sensor such as a complementary metal-oxide semiconductor (CMOS), a charge-coupled device (CCD) or an active pixel sensor, a linear lens, a concave lens, a convex lens, a wide-angle lens, or a fish eye lens.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge-coupled device
  • active pixel sensor a linear lens
  • a concave lens a concave lens
  • a convex lens a wide-angle lens
  • fish eye lens a fish eye lens
  • the second display unit 120 may display a top view image.
  • the top view image may include a driver's vehicle, a surrounding vehicle, a pedestrian, a pillar, etc.
  • the second display unit 120 may be located on a dashboard of the vehicle disposed between a driver's seat and a passenger's seat.
  • the second display unit 120 is not limited to the location shown in FIG. 3 .
  • the second display unit 120 may be divided into a first region 120 a and a second region 120 b .
  • the first region 120 a may be located on the left side of the second display unit 120
  • the second region 120 b may be located on the right side of the second display unit 120 .
  • a screen of the second display unit 120 is not limited to the structure shown in FIG. 3 .
  • the second display unit 120 may display an image of the surroundings photographed through each of the cameras 110 a to 110 d on each of the screens 120 a and 120 b .
  • the screen displayed on each of the first region 120 a and the second region 120 b will be described with reference to FIGS. 4 A and 4 B .
  • FIG. 4 A is a diagram illustrating some embodiments of the top view image displayed on the first region 120 a of the second display unit 120 .
  • image (a) is the first region 120 a showing the top view image of the front of the vehicle 800 .
  • a region of interest includes the entire front region of the vehicle 800 .
  • image (b) is the first region 120 a showing the top view image of the rear of the vehicle 800 .
  • a region of interest includes the entire rear region of the vehicle 800 .
  • image (c) of FIG. 4 A is the first region 120 a which displays the top view image of the changed region of interest.
  • each of the cameras 110 a to 110 d readjusts a focal point and/or a principal point, and the control unit 200 changes the reference point of an image collected by the image collection unit 110 and then performs perspective transformation again.
  • the image displayed on the first region 120 a is not limited to the embodiments shown in FIG. 4 A . Those skilled in the art will recognize from embodiments of the present disclosure that various top view images having different regions of interest may be displayed on the first region 120 a.
  • FIG. 4 B is a diagram illustrating some embodiments of the top view image displayed on the second region 120 b of the second display unit 120 .
  • image (a) is a second region 120 b displaying an omnidirectional standard top view image of the vehicle 800 .
  • image (b) is a second region 120 b displaying an omnidirectional reduced top view image of the vehicle 800 .
  • image (c) is a second region 120 b displaying an omnidirectional enlarged top view image of the vehicle 800 .
  • the image displayed on the second region 120 b is not limited to the embodiments shown in FIG. 4 B .
  • various omnidirectional top view images having different sizes may be displayed on the second region 120 b .
  • various configurations of top view images may be displayed on the second display unit 120 .
  • each of the images displayed on the first region 120 a and the second region 120 b may overlap or have different relative sizes or different positions.
  • an additional or auxiliary screen may be provided in addition to the first region 120 a and the second region 120 b.
  • the second display unit 120 may be configured as a physical device including any one of an LCD display, an OLED display, an LED display, a flat panel display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
  • the user input unit 130 may apply power to the SVM system 100 or set the image of the first region 120 a or the second region 120 b by receiving a driver's input.
  • the user input unit 130 may include a touch panel.
  • the user input unit 130 may be coupled with the second display unit 120 to be provided as a touch screen.
  • the user input unit 130 may include an integrated module in which a touch panel is coupled to the central display, i.e., the second display unit 120 , in a stacked structure.
  • the user input unit 130 may sense a driver's touch input and may output a touch event value corresponding to the sensed touch signal.
  • the touch panel may be implemented as various types of touch sensors such as a capacitive type, a resistive type, or a piezoelectric type.
  • the PDW system 300 is a parking assistance system and assists a driver in parking by notifying a driver of a collision possibility when there is a risk of collision between a surrounding object and the vehicle.
  • the PDW system 300 may include an object recognition sensor unit 310 , a first display unit 320 , and a PDW power supply unit 330 .
  • the object recognition sensor unit 310 may sense an object around the vehicle 800 and may provide information about the object to the control unit 200 .
  • the object recognition sensor unit 310 may include ultrasonic sensors 310 a to 310 d .
  • the ultrasonic sensors 310 a to 310 d may be located on the front, rear, and/or left and right sides of the vehicle 800 .
  • the ultrasonic sensors 310 a to 310 d may emit ultrasonic waves to the front, rear, and/or left and right sides of the vehicle 800 and may receive ultrasonic waves reflected from an obstacle (e.g., a surrounding vehicle, a pedestrian, a pillar, etc.).
  • the object recognition sensor unit 310 may provide reflected ultrasonic-wave information to the control unit 200 .
  • the control unit 200 may calculate information about the location, speed, and/or angle of the obstacle based on the reflected ultrasonic-wave information.
  • the first display unit 320 may display one or more pieces of information including a direction in which an object is located, a distance between the vehicle 800 and the object, and a collision risk.
  • the first display unit 320 is disposed to face the driver's seat.
  • the first display unit 320 is not limited to the location shown in FIG. 3 .
  • the first display unit 320 may be coupled to the second display unit 120 .
  • the first display unit 320 may display a warning according to the warning level of the PDW system 300 .
  • the warning level of the PDW system 300 may be divided into four levels.
  • FIG. 5 is a diagram illustrating an image displayed on the first display unit 320 for each warning level of the PDW system 300 according to an embodiment of the present disclosure.
  • the warning level may be divided into a non-warning level, a first level, a second level, and a third level.
  • the first display unit 320 may notify the driver of each warning level by expressing a different image color for each warning level. By brightly displaying an area where an obstacle is located after dividing an area around the vehicle 800 in the image into a plurality of areas, the first display unit 320 may notify the driver of the location of the obstacle.
  • image (a) is an image displayed on the first display unit 320 in the non-warning level of the PDW system 300 .
  • the area around the vehicle 800 is displayed darkly or lightly, e.g., with no surrounding oval as shown in images (b), (c), and (d), on the first display unit 320 .
  • image (b) is an image displayed on the first display unit 320 in the first level warning of the PDW system 300 .
  • the first display unit 320 may display the entire area around the vehicle 800 in green or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in green.
  • image (c) is an image displayed on the first display unit 320 in the second level warning of the PDW system 300 .
  • the first display unit 320 may display the entire area around the vehicle 800 in yellow or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in yellow.
  • image (d) is an image displayed on the first display unit 320 in the third level warning of the PDW system 300 .
  • the first display unit 320 may display the entire area around the vehicle 800 in red or only an area (e.g., the oval surrounding vehicle 800 ) where the obstacle is located in the area around the vehicle 800 in red.
  • the image displayed on the first display unit 320 is not limited to the embodiments shown in FIG. 5 .
  • the top view image may be displayed on the first display unit 320 .
  • the first display unit 320 may display an image in which the top view image and the warning image are overlapped.
  • various images having different colors and surrounding areas may be displayed on the first display unit 320 .
  • an additional or auxiliary screen may be provided in addition to the first display unit 320 .
  • the first display unit 320 may be a cluster display and may be configured as a physical device including any one of an LCD display, a PDP display, an OLED display, an FED display, an LED display, a flat panel display, a 3D display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
  • the PDW power supply unit 330 may drive the PDW system 300 by receiving a driver's input.
  • the PDW power supply unit 330 may include a power button.
  • the power button may be a touch type and may be combined with the user input unit 130 to be implemented as a touch screen.
  • the speed sensor unit 400 may sense the driving speed of the vehicle and may transmit the driving speed information to the control unit 200 .
  • the gear sensor unit 500 may sense the operation of a transmission gear lever by a driver's operation and may transmit information about the operation of the transmission gear lever to the control unit 200 .
  • the steering-angle sensor unit 600 may sense the steering angle of the vehicle as a steering wheel is operated and may transmit information about the steering angle to the control unit 200 .
  • the control unit 200 may include at least one core which may execute at least one command.
  • the control unit 200 may execute commands stored in a memory.
  • the control unit 200 may be a single processor or a plurality of processors.
  • the control unit 200 may include at least one of an advanced driver assistance system (ADAS), a central processing unit (CPU), a microprocessor, a graphic processing unit (GPU), application specific integrated circuits (ASICs), or field programmable gate arrays (FPGAs), but embodiments of the present disclosure are not limited thereto.
  • ADAS advanced driver assistance system
  • CPU central processing unit
  • microprocessor a graphic processing unit
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • control unit 200 may be implemented with software and hardware including the SVM system 100 .
  • the control unit 200 may transform the collected images into a top view image which is a perspective from above the vehicle. Meanwhile, since a specific method in which distorted images of the surroundings of the vehicle are transformed using a perspective transformation matrix and are combined into one top view image is known in an image processing field, a detailed description thereof will be omitted.
  • control unit 200 may change the top view image displayed on the second display unit 120 based on the driver's input. For example, when a user scrolls the touch panel of the user input unit 130 , the top view image in which the region of interest is changed according to the scroll input may be formed. Further, when the user touches the touch panel of the user input unit 130 , the top view image in which the region of interest is changed according to the touch input may be formed. Meanwhile, since a specific method of performing the perspective transformation by changing a reference point according to the region of interest in the distorted image of the surroundings of the vehicle is known in the image processing field, a detailed description thereof will be omitted.
  • control unit 200 may be implemented with software and hardware including the PDW system 300 .
  • the control unit 200 may recognize a driver's input for activating (turning on) the PDW system 300 .
  • the control unit 200 may determine the parking warning level for the vehicle to be parked.
  • the warning level may be divided into four levels (the non-warning level, the first level, the second level, and the third level).
  • the PDW system 300 may recognize the location and distance of the obstacle (e.g., another vehicle, a pillar, a pedestrian, etc.) located around the vehicle 800 through the ultrasonic sensors 310 a to 310 d .
  • the control unit 200 may determine the parking warning level based on information about the location and/or distance of the obstacle and the vehicle 800 .
  • the control unit 200 may transmit the warning level to the SVM system 100 when the warning level is determined.
  • the SVM system 100 may display the top view image on the first region 120 a and the second region 120 b based on the warning level and obstacle information transmitted from the control unit 200 .
  • a front top view or a rear top view is displayed on the first region 120 a
  • a reduced top view is displayed on the second region 120 b . That is, the region of interest (ROI) is set omni-directionally and a top view range is set to the reduced range.
  • ROI region of interest
  • an image in which the obstacle is enlarged is displayed on the first region 120 a to draw a driver's attention and allow a driver to easily identify a surrounding obstacle, and a reduced top view is displayed on the second region 120 b . That is, the region of interest is set to the region around the vehicle in which the obstacle is located, and the top view range is set to the reduced range.
  • the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a
  • a standard top view or an enlarged top view may be displayed on the second region 120 b.
  • a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the first warning level, and a standard top view is displayed on the second region 120 b .
  • a region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a
  • a reduced top view or an enlarged top view may be displayed on the second region 120 b.
  • a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the second warning level, and an enlarged top view is displayed on the second region 120 b .
  • embodiments of the present disclosure are not limited thereto, as the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a , and a reduced top view or a standard top view may be displayed on the second region 120 b.
  • control unit 200 may collect information on the steering angle of the vehicle 800 from the steering-angle sensor unit 600 and may calculate the reverse path of the vehicle based on the collected information on the steering angle and the top view image.
  • the control unit 200 may calculate the collision possibility of the vehicle 800 and the obstacle based on the reverse path, the top view image, and the obstacle information.
  • control unit 200 may provide the reduced and/or enlarged top view image to an RSPA (remote smart parking assist) system with reference to the location and angle of a parking line to easily secure a distance between the vehicle 800 and the parking line, which is minimally required for activating an RSPA function.
  • RSPA remote smart parking assist
  • All components shown in FIG. 1 are not essential components of embodiments of the present disclosure, and some components included in the image processing apparatus may be added, changed, or deleted in other embodiments. Meanwhile, the components shown in FIG. 1 represent functional elements which are functionally different from each other, and a plurality of components may be implemented in a form integrated with each other in an actual physical environment. In addition, the function of one component may be distributed and performed in a computer system in a plurality of physical devices.
  • FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
  • step S 610 the control unit 200 determines whether the driver applies power to the PDW system 300 from the PDW power supply unit 330 .
  • step S 670 the second display unit 120 displays a front top view having the front as the region of interest in the first region 120 a by default and displays a standard top view in the second region 120 b .
  • the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
  • step S 620 the control unit 200 determines whether the speed of the vehicle 800 is less than a preset speed (e.g., 10 kph) based on information on the speed of the vehicle 800 obtained from the speed sensor unit 400 .
  • a preset speed e.g. 10 kph
  • step S 640 the control unit 200 determines whether the gear of the vehicle 800 is set to a D-gear (drive) or an R-gear (reverse) based on the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500 .
  • step S 660 the second display unit 120 displays a front top view having the front area as the region of interest in the first region 120 a by default and displays a reduced top view in the second region 120 b .
  • the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
  • step S 640 When the gear of the vehicle 800 is not the D-gear or the R-gear, the process moves from step S 640 to step S 670 .
  • the process may move to step S 660 or step S 670 regardless of the gear information of the vehicle 800 .
  • the process may move to step S 670 .
  • step S 630 the control unit 200 determines whether the gear of the vehicle 800 is set to a P-gear (park) or an N-gear (neutral) through the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500 .
  • step S 650 the SVM system 100 and the PDW system 300 are cooperatively operated. Such a cooperative operation may be performed through the control unit 200 .
  • a detailed process in which the SVM system 100 and the PDW system 300 are cooperatively operated will be described below with reference to FIGS. 7 A to 7 E .
  • step S 630 When the gear of the vehicle 800 is not the P-gear or the N-gear, the process moves from step S 630 to step S 670 .
  • the process may move to step S 650 or step S 670 regardless of the gear information of the vehicle 800 .
  • the process may move to step S 670 .
  • FIGS. 7 A to 7 E are conceptual diagrams illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to embodiments of the present disclosure.
  • FIG. 7 A is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t 1 , t 2 , and t 3 in the first level warning situation in the image control device according to an embodiment of the present disclosure.
  • reference numerals will be omitted for the convenience of understanding.
  • the parking warning is not displayed on the first display unit 320 , the front top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
  • a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
  • the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
  • the obstacle 700 is located on the right side of the front of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset first level warning distance range.
  • the object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d ) receiving the ultrasonic signal to the control unit 200 .
  • the control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310 .
  • the control unit 200 determines the warning level as the first level based on the calculated result.
  • the control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320 .
  • the first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200 .
  • the control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110 , the warning level obtained from the PDW system 300 , and information on the obstacle 700 .
  • the new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700 .
  • the control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700 .
  • the control unit 200 transmits the generated image to the second display unit 120 .
  • the top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional reduced top view image is displayed on the second region 120 b so that the driver can easily identify the situation around where the vehicle 800 and the obstacle 700 are located.
  • a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
  • FIG. 7 B is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t 1 , t 2 , and t 3 in the third level warning situation in the image control device according to an embodiment of the present disclosure.
  • reference numerals will be omitted for the convenience of understanding.
  • the parking warning is not displayed on the first display unit 320 , the front top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
  • a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
  • the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
  • the obstacle 700 is located on the right side of the front of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range.
  • the object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d ) receiving the ultrasonic signal to the control unit 200 .
  • the control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310 .
  • the control unit 200 determines the warning level as the third level based on the calculated result.
  • the control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320 .
  • the first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200 .
  • the control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110 , the warning level obtained from the PDW system 300 , and information on the obstacle 700 .
  • the new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700 .
  • the control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700 .
  • the control unit 200 transmits the generated image to the second display unit 120 .
  • the top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional enlarged top view image is displayed on the second region 120 b so that the driver can easily identify the locations of the vehicle 800 and the obstacle 700 .
  • a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
  • FIG. 7 C is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the steering angle of the vehicle 800 according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
  • reference numerals will be omitted for the convenience of understanding.
  • FIG. 7 C shows a situation where the vehicle 800 is reversing.
  • a reverse path 810 according to the steering angle of the vehicle 800 is shown in the first region 120 a .
  • the reverse path 810 may be displayed on the first region 120 a and/or the second region 120 b so that the driver can easily identify the reverse path 810 .
  • the reverse path 810 may exist only conceptually for the cooperative operation of the SVM system 100 and the PDW system 300 .
  • the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
  • the warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
  • the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
  • the obstacle 700 is located on the right side of the rear of the vehicle 800 , and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
  • the control unit 200 determines the new region of interest 900 based on the steering angle received from the steering-angle sensor unit 600 , the image of the surroundings received from the image collection unit 110 , the warning level received from the PDW system 300 , and information on the obstacle 700 .
  • control unit 200 first generates the top view image by transforming and combining collected images.
  • the control unit 200 calculates the reverse path 810 of the vehicle 800 based on the generated top view image and steering-angle information.
  • the control unit 200 determines the new region of interest 900 including the reverse path 810 and the obstacle 700 based on the reverse path 810 and the obstacle 700 information.
  • the control unit 200 generates a new top view image focused on the reverse path 810 and the obstacle 700 by retransforming and recombining collected images according to the new region of interest 900 .
  • the control unit 200 transmits the generated image to the second display unit 120 .
  • the top view image in which the reverse path 810 and the obstacle 700 are focused is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional enlarged top view image is displayed on the second region 120 b to allow the driver to move to another space for avoiding the obstacle 700 .
  • a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
  • FIG. 7 D is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the locations of the obstacles 700 a and 700 b according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
  • reference numerals will be omitted for the convenience of understanding.
  • the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
  • warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
  • all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
  • the right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800 , and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800 .
  • the control unit 200 may set the region of interest by assigning a weight to an angle among a plurality of obstacles having the same warning level.
  • control unit 200 may assign a high weight to an obstacle located in a diagonal direction having a relatively high probability of collision. In this case, the region of interest 900 a focused on the right rear obstacle 700 a is determined.
  • the top view image according to the third level warning and the new region of interest 900 a in which a weight is assigned to the angle of the obstacle is displayed in the first region 120 a and the second region 120 b.
  • an angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined with a region where space maps assigned to each of the cameras 110 a to 110 d overlap. In another embodiment, the angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined as an angle at which a collision is most likely to occur according to steering angle information received from the steering-angle sensor unit 600 .
  • FIG. 7 E is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the locations of the obstacles 700 a and 700 b according to the time sequence t 1 , t 2 , and t 3 , in the image control device according to an embodiment of the present disclosure.
  • reference numerals will be omitted for the convenience of understanding.
  • the parking warning is not displayed on the first display unit 320 , the rear top view is displayed on the first region 120 a of the second display unit 120 , and the omnidirectional standard top view is displayed on the second region 120 b.
  • warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
  • all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
  • the right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800 , and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800 .
  • the control unit 200 may set the region of interest by assigning a weight to a distance among a plurality of obstacles having the same warning level.
  • control unit 200 may assign a high weight to an obstacle located within a short distance having a relatively high probability of collision. In this case, the region of interest 900 b focused on the rear obstacle 700 b is determined.
  • the top view image according to the third level warning and the new region of interest 900 b in which the weight is assigned to the distance of the obstacle is displayed in the first region 120 a and the second region 120 b.
  • FIG. 8 A and FIG. 8 B are conceptual diagrams illustrating an image setting process as pixels of the camera increase and the region of interest varies in the image control device according to an embodiment of the present disclosure.
  • FIG. 8 A and FIG. 8 B illustrate a process in which the region of interest is changed as a driver scrolls or touches the touch panel, when the user input unit 130 includes the touch panel, in the image control device according to an embodiment of the present disclosure.
  • the touch panel when the touch panel is coupled to the second display unit 120 in a stacked structure, the touch panel may correspond to the first region 120 a and/or the second region 120 b.
  • the driver may scroll the second region 120 b where the top view image of the region of interest 815 by default is displayed. If the driver scrolls any location on the second region 120 b , the user input unit 130 may recognize the driver's scroll length and direction.
  • the control unit 200 receives scroll information from the user input unit 130 . Based on the scroll information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
  • the control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
  • control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8 A .
  • the driver may touch the second region 120 b on which the top view image of the default region of interest 815 is displayed.
  • the user input unit 130 may recognize a driver's touch location.
  • the control unit 200 receives touch information from the user input unit 130 . Based on the touch information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
  • the driver may set a region of interest 830 which is further enlarged to the front of the vehicle 800 by touching the new region of interest 820 .
  • the control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
  • control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8 B .
  • Each component of the apparatus or method according to embodiments of the present disclosure may be implemented as hardware or software or a combination of hardware and software. Further, the function of each component may be implemented as software and a microprocessor may be implemented to execute the function of software corresponding to each component.
  • Various implementations of systems and techniques described herein may be realized as digital electronic circuits, integrated circuits, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof.
  • These various implementations may include one or more computer programs executable on a programmable system.
  • the programmable system includes at least one programmable processor (which may be a special-purpose processor or a general-purpose processor) coupled to receive and transmit data and instructions from and to a storage system, at least one input device, and at least one output device.
  • the computer programs also known as programs, software, software applications, or codes
  • the computer-readable recording medium includes all types of recording devices in which data readable by a computer system is stored.
  • a computer-readable recording medium may be a non-volatile or non-transitory medium, such as ROM, a CD-ROM, a magnetic tape, a floppy disk, a memory card, a hard disk, a magneto-optical disk, or a storage device, and may further include a transitory medium such as a data transmission medium.
  • the computer-readable recording medium may be distributed in a computer system connected via a network, so that computer-readable codes may be stored and executed in a distributed manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

An embodiment image processing apparatus includes an image collector for collecting an image of surroundings around a vehicle using a camera attached to the vehicle, an object recognition sensor for collecting obstacle location information of an obstacle located around the vehicle, a first display for displaying a parking warning image based on the obstacle location information, a controller for generating a top view image based on the image of the surroundings and for determining to display the parking warning image based on the obstacle location information, wherein the controller generates the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information, and a second display for displaying the top view image, wherein the controller automatically displays the top view image on the second display.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of Korean Patent Application No. 10-2023-0003098, filed on Jan. 9, 2023, which application is hereby incorporated herein by reference.
TECHNICAL FIELD
The present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image.
BACKGROUND
The content described in this section merely provides background information on the present embodiments and does not constitute the prior art.
A surround view monitor (SVM) system is a parking assistance system which displays the surroundings of a vehicle with a top view image. Using this system, a driver can easily check parking lines or obstacles in blind zones through an indoor monitor.
The SVM system typically uses four super-wide-angle cameras each having the view angle of 180 degrees or greater. The SVM system performs a perspective transformation procedure for correcting images collected by the camera to transform the parking lines, which appear curved due to a super-wide-angle lens, into straight lines. After the perspective transformation, four images are combined into one to provide a top view image.
In the existing SVM system, top view images that may be selected by a driver are limited to (1) front and rear top view images and (2) omnidirectional standard, reduced, and enlarged top view images. In addition, it is inconvenient for the driver to manually select an appropriate top view image according to the parking situation.
On the other hand, the parking distance warning (PDW) system is a system which detects an object around the vehicle using ultrasonic waves and then warns the driver through an auditory or visual display.
The existing SVM system and PDW system operate independently. For this reason, there is a limit to a driver's ability to make detailed driving decisions in various parking situations. Therefore, there is a problem in that the situation around the vehicle is not more intuitively and clearly provided to the driver.
SUMMARY
The present disclosure relates to a method and an apparatus for processing a surround view monitor (SVM) top view image. Particular embodiments relate to a method and an apparatus for processing a SVM top view image which can provide top view images having different regions of interest to a driver in various parking situations in cooperation with a parking distance warning (PDW) system.
In view of the above, embodiments of the present disclosure provide a method and an apparatus that provide top view images having different regions of interest to a driver according to situations around a vehicle by operating a SVM system and a PDW system in cooperation with each other, so that it is possible to intuitively and clearly provide information around a vehicle to the driver, and it is possible to increase a driver's parking convenience.
Features achievable by embodiments of the present disclosure are not limited to the above-mentioned features, and other features which are not mentioned will be clearly understood by those skilled in the art from the following description.
According to an embodiment of the present disclosure, an image processing apparatus is provided. The image processing apparatus includes an image collection unit configured to collect an image of the surroundings around a vehicle using at least one camera attached to the vehicle. An object recognition sensor unit is configured to collect obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle. A first display unit is configured to display a parking warning image based on the obstacle location information. A control unit is configured to generate a top view image based on the image of the surroundings, and to determine whether to display the parking warning image based on the obstacle location information. A second display unit is configured to display the top view image. When a determination is made to display the parking warning image, the control unit generates the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information and automatically displays the top view image on the second display unit.
According to another embodiment of the present disclosure, an image processing method is provided, the image processing method including an image collection step of collecting an image of surroundings around a vehicle using at least one camera attached to the vehicle, a top view image display step of generating a top view image based on the image of the surroundings and displaying the top view image on a central display, an obstacle location information collection step of collecting obstacle location information of an obstacle located around the vehicle using at least one object recognition sensor attached to the vehicle, a parking warning determination step of determining whether to display a parking warning image based on the obstacle location information and displaying the parking warning image on a cluster display, and a top view image generation step of generating the top view image focused on the obstacle based on the image of the surroundings and the obstacle location information, when the parking warning image is displayed, and automatically displaying the top view image on the central display.
According to an embodiment of the present disclosure, it is possible to provide various top view images to a driver by operating a SVM system and a PDW system in cooperation with each other and providing different regions of interest according to various parking situations.
According to an embodiment of the present disclosure, it is possible to improve a driver's parking convenience by operating a SVM system and a PDW system in cooperation with each other and automatically changing a top view image.
Effects of embodiments of the present disclosure are not limited to the above-mentioned effects, and other effects which are not mentioned will be clearly understood by those skilled in the art from the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
FIG. 2 is a diagram illustrating a camera and an ultrasonic sensor attached to a vehicle including the image processing apparatus according to an embodiment of the present disclosure.
FIG. 3 is a diagram illustrating a cluster display and a central display attached to the vehicle including the image processing apparatus according to an embodiment of the present disclosure.
FIG. 4A is a diagram illustrating a top view image displayed on a first region of a second display unit according to some embodiments of the present disclosure.
FIG. 4B is a diagram illustrating a top view image displayed on a second region of the second display unit according to some embodiments of the present disclosure.
FIG. 5 is a diagram illustrating a warning image displayed on a first display unit according to some embodiments of the present disclosure.
FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
FIGS. 7A to 7E are conceptual diagrams illustrating a cooperative operation between an SVM system and a PDW system according to embodiments of the present disclosure.
FIGS. 8A and 8B are conceptual diagrams illustrating a top view image setting process of a user in an image control device according to some embodiments of the present disclosure.
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
Hereinafter, some exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the following description, like reference numerals designate like elements, although the elements are shown in different drawings. Further, in the following description of some embodiments, a detailed description of known functions and configurations incorporated therein will be omitted for the purpose of clarity and for brevity.
Additionally, various terms such as first, second, A, B, (a), (b), etc. are used solely to differentiate one component from the other but not to imply or suggest the substances, order, or sequence of the components. Throughout this specification, when a part ‘includes’ or ‘comprises’ a component, the part is meant to further include other components, not to exclude thereof unless specifically stated to the contrary.
The following detailed description, together with the accompanying drawings, is intended to describe exemplary embodiments of the present invention and is not intended to represent the only embodiments in which the present invention may be practiced.
FIG. 1 is a block diagram illustrating the configuration of an image processing apparatus according to an embodiment of the present disclosure.
Referring to FIG. 1 , the image processing apparatus may include an SVM system 100, a control unit 200, a PDW system 300, a speed sensor unit 400, a gear sensor unit 500, and a steering-angle sensor unit 600.
Referring to FIG. 1 , the SVM system 100 according to an embodiment may include an image collection unit 110, a second display unit 120, and a user input unit 130.
Referring to FIG. 2 , the image collection unit 110 may include cameras 110 a to 110 d. In an embodiment, the cameras 110 a to 110 d may be located on the front, rear, and/or left and right sides of the vehicle. The cameras 110 a to 110 d may collect images of the surroundings including obstacles (e.g., surrounding vehicles, pedestrians, pillars, etc.) by photographing the front, rear, and/or left and right sides of the vehicle. The image collection unit 110 may provide the collected images of the surroundings to the control unit 200.
The cameras 100 a to 110 d may include any one of an image sensor such as a complementary metal-oxide semiconductor (CMOS), a charge-coupled device (CCD) or an active pixel sensor, a linear lens, a concave lens, a convex lens, a wide-angle lens, or a fish eye lens. The camera 1100 may be analog or digital.
The second display unit 120 may display a top view image. The top view image may include a driver's vehicle, a surrounding vehicle, a pedestrian, a pillar, etc.
Referring to FIG. 3 , in an embodiment, the second display unit 120 may be located on a dashboard of the vehicle disposed between a driver's seat and a passenger's seat. However, the second display unit 120 is not limited to the location shown in FIG. 3 .
In an embodiment, the second display unit 120 may be divided into a first region 120 a and a second region 120 b. Referring to FIG. 3 , the first region 120 a may be located on the left side of the second display unit 120, while the second region 120 b may be located on the right side of the second display unit 120. However, a screen of the second display unit 120 is not limited to the structure shown in FIG. 3 .
In an embodiment, the second display unit 120 may display an image of the surroundings photographed through each of the cameras 110 a to 110 d on each of the screens 120 a and 120 b. The screen displayed on each of the first region 120 a and the second region 120 b will be described with reference to FIGS. 4A and 4B.
FIG. 4A is a diagram illustrating some embodiments of the top view image displayed on the first region 120 a of the second display unit 120.
In FIG. 4A, image (a) is the first region 120 a showing the top view image of the front of the vehicle 800. In image (a) of FIG. 4A, a region of interest includes the entire front region of the vehicle 800.
In FIG. 4A, image (b) is the first region 120 a showing the top view image of the rear of the vehicle 800. In image (b) of FIG. 4A, a region of interest includes the entire rear region of the vehicle 800.
When the region of interest is changed by a driver's manipulation or a cooperative operation between the SVM system 100 and the PDW system 300, image (c) of FIG. 4A is the first region 120 a which displays the top view image of the changed region of interest. When the region of interest is changed, each of the cameras 110 a to 110 d readjusts a focal point and/or a principal point, and the control unit 200 changes the reference point of an image collected by the image collection unit 110 and then performs perspective transformation again.
The image displayed on the first region 120 a is not limited to the embodiments shown in FIG. 4A. Those skilled in the art will recognize from embodiments of the present disclosure that various top view images having different regions of interest may be displayed on the first region 120 a.
FIG. 4B is a diagram illustrating some embodiments of the top view image displayed on the second region 120 b of the second display unit 120.
In FIG. 4B, image (a) is a second region 120 b displaying an omnidirectional standard top view image of the vehicle 800.
In FIG. 4B, image (b) is a second region 120 b displaying an omnidirectional reduced top view image of the vehicle 800.
In FIG. 4B, image (c) is a second region 120 b displaying an omnidirectional enlarged top view image of the vehicle 800.
The image displayed on the second region 120 b is not limited to the embodiments shown in FIG. 4B. Those skilled in the art will recognize from embodiments of the present disclosure that various omnidirectional top view images having different sizes may be displayed on the second region 120 b. Further, those skilled in the art will recognize from embodiments of the present disclosure that various configurations of top view images may be displayed on the second display unit 120. For example, each of the images displayed on the first region 120 a and the second region 120 b may overlap or have different relative sizes or different positions. Further, those skilled in the art will recognize from embodiments of the present disclosure that an additional or auxiliary screen may be provided in addition to the first region 120 a and the second region 120 b.
The second display unit 120 may be configured as a physical device including any one of an LCD display, an OLED display, an LED display, a flat panel display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
The user input unit 130 may apply power to the SVM system 100 or set the image of the first region 120 a or the second region 120 b by receiving a driver's input.
In an embodiment, the user input unit 130 may include a touch panel. The user input unit 130 may be coupled with the second display unit 120 to be provided as a touch screen. For example, the user input unit 130 may include an integrated module in which a touch panel is coupled to the central display, i.e., the second display unit 120, in a stacked structure.
In an embodiment, the user input unit 130 may sense a driver's touch input and may output a touch event value corresponding to the sensed touch signal. The touch panel may be implemented as various types of touch sensors such as a capacitive type, a resistive type, or a piezoelectric type.
The PDW system 300 is a parking assistance system and assists a driver in parking by notifying a driver of a collision possibility when there is a risk of collision between a surrounding object and the vehicle.
Referring to FIG. 1 , according to an embodiment, the PDW system 300 may include an object recognition sensor unit 310, a first display unit 320, and a PDW power supply unit 330.
The object recognition sensor unit 310 may sense an object around the vehicle 800 and may provide information about the object to the control unit 200.
Referring to FIG. 2 , the object recognition sensor unit 310 may include ultrasonic sensors 310 a to 310 d. The ultrasonic sensors 310 a to 310 d may be located on the front, rear, and/or left and right sides of the vehicle 800.
The ultrasonic sensors 310 a to 310 d may emit ultrasonic waves to the front, rear, and/or left and right sides of the vehicle 800 and may receive ultrasonic waves reflected from an obstacle (e.g., a surrounding vehicle, a pedestrian, a pillar, etc.). The object recognition sensor unit 310 may provide reflected ultrasonic-wave information to the control unit 200. The control unit 200 may calculate information about the location, speed, and/or angle of the obstacle based on the reflected ultrasonic-wave information.
When the object around the vehicle 800 is sensed by the PDW system 300, the first display unit 320 may display one or more pieces of information including a direction in which an object is located, a distance between the vehicle 800 and the object, and a collision risk.
Referring to FIG. 3 , the first display unit 320 is disposed to face the driver's seat. However, the first display unit 320 is not limited to the location shown in FIG. 3 . For example, the first display unit 320 may be coupled to the second display unit 120.
In some embodiments, the first display unit 320 may display a warning according to the warning level of the PDW system 300. In an embodiment, the warning level of the PDW system 300 may be divided into four levels.
FIG. 5 is a diagram illustrating an image displayed on the first display unit 320 for each warning level of the PDW system 300 according to an embodiment of the present disclosure. The warning level may be divided into a non-warning level, a first level, a second level, and a third level.
The first display unit 320 may notify the driver of each warning level by expressing a different image color for each warning level. By brightly displaying an area where an obstacle is located after dividing an area around the vehicle 800 in the image into a plurality of areas, the first display unit 320 may notify the driver of the location of the obstacle.
In FIG. 5 , image (a) is an image displayed on the first display unit 320 in the non-warning level of the PDW system 300. In the case of the non-warning level, the area around the vehicle 800 is displayed darkly or lightly, e.g., with no surrounding oval as shown in images (b), (c), and (d), on the first display unit 320.
In FIG. 5 , image (b) is an image displayed on the first display unit 320 in the first level warning of the PDW system 300. In the case of the first level warning, the first display unit 320 may display the entire area around the vehicle 800 in green or only an area (e.g., the oval surrounding vehicle 800) where the obstacle is located in the area around the vehicle 800 in green.
In FIG. 5 , image (c) is an image displayed on the first display unit 320 in the second level warning of the PDW system 300. In the case of the second level warning, the first display unit 320 may display the entire area around the vehicle 800 in yellow or only an area (e.g., the oval surrounding vehicle 800) where the obstacle is located in the area around the vehicle 800 in yellow.
In FIG. 5 , image (d) is an image displayed on the first display unit 320 in the third level warning of the PDW system 300. In the case of the third level warning, the first display unit 320 may display the entire area around the vehicle 800 in red or only an area (e.g., the oval surrounding vehicle 800) where the obstacle is located in the area around the vehicle 800 in red.
The image displayed on the first display unit 320 is not limited to the embodiments shown in FIG. 5 . In an embodiment, the top view image may be displayed on the first display unit 320. In another embodiment, the first display unit 320 may display an image in which the top view image and the warning image are overlapped. Those skilled in the art will recognize from embodiments of the present disclosure that various images having different colors and surrounding areas may be displayed on the first display unit 320. Further, those skilled in the art will recognize from embodiments of the present disclosure that an additional or auxiliary screen may be provided in addition to the first display unit 320.
The first display unit 320 may be a cluster display and may be configured as a physical device including any one of an LCD display, a PDP display, an OLED display, an FED display, an LED display, a flat panel display, a 3D display, and a transparent display, for example, but embodiments of the present disclosure are not limited thereto.
The PDW power supply unit 330 may drive the PDW system 300 by receiving a driver's input. In an embodiment, the PDW power supply unit 330 may include a power button. However, embodiments of the present disclosure are not limited thereto, as the power button may be a touch type and may be combined with the user input unit 130 to be implemented as a touch screen.
When power is applied to the PDW system 300, information on the obstacle around the vehicle is identified through the object recognition sensor unit 310, and the obstacle information is transmitted to the control unit 200.
The speed sensor unit 400 may sense the driving speed of the vehicle and may transmit the driving speed information to the control unit 200.
The gear sensor unit 500 may sense the operation of a transmission gear lever by a driver's operation and may transmit information about the operation of the transmission gear lever to the control unit 200.
The steering-angle sensor unit 600 may sense the steering angle of the vehicle as a steering wheel is operated and may transmit information about the steering angle to the control unit 200.
The control unit 200 may include at least one core which may execute at least one command. The control unit 200 may execute commands stored in a memory. The control unit 200 may be a single processor or a plurality of processors. The control unit 200 may include at least one of an advanced driver assistance system (ADAS), a central processing unit (CPU), a microprocessor, a graphic processing unit (GPU), application specific integrated circuits (ASICs), or field programmable gate arrays (FPGAs), but embodiments of the present disclosure are not limited thereto.
In an embodiment, the control unit 200 may be implemented with software and hardware including the SVM system 100. The control unit 200 may transform the collected images into a top view image which is a perspective from above the vehicle. Meanwhile, since a specific method in which distorted images of the surroundings of the vehicle are transformed using a perspective transformation matrix and are combined into one top view image is known in an image processing field, a detailed description thereof will be omitted.
In an embodiment, the control unit 200 may change the top view image displayed on the second display unit 120 based on the driver's input. For example, when a user scrolls the touch panel of the user input unit 130, the top view image in which the region of interest is changed according to the scroll input may be formed. Further, when the user touches the touch panel of the user input unit 130, the top view image in which the region of interest is changed according to the touch input may be formed. Meanwhile, since a specific method of performing the perspective transformation by changing a reference point according to the region of interest in the distorted image of the surroundings of the vehicle is known in the image processing field, a detailed description thereof will be omitted.
In an embodiment, the control unit 200 may be implemented with software and hardware including the PDW system 300. In an embodiment, as the driver manipulates a parking safety button, the control unit 200 may recognize a driver's input for activating (turning on) the PDW system 300.
In an embodiment, the control unit 200 may determine the parking warning level for the vehicle to be parked. In some embodiments, the warning level may be divided into four levels (the non-warning level, the first level, the second level, and the third level). The PDW system 300 may recognize the location and distance of the obstacle (e.g., another vehicle, a pillar, a pedestrian, etc.) located around the vehicle 800 through the ultrasonic sensors 310 a to 310 d. The control unit 200 may determine the parking warning level based on information about the location and/or distance of the obstacle and the vehicle 800. The control unit 200 may transmit the warning level to the SVM system 100 when the warning level is determined.
The SVM system 100 may display the top view image on the first region 120 a and the second region 120 b based on the warning level and obstacle information transmitted from the control unit 200.
In an embodiment, in the non-warning level, in order to secure a wide field of view for the driver, a front top view or a rear top view is displayed on the first region 120 a, and a reduced top view is displayed on the second region 120 b. That is, the region of interest (ROI) is set omni-directionally and a top view range is set to the reduced range.
In an embodiment, in the case of the first warning level, an image in which the obstacle is enlarged is displayed on the first region 120 a to draw a driver's attention and allow a driver to easily identify a surrounding obstacle, and a reduced top view is displayed on the second region 120 b. That is, the region of interest is set to the region around the vehicle in which the obstacle is located, and the top view range is set to the reduced range. However, embodiments of the present disclosure are not limited thereto, as the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a, and a standard top view or an enlarged top view may be displayed on the second region 120 b.
In an embodiment, in the case of the second warning level, a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the first warning level, and a standard top view is displayed on the second region 120 b. However, embodiments of the present disclosure are not limited thereto, as the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a, and a reduced top view or an enlarged top view may be displayed on the second region 120 b.
In an embodiment, in the case of the third warning level, a more enlarged image of the obstacle is displayed on the first region 120 a as compared to the second warning level, and an enlarged top view is displayed on the second region 120 b. However, embodiments of the present disclosure are not limited thereto, as the region of interest which is set by a driver or the omnidirectional region may be displayed on the first region 120 a, and a reduced top view or a standard top view may be displayed on the second region 120 b.
In an embodiment, the control unit 200 may collect information on the steering angle of the vehicle 800 from the steering-angle sensor unit 600 and may calculate the reverse path of the vehicle based on the collected information on the steering angle and the top view image. The control unit 200 may calculate the collision possibility of the vehicle 800 and the obstacle based on the reverse path, the top view image, and the obstacle information.
In an embodiment, the control unit 200 may provide the reduced and/or enlarged top view image to an RSPA (remote smart parking assist) system with reference to the location and angle of a parking line to easily secure a distance between the vehicle 800 and the parking line, which is minimally required for activating an RSPA function. Thus, it is possible to increase the possibility of the RSPA system recognizing the parking line.
All components shown in FIG. 1 are not essential components of embodiments of the present disclosure, and some components included in the image processing apparatus may be added, changed, or deleted in other embodiments. Meanwhile, the components shown in FIG. 1 represent functional elements which are functionally different from each other, and a plurality of components may be implemented in a form integrated with each other in an actual physical environment. In addition, the function of one component may be distributed and performed in a computer system in a plurality of physical devices.
FIG. 6 is a flowchart illustrating a method of operating an image processing apparatus according to an embodiment of the present disclosure.
In step S610, the control unit 200 determines whether the driver applies power to the PDW system 300 from the PDW power supply unit 330.
When power is not applied to the PDW system 300, the process moves from step S610 to step S670. In step S670, the second display unit 120 displays a front top view having the front as the region of interest in the first region 120 a by default and displays a standard top view in the second region 120 b. In another embodiment, the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
When power is applied to the PDW system 300, the process moves from step S610 to step S620. In step S620, the control unit 200 determines whether the speed of the vehicle 800 is less than a preset speed (e.g., 10 kph) based on information on the speed of the vehicle 800 obtained from the speed sensor unit 400.
When the speed of the vehicle 800 is equal to or greater than the preset speed, the process moves from step S620 to step S640. In step S640, the control unit 200 determines whether the gear of the vehicle 800 is set to a D-gear (drive) or an R-gear (reverse) based on the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500.
When the gear of the vehicle 800 is the D-gear or the R-gear, the process moves from step S640 to step S660. If the vehicle 800 is driven at a preset speed or faster and the gear is the D-gear or the R-gear, it is necessary to provide a wide field of view to the driver. Therefore, in step S660, the second display unit 120 displays a front top view having the front area as the region of interest in the first region 120 a by default and displays a reduced top view in the second region 120 b. In another embodiment, the driver may individually control the SVM system. Therefore, when the driver sets the region of interest differently from the default, different images may be displayed on the first region 120 a and the second region 120 b.
When the gear of the vehicle 800 is not the D-gear or the R-gear, the process moves from step S640 to step S670. However, embodiments of the present disclosure are not limited thereto, and in another embodiment, the process may move to step S660 or step S670 regardless of the gear information of the vehicle 800. In a further embodiment, when the gear of the vehicle 800 is the D-gear or the R-gear, the process may move to step S670.
Returning back to step S620, when the speed of the vehicle 800 is less than a preset speed, the process moves from step S620 to step S630. In step S630, the control unit 200 determines whether the gear of the vehicle 800 is set to a P-gear (park) or an N-gear (neutral) through the transmission gear information of the vehicle 800 obtained from the gear sensor unit 500.
When the gear of the vehicle 800 is the P-gear or the N-gear, the process moves from step S630 to step S650. In step S650, the SVM system 100 and the PDW system 300 are cooperatively operated. Such a cooperative operation may be performed through the control unit 200. A detailed process in which the SVM system 100 and the PDW system 300 are cooperatively operated will be described below with reference to FIGS. 7A to 7E.
When the gear of the vehicle 800 is not the P-gear or the N-gear, the process moves from step S630 to step S670. However, embodiments of the present disclosure are not limited thereto, and in another embodiment, the process may move to step S650 or step S670 regardless of the gear information of the vehicle 800. In a further embodiment, when the gear of the vehicle 800 is the P-gear or the N-gear, the process may move to step S670.
FIGS. 7A to 7E are conceptual diagrams illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to embodiments of the present disclosure.
FIG. 7A is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t1, t2, and t3 in the first level warning situation in the image control device according to an embodiment of the present disclosure. For the redundant components shown in FIG. 7A, reference numerals will be omitted for the convenience of understanding.
Referring to FIG. 7A, in the non-warning level just before the PDW system 300 recognizes the obstacle 700 (t1), the parking warning is not displayed on the first display unit 320, the front top view is displayed on the first region 120 a of the second display unit 120, and the omnidirectional standard top view is displayed on the second region 120 b.
Immediately after the PDW system 300 recognizes the obstacle 700 (t2), a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
Afterwards (t3), the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
Referring to FIG. 7A, the obstacle 700 is located on the right side of the front of the vehicle 800, and a distance between the obstacle 700 and the vehicle 800 falls within a preset first level warning distance range. The object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d) receiving the ultrasonic signal to the control unit 200. The control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310. The control unit 200 determines the warning level as the first level based on the calculated result.
The control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320. The first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200.
The control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110, the warning level obtained from the PDW system 300, and information on the obstacle 700. The new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700. The control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700. The control unit 200 transmits the generated image to the second display unit 120. The top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120, and the omnidirectional reduced top view image is displayed on the second region 120 b so that the driver can easily identify the situation around where the vehicle 800 and the obstacle 700 are located.
However, embodiments of the present disclosure are not limited thereto. In another embodiment, a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
FIG. 7B is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the time sequence t1, t2, and t3 in the third level warning situation in the image control device according to an embodiment of the present disclosure. For the redundant components shown in FIG. 7B, reference numerals will be omitted for the convenience of understanding.
Referring to FIG. 7B, in the non-warning level just before the PDW system 300 recognizes the obstacle 700 (t1), the parking warning is not displayed on the first display unit 320, the front top view is displayed on the first region 120 a of the second display unit 120, and the omnidirectional standard top view is displayed on the second region 120 b.
Immediately after the PDW system 300 recognizes the obstacle 700 (t2), a warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
Afterwards (t3), the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
Referring to FIG. 7B, the obstacle 700 is located on the right side of the front of the vehicle 800, and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range. The object recognition sensor unit 310 transmits information including the ultrasonic signal reflected from the obstacle 700 and/or the ultrasonic sensor (at least one of 310 a to 310 d) receiving the ultrasonic signal to the control unit 200. The control unit 200 calculates the location, distance, and/or angle of the obstacle 700 based on information received from the object recognition sensor unit 310. The control unit 200 determines the warning level as the third level based on the calculated result.
The control unit 200 transmits information about the location, distance, and/or warning level of the obstacle 700 to the first display unit 320. The first display unit 320 displays the warning light 950 in the direction in which the obstacle is located based on information received from the control unit 200.
The control unit 200 determines the new region of interest 900 based on the image collected from the image collection unit 110, the warning level obtained from the PDW system 300, and information on the obstacle 700. The new region of interest 900 is determined as a region which includes the obstacle 700 and is narrower than the omnidirectional region of interest so that the driver can more easily identify the obstacle 700. The control unit 200 transforms and combines collected images according to the new region of interest 900 to generate the top view image focused on the obstacle 700. The control unit 200 transmits the generated image to the second display unit 120. The top view image in which the obstacle 700 is focused is displayed on the first region 120 a of the second display unit 120, and the omnidirectional enlarged top view image is displayed on the second region 120 b so that the driver can easily identify the locations of the vehicle 800 and the obstacle 700.
However, embodiments of the present disclosure are not limited thereto. In another embodiment, a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
FIG. 7C is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the steering angle of the vehicle 800 according to the time sequence t1, t2, and t3, in the image control device according to an embodiment of the present disclosure. For the redundant components shown in FIG. 7C, reference numerals will be omitted for the convenience of understanding.
FIG. 7C shows a situation where the vehicle 800 is reversing. Referring to FIG. 7C, a reverse path 810 according to the steering angle of the vehicle 800 is shown in the first region 120 a. In an embodiment, the reverse path 810 may be displayed on the first region 120 a and/or the second region 120 b so that the driver can easily identify the reverse path 810. However, in another embodiment, the reverse path 810 may exist only conceptually for the cooperative operation of the SVM system 100 and the PDW system 300.
Referring to FIG. 7C, just before the PDW system 300 recognizes the obstacle 700 (t1) and/or when there is no risk of collision between the vehicle 800 and the obstacle 700, the parking warning is not displayed on the first display unit 320, the rear top view is displayed on the first region 120 a of the second display unit 120, and the omnidirectional standard top view is displayed on the second region 120 b.
When the PDW system 300 recognizes the obstacle 700 (t2) and the vehicle 800 simultaneously reverses along the reverse path 810 so that there is a risk of collision between the vehicle 800 and the obstacle 700, the warning light 950 is displayed on the first display unit 320 in a direction in which the obstacle is located, and a new region of interest 900 is set.
Afterwards (t3), the top view image according to the new region of interest 900 is displayed on the first region 120 a and the second region 120 b.
Referring to FIG. 7C, the obstacle 700 is located on the right side of the rear of the vehicle 800, and a distance between the obstacle 700 and the vehicle 800 falls within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted.
The control unit 200 determines the new region of interest 900 based on the steering angle received from the steering-angle sensor unit 600, the image of the surroundings received from the image collection unit 110, the warning level received from the PDW system 300, and information on the obstacle 700.
More specifically, the control unit 200 first generates the top view image by transforming and combining collected images. The control unit 200 calculates the reverse path 810 of the vehicle 800 based on the generated top view image and steering-angle information. The control unit 200 determines the new region of interest 900 including the reverse path 810 and the obstacle 700 based on the reverse path 810 and the obstacle 700 information. The control unit 200 generates a new top view image focused on the reverse path 810 and the obstacle 700 by retransforming and recombining collected images according to the new region of interest 900.
The control unit 200 transmits the generated image to the second display unit 120. The top view image in which the reverse path 810 and the obstacle 700 are focused is displayed on the first region 120 a of the second display unit 120, and the omnidirectional enlarged top view image is displayed on the second region 120 b to allow the driver to move to another space for avoiding the obstacle 700.
However, embodiments of the present disclosure are not limited thereto. In another embodiment, a driver may change, reduce, and enlarge the region of interest by touching or scrolling the first region 120 a and/or the second region 120 b which is set in the image control device.
FIG. 7D is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 based on the locations of the obstacles 700 a and 700 b according to the time sequence t1, t2, and t3, in the image control device according to an embodiment of the present disclosure. For the redundant components shown in FIG. 7D, reference numerals will be omitted for the convenience of understanding.
Referring to FIG. 7D, in the non-warning level just before the PDW system 300 recognizes the obstacles 700 a and 700 b (t1), the parking warning is not displayed on the first display unit 320, the rear top view is displayed on the first region 120 a of the second display unit 120, and the omnidirectional standard top view is displayed on the second region 120 b.
Immediately after the PDW system 300 recognizes the obstacles 700 a and 700 b (t2), warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
Referring to FIG. 7D, all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted. The right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800, and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800.
The control unit 200 may set the region of interest by assigning a weight to an angle among a plurality of obstacles having the same warning level.
In an embodiment where a weight is assigned to the angle, the control unit 200 may assign a high weight to an obstacle located in a diagonal direction having a relatively high probability of collision. In this case, the region of interest 900 a focused on the right rear obstacle 700 a is determined.
Afterwards (t3), the top view image according to the third level warning and the new region of interest 900 a in which a weight is assigned to the angle of the obstacle is displayed in the first region 120 a and the second region 120 b.
In an embodiment, an angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined with a region where space maps assigned to each of the cameras 110 a to 110 d overlap. In another embodiment, the angle between the vehicle 800 and the obstacle to which the weight is assigned may be determined as an angle at which a collision is most likely to occur according to steering angle information received from the steering-angle sensor unit 600.
FIG. 7E is a conceptual diagram illustrating the cooperative operation between the SVM system 100 and the PDW system 300 according to the locations of the obstacles 700 a and 700 b according to the time sequence t1, t2, and t3, in the image control device according to an embodiment of the present disclosure. For the redundant components shown in FIG. 7E, reference numerals will be omitted for the convenience of understanding.
Referring to FIG. 7E, in the non-warning level just before the PDW system 300 recognizes the obstacles 700 a and 700 b (t1), the parking warning is not displayed on the first display unit 320, the rear top view is displayed on the first region 120 a of the second display unit 120, and the omnidirectional standard top view is displayed on the second region 120 b.
Immediately after the PDW system 300 recognizes the obstacles 700 a and 700 b (t2), warning lights 950 a and 950 b are displayed on the first display unit 320 in directions in which the obstacles are located, and new regions of interest 900 a and 900 b are set.
Referring to FIG. 7E, all the distances between the vehicle and the obstacles 700 a and 700 b fall within a preset third level warning distance range. Since the process of the control unit 200 and the PDW system 300 determining the warning level has been described above, a redundant description thereof will be omitted. The right rear obstacle 700 a and the rear obstacle 700 b are located around the vehicle 800, and a distance between the right rear obstacle 700 a and the vehicle 800 is greater than a distance between the rear obstacle 700 b and the vehicle 800.
The control unit 200 may set the region of interest by assigning a weight to a distance among a plurality of obstacles having the same warning level.
In an embodiment in which a weight is assigned to the distance, the control unit 200 may assign a high weight to an obstacle located within a short distance having a relatively high probability of collision. In this case, the region of interest 900 b focused on the rear obstacle 700 b is determined.
Afterwards (t3), the top view image according to the third level warning and the new region of interest 900 b in which the weight is assigned to the distance of the obstacle is displayed in the first region 120 a and the second region 120 b.
FIG. 8A and FIG. 8B are conceptual diagrams illustrating an image setting process as pixels of the camera increase and the region of interest varies in the image control device according to an embodiment of the present disclosure.
FIG. 8A and FIG. 8B illustrate a process in which the region of interest is changed as a driver scrolls or touches the touch panel, when the user input unit 130 includes the touch panel, in the image control device according to an embodiment of the present disclosure.
Referring to FIG. 8A and FIG. 8B, when the touch panel is coupled to the second display unit 120 in a stacked structure, the touch panel may correspond to the first region 120 a and/or the second region 120 b.
Referring to FIG. 8A, the driver may scroll the second region 120 b where the top view image of the region of interest 815 by default is displayed. If the driver scrolls any location on the second region 120 b, the user input unit 130 may recognize the driver's scroll length and direction.
The control unit 200 receives scroll information from the user input unit 130. Based on the scroll information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
The control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
In another embodiment, when a preset time interval is exceeded, the control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8A.
Referring to FIG. 8B, the driver may touch the second region 120 b on which the top view image of the default region of interest 815 is displayed. When the driver touches the front location of the vehicle 800 displayed on the second region 120 b, the user input unit 130 may recognize a driver's touch location.
The control unit 200 receives touch information from the user input unit 130. Based on the touch information, the control unit 200 generates a new top view image including a new region of interest 820 and displays it on the second region 120 b.
The driver may set a region of interest 830 which is further enlarged to the front of the vehicle 800 by touching the new region of interest 820. The control unit 200 may generate the top view image of the region of interest which is set by a user to the extent that distortion of the top view image does not occur based on the pixel information of the cameras 110 a to 110 d.
In another embodiment, when a preset time interval is exceeded, the control unit 200 may restore the region of interest changed by the driver to the default. In the case that the region of interest is restored to the default, the second region 120 b returns to an initial screen shown in FIG. 8B.
Each component of the apparatus or method according to embodiments of the present disclosure may be implemented as hardware or software or a combination of hardware and software. Further, the function of each component may be implemented as software and a microprocessor may be implemented to execute the function of software corresponding to each component.
Various implementations of systems and techniques described herein may be realized as digital electronic circuits, integrated circuits, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various implementations may include one or more computer programs executable on a programmable system. The programmable system includes at least one programmable processor (which may be a special-purpose processor or a general-purpose processor) coupled to receive and transmit data and instructions from and to a storage system, at least one input device, and at least one output device. The computer programs (also known as programs, software, software applications, or codes) contain commands for a programmable processor and are stored in a “computer-readable recording medium”.
The computer-readable recording medium includes all types of recording devices in which data readable by a computer system is stored. Such a computer-readable recording medium may be a non-volatile or non-transitory medium, such as ROM, a CD-ROM, a magnetic tape, a floppy disk, a memory card, a hard disk, a magneto-optical disk, or a storage device, and may further include a transitory medium such as a data transmission medium. In addition, the computer-readable recording medium may be distributed in a computer system connected via a network, so that computer-readable codes may be stored and executed in a distributed manner.
The flowchart/timing diagram of the present specification describes that processes are sequentially executed, but this is merely illustrative of the technical idea of an embodiment of the present disclosure. In other words, since it is apparent to those skilled in the art that an order described in the flowchart/timing diagram may be changed or one or more processes may be executed in parallel without departing from the essential characteristics of an embodiment of the present disclosure, the flowchart/timing diagram is not limited to a time-series order.
Although exemplary embodiments of the present disclosure have been described for illustrative purposes, those skilled in the art will appreciate that various modifications, additions, and substitutions are possible, without departing from the idea and scope of the claimed invention. Therefore, exemplary embodiments of the present disclosure have been described for the sake of brevity and clarity. The scope of the technical idea of the present embodiments is not limited by the illustrations. Accordingly, one of ordinary skill would understand that the scope of the claimed invention is not to be limited by the above explicitly described embodiments but by the claims and equivalents thereof.

Claims (20)

What is claimed is:
1. An image processing apparatus comprising:
a plurality of cameras configured to collect a plurality of images surroundings around a vehicle;
a plurality of object recognition sensors configured to collect obstacle location information of a plurality of obstacles located around the vehicle;
a controller configured to generate one or more top view images based on the plurality of images of the surroundings and to determine whether to display a parking warning image based on the obstacle location information; and
a first display configured to display the parking warning image; and
a second display configured to display the one or more top view image, images;
wherein the controller is configured to:
select a parking warning level from a plurality of preset parking warning levels based on the obstacle location information;
display the parking warning image corresponding to the selected parking warning level on the first display; and
in response to the determination to display the parking warning image, display the top view image of the plurality of obstacles on the second display, wherein in a situation in which each obstacle of the plurality of obstacles has the same parking warning level, display the top view image focused on an obstacle located in a diagonal direction of the vehicle based on a steering angle of the vehicle and directions from the vehicle to the plurality of obstacles.
2. The apparatus of claim 1, wherein:
the second display is divided into a plurality of regions including a first region and a second region; and
in response to the determination to display the parking warning image, the controller is configured to automatically display a first top view image focused on the plurality of obstacles on the first region and automatically display a second top view image including all regions around the vehicle on the second region.
3. The apparatus of claim 2, wherein the controller is configured to: control such that the first top view image in which a width of a region of interest is changed is automatically displayed on the first region and the second top view image in which the width of the region of interest is changed is automatically displayed on the second region according to the selected parking warning level.
4. The apparatus of claim 3, wherein the controller is configured to control such that the first top view image in which the width of the region of interest is reduced is automatically displayed on the first region and the second top view image in which the width of the region of interest is enlarged is automatically displayed on the second region according to the selected parking warning level.
5. The apparatus of claim 3, further comprising a speed sensor configured to detect a speed of the vehicle and transmit information about the speed to the controller, wherein the controller is configured to control to display the first top view image and the second top view image corresponding to the selected parking warning level on the second display in response to the detected speed being less than a preset speed.
6. The apparatus of claim 3, further comprising a gear sensor configured to detect an operation of a transmission gear lever and transmit information about the operation of the transmission gear lever to the controller, wherein the controller is configured to control to display the first top view image and the second top view image corresponding to the selected parking warning level on the second display in response to the transmission gear lever being set to any gear of a plurality of preset gears.
7. The apparatus of claim 3, further comprising a steering-angle sensor configured to detect the steering angle of the vehicle and transmit information about the steering angle to the controller, wherein the controller is configured to:
calculate a reverse path of the vehicle based on the steering angle and the top view image;
calculate a collision possibility between the obstacle and the vehicle based on the reverse path and the obstacle location information; and
generate the first top view image and the second top view image based on the collision possibility, the parking warning level, or the obstacle location information.
8. The apparatus of claim 1, further comprising a user input device configured to recognize an input in a scroll method or a touch method, wherein the controller is configured to generate the top view image in which a region of interest is changed based on the input.
9. An image processing method, the method comprising:
collecting a plurality of images of surroundings around a vehicle using a plurality of cameras attached to the vehicle;
collecting obstacle location information of a plurality of obstacles located around the vehicle;
determining, for each of the plurality of obstacles, a parking warning level from a plurality of preset parking warning levels based on the obstacle location information, wherein each obstacle of the plurality of obstacles has the same parking warning level;
displaying a parking warning image corresponding to the determined parking warning level on a first display;
generating a top view image focused on the plurality of obstacles based on the plurality of images of the surroundings and the obstacle location information, wherein the top view image is focused on an obstacle located in a diagonal direction of the vehicle based on a steering angle of the vehicle and directions from the vehicle to the one or more obstacles; and
displaying the top view image on a second display.
10. The method of claim 9, wherein generating the top view image comprises:
automatically generating a first top view image focused on the plurality of obstacle on a first region of the second display; and
automatically generating a second top view image including all regions around the vehicle on a second region of the second display.
11. The method of claim 10, wherein: generating the top view image comprises:
automatically displaying the first top view image in which a width of a region of interest is changed on the first region according to the parking warning level; and
automatically displaying the second top view image in which the width of the region of interest is changed on the second region according to the parking warning level.
12. The method of claim 11, wherein: generating the top view image further comprises:
automatically displaying the first top view image in which the width of the region of interest is reduced on the first region according to the parking warning level; and
automatically displaying the second top view image in which the width of the region of interest is enlarged on the second region according to the parking warning level.
13. The method of claim 11, wherein generating the top view image further comprises determining whether a speed of the vehicle is less than a preset speed.
14. The method of claim 11, wherein generating the top view image further comprises determining whether a transmission gear lever of the vehicle is set to any gear of a plurality of preset gears.
15. The method of claim 11, wherein generating the top view image comprises:
calculating a reverse path of the vehicle based on the steering angle of the vehicle and the top view image;
calculating a collision possibility between the plurality of obstacles and the vehicle based on the reverse path and the obstacle location information; and
generating the first top view image and the second top view image based on the collision possibility, the parking warning level, or the obstacle location information.
16. An image processing method, the method comprising:
collecting a plurality of images of surroundings around a vehicle using a plurality of cameras attached to the vehicle;
collecting obstacle location information of a plurality of obstacles located around the vehicle;
determining, for each of the plurality of obstacles, a parking warning level from a plurality of preset parking warning levels based on the obstacle location information, wherein each obstacle of the plurality of obstacles has the same parking warning level;
displaying a parking warning image corresponding to the determined parking warning level on a first display;
generating a top view image focused on the plurality of obstacles based on the plurality of images of the surroundings and the obstacle location information, wherein the top view image is focused on an obstacle located in a diagonal direction of the vehicle based on a steering angle of the vehicle and directions from the vehicle to the plurality of obstacles;
displaying the top view image on a second display; and
recognizing an input that is input according to a scroll method or a touch method; and
re-generating the top view image in which a region of interest is changed based on the input.
17. The method of claim 16, wherein generating the top view image comprises:
automatically generating a first top view image focused on the plurality of obstacles on a first region of the second display; and
automatically generating a second top view image including all regions around the vehicle on a second region of the second display.
18. The method of claim 17, wherein generating the top view image comprises:
automatically displaying the first top view image in which a width of the region of interest is changed on the first region according to the parking warning level; and
automatically displaying the second top view image in which the width of the region of interest is changed on the second region according to the parking warning level.
19. The method of claim 18, wherein generating the top view image further comprises:
automatically displaying the first top view image in which the width of the region of interest is reduced on the first region according to the parking warning level; and
automatically displaying the second top view image in which the width of the region of interest is enlarged on the second region according to the parking warning level.
20. The method of claim 18, wherein generating the top view image further comprises:
determining whether a speed of the vehicle is less than a preset speed; and
determining whether a transmission gear lever of the vehicle is set to any gear of a plurality of preset gears.
US18/345,323 2023-01-09 2023-06-30 Method and apparatus for SVM top view image processing Active 2043-07-27 US12407794B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020230003098A KR20240111376A (en) 2023-01-09 2023-01-09 Method And Apparatus for SVM Top View Image Processing
KR10-2023-0003098 2023-01-09

Publications (2)

Publication Number Publication Date
US20240236276A1 US20240236276A1 (en) 2024-07-11
US12407794B2 true US12407794B2 (en) 2025-09-02

Family

ID=91732087

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/345,323 Active 2043-07-27 US12407794B2 (en) 2023-01-09 2023-06-30 Method and apparatus for SVM top view image processing

Country Status (3)

Country Link
US (1) US12407794B2 (en)
KR (1) KR20240111376A (en)
CN (1) CN118306314A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170028917A1 (en) * 2014-01-27 2017-02-02 Denso Corporation Driving assistance device and driving assistance method
US20180137760A1 (en) * 2015-05-11 2018-05-17 Panasonic Intellectual Property Management Co. Ltd. Monitoring-target-region setting device and monitoring-target-region setting method
US20180265004A1 (en) * 2016-03-07 2018-09-20 Mazda Motor Corporation Vehicle periphery image display device
US20180330175A1 (en) * 2017-05-10 2018-11-15 Fotonation Limited Multi-camera vision system and method of monitoring
US20190215465A1 (en) * 2017-02-28 2019-07-11 JVC Kenwood Corporation Bird's-eye view image generating device, bird's-eye view image generating system, bird's-eye view image generating method, and medium
US20200039506A1 (en) * 2018-08-02 2020-02-06 Faraday&Future Inc. System and method for providing visual assistance during an autonomous driving maneuver
US20200081607A1 (en) * 2018-09-07 2020-03-12 Aisin Seiki Kabushiki Kaisha Display control device
US20200137322A1 (en) * 2018-10-26 2020-04-30 Denso Corporation Image processing apparatus
US20200242374A1 (en) * 2018-04-02 2020-07-30 Jvckenwood Corporation Vehicle display control device, vehicle display system, vehicle display control method, non-transitory storage medium
US20200398824A1 (en) * 2019-06-24 2020-12-24 Honda Motor Co., Ltd. Parking assist system
US20210107511A1 (en) * 2019-10-11 2021-04-15 Toyota Jidosha Kabushiki Kaisha Parking assist apparatus
US20220408062A1 (en) * 2020-03-27 2022-12-22 Jvckenwood Corporation Display control apparatus, display control method, and program
US20230012629A1 (en) * 2020-03-26 2023-01-19 Samsung Electronics Co., Ltd. Electronic device for displaying image by using camera monitoring system (cms) side display mounted in vehicle, and operation method thereof
US20230302999A1 (en) * 2022-03-23 2023-09-28 Isuzu Motors Limited Vehicle rearward monitoring system and vehicle rearward monitoring method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170028917A1 (en) * 2014-01-27 2017-02-02 Denso Corporation Driving assistance device and driving assistance method
US20180137760A1 (en) * 2015-05-11 2018-05-17 Panasonic Intellectual Property Management Co. Ltd. Monitoring-target-region setting device and monitoring-target-region setting method
US20180265004A1 (en) * 2016-03-07 2018-09-20 Mazda Motor Corporation Vehicle periphery image display device
US20190215465A1 (en) * 2017-02-28 2019-07-11 JVC Kenwood Corporation Bird's-eye view image generating device, bird's-eye view image generating system, bird's-eye view image generating method, and medium
US20180330175A1 (en) * 2017-05-10 2018-11-15 Fotonation Limited Multi-camera vision system and method of monitoring
US20200242374A1 (en) * 2018-04-02 2020-07-30 Jvckenwood Corporation Vehicle display control device, vehicle display system, vehicle display control method, non-transitory storage medium
US20200039506A1 (en) * 2018-08-02 2020-02-06 Faraday&Future Inc. System and method for providing visual assistance during an autonomous driving maneuver
US20200081607A1 (en) * 2018-09-07 2020-03-12 Aisin Seiki Kabushiki Kaisha Display control device
US20200137322A1 (en) * 2018-10-26 2020-04-30 Denso Corporation Image processing apparatus
US20200398824A1 (en) * 2019-06-24 2020-12-24 Honda Motor Co., Ltd. Parking assist system
US20210107511A1 (en) * 2019-10-11 2021-04-15 Toyota Jidosha Kabushiki Kaisha Parking assist apparatus
US20230012629A1 (en) * 2020-03-26 2023-01-19 Samsung Electronics Co., Ltd. Electronic device for displaying image by using camera monitoring system (cms) side display mounted in vehicle, and operation method thereof
US20220408062A1 (en) * 2020-03-27 2022-12-22 Jvckenwood Corporation Display control apparatus, display control method, and program
US20230302999A1 (en) * 2022-03-23 2023-09-28 Isuzu Motors Limited Vehicle rearward monitoring system and vehicle rearward monitoring method

Also Published As

Publication number Publication date
CN118306314A (en) 2024-07-09
KR20240111376A (en) 2024-07-17
US20240236276A1 (en) 2024-07-11

Similar Documents

Publication Publication Date Title
US12220825B2 (en) Display apparatus
US10899277B2 (en) Vehicular vision system with reduced distortion display
US11858424B2 (en) Electronic device for displaying image by using camera monitoring system (CMS) side display mounted in vehicle, and operation method thereof
TWI478833B (en) Method of adjusting the vehicle image device and system thereof
US20150109444A1 (en) Vision-based object sensing and highlighting in vehicle image display systems
US20140114534A1 (en) Dynamic rearview mirror display features
US10495458B2 (en) Image processing system for vehicle
CN115087584A (en) Vehicle Trailer Guidance System
JP6014433B2 (en) Image processing apparatus, image processing method, and image processing system
US11760262B2 (en) Surround view monitoring system and method for vehicle, and parking assist control system of vehicle
JP5991648B2 (en) Display control device for vehicle
WO2013046407A1 (en) Image display device, and image display method
WO2013046408A1 (en) Image display device and image display method
JP2012076483A (en) Parking support device
JP2005186648A (en) Vehicle periphery visual recognition device and display control device
JP7631275B2 (en) Mobile body and imaging device installation method
WO2014155953A1 (en) Vehicle-surroundings-monitoring control device
WO2018159016A1 (en) Bird's eye view image generation device, bird's eye view image generation system, bird's eye view image generation method and program
US12185017B2 (en) Display control apparatus, vehicle, and display control method
US12240386B2 (en) Vehicle sensing system with enhanced obstacle detection forward and sideward of the vehicle
JP5445719B2 (en) Image display device and image display method
JP6617462B2 (en) Vehicle periphery visual recognition device
US12407794B2 (en) Method and apparatus for SVM top view image processing
KR20170133743A (en) Vehicle control system based on user input and method thereof
JP2016070951A (en) Display device, control method, program, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: HYUNDAI MOTOR COMPANY, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, MIN CHUL;REEL/FRAME:064128/0385

Effective date: 20230614

Owner name: KIA CORPORATION, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, MIN CHUL;REEL/FRAME:064128/0385

Effective date: 20230614

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction