WO2022038981A1 - Dispositif et procede de traitement d'informations, dispositif de capture d'image, dispositif mobile et programme informatique - Google Patents
Dispositif et procede de traitement d'informations, dispositif de capture d'image, dispositif mobile et programme informatique Download PDFInfo
- Publication number
- WO2022038981A1 WO2022038981A1 PCT/JP2021/027934 JP2021027934W WO2022038981A1 WO 2022038981 A1 WO2022038981 A1 WO 2022038981A1 JP 2021027934 W JP2021027934 W JP 2021027934W WO 2022038981 A1 WO2022038981 A1 WO 2022038981A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- image data
- correction amount
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/091—Digital circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
Definitions
- the present disclosure discloses an information processing device and an information processing method, an image pickup device, a mobile device, and a computer that control the exposure of an in-vehicle camera and perform processing for developing, adjusting, and synthesizing an image obtained by imaging. Regarding the program.
- ADAS Advanced Driver Assistance System
- the brightness farther than the recognition area is measured, and the change in brightness is detected by comparing the measured brightness with a time difference, and the timing at which the change in brightness occurs in the recognition area.
- an exposure control device that controls the exposure of the in-vehicle camera at the predicted timing has been proposed (see Patent Document 1).
- the autobracket After performing autobracketing shooting with a predetermined reference exposure value, the autobracket that selects the appropriate exposure or the image data closest to the appropriate exposure from the obtained plurality of image data and sets it as a new reference exposure value.
- a proposal has been made for an exposure control method for an in-vehicle surveillance camera having a function of shooting.
- An object of the technique disclosed in the present specification is an information processing apparatus, an information processing method, and an imaging apparatus that control the exposure of an in-vehicle camera and perform processing for developing, adjusting, and synthesizing an image obtained by imaging. , To provide mobile devices, as well as computer programs.
- the information processing apparatus is A recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount. It is provided with the processing unit for generating the image data of 2.
- the first image data generated based on the first correction amount specialized for the color tone of the first object is used.
- the color tone of the object can be recognized with high accuracy.
- the recognition unit recognizes a second object different from the first object from the first image data, and recognizes the second object.
- the processing unit detects the tone of the second object, calculates a second correction amount for correcting the tone, and corrects the tone of the first image data based on the second correction amount. Still another second image data which is the obtained data is generated.
- the processing unit calculates an image composition ratio when synthesizing the plurality of different second image data, and generates a composite image obtained by synthesizing the plurality of different second image data based on the image composition ratio.
- the information processing apparatus further includes a control unit that controls the image pickup of the image sensor based on the first correction amount and the second correction amount.
- control unit feeds back the correction value obtained by the processing unit to perform image pickup control (for example, exposure control) of the image sensor. Therefore, after that, the image sensor can capture more appropriate imaging data.
- image pickup control for example, exposure control
- a determination unit for determining blinking of the first object is further provided.
- the processing unit obtains the first correction amount calculated based on the determination result.
- the correction value is calculated based on the determination result of the blinking of the first object of the determination unit.
- the color tone of the first object is increased by using the second image data generated based on the first correction amount specialized for the color tone of the first object. Can be recognized with accuracy.
- the information processing apparatus further includes a control unit that controls the image pickup of the image sensor based on the first correction amount calculated based on the determination result.
- the control unit feeds back the correction value obtained by the processing unit to perform image pickup control (for example, exposure control) of the image sensor. Therefore, after that, the image sensor can capture more appropriate imaging data.
- the image sensor is mounted on a moving body and used.
- the first object is a traffic light.
- the color tone of the traffic light (red, yellow, Blue) can be recognized with high accuracy.
- the moving body is an autonomous driving vehicle, it is possible to appropriately determine the progress or stop of the autonomous driving vehicle.
- the recognition unit is the first from a plurality of different first image data generated by the processing unit developing a plurality of different image pickup data output simultaneously by the image sensor and simultaneously imaged by different image pickup controls. Recognize one object and a second object different from the first object, respectively.
- the processing unit One used to detect the color tone of the first object, calculate a first correction amount for correcting the color tone, and recognize the first object based on the first correction amount.
- the second image data which is the data obtained by correcting the color tone of the first image data of the above, is generated.
- the tone of the second object is detected, the second correction amount for correcting the tone is calculated, and the separate correction amount used for recognizing the second object based on the second correction amount.
- Generate a separate second image data which is the tone-corrected data of the first image data.
- the color tone of the first object and the second object of the plurality of different first image data generated by the processing unit developing the plurality of different imaging data simultaneously captured by different imaging controls Regardless of the tone of the first object, the color tone of the first object can be recognized with high accuracy by using one first image data generated based on the first correction amount specialized for the color tone of the first object. .. Further, the tone of the second object can be recognized with high accuracy by using the separate second image data generated based on the second correction amount specialized for the tone of the second object.
- the information processing method is The first object is recognized from the first image data generated by developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount. Generate the image data of 2.
- the image pickup apparatus is Image sensor and A recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount. It is provided with the processing unit for generating the image data of 2.
- the mobile device is With a mobile body
- the image sensor mounted on the moving body and A recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount. It includes the processing unit that generates the image data of No. 2 and an operation control unit that controls the operation of the moving body based on the result of recognizing the second image data.
- the computer program according to the embodiment of the present disclosure is A recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount. It is described in a computer-readable format so that the computer functions as the processing unit that generates the image data of 2.
- an information processing apparatus an information processing method, and an imaging apparatus that control the exposure of an in-vehicle camera and perform processing for developing, adjusting, and synthesizing an image obtained by imaging.
- Mobile devices as well as computer programs can be provided.
- the first embodiment of the operation flow of the image pickup apparatus 200 is schematically shown.
- a second embodiment of the operation flow of the image pickup apparatus 200 is schematically shown.
- An example of an HDR composite image is shown.
- An example of an HDR composite image is shown.
- An example of an HDR composite image is shown.
- An example of an HDR composite image is shown.
- An example of an HDR composite image is shown.
- An example of an HDR composite image is shown.
- FIG. 1 is a block diagram showing a configuration example of a schematic function of a vehicle control system 100, which is an example of a mobile control system to which the present technology can be applied.
- a vehicle provided with a vehicle control system 100 is distinguished from other vehicles, it is referred to as an own vehicle or an own vehicle.
- the vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a body system control unit 109, and a body. It includes a system system 110, a storage unit 111, and an automatic operation control unit 112.
- the input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the body system control unit 109, the storage unit 111, and the automatic operation control unit 112 are via the communication network 121. They are interconnected.
- the communication network 121 is, for example, from an in-vehicle communication network or bus compliant with any standard such as CAN (Control Area Network), LIN (Local Internet Network), LAN (Local Area Network), or FlexRay (registered trademark). Become. In addition, each part of the vehicle control system 100 may be directly connected without going through the communication network 121.
- the description of the communication network 121 shall be omitted.
- the input unit 101 and the automatic operation control unit 112 communicate with each other via the communication network 121, it is described that the input unit 101 and the automatic operation control unit 112 simply communicate with each other.
- the input unit 101 is provided with a device used by the passenger to input various data, instructions, and the like.
- the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than manual operation by voice or gesture.
- the input unit 101 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device corresponding to the operation of the vehicle control system 100.
- the input unit 101 generates an input signal based on data, instructions, and the like input by the passenger, and supplies the input signal to each unit of the vehicle control system 100.
- the data acquisition unit 102 includes various sensors and the like that acquire data used for processing of the vehicle control system 100, and supplies the acquired data to each unit of the vehicle control system 100.
- the data acquisition unit 102 includes various sensors for detecting the state of the own vehicle and the like.
- the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an accelerator pedal operation amount, a brake pedal operation amount, a steering wheel steering angle, an engine speed, and the like. It is equipped with a sensor or the like for detecting the rotation speed of the motor, the rotation speed of the wheels, or the like.
- IMU inertial measurement unit
- the data acquisition unit 102 includes various sensors for detecting information outside the own vehicle.
- the data acquisition unit 102 includes an image pickup device such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a polarized camera, and other cameras.
- the data acquisition unit 102 includes an environment sensor for detecting the weather or the weather, and a surrounding information detection sensor for detecting an object around the own vehicle.
- the environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
- the ambient information detection sensor includes, for example, an ultrasonic sensor, a radar, a LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ranking), a sonar, and the like.
- the data acquisition unit 102 includes various sensors for detecting the current position of the own vehicle.
- the data acquisition unit 102 includes a GNSS receiver or the like that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite.
- GNSS Global Navigation Satellite System
- the data acquisition unit 102 includes various sensors for detecting information in the vehicle.
- the data acquisition unit 102 includes an image pickup device that captures an image of the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
- the biosensor is provided on, for example, on the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel.
- the communication unit 103 communicates with the in-vehicle device 104 and various devices, servers, base stations, etc. outside the vehicle, transmits data supplied from each unit of the vehicle control system 100, and uses the received data as the vehicle control system. It is supplied to each part of 100.
- the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
- the communication unit 103 wirelessly communicates with the in-vehicle device 104 by wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 may use USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL () via a connection terminal (and a cable if necessary) (not shown). Wired communication is performed with the in-vehicle device 104 by Mobile High-definition Link) or the like.
- the communication unit 103 with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a network peculiar to a business operator) via a base station or an access point.
- a device for example, an application server or a control server
- an external network for example, the Internet, a cloud network, or a network peculiar to a business operator
- the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal existing in the vicinity of the own vehicle (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal). Communicate.
- P2P Peer To Peer
- MTC Machine Type Communication
- the communication unit 103 includes vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-home (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Perform V2X communication such as communication. Further, for example, the communication unit 103 is provided with a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic congestion, traffic regulation, or required time. do.
- a beacon receiving unit receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic congestion, traffic regulation, or required time. do.
- the in-vehicle device 104 includes, for example, a mobile device or a wearable device owned by a passenger, an information device carried in or attached to the own vehicle, a navigation device for searching a route to an arbitrary destination, and the like.
- the output control unit 105 controls the output of various information to the passengers of the own vehicle or the outside of the vehicle.
- the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data) and supplies it to the output unit 106 to output the output unit 105. It controls the output of visual and auditory information from 106.
- the output control unit 105 synthesizes image data captured by different image pickup devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, or the like, and outputs a signal including the generated image. It is supplied to the output unit 106.
- the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a danger zone, and outputs an output signal including the generated voice data to the output unit 106.
- Supply for example, the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision,
- the output unit 106 is provided with a device capable of outputting visual information or auditory information to the passengers of the own vehicle or the outside of the vehicle.
- the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a spectacle-type display worn by a passenger, a projector, a lamp, and the like.
- the display device included in the output unit 106 displays visual information in the driver's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function, in addition to the device having a normal display. It may be a display device.
- the drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies control signals to each unit other than the drive system system 108 as necessary, and notifies the control state of the drive system system 108.
- the drive system system 108 includes various devices related to the drive system of the own vehicle.
- the drive system system 108 includes a drive force generator for generating a drive force of an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle, and the like. It is equipped with a braking device that generates braking force, ABS (Antilock Brake System), ESC (Electronic Stability Control), an electric power steering device, and the like.
- the body system control unit 109 controls the body system 110 by generating various control signals and supplying them to the body system 110. Further, the body system control unit 109 supplies a control signal to each unit other than the body system 110 as necessary, and notifies the control state of the body system 110 and the like.
- the body system 110 includes various body devices equipped on the vehicle body.
- the body system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, headlamps, back lamps, brake lamps, winkers, fog lamps, etc.). Etc. are provided.
- the storage unit 111 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like. ..
- the storage unit 111 stores various programs, data, and the like used by each unit of the vehicle control system 100.
- the storage unit 111 has map data such as a three-dimensional high-precision map such as a dynamic map, a global map which is less accurate than the high-precision map and covers a wide area, and a local map including information around the own vehicle.
- map data such as a three-dimensional high-precision map such as a dynamic map, a global map which is less accurate than the high-precision map and covers a wide area, and a local map including information around the own vehicle.
- the automatic driving control unit 112 controls automatic driving such as autonomous driving or driving support. Specifically, for example, the automatic driving control unit 112 issues collision avoidance or impact mitigation of the own vehicle, follow-up running based on the inter-vehicle distance, vehicle speed maintenance running, collision warning of the own vehicle, lane deviation warning of the own vehicle, and the like. Coordinated control is performed for the purpose of realizing the functions of ADAS (Advanced Driver Assistance System) including. Further, for example, the automatic driving control unit 112 performs coordinated control for the purpose of automatic driving that autonomously travels without depending on the operation of the driver.
- the automatic operation control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
- the detection unit 131 detects various types of information necessary for controlling automatic operation.
- the detection unit 131 includes an outside information detection unit 141, an in-vehicle information detection unit 142, and a vehicle state detection unit 143.
- the vehicle outside information detection unit 141 performs detection processing of information outside the own vehicle based on data or signals from each unit of the vehicle control system 100. For example, the vehicle outside information detection unit 141 performs detection processing, recognition processing, tracking processing, and distance detection processing for an object around the own vehicle. Objects to be detected include, for example, vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, road signs, and the like. Further, for example, the vehicle outside information detection unit 141 performs detection processing of the environment around the own vehicle. The surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like.
- the out-of-vehicle information detection unit 141 inputs data indicating the result of the detection process to the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the traffic rule recognition unit 152, the situation recognition unit 153, and the operation control unit 135. It is supplied to the emergency situation avoidance unit 171 and the like.
- the in-vehicle information detection unit 142 performs in-vehicle information detection processing based on data or signals from each unit of the vehicle control system 100.
- the in-vehicle information detection unit 142 performs driver authentication processing and recognition processing, driver status detection processing, passenger detection processing, vehicle interior environment detection processing, and the like.
- the state of the driver to be detected includes, for example, physical condition, arousal degree, concentration degree, fatigue degree, line-of-sight direction, and the like.
- the environment inside the vehicle to be detected includes, for example, temperature, humidity, brightness, odor, and the like.
- the in-vehicle information detection unit 142 supplies data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
- the vehicle state detection unit 143 performs detection processing of the state of the own vehicle based on data or signals from each part of the vehicle control system 100.
- the state of the own vehicle to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, driving operation state, power seat position / tilt, door lock state, and other in-vehicle devices. The state etc. are included.
- the vehicle state detection unit 143 supplies data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
- the self-position estimation unit 132 estimates the position and posture of the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the vehicle exterior information detection unit 141 and the situation recognition unit 153 of the situation analysis unit 133. Perform processing. Further, the self-position estimation unit 132 generates a local map (hereinafter, referred to as a self-position estimation map) used for self-position estimation, if necessary.
- the map for self-position estimation is, for example, a highly accurate map using a technique such as SLAM (Simultaneus Localization and Mapping).
- the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 111.
- the situation analysis unit 133 analyzes the situation of the own vehicle and the surroundings.
- the situational analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situational awareness unit 153, and a situational prediction unit 154.
- the map analysis unit 151 uses data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132 and the vehicle outside information detection unit 141 as necessary, and the map analysis unit 151 of various maps stored in the storage unit 111. Perform analysis processing and build a map containing information necessary for automatic operation processing.
- the map analysis unit 151 uses the constructed map as a traffic rule recognition unit 152, a situation recognition unit 153, a situation prediction unit 154, a route planning unit 161 of the planning unit 134, an action planning unit 162, an operation planning unit 163, and the like. Supply to.
- the traffic rule recognition unit 152 determines the traffic rules around the vehicle based on data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132, the vehicle outside information detection unit 141, and the map analysis unit 151. Perform recognition processing. By this recognition process, for example, the position and state of the signal around the own vehicle, the content of the traffic regulation around the own vehicle, the lane in which the vehicle can travel, and the like are recognized.
- the traffic rule recognition unit 152 supplies data indicating the result of the recognition process to the situation prediction unit 154 and the like.
- the situation recognition unit 153 can be used for data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. Based on this, the situation recognition process related to the own vehicle is performed. For example, the situational awareness unit 153 performs recognition processing such as the situation of the own vehicle, the situation around the own vehicle, and the situation of the driver of the own vehicle. In addition, the situational awareness unit 153 generates a local map (hereinafter referred to as a situational awareness map) used for recognizing the situation around the own vehicle, if necessary.
- the situational awareness map is, for example, an occupied grid map (Occupancy Grid Map).
- the status of the own vehicle to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the own vehicle, and the presence / absence and content of an abnormality.
- the surrounding conditions of the vehicle to be recognized include, for example, the type and position of surrounding stationary objects, the type, position and movement of surrounding animals (for example, speed, acceleration, moving direction, etc.), and the surrounding roads.
- the composition and road surface condition, as well as the surrounding weather, temperature, humidity, brightness, etc. are included.
- the state of the driver to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, eye movement, driving operation, and the like.
- the situational awareness unit 153 supplies data indicating the result of the recognition process (including a situational awareness map, if necessary) to the self-position estimation unit 132, the situation prediction unit 154, and the like. Further, the situational awareness unit 153 stores the situational awareness map in the storage unit 111.
- the situation prediction unit 154 performs a situation prediction process related to the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151, the traffic rule recognition unit 152, and the situation recognition unit 153. For example, the situation prediction unit 154 performs prediction processing such as the situation of the own vehicle, the situation around the own vehicle, and the situation of the driver.
- the situation of the own vehicle to be predicted includes, for example, the behavior of the own vehicle, the occurrence of an abnormality, the mileage, and the like.
- the situation around the own vehicle to be predicted includes, for example, the behavior of the animal body around the own vehicle, the change in the signal state, the change in the environment such as the weather, and the like.
- the driver's situation to be predicted includes, for example, the driver's behavior and physical condition.
- the situation prediction unit 154 together with the data indicating the result of the prediction processing, along with the data from the traffic rule recognition unit 152 and the situation recognition unit 153, is the route planning unit 161 of the planning unit 134, the action planning unit 162, and the operation planning unit 163. And so on.
- the route planning unit 161 plans a route to the destination based on data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the route planning unit 161 sets a route from the current position to the specified destination based on the global map. Further, for example, the route planning unit 161 appropriately changes the route based on the conditions of traffic congestion, accidents, traffic restrictions, construction work, etc., and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
- the action planning unit 162 can safely route the route planned by the route planning unit 161 based on the data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154 within the planned time. Plan your vehicle's actions to drive. For example, the action planning unit 162 plans starting, stopping, traveling direction (for example, forward, backward, left turn, right turn, turning, etc.), traveling lane, traveling speed, overtaking, and the like. The action planning unit 162 supplies data indicating the planned behavior of the own vehicle to the operation planning unit 163 and the like.
- the motion planning unit 163 is an operation of the own vehicle for realizing the action planned by the action planning unit 162 based on the data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. Plan. For example, the motion planning unit 163 plans acceleration, deceleration, traveling track, and the like. The motion planning unit 163 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172 and the direction control unit 173 of the motion control unit 135.
- the motion control unit 135 controls the motion of the own vehicle.
- the operation control unit 135 includes an emergency situation avoidance unit 171, an acceleration / deceleration control unit 172, and a direction control unit 173.
- the emergency situation avoidance unit 171 is based on the detection results of the outside information detection unit 141, the inside information detection unit 142, and the vehicle condition detection unit 143, and the collision, contact, entry into a danger zone, driver abnormality, and vehicle Performs emergency detection processing such as abnormalities.
- the emergency situation avoidance unit 171 detects the occurrence of an emergency situation, the emergency situation avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency situation such as a sudden stop or a sharp turn.
- the emergency situation avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like.
- the acceleration / deceleration control unit 172 performs acceleration / deceleration control for realizing the operation of the own vehicle planned by the motion planning unit 163 or the emergency situation avoidance unit 171.
- the acceleration / deceleration control unit 172 calculates a control target value of a driving force generator or a braking device for realizing a planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. It is supplied to the system control unit 107.
- the direction control unit 173 controls the direction for realizing the operation of the own vehicle planned by the motion planning unit 163 or the emergency situation avoidance unit 171. For example, the direction control unit 173 calculates the control target value of the steering mechanism for realizing the traveling track or the sharp turn planned by the motion planning unit 163 or the emergency situation avoidance unit 171 and controls to indicate the calculated control target value.
- the command is supplied to the drive system control unit 107.
- Various external recognition sensors such as cameras, millimeter-wave radars, and laser radars are beginning to be installed in vehicles in order to perform more accurate external recognition toward the realization of autonomous driving and ADAS.
- Each sensor has its strengths and weaknesses depending on the detection principle. For example, a camera that captures visible light is not good at dark places, and a radar that detects the reflection of radio waves is not good at objects that do not easily reflect radio waves, such as people and animals. Further, by utilizing the fusion technology that combines two or more sensors, it is possible to realize more accurate external recognition by making the best use of the characteristics of each sensor.
- the in-vehicle camera is intended to realize automatic driving of the own vehicle such as collision avoidance and lane control, and ADAS. Therefore, the image taken by the in-vehicle camera does not faithfully reproduce the photographed landscape so that it does not look strange to humans, but rather captures specific objects such as vehicles, pedestrians, obstacles, and road surfaces. It is preferable that the image can be recognized with high accuracy. Further, since it is related to the determination of the progress or stop of the self-driving vehicle, it is preferable that the image can recognize the hue of the red, yellow, and blue signal lights of the traffic light with high accuracy.
- Patent Document 2 a target is detected from an image captured by an in-vehicle camera with the target placed in an imaging range, the brightness of the target in the detected image is measured, and the measured brightness is suitable for target recognition.
- a method for determining an exposure control value of an in-vehicle camera for calculating an exposure control value for achieving a target value is disclosed.
- FIG. 2 schematically shows a functional configuration example of the image pickup apparatus 200 according to the first embodiment of the technique disclosed in the present specification. It is assumed that the image pickup apparatus 200 is mainly mounted on a vehicle and used.
- the image pickup device 200 corresponds to one of the image pickup devices included in the data acquisition unit 102 in the vehicle control system 100.
- the illustrated image pickup device 200 includes a lens 201, an image sensor 202, a signal processing unit 203, a recognition unit 204, and a control unit 205.
- the image sensor 202 is configured by using an element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and captures an image formed on the image pickup surface by the lens 201.
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge Coupled Device
- the signal processing unit 203 performs processing also called “development” and "HDR synthesis” on the RAW data output from the image sensor 202. For example, demosaicing, noise reduction, white balance adjustment, gamma correction, sensor spectroscopic correction, edge correction, YC conversion, and the like correspond to development processing.
- the recognition unit 204 recognizes an object included in the captured image after processing by the signal processing unit 203.
- the recognition unit 204 targets objects that must be recognized or detected in order to realize automatic driving and ADAS, such as peripheral vehicles, road surfaces, and traffic lights.
- the recognition unit 204 can further incorporate other objects such as bikes, bicycles, pedestrians, road signs, lanes, medians, guardrails, roadside trees and street lights into the recognition target.
- Objects to be recognized may be added, deleted, or changed according to the manual operation of the passenger of the own vehicle such as a driver.
- an object to be recognized may be added, deleted, or changed depending on the application.
- the recognition unit 204 may recognize an object by using a machine learning or deep learning model or artificial intelligence (AI).
- AI artificial intelligence
- the control unit 205 controls the image pickup operation in the image sensor 202 and the signal processing in the signal processing unit 203 based on the state of the region of each object recognized by the recognition unit 204 in the image captured by the image sensor 202.
- the control unit 205 refers to the image sensor 202 and the signal processing unit 203 in order to be able to recognize objects such as vehicles, road surfaces, and traffic lights that are indispensable for recognition or detection for automatic driving and realization of ADAS with high accuracy. Take control. Therefore, the image taken by the image sensor 202 under the control of the control unit 205 and developed by the signal processing unit 203 is identified as a vehicle, a road surface, a signal, etc. in the recognition process performed in the vehicle control system 100 at the subsequent stage. The image quality is adjusted so that the object can be easily recognized. Therefore, the image output from the image pickup apparatus 200 to the vehicle control system 100 is not always a faithful reproduction of the original landscape, and may make a person feel uncomfortable when viewing or viewing.
- peripheral vehicles and road surfaces are recognized with high accuracy or a high recognition rate by recognizing an image taken by the image sensor 202 and developed by the signal processing unit 203 under the control of the control unit 205. And can recognize traffic lights. Then, the vehicle control system 100 is based on such an image recognition result, for example, inter-vehicle control (ACC), lane departure warning (LDW), lane keep assist (LKA), automatic emergency braking (AEB), blind spot detection.
- Vehicle control for automatic driving or ADAS such as (BSD) is performed, and further, the drive of each drive unit such as an active cornering light (ACL), a brake actuator (BRK), and a steering device (STR) is controlled. This can contribute to the safe driving of the own vehicle.
- FIG. 9 schematically shows a first embodiment of the operation flow of the image pickup apparatus 200.
- the image sensor 202 inputs the captured image data to the signal processing unit 203 (step S11).
- the signal processing unit 203 develops the image pickup data to generate the first image data (object recognition image data) and inputs it to the recognition unit 204 (step S12).
- the recognition unit 204 detects a target subject (vehicle, road surface, traffic light, etc.) from the first image data (image data for object recognition) and notifies the signal processing unit 203 (step S13).
- the signal processing unit 203 obtains a correction value specialized for the target subject (vehicle, road surface, traffic light, etc.) based on the information from the recognition unit 204, and adjusts the image quality (exposure adjustment, noise removal, edge correction, tone correction, etc.). Color tone correction, HDR composition).
- the signal processing unit 203 obtains the second image data (image data for sensing output to the vehicle control system 100), which is the data obtained by correcting the developed first image data (image data for object recognition). Generate (step S14).
- the control unit 205 feeds back the correction value obtained by the signal processing unit 203 to perform image pickup control (for example, exposure control) of the image sensor 202 (step S15). Therefore, after that, the image sensor 202 can capture more appropriate imaging data.
- FIG. 10 schematically shows a second embodiment of the operation flow of the image pickup apparatus 200.
- the image sensor 202 inputs a plurality of different imaging data captured to the signal processing unit 203 (step S21).
- Multiple different imaging data are multiple different imaging data specialized for the subject (vehicle, road surface, traffic light, etc.), which are simultaneously imaged with different exposure values for the subject (vehicle, road surface, traffic light, etc.).
- the signal processing unit 203 develops a plurality of different imaging data to generate a plurality of different first image data (image data for object recognition) and inputs the data to the recognition unit 204 (step S22).
- Multiple different image data are, for example, image data that makes it easy to recognize a distant subject, image data that makes it easy to recognize a subject at night by removing noise, and image data that makes it easy to recognize the light color of a signal by lengthening the exposure time.
- the recognition unit 204 detects a target subject (vehicle, road surface, signal, etc.) from a plurality of different first image data (image data for object recognition) specialized for the subject (vehicle, road surface, signal, etc.). And notify the signal processing unit 203 (step S23).
- the signal processing unit 203 obtains a correction value specialized for the target subject (vehicle, road surface, traffic light, etc.) based on the information from the recognition unit 204, and adjusts the image quality (exposure adjustment, noise removal, edge correction, tone correction, etc.). Color tone correction, HDR composition).
- the signal processing unit 203 outputs a plurality of different second image data (vehicle control system 100), which is data obtained by correcting a plurality of developed different first image data (image data for object recognition). Image data for sensing) is generated (step S24).
- the signal processing unit 203 may HDR-synthesize a plurality of different second image data to generate an HDR composite image and output it to the vehicle control system 100.
- the control unit 205 feeds back the correction value obtained by the signal processing unit 203 to perform image pickup control (for example, exposure control) of the image sensor 202 (step S25). Therefore, after that, the image sensor 202 can capture more appropriate imaging data.
- the second embodiment (FIG. 10) will be mainly described. That is, a case where the signal processing unit 203 processes a plurality of image pickup data specialized for a subject (vehicle, road surface, traffic light, etc.) will be described.
- FIG. 3 schematically shows a general exposure control loop in the image pickup apparatus 200.
- the image sensor 202 includes a shutter 301, an element unit 302, and an analog gain processing unit 303.
- the light collected by the lens 201 passes through the shutter 301 and reaches the image pickup surface of the element unit 302.
- the element unit 302 is composed of a two-dimensional pixel array, and a pixel signal corresponding to the amount of received light is output from each pixel.
- Each pixel signal is amplified in the analog region by the analog gain processing unit 303, then digitally converted and output to the signal processing unit 203.
- the signal processing unit 203 includes a development processing unit 304, a detection unit 305, and a comparison unit 306.
- the development processing unit 304 performs development processing including digital gain processing, gamma processing, and HDR (High Dynamic Range) processing on the digital pixel signal output from the image sensor 202.
- the detection unit 305 detects the entire screen imaged by the image sensor 202 by OPD (Optical Detection), and detects the brightness (luminance) of the screen and the color tone (ratio) of the screen.
- the comparison unit 306 compares the brightness of the entire screen detected by the detection unit 305 with a predetermined reference value (Ref).
- the control unit 205 controls the opening / closing timing (that is, the exposure time) of the shutter 301 and adjusts the analog gain of the analog gain processing unit 303 based on the difference between the screen brightness output from the comparison unit 306 and the reference value.
- the digital gain, gamma, HDR, and other development parameters in the development processing unit 304 are adjusted to control the captured image of the image sensor 202 to have an appropriate brightness and an appropriate color tone.
- the brightness of the entire screen can be adjusted, but the necessary subject such as a peripheral vehicle, a road surface, and a traffic light does not necessarily have an appropriate brightness (or image recognition). It is not always taken with a brightness suitable for. For example, if you drive under the scorching sun with strong sunlight and drive under a roadside tree, the contrast between the sunlit part and the shaded part is too strong, and you cannot recognize the vehicle or road surface in the shaded part with high accuracy. There is a concern that the captured image will look like this. In addition, the captured image is in a state where all the light colors are extinguished due to flicker, and the captured image is such that the red and yellow colors of the traffic light cannot be distinguished due to the influence of sunlight. Is a concern.
- FIG. 4 schematically shows an exposure control loop to which the technique disclosed in the present specification is applied in the image pickup apparatus 200.
- the control loop shown in FIG. 4 is configured to perform optimum image creation for each object recognized by the recognition unit 204 from the captured image.
- the recognition unit 204 detects a vehicle (most recent preceding vehicle), a road surface, and a traffic light as objects for which recognition or detection is indispensable for the realization of automatic driving and ADAS.
- the image sensor 202 includes a shutter 401, an element unit 402, and an analog gain processing unit 403.
- the light collected by the lens 201 passes through the shutter 401 and reaches the image pickup surface of the element unit 402.
- the element unit 402 is composed of a two-dimensional pixel array, and a pixel signal corresponding to the amount of received light is output from each pixel.
- Each pixel signal is amplified in the analog region by the analog gain processing unit 403, digitally converted, and output to the signal processing unit 203.
- the image sensor 202 will be described as performing a single exposure (that is, having a single exposure timing within one frame period).
- the signal processing unit 203 includes a development processing unit 404, a detection unit 405, and a comparison unit 406.
- the detection unit 405 includes a vehicle detection unit 405-1 that detects the brightness of the vehicle portion in the captured image so that the recognition unit 204 can perform OPD detection for each object recognized from the captured image. It includes a road surface detection unit 405-2 that detects the brightness of the road surface portion in the captured image, and a signal detection unit 405-3 that detects the brightness and color tone of the signal unit portion in the captured image. Further, the detection unit 405 also includes an entire screen detection unit 405-4 that OPD detects the entire screen captured by the image sensor 202. Further, when the recognition unit 204 additionally detects an object other than the vehicle, the road surface, and the traffic light, it may further include a detection unit that detects the brightness of the added object.
- the development processing unit 404 has a vehicle development processing unit 404-1 and a road surface development processing unit 404 so that the recognition unit 204 can individually perform a development process suitable for each object recognized from the captured image. -2 and a developing processing unit 404-3 for a signal device are provided. Further, the development processing unit also includes an entire screen development processing unit 404-4 that performs development processing suitable for the entire screen. Further, when the recognition unit 204 additionally detects an object other than the vehicle, the road surface, and the traffic light, it may further include a developing processing unit for the added object.
- the development processing unit 404 further includes an HDR processing unit 407.
- normal automatic exposure control is carried out within the range indicated by the reference number 410. That is, the entire screen detection unit 405-4 detects the brightness of the screen by OPD detecting the entire screen captured by the image sensor 202. Then, the comparison unit 406 compares the brightness of the entire screen detected by the detection unit 405 with a predetermined reference value (Ref). The control unit 205 controls the opening / closing timing (that is, the exposure time) of the shutter 401 and adjusts the analog gain of the analog gain processing unit 403 based on the difference between the screen brightness output from the comparison unit 406 and the reference value. Alternatively, the digital gain and other development parameters in the entire screen development processing unit 404-4 are adjusted to control the captured image of the image sensor 202 to have the optimum brightness.
- the recognition unit 204 recognizes an object included in the captured image 420 after processing by the signal processing unit 203.
- the recognition unit 204 is recognized or detected for automatic driving and realization of ADAS.
- Vehicle 421 (second object), road surface 422 (second object), and traffic light 423 (first object) are recognized as essential objects.
- other objects such as bikes, bicycles, pedestrians, road signs, guardrails, roadside trees and street lights can be further incorporated into the recognition target.
- the vehicle detection unit 405-1 sets a detection frame of the vehicle 421 in the captured image 420 based on the recognition result of the vehicle 421 by the recognition unit 204, detects the brightness in the detection frame, and detects the entire screen. The difference between the brightness of the entire screen detected by the detection unit 405-4 and the brightness in the detection frame of the vehicle 421 is calculated. Then, the developing processing unit 404-1 for a vehicle adjusts the digital gain and the gamma value based on the difference to adjust the brightness within the frame of the vehicle 421, and solves the problem as a resolution according to the distance of the subject. Edge correction is performed based on the image characteristics to develop.
- the vehicle development processing unit 404-1 performs development processing that is optimal for recognizing the vehicle 421 from the captured image (if the detection frame of the vehicle 421 is too dark with respect to the entire screen, the vehicle 421 will be more.
- the development process is performed so as to be bright, and if the detection frame of the vehicle 421 is too bright for the entire screen, the development process is performed so that the vehicle 421 is dark).
- the captured image developed by the vehicle development processing unit 404-1 does not necessarily faithfully reproduce the original landscape, and may make a person feel uncomfortable when viewing or viewing the vehicle. It is an image that can recognize 421 with high accuracy.
- the road surface detection unit 405-2 sets a detection frame for the road surface 422 in the captured image 420 based on the recognition result of the road surface 422 by the recognition unit 204, and detects the brightness in the detection frame.
- the difference between the brightness of the entire screen detected by the entire screen detection unit 405-4 and the brightness within the detection frame of the road surface 422 is calculated.
- the road surface development processing unit 404-2 adjusts the digital gain and the gamma value based on the difference, adjusts the brightness within the frame of the road surface 422, and solves the problem as a resolution according to the distance of the subject. Edge correction is performed based on the image characteristics to develop.
- the road surface development processing unit 404-2 performs the development processing that is optimal for recognizing the road surface 422 from the captured image (if the detection frame of the road surface 422 is too dark with respect to the entire screen, the road surface 422 becomes more. Development processing is performed so as to be bright, and if the detection frame of the road surface 422 is too bright for the entire screen, development processing is performed so that the road surface 422 becomes dark).
- the captured image developed by the road surface development processing unit 404-2 does not necessarily faithfully reproduce the original landscape, and may make people feel uncomfortable when viewing or viewing, but the road surface It is an image that can recognize 422 with high accuracy.
- the traffic light detection unit 405-3 sets the detection frame of the traffic light 423 in the captured image 420 based on the recognition result of the traffic light 423 by the recognition unit 204, detects the brightness in the detection frame, and detects the entire screen. The difference between the brightness of the entire screen detected by the detection unit 405-4 and the brightness in the detection frame of the traffic light 423 is calculated. Then, the development processing unit 404-3 for the traffic light adjusts the digital gain and the gamma value based on the difference, adjusts the brightness within the frame of the traffic light 423, and solves the problem as a resolution according to the distance of the subject. Edge correction is performed based on the image characteristics to develop.
- the development processing unit 404-3 for the traffic light performs the development processing that is optimal for recognizing the traffic light 423 from the captured image (if the detection frame of the traffic light 423 is too dark for the entire screen, the traffic light 423 is more. Development processing is performed so as to be bright, and if the detection frame of the traffic light 423 is too bright for the entire screen, development processing is performed so that the traffic light 423 is darkened).
- the traffic light detection unit 405-3 sets a detection frame of the traffic light 423 in the captured image 420 based on the recognition result of the traffic light 423 by the recognition unit 204, detects the color tone in the detection frame, and determines a predetermined reference. Calculate the difference from the value.
- the development processing unit 404-3 for the signal device adjusts the color tone correction value based on the difference, and adjusts the color tone within the frame of the signal device 423 (based on the hue of the red, yellow, and blue signal lights of the signal device 423). If it is out of the range of values, the development process is performed together with the adjustment of saturation and lightness so that the hue is within the range).
- the captured image developed by the traffic light development process 404-3 does not necessarily faithfully reproduce the original landscape, and may make a person feel uncomfortable when viewing or viewing the traffic light 423. It is an image that can be recognized with high accuracy.
- the HDR processing unit 407 includes an image developed by the vehicle development processing unit 404-1 to the tone and image quality for the vehicle, and an image developed by the road surface development processing unit 404-2 to the tone and image quality for the road surface.
- the development processing unit 404-3 for the signal is HDR-combined with the gradation and color tone for the signal and the image developed to the image quality.
- the HDR processing unit 407 creates an HDR composite image that has been developed and processed to have a gradation, color tone, and image quality for the entire screen.
- the vehicle detection unit 405-1 detects the color tone of the detection frame of the vehicle 421, calculates a difference from a predetermined reference value, and based on the difference.
- the color tone in the frame of the vehicle 421 may be adjusted by adjusting the color tone correction value.
- the road surface detection unit 405-2 detects the color tone of the detection frame of the road surface 422, calculates the difference from the predetermined reference value, adjusts the color tone correction value based on the difference, and within the frame of the road surface 422. You may adjust the color tone of.
- the image developed by the vehicle development processing unit 404-1 to the tone, color tone and image quality for the vehicle, and the road surface development processing unit 404-2 develops to the tone, color tone and image quality for the road surface.
- the image developed in the above and the image developed by the developing unit 404-3 for the signal device to have the gradation, color tone and image quality for the signal device may be combined by HDR.
- the image pickup apparatus 200 shown in FIG. 4 includes a recognition unit 204, and by mounting a plurality of development processing units 404 and a plurality of detection units 405 for each object recognized by the recognition unit 204, a normal automatic exposure control is performed. And, at the same time as stably performing automatic color tone correction, it is possible to perform optimum development for image recognition of each object.
- the recognition unit 204 recognizes three types of objects, a vehicle, a road surface, and a traffic light, and a plurality of detection units 405 for a vehicle, a road surface, and a traffic light, and a plurality of developments. It is configured to mount the processing unit 404.
- the recognition unit 204 can also incorporate other objects such as bikes, bicycles, pedestrians, road signs, lanes, medians, guardrails, street trees and street lights into the recognition target, and adds them accordingly.
- the image sensor 202 has been described as performing a single exposure, but of course, a plurality of simultaneous exposures (that is, having a plurality of exposure timings within one frame period and displaying a plurality of images within one frame period).
- (Imaging) may be feasible.
- the image pickup device 200 utilizes a plurality of simultaneous exposures, and is divided into applications such as automatic exposure control for each brightness group such as low-brightness side automatic exposure and high-brightness side automatic exposure, and LED (Light Emitting Diode) flicker countermeasures. May be output.
- FIG. 5 shows a basic operation procedure for automatic exposure control and automatic color tone correction in the image pickup apparatus 200 shown in FIG. 4 in the form of a flowchart. However, here, it is assumed that a plurality of image sensors 202 can be simultaneously exposed, and four exposures are performed for vehicles, road surfaces, traffic lights, and the entire screen.
- the entire screen detection unit 405-4 detects the brightness of the entire screen of the captured image by the image sensor 202 (step S501).
- the comparison unit 406 compares the brightness (brightness) of the entire screen and the color tone (ratio) of the entire screen detected by the entire screen detection unit 405-4 with a predetermined reference value (Ref) to calculate the error amount. (Step S502).
- control unit 205 performs device control such as opening / closing timing (that is, exposure time) of the shutter 401 and analog gain adjustment of the analog gain processing unit 403 based on the error amount calculated in step S502 (step S503).
- each device control (steps S503-1, S503-2, S503-3, S503-4) is performed for low brightness, high brightness, traffic light, and the entire screen. May be good. Moreover, each device control may have the same contents.
- steps S501 to S503 is a process corresponding to normal automatic exposure control.
- the recognition unit 204 performs image recognition processing on each captured image processed based on the device control in steps S503-1 to S503-3 (step S504).
- the recognition unit 204 cannot recognize the target object (vehicle, road surface, traffic light) from the captured image (No in step S504), this process ends.
- the recognition unit 204 can recognize the target object (vehicle, road surface, signal) from the captured image (Yes in step S504)
- the signal processing unit 203 receives the target object (vehicle) from the recognition unit 204.
- Image recognition information regarding a vehicle, a road surface, and a traffic light) is acquired (step S505).
- the vehicle detection unit 405-1 sets the vehicle detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness of the object in the vehicle detection frame (step S506).
- the recognition unit 204 can recognize N vehicles (where N is an integer of 2 or more) from the captured image
- the vehicle detection unit 405-1 sets a detection frame for N vehicles.
- the brightness of all vehicles may be detected individually, the average brightness of objects in all detection frames may be detected, or the brightness of up to a predetermined number of units with higher priority may be detected. You may try to do it.
- the priority of the vehicle may be assigned based on the possibility of collision with the own vehicle such as the distance from the own vehicle.
- the vehicle detection unit 405-1 compares the brightness of the object in the vehicle detection frame with the brightness of the entire screen detected by the entire screen detection unit 405-4, and calculates the error amount (step S507). ). Then, the vehicle development processing unit 404-1 has the error amount calculated in step S507 for the image captured for the vehicle by a plurality of simultaneous exposures, the image information acquired from the recognition unit 204, and the vehicle detection unit 405-. Based on the detection information of 1, the vehicle correction amount for adjusting the noise and developing the tone and image quality for the vehicle is calculated (step S513). The vehicle development processing unit 404-1 develops the image captured for the vehicle into the tone and image quality for the vehicle based on the correction amount for the vehicle (step S508).
- the road surface detection unit 405-2 sets the road surface detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness of the object in the road surface detection frame (step S509).
- the road surface detection unit 405-2 compares the brightness of the object in the detection frame of the road surface with the brightness of the entire screen detected by the entire screen detection unit 405-4, and calculates the error amount (step S510). ).
- the road surface development processing unit 404-2 adjusts noise based on the error amount calculated in step S510 for the image captured for the road surface by a plurality of simultaneous exposures, and develops the tone and image quality for the road surface.
- the road surface correction amount for processing is calculated (step S514).
- the road surface development processing unit 404-2 develops the image captured for the road surface to the tone and image quality for the road surface based on the correction amount for the road surface (step S511).
- the traffic light detection unit 405-3 sets the traffic light detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness and color tone of the object in the traffic light detection frame (step S515).
- the traffic light detection unit 405-3 compares the brightness and color tone of the object in the detection frame of the traffic light with the brightness and color tone of the entire screen detected by the entire screen detection unit 405-4, and determines the amount of error. Calculate (step S516).
- the development processing unit 404-3 for the traffic light has the error amount calculated in step S507 for the image captured for the traffic light by a plurality of simultaneous exposures, the image information acquired from the recognition unit 204, and the detection unit 405 for the traffic light. Based on the detection information of No.
- the correction amount for the traffic light for adjusting the noise and developing the tone and the color tone and the image quality for the traffic light is calculated (step S517).
- the traffic light development processing unit 404-3 develops the image captured for the traffic light into the tone, color tone, and image quality for the traffic light based on the correction amount for the traffic light (step S518).
- the development processing unit 404 carries out development processing for LED flicker countermeasures (step S512).
- the image pickup apparatus 200 is equipped with a plurality of simultaneous exposure functions, a plurality of detection functions, and a plurality of development functions. With such a configuration, the image pickup apparatus 200 can maintain stable exposure even with a sudden change in the subject while suppressing LED flicker. As a result, the image pickup apparatus 200 is characterized in that it is possible to perform optimum development processing on a plurality of subjects without depending on each other's exposure, and it is possible to immediately reflect the amount of error during development. .. Further, the image pickup apparatus 200 is characterized in that the subject detection function can be maintained.
- state where image recognition is not possible include the entrance and exit of a tunnel and the state of being exposed to the headlights of an oncoming vehicle.
- a state in which the light color of the traffic light cannot be recognized a state in which the light color of the traffic light is extinguished (a state in which all the light colors are extinguished due to flicker) and a state in which the red and yellow colors of the traffic light cannot be distinguished due to the influence of sunlight.
- an imaging device that determines a problem scene in which the light color of the traffic light cannot be recognized based on the image recognition result and adaptively controls automatic exposure in the problem scene will be described.
- the light of the traffic light is turned off due to the light of the traffic light (the state where all the light colors are turned off by the flicker) and the influence of the sunlight. It can be mentioned that the colors red and yellow cannot be distinguished.
- adaptive control of automatic exposure in such a problem scene variable control of the convergence speed of automatic exposure and variable control of the automatic exposure detection region can be mentioned. Details of the problem scene determination method and the adaptive control of automatic exposure will be described later.
- FIG. 6 schematically shows a functional configuration example of the image pickup apparatus 600 according to the second embodiment. It is assumed that the image pickup apparatus 600 is mainly mounted on a vehicle and used.
- the illustrated image pickup device 600 includes a lens 601, an image sensor 602, a signal processing unit 603, a recognition unit 604, a determination unit 605, and a control unit 606.
- the image sensor 602 is configured by using elements such as CMOS and CCD, and images an image formed on the image pickup surface by the lens 601 and HDR composites.
- the signal processing unit 603 performs development processing on the RAW data output from the image sensor 602. For example, demosaicing, noise reduction, white balance adjustment, gamma correction, sensor spectroscopic correction, edge correction, YC conversion, and the like correspond to development processing.
- the recognition unit 604 recognizes an object included in the captured image after processing by the signal processing unit 603.
- the recognition unit 604 basically recognizes peripheral vehicles and lanes (lanes) as objects used for determining the problem scene by the determination unit 605 in the subsequent stage.
- the recognition unit 604 can further incorporate other objects such as motorcycles, bicycles, pedestrians, road signs, traffic lights, guardrails, roadside trees and street lights into the recognition target.
- the determination unit 605 determines a problem scene in which the light color of the traffic light cannot be recognized based on the image recognition result in the recognition unit 604. Specifically, the determination unit 605 determines a state in which the red and yellow colors of the traffic light cannot be distinguished due to the state of the traffic light being extinguished (a state in which all the light colors are extinguished due to flicker) and the influence of sunlight. ..
- the determination unit 605 calculates the degree of contrast of the image based on the OPD detection.
- the determination unit 605 has two methods: a method of comparing the detection value of the road surface region in the captured image and the detection value of the other region, and a method of determining based on the shape of the histogram (brightness distribution of the image).
- the degree of contrast shall be calculated according to the street.
- the control unit 606 controls the image pickup operation of the image sensor 602 and the development process of the signal processing unit 603 in the problem scene in which the determination unit 605 determines that the light color of the traffic light cannot be recognized, and adapts the automatic exposure. Control. Specifically, the control unit 606 implements variable control of the convergence speed of automatic exposure and variable control of the automatic exposure detection region as adaptive control of automatic exposure in the problem scene.
- peripheral vehicles are recognized with high accuracy or a high recognition rate by recognizing an image taken by the image sensor 602 and developed by the signal processing unit 603 under the control of the control unit 606. can do. Then, the vehicle control system 100 is based on such an image recognition result, for example, inter-vehicle control (ACC), lane departure warning (LDW), lane keep assist (LKA), automatic emergency braking (AEB), blind spot detection.
- Vehicle control for automatic driving or ADAS such as (BSD) is performed, and further, the drive of each drive unit such as an active cornering light (ACL), a brake actuator (BRK), and a steering device (STR) is controlled. This can contribute to the safe driving of the own vehicle.
- the image pickup apparatus 600 according to the second embodiment is the same as that of the first embodiment (that is, the image pickup apparatus 200 shown in FIG. 4), the image sensor 602 has a plurality of exposures, and the signal processing unit 603 and the signal processing unit 603 and The detection unit 604 has a plurality of systems for each object. Therefore, hereinafter, the same configuration as that of the first embodiment (that is, the image pickup apparatus 200 shown in FIG. 4) (that is, all except the determination unit 605) will be described using the names and reference numerals of FIG. In some cases.
- FIG. 7 shows a basic operation procedure for automatic exposure control in the image pickup apparatus 600 shown in FIG. 6 in the form of a flowchart.
- the determination unit 605 acquires the image recognition information by the recognition unit 604 (step S701).
- the recognition unit 604 can recognize an object such as a road surface, a lane on the road surface, a vehicle, and a traffic light from an image captured by the image sensor 602. Then, based on the image recognition result by the recognition unit 604, the detection frame can be arranged on the ground (road surface), the sky, the vehicle, the traffic light, or the like with respect to the captured image.
- the determination unit 605 calculates a traffic light based on the image recognition information by the recognition unit 604, and extracts a signal light, an arrow light, and the like from the calculated traffic light area (step S702). The determination unit 605 further acquires the detection value of the traffic light region (step S703).
- the determination unit 605 calculates the contrast degree of the image, the brightness (brightness) of the screen, and the color tone (ratio) of the screen based on the OPD detection. Specifically, the determination unit 605 calculates the degree of contrast by comparing the detection value of the traffic light in the captured image with the detection value of the other region, and is based on the shape of the histogram (luminance distribution of the image). Calculate the degree of contrast.
- the determination unit 605 checks whether or not the detection value of the traffic light region acquired in step S703 is equal to or greater than the lower limit threshold value and less than the upper limit threshold value (within a predetermined range) (step S704). In short, the determination unit 605 checks whether or not the traffic light region is overexposed or reflected.
- the determination unit 605 determines whether or not the recognition unit 604 can recognize the blinking of the traffic light (step S705).
- the blinking of the traffic light can be recognized means that the traffic light is lit and the likelihood (reliability) of the recognition is high.
- the blinking of the traffic light cannot be recognized means, for example, when the traffic light area is overexposed or reflected, or when the traffic light is lit but the light color is indistinguishable from red or yellow. , It means that the traffic light is in a extinguished state (a state in which all the light colors are extinguished by flicker).
- the control unit 606 acquires the detection value of the traffic light region in the captured image and calculates the correction value for the traffic light used for performing the traffic light blinking correction process (step S706).
- the control unit 606 inputs the correction value for the traffic light to the image sensor 602, and controls the automatic exposure of the image sensor 602 and the development processing in the signal processing unit 603 using only the detection value in the traffic light region (step S707).
- control unit 606 inputs the correction value for the signal to the signal processing unit 603 (development processing unit 404 of the signal processing unit 203 in FIG. 4) (feedback for the development processing of the next frame).
- the traffic light development processing unit 404-3 develops the image captured for the traffic light into the tone, color tone, and image quality for the traffic light based on the correction amount for the traffic light.
- the control unit 606 is the traffic light in the captured image.
- the detection value of the region is acquired, and the correction value for the traffic light used for performing the traffic light blinking correction processing is calculated (step S706).
- the control unit 606 inputs the correction value for the traffic light to the image sensor 602, and controls the automatic exposure of the image sensor 602 and the development processing in the signal processing unit 603 using only the detection value in the traffic light region (step S707).
- control unit 606 inputs the correction value for the signal to the signal processing unit 603 (development processing unit 404 of the signal processing unit 203 in FIG. 4) (feedback for the development processing of the next frame).
- the traffic light development processing unit 404-3 develops the image captured for the traffic light into the tone, color tone, and image quality for the traffic light based on the correction amount for the traffic light.
- FIG. 8 shows a basic operation procedure for HDR composition processing in the image pickup apparatus 200 shown in FIG. 4 in the form of a flowchart.
- the signal processing unit 203 acquires image recognition information regarding the target object (vehicle, road surface, traffic light) recognized from the captured image from the recognition unit 204 (step S801).
- the vehicle detection unit 405-1 sets the vehicle detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness of the object in the vehicle detection frame.
- the road surface detection unit 405-2 sets the road surface detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness of the object in the road surface detection frame.
- the traffic light detection unit 405-3 sets a traffic light detection frame based on the image recognition information acquired from the recognition unit 204, and detects the brightness and color tone of the object in the traffic light detection frame (step S802).
- the vehicle development processing unit 404-1 develops the image captured for the vehicle into the tone and image quality for the vehicle based on the correction amount for the vehicle.
- the road surface development processing unit 404-2 develops the image captured for the road surface into the tone and image quality for the road surface based on the correction amount for the road surface.
- the traffic light development processing unit 404-3 develops the image captured for the traffic light into the tone, color tone, and image quality for the traffic light based on the correction amount for the traffic light.
- the HDR processing unit 407 includes a vehicle development processing image generated by the vehicle development processing unit 404-1, a road surface development processing image generated by the road surface development processing unit 404-2, and a signal development processing unit 404.
- the developed image for the signal generated by -3 is acquired (step S803).
- the HDR processing unit 407 calculates the image composition ratio when the development processing image for the vehicle, the development processing image for the road surface, and the development processing image for the traffic light are combined (step S804). For example, the HDR processing unit 407 calculates the composite ratio by giving priority to the gradation ratio of the developed image in order to expand the dynamic range. Alternatively, the HDR processing unit 407 calculates the composite ratio by giving priority to the exposure ratio of the developed image in order to reduce the flicker of the signal lamp. Alternatively, the HDR processing unit 407 calculates the composition ratio with priority given to the control amount of the developed image in order to reduce random noise.
- the HDR processing unit 407 includes a vehicle correction amount for correcting the tone and color tone of the developed image for the vehicle, a road surface correction amount for correcting the tone and color tone of the developed image for the road surface, and a signal device.
- the gradation of the developed image and the correction amount for the signal for correcting the color tone of the above are calculated (step S805).
- the HDR processing unit 407 calculates a correction amount for correcting the gradation and the color tone in conjunction with the brightness and the color tone of the recognition detection in order to reduce the bleeding of the signal lamp. Further, the HDR processing unit 407 calculates a correction amount for correcting the gradation and the color tone in conjunction with the brightness and the color tone of the recognition detection in order to suppress the influence of the scene such as the tunnel entrance / exit, daytime or nighttime.
- the HDR processing unit 407 corrects the developed image for the vehicle, the developed image for the road surface, and the developed image for the traffic light based on the calculated correction amount (step S805).
- the HDR processing unit 407 sets the corrected image composition ratio (step S804) of the corrected image for the vehicle, the corrected image for the road surface, and the corrected image for the signal device into the calculated image composition ratio (step S804). Synthesize based on.
- the HDR processing unit 407 generates an HDR composite image from the developed image for the vehicle, the developed image for the road surface, and the developed image for the signal device (step S806).
- the HDR processing unit 407 gives priority to the gradation ratio of a plurality of developed images (for vehicles, road surfaces, and traffic lights) in order to expand the dynamic range. Is calculated, and a plurality of developed images are HDR-combined based on the calculated composition ratio. As a result, an HDR composite image in which the tone in the tunnel entrance including the vehicle is corrected is generated.
- the HDR processing unit 407 adjusts the exposure of a plurality of developed images (for vehicles, road surfaces, and traffic lights) to remove noise in order to increase the contrast.
- the composite ratio for this is calculated, and a plurality of developed images are HDR-combined based on the calculated composite ratio. This produces an HDR composite image with improved fog haze.
- the HDR processing unit 407 calculates the composition ratio for adjusting the exposure of a plurality of developed images (for vehicles, road surfaces, and traffic lights) and correcting the color tone.
- a plurality of developed images are HDR-combined based on the calculated composition ratio. As a result, an HDR composite image with improved gradation due to backlight is generated.
- the HDR processing unit 407 corrects the edges of a plurality of developed images (for vehicles, road surfaces, and traffic lights) to correct the tone, so that the resolution and the resolution are used.
- the composite ratio is calculated with priority given to the color tone, and a plurality of developed images are HDR-combined based on the calculated composite ratio. As a result, an HDR composite image with improved resolution of a distant subject is generated.
- the HDR processing unit 407 adjusts the exposure of a plurality of developed images (for vehicles, road surfaces, and traffic lights) to remove noise. Therefore, the composite ratio of the developed processed image for the signal device is increased, the composite ratio is calculated with priority given to the S / N ratio, and a plurality of developed processed images are HDR composited based on the calculated composite ratio. As a result, an HDR composite image in which noise in dark areas is reduced is generated.
- Image quality adjustment systems for viewing are typically adjusted on the basis of subjective evaluation (human evaluation), where subjective evaluation methods are psychological measurements and statistics for data analysis. It consists of processing.
- the image quality adjusted based on the image quality standard for viewing is not optimal for sensing. For example, since the entire screen is adjusted to have an appropriate image quality, the image quality is not dynamically switched to emphasize the resolution of a distant subject. In some cases, a subject whose distance or position frequently moves cannot be recognized. Since the exposure time is long at night, it is advantageous for a blinking subject such as an LED light source, but it does not dynamically switch to an image quality that emphasizes a specific subject. It was difficult to adjust S / N, noise, edge enhancement, and contrast in a well-balanced manner. Since subjective evaluation is taken into consideration for changes in the subject, it may not be possible to respond to sudden changes in image quality.
- HDR composition is adjusted so that the tone is linear
- the tone of the entire screen is expressed in the image that combines multiple exposed images.
- HDR rendering adjusted based on image quality standards for viewing is not optimal for sensing.
- the gradation is adjusted to be linear and the HDR composition algorithm does not change dynamically, the blinking area such as the LED light source may not be reflected in the composition.
- the algorithm for expanding the tone it has been difficult to achieve both a change in blinking brightness such as an LED light source and a correspondence with a moving subject. Since subjective evaluation is taken into consideration for changes over time, it may not be possible to respond to sudden changes in brightness.
- the signal processing unit 203 is configured to receive feedback from the recognition unit 204.
- the recognition unit 204 instructs the signal processing unit 203 of information necessary for recognition.
- the signal processing unit 203 can acquire an image specialized for a specific image quality in a specific area.
- the signal processing unit 203 can realize an image with the optimum image quality for sensing by adjusting and HDR-synthesizing the image quality specialized for a specific region from the fed-back information.
- the signal processing unit 203 adjusts the image quality
- the adjustment data does not give priority and weight to the statistical processing.
- the signal processing unit 203 and the recognition unit 204 are communicated (integrated) so that the image quality can follow the target subject.
- the recognition unit 204 detects a target subject from the image data.
- the recognition unit 204 notifies the signal processing unit 203 of the target subject.
- the signal processing unit 203 adjusts the image quality specifically for the target subject.
- the high-pixel image sensor 202 can immediately follow the target subject in a wide range.
- the image sensor 202 of the high-speed frame can immediately follow the target subject.
- the high-sensitivity image sensor 202 can immediately follow the target subject with a wide illuminance.
- the image sensor 202 capable of capturing a plurality of images can immediately follow a plurality of target subjects.
- the signal processing unit 203 can adjust to the optimum image quality without adjustment data based on subjective evaluation.
- the signal processing unit 203 can adjust the image quality to be specific to the subject without giving priority and weight to the statistical processing. Since the signal processing unit 203 does not perform statistical processing together with the adjustment data, the signal processing can be speeded up. Since the signal processing unit 203 generates an image quality specialized for the target subject, the recognition accuracy can be improved.
- the high-pixel image sensor 202 can adjust the image quality in a wide field of view without being affected by the composition.
- the high-speed frame image sensor 202 can immediately obtain image quality that follows a moving subject.
- the high-sensitivity image sensor 202 can adjust the image quality in a wide illuminance without being affected by the brightness.
- the image sensor 202 capable of capturing a plurality of images can obtain image quality that follows a plurality of subjects.
- Communicating the signal processing unit 203 and the recognition unit 204 can be applied to the conventional signal processing unit 203, and image quality more specialized for the target subject can be obtained.
- the priority can be set according to the distance of the subject, and the image quality specialized for the target subject can be obtained.
- a recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is the data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount.
- An information processing device including the processing unit that generates the image data of 2.
- the recognition unit recognizes a second object different from the first object from the first image data, and recognizes the second object.
- the processing unit detects the tone of the second object, calculates a second correction amount for correcting the tone, and corrects the tone of the first image data based on the second correction amount.
- An information processing device that generates yet another second image data, which is the obtained data.
- the processing unit calculates an image composition ratio when synthesizing a plurality of different second image data, and generates a composite image obtained by synthesizing the plurality of different second image data based on the image composition ratio.
- Processing device (4) The information processing apparatus according to (1) or (2) above.
- An information processing device further comprising a control unit that controls image pickup of the image sensor based on the first correction amount and the second correction amount.
- the processing unit is an information processing device that obtains the first correction amount calculated based on the determination result.
- An information processing device further comprising a control unit that controls image pickup of the image sensor based on the first correction amount calculated based on the determination result.
- the recognition unit is the first from a plurality of different first image data generated by the processing unit developing a plurality of different image pickup data output simultaneously by the image sensor and simultaneously imaged by different image pickup controls. Recognize one object and a second object different from the first object, respectively.
- the processing unit One used to detect the color tone of the first object, calculate a first correction amount for correcting the color tone, and recognize the first object based on the first correction amount.
- the second image data which is the data obtained by correcting the color tone of the first image data of the above, is generated.
- the tone of the second object is detected, a second correction amount for correcting the tone is calculated, and a separate correction amount used for recognizing the second object based on the second correction amount is used.
- An information processing device that generates separate second image data, which is data obtained by correcting the tone of the first image data.
- the processing unit calculates an image composition ratio when synthesizing a plurality of different second image data, and generates a composite image obtained by synthesizing the plurality of different second image data based on the image composition ratio. Processing device. (9) The information processing apparatus according to (7) or (8) above.
- An information processing device further comprising a control unit that controls image pickup of the image sensor based on the first correction amount and the second correction amount.
- the information processing apparatus according to any one of (7) to (9) above.
- a determination unit for determining blinking of the first object is further provided.
- the processing unit is an information processing device that obtains the first correction amount calculated based on the determination result.
- An information processing device further comprising a control unit that controls image pickup of the image sensor based on the first correction amount calculated based on the determination result.
- (12) The information processing apparatus according to (1) to (11) above.
- the image sensor is mounted on a moving body and used.
- the first object is an information processing device that is a traffic light.
- the first object is recognized from the first image data generated by developing the image pickup data output by the image sensor. It is the data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount.
- Image sensor and A recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount.
- An image pickup apparatus including the processing unit that generates the image data of 2.
- a moving body device including the processing unit that generates the image data of 2 and an operation control unit that controls an operation in the moving body based on the result of recognizing the second image data.
- a recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor.
- a recognition unit that recognizes a first object from the first image data generated by the processing unit developing the image pickup data output by the image sensor. It is the data obtained by detecting the color tone of the first object, calculating the first correction amount for correcting the color tone, and correcting the color tone of the first image data based on the first correction amount.
- a non-transient computer-readable storage medium that records a computer program written in a computer-readable format so that the computer functions as the processing unit that generates the image data of 2.
- Vehicle control system 101 ... Input unit, 102 ... Data acquisition unit, 103 ... Communication unit 104 ... In-vehicle equipment, 105 ... Output control unit, 106 ... Output unit 107 ... Drive system control unit, 108 ... Drive system system 109 ... Body System control unit, 110 ... Body system, 111 storage unit 112 ... Automatic operation control unit, 121 ... Communication network 131 ... Detection unit, 132 ... Self-position estimation unit 133 ... Situation analysis unit 134 ... Planning unit, 135 ... Operation control Part 141 ... Out-of-vehicle information detection unit, 142 ... In-vehicle information detection unit 143 ... Vehicle condition detection unit 151 ...
- Map analysis unit 152 ... Traffic rule recognition unit 153 ... Situation recognition unit 154 ... Situation prediction unit 161 ... Route planning department, 162 ... Action planning unit, 163 ... Operation planning unit 171 ... Emergency avoidance unit, 172 ... Acceleration / deceleration control unit, 173 ... Direction control unit 200 ... Imaging device, 201 ... Lens, 202 ... Image sensor 203 ... Signal processing unit, 204 ... Recognition unit, 205 ... Control unit 301 ... Shutter, 302 ... Element unit, 303 ... Analog gain processing unit 304 ... Development processing unit, 305 ... Detection unit, 306 ... Comparison unit 401 ... Shutter, 402 ... Element unit, 403 ...
- Analog gain Processing unit 404-1 ... Vehicle development processing unit, 404-2 ... Road surface development processing unit 404-4 ... Overall screen development processing unit, 405-1 ... Vehicle detection unit 405-2 ... Road surface detection unit, 405- 4 ... Screen detection unit 406 ... Comparison unit, 407 ... HDR processing unit 600 ... Imaging device, 601 ... Lens, 602 ... Image sensor 603 ... Signal processing unit, 604 ... Recognition unit, 605 ... Judgment unit 606 ... Control unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Exposure Control For Cameras (AREA)
- Studio Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
Ce dispositif de traitement d'informations comprend : une unité de reconnaissance qui reconnaît un premier objet à partir de premières données d'image générées suite au développement, par une unité de traitement, de données de capture d'image qui ont été produites par un capteur d'image; et l'unité de traitement, qui détecte la tonalité de couleurs du premier objet, calcule une première quantité de correction pour corriger la tonalité de couleurs, et génère des secondes données d'image, qui sont des données obtenues par correction de la tonalité de couleurs des premières données d'image sur la base de la première quantité de correction. L'unité de reconnaissance reconnaît un second objet, qui est différent du premier objet, en utilisant les premières données d'image. L'unité de traitement détecte la gradation tonale du second objet, calcule une seconde quantité de correction pour corriger la gradation tonale, et génère en outre des secondes données d'image séparées, qui sont des données obtenues par correction de la gradation tonale des premières données d'image sur la base de la seconde quantité de correction.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020138556A JP2022034720A (ja) | 2020-08-19 | 2020-08-19 | 情報処理装置及び情報処理方法、撮像装置、移動体装置、並びにコンピュータプログラム |
| JP2020-138556 | 2020-08-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022038981A1 true WO2022038981A1 (fr) | 2022-02-24 |
Family
ID=80350356
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/027934 Ceased WO2022038981A1 (fr) | 2020-08-19 | 2021-07-28 | Dispositif et procede de traitement d'informations, dispositif de capture d'image, dispositif mobile et programme informatique |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2022034720A (fr) |
| WO (1) | WO2022038981A1 (fr) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102540632B1 (ko) * | 2022-10-27 | 2023-06-13 | 주식회사 모빌테크 | 색상 보정을 적용한 컬러맵 생성 방법 및 이를 실행하기 위하여 기록매체에 기록된 컴퓨터 프로그램 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011254340A (ja) * | 2010-06-03 | 2011-12-15 | Hitachi Ltd | 撮像装置 |
| JP2020068008A (ja) * | 2018-10-19 | 2020-04-30 | ソニー株式会社 | センサ装置、パラメータ設定方法 |
-
2020
- 2020-08-19 JP JP2020138556A patent/JP2022034720A/ja active Pending
-
2021
- 2021-07-28 WO PCT/JP2021/027934 patent/WO2022038981A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011254340A (ja) * | 2010-06-03 | 2011-12-15 | Hitachi Ltd | 撮像装置 |
| JP2020068008A (ja) * | 2018-10-19 | 2020-04-30 | ソニー株式会社 | センサ装置、パラメータ設定方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2022034720A (ja) | 2022-03-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7646358B2 (ja) | 情報処理装置及び情報処理方法、撮像装置、移動体装置、並びにコンピュータプログラム | |
| KR102533860B1 (ko) | 화상 처리 장치 및 화상 처리 방법 | |
| KR102685934B1 (ko) | 정보 처리 장치, 이동체, 제어 시스템, 정보 처리 방법 및 프로그램 | |
| JP6939283B2 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
| JP7226440B2 (ja) | 情報処理装置、情報処理方法、撮影装置、照明装置、及び、移動体 | |
| JPWO2019082669A1 (ja) | 情報処理装置、情報処理方法、プログラム、及び、移動体 | |
| US11272115B2 (en) | Control apparatus for controlling multiple camera, and associated control method | |
| KR102749769B1 (ko) | 노광 제어 장치, 노광 제어 방법, 프로그램, 촬영 장치, 및 이동체 | |
| JP6977722B2 (ja) | 撮像装置、および画像処理システム | |
| US20230045772A9 (en) | Information processing apparatus, information processing method, and program | |
| CN110012215B (zh) | 图像处理装置和图像处理方法 | |
| WO2022038981A1 (fr) | Dispositif et procede de traitement d'informations, dispositif de capture d'image, dispositif mobile et programme informatique | |
| JPWO2018056070A1 (ja) | 信号処理装置、撮影装置、及び、信号処理方法 | |
| JP7318656B2 (ja) | 画像処理装置と画像処理方法およびプログラム | |
| WO2021229983A1 (fr) | Dispositif et programme de capture d'image | |
| WO2022085479A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme | |
| KR20200119790A (ko) | 인식 장치와 인식 방법 그리고 프로그램 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21858130 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21858130 Country of ref document: EP Kind code of ref document: A1 |