[go: up one dir, main page]

WO2017034323A1 - Dispositif et procédé de traitement d'image pour améliorer de manière adaptative un faible niveau d'éclairage, et dispositif de détection d'objet l'utilisant - Google Patents

Dispositif et procédé de traitement d'image pour améliorer de manière adaptative un faible niveau d'éclairage, et dispositif de détection d'objet l'utilisant Download PDF

Info

Publication number
WO2017034323A1
WO2017034323A1 PCT/KR2016/009394 KR2016009394W WO2017034323A1 WO 2017034323 A1 WO2017034323 A1 WO 2017034323A1 KR 2016009394 W KR2016009394 W KR 2016009394W WO 2017034323 A1 WO2017034323 A1 WO 2017034323A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel value
image
pixel
target
input image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2016/009394
Other languages
English (en)
Korean (ko)
Inventor
정순기
구재호
쟈베드샤지드
오선호
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industry Academic Cooperation Foundation of KNU
Original Assignee
Industry Academic Cooperation Foundation of KNU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industry Academic Cooperation Foundation of KNU filed Critical Industry Academic Cooperation Foundation of KNU
Publication of WO2017034323A1 publication Critical patent/WO2017034323A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering

Definitions

  • the present invention relates to an image processing apparatus and method, and an object detecting apparatus using the same.
  • a method of confirming an event from a video taken by a camera is used.
  • grasping an event from a video may directly determine whether a specific event has occurred while a user checks the video
  • a computer has developed a method of automatically determining whether an event has occurred by detecting an object from the video through image processing. .
  • a human visually checks an image or a computer detects an object through image processing, in order to determine whether an event is generated from the image, the image needs to have a certain level of illumination. In the case of detecting an object by processing a low light image, the detection result is inaccurate and there is a limit in reliability. Even when a human visually checks the image, it is difficult to identify the object from the image captured in a dark environment.
  • An embodiment of the present invention is to provide an image processing apparatus and method and an object detection apparatus for improving the accuracy and reliability of the image processing results by improving the low light image.
  • An embodiment of the present invention is to provide an image processing apparatus and method, and an object detection apparatus that can solve the problem that the object identification is impossible because the pixel value is partially saturated in the resultant image after the image processing to improve the illuminance do.
  • An embodiment of the present invention is to provide an image processing apparatus and method, and an object detecting apparatus that can mitigate the color distortion of the resultant image after image processing.
  • An embodiment of the present invention is to provide an image processing apparatus and method for enabling real-time processing by reducing the amount of computation required for image processing, and an object detection apparatus.
  • An image processing apparatus may include: a pixel for generating pixel value coordinates including a pixel value of a target pixel in a target image obtained from an input image and a pixel value of a reference pixel corresponding to the target pixel in a reference image A value coordinate generator; A reference pixel value determiner which determines a reference pixel value corresponding to the pixel value of the target image based on the pixel value coordinates; And a pixel value changer configured to change the pixel value of each pixel in the input image to the reference pixel value corresponding to the pixel value.
  • the pixel value coordinate generator may generate two-dimensional pixel value coordinates including a pixel value of the target pixel and a pixel value of the reference pixel.
  • the target image may be a background image obtained by separating the foreground from the input image.
  • the target image may be a background image frame obtained by separating the foreground from an image frame obtained at predetermined time intervals among consecutive input images.
  • the reference image may be a background image obtained by separating the foreground from an image of a scene identical to the input image in an environment brighter than the input image.
  • the reference pixel value determiner may include: calculating a equation of a figure in which a sum of distances spaced from each pixel value coordinate point corresponding to each pixel value coordinate in a coordinate system representing the pixel value coordinates is a minimum and then applies the pixel value of the target image to the pixel value of the target image;
  • the relational expression of the reference pixel values may be obtained.
  • the reference pixel value determiner may classify the pixel value coordinates into a plurality of groups and calculate an equation of the figure for each group.
  • the reference pixel value determiner may classify the pixel value coordinates based on a boundary obtained by parallelly moving the figure in the coordinate system.
  • the reference pixel value determiner may adjust the equation of the figure for the group by moving the figure for at least one of the other groups toward the figure for one group in the coordinate system.
  • the pixel value changing unit may change the pixel value of the input image into a reference pixel value obtained by substituting a variable corresponding to the pixel value of the target image in the relational expression.
  • the image processing apparatus converts a first color space model used in the input image, the target image, and the reference image into a second color space model including luminance and chromaticity, and uses the input image in which the pixel value is changed.
  • the apparatus may further include a color space model converter configured to convert the second color space model into the first color space model.
  • the pixel value coordinate generator may generate the pixel value coordinates by using the luminance pixel value corresponding to the luminance of the target pixel and the luminance pixel value corresponding to the luminance of the reference pixel.
  • the image processing apparatus may perform a chromaticity corresponding to the chromaticity of each pixel in the input image based on a chromaticity pixel value corresponding to the chromaticity of the target pixels in the target image and a chromaticity pixel value corresponding to the chromaticity of the reference pixels in the reference image.
  • the apparatus may further include a chromaticity pixel value changing unit for changing the pixel value.
  • the chroma pixel value changing unit obtains an average of chroma pixel values of the target pixels and chroma pixel values of the reference pixels, and calculates an average of chroma pixel values of the target pixels from chroma pixel values of each pixel in the input image. After subtracting, the average of the chromaticity pixel values of the reference pixels may be added.
  • An image processing method is a method of processing an image using an image processing apparatus, wherein the pixel value of a target pixel in a target image obtained from an input image and a reference pixel corresponding to the target pixel in a reference image are obtained.
  • An image processing method may be implemented as a computer executable program and recorded on a computer readable recording medium.
  • An image processing method may be implemented as a computer program stored in a medium for execution in combination with a computer.
  • the pre-processing unit for pre-processing the input image;
  • a background separator for separating a background from the preprocessed input image to obtain a foreground image;
  • an object detector configured to detect an object by using the foreground image
  • the preprocessor includes: a pixel value of a target pixel in a target image obtained from the input image and a pixel value of a reference pixel corresponding to the target pixel in a reference image
  • a pixel value coordinate generator for generating pixel value coordinates including a;
  • a reference pixel value determiner which determines a reference pixel value corresponding to the pixel value of the target image based on the pixel value coordinates;
  • a pixel value changer configured to change the pixel value of each pixel in the input image to the reference pixel value corresponding to the pixel value.
  • the object detecting apparatus analyzes the brightness of the input image and transmits the input image to the preprocessor if the brightness is smaller than a preset threshold, and if the brightness is greater than or equal to the threshold, the background of the input image. It may further include a brightness analyzer for transmitting to the separator.
  • the target image may be a background image obtained by separating the foreground from the input image.
  • the target image may be a background image frame obtained by separating the foreground from an image frame obtained at predetermined time intervals among consecutive input images.
  • the reference image may be a background image obtained by separating the foreground from an image of a scene identical to the input image in an environment brighter than the input image.
  • the accuracy and reliability of the image processing result may be improved by improving the low light image.
  • real-time processing may be achieved by reducing the amount of computation required for image processing to improve processing speed.
  • FIG. 1 is an exemplary block diagram of an object detecting apparatus according to an embodiment of the present invention.
  • FIG. 2 is a view schematically showing the configuration of a preprocessor according to an embodiment of the present invention.
  • 3 is an example of an input image that is image processed according to an embodiment of the present invention.
  • FIG. 4 is an example of a reference image used to process the input image of FIG. 3.
  • FIG. 5 is a graph illustrating pixel value coordinates generated from a target image obtained from the input image of FIG. 3 and a reference image of FIG. 4, according to an exemplary embodiment.
  • FIG. 6 is a graph illustrating a straight line calculated from the pixel value coordinate points of FIG. 5 according to an exemplary embodiment of the present invention.
  • FIG. 7 is a graph illustrating a state in which pixel value coordinates are classified into a plurality of groups according to another exemplary embodiment of the present invention.
  • FIG. 8 is a graph illustrating a straight line calculated for each group of pixel value coordinates according to another exemplary embodiment of the present invention.
  • FIG. 9 is a result image obtained by processing the input image of FIG. 3 based on the straight lines shown in FIG. 8 according to another embodiment of the present invention.
  • FIG. 11 is a resultant image obtained by processing the input image of FIG. 3 based on straight lines after adjustment illustrated in FIG. 10, according to another exemplary embodiment.
  • FIG. 12 is an exemplary flowchart of an object detection method according to an embodiment of the present invention.
  • FIG. 13 is an exemplary flowchart of a preprocessing process according to an embodiment of the present invention.
  • FIG. 14 is an exemplary flowchart of a process of changing a chromaticity pixel value of each pixel in an input image according to an embodiment of the present invention.
  • the terms ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module', etc. used throughout the present specification may mean a unit for processing at least one function or operation.
  • it can mean a hardware component such as software, FPGA, or ASIC.
  • ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module', etc. are not limited to software or hardware.
  • ' ⁇ ', ' ⁇ ', ' ⁇ ', ' ⁇ ' May be configured to reside in an addressable storage medium or may be configured to play one or more processors.
  • ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module' are components such as software components, object-oriented software components, class components, and task components. And processes, functions, properties, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and Contains variables
  • the components and the functions provided within ' ⁇ ', ' ⁇ ', ' ⁇ ', ' ⁇ ', ',' ⁇ Module 'or may be further separated into additional components and' ⁇ part ',' ⁇ group ',' ⁇ block ',' ⁇ module '.
  • FIG. 1 is an exemplary block diagram of an object detecting apparatus 10 according to an embodiment of the present invention.
  • the object detecting apparatus 10 may include a preprocessor 110, a background separator 120, and an object detector 130.
  • the preprocessor 110 preprocesses the input image.
  • the background separator 120 separates a background from the preprocessed input image to obtain a foreground image.
  • the object detector 130 detects an object by using the foreground image.
  • the object detecting apparatus 10 and the components included therein according to an embodiment of the present invention are processors that can process image data according to a predetermined algorithm, and may include, for example, a CPU, a GPU, and the like. Do not.
  • the processor may process image data by calling and executing a program stored in a storage device such as an HDD or an SSD.
  • the processed image may be provided to a display device such as a display, displayed on a screen, transmitted to another device using the processed image, or stored in the storage device.
  • the image processed according to an embodiment of the present invention may be an image captured by the same posture of the one space region in the same position, for example, but may be a surveillance image captured by CCTV, but is not limited thereto. According to an embodiment, the image processing according to the embodiment of the present invention may be applied to an image captured while the camera is dynamically moved.
  • the object detecting apparatus 10 may determine whether a specific event occurs by processing an input image and detecting an object therefrom. For example, the object detecting apparatus 10 detects and tracks a moving object moving in an image, discovers an object displayed on an image from a certain point in the monitoring target space, or disappears at a certain point in the monitoring target space. You can see the missing objects in the image.
  • the preprocessor 110 preprocesses the input image to improve the accuracy of the detection result before detecting the object from the image.
  • the preprocessing unit 110 may improve the accuracy and reliability of various image processing results including object detection by improving an input image having low illumination taken at night time or in an environment in which lighting is insufficient.
  • FIG. 2 is a view schematically showing the configuration of the pre-processing unit 110 according to an embodiment of the present invention.
  • the preprocessor 110 may include a pixel value coordinate generator 112, a reference pixel value determiner 113, and a pixel value changer 114.
  • the pixel value coordinate generator 112 includes pixel values of the target pixel in the target image obtained from the input image and pixel values of the reference pixel corresponding to the target pixel in the reference image. Can be generated.
  • the reference pixel value determiner 113 may determine a reference pixel value corresponding to the pixel value of the target image based on the pixel value coordinates.
  • the pixel value changing unit 114 may change the pixel value of each pixel in the input image into the reference pixel value corresponding to the pixel value.
  • the preprocessing unit 110 may improve an image by changing a pixel value of an input image (eg, a low light image) by using a target image and a reference image.
  • an input image eg, a low light image
  • the target image may be obtained from an input image.
  • the target image may be a background image obtained by separating the foreground from the input image.
  • the preprocessing unit 110 uses the input image as the target image, the target image may be the same as the input image.
  • the reference image is an image captured by the same scene as the input image, and is an image captured in a brighter environment than the input image. Furthermore, according to an embodiment, the reference image may be a background image obtained by separating the foreground from an image of a scene identical to the input image in a brighter environment than the input image.
  • the preprocessing unit 110 when the preprocessing unit 110 receives a surveillance image that always photographs an outdoor space as an input image, the reference image has a brightest brightness and has a low light intensity area such as a shade in the image (eg, daytime). Time frame).
  • a shade in the image eg, daytime. Time frame
  • the pixel value coordinate generator 112 may generate pixel value coordinates including a pixel value of a target pixel and a pixel value of a reference pixel in a reference image for each target pixel in the target image.
  • the pixel value coordinate generator 112 may generate two-dimensional pixel value coordinates P T and P R including a pixel value of a target pixel and a pixel value of a reference pixel.
  • P T and P R including a pixel value of a target pixel and a pixel value of a reference pixel.
  • FIG. 3 is an example of an input image processed by an image according to an embodiment of the present invention
  • FIG. 4 is an example of a reference image used to process the input image of FIG. 3.
  • FIG. 5 is a graph illustrating pixel value coordinates generated from a target image obtained from the input image of FIG. 3 and a reference image of FIG. 4, according to an exemplary embodiment.
  • the preprocessor 110 may improve the illumination of the image by preprocessing the input image using the reference image as shown in FIG. 4.
  • the reference image of FIG. 4 is an image of the same outdoor space as the input image of FIG. 3 in the same posture at the same position, except that the photographing time is in daytime instead of night.
  • the pixel value coordinate generator 112 generates two-dimensional pixel value coordinates P T and P R composed of pixel values of the target pixel in the target image and pixel values of the reference pixel in the reference image. If the pixel value coordinate points corresponding to the pixel value coordinates P T and P R are represented on a two-dimensional coordinate plane, the graph of FIG. 5 may be obtained.
  • the horizontal axis is a coordinate axis corresponding to the pixel value of the target image
  • the vertical axis is a coordinate axis corresponding to the pixel value of the reference image.
  • the target image is a background image obtained by separating the foreground from the input image
  • the reference image is also a background image obtained by separating the foreground from an image of the same scene as the input image in a brighter environment (that is, a day time zone) than the input image. This was used.
  • each pixel value coordinate (P T , P R ) has the pixel value P T of each target pixel in the target image as one coordinate value, and the pixel value P R of the reference pixel in the reference image is the other coordinate value.
  • the target pixel and the corresponding reference pixel are pixels having the same position of the pixel in the image.
  • the imaging time is different, so that the pixel value P T of the target pixel and the pixel value P R of the reference pixel do not coincide with each other and the pixel value coordinate points are coordinated It can be seen that it is scattered in the plane.
  • the reference pixel value determiner 113 may determine the target image based on the pixel value coordinates P T and P R. A reference pixel value corresponding to the pixel value may be determined.
  • the reference pixel value determiner 113 is a distance spaced from each pixel value coordinate point corresponding to each pixel value coordinate in a coordinate system representing the pixel value coordinates P T and P R.
  • a relation of reference pixel values with respect to pixel values of a target image may be obtained by calculating an equation of a figure in which the sum is the minimum.
  • the decision unit 113 calculates an equation of a straight line represented by a linear function.
  • the reference pixel value determiner 113 may calculate equations of various shapes, such as curves expressed by higher order functions, based on two-dimensional pixel value coordinates P T and P R.
  • the dimension of the pixel value coordinate is three or more dimensions, an equation of a spatial figure such as a plane or a curved surface may be calculated.
  • FIG. 6 is a graph illustrating a straight line calculated from the pixel value coordinate points of FIG. 5 according to an exemplary embodiment of the present invention.
  • the reference pixel value determiner 113 has a minimum sum of distances spaced from each pixel value coordinate point corresponding to each pixel value coordinate in a coordinate system representing the pixel value coordinates P T and P R.
  • the equation of the straight line l m can be calculated.
  • the reference pixel value determiner 113 has a minimum sum of distances spaced apart from each pixel value coordinate point in a direction parallel to the coordinate axis (ie, the vertical axis in FIG. 6) corresponding to the pixel value P R of the reference pixel.
  • the equation of the straight line l m can be calculated.
  • the reference pixel value determiner 113 is a figure in which the sum of distances from each pixel value coordinate point is minimized based on pixel value coordinates P T and P R obtained from a target image and a reference image (eg, The equation of a straight line, a curve, a broken line, a plane, a curved surface, etc.) may be calculated to obtain a relation of the reference pixel value P R with respect to the pixel value P T of the target image.
  • a reference image eg, The equation of a straight line, a curve, a broken line, a plane, a curved surface, etc.
  • the pixel value changing unit 114 may change the pixel value of each pixel in the input image into a reference pixel value corresponding to the pixel value.
  • the pixel value changing unit 114 uses the pixel value of the target image in a relational expression (ie, an equation of a figure) obtained by the reference pixel value determination unit 113 to obtain a pixel value of an input image.
  • the reference pixel value P R may be obtained by substituting into the variable P T corresponding to, and the corresponding pixel value of the input image may be changed to the reference pixel value P R.
  • the preprocessor 110 may adjust the pixel value of the input image to match the pixel value of the reference image by processing the input image based on the reference image. If the input image is a low light image taken in a dark environment, and the reference image is a high light image taken in a brighter environment with the same scene as the input image, the image processing according to the embodiment of the present invention described above is The input image may have a higher level of illuminance corresponding to the reference image.
  • the preprocessor 110 may further increase the illuminance of the image by changing the pixel value of the input image according to a plurality of different relational expressions.
  • the reference pixel value determiner 113 classifies the pixel value coordinates P T , P R into a plurality of groups and calculates an equation of a figure (eg, a straight line) for each group. Can be.
  • FIG. 7 is a graph illustrating a state in which pixel value coordinates are classified into a plurality of groups G h , G m , and G l according to another embodiment of the present invention.
  • the reference pixel value determiner 113 uses a plurality of groups G h , G m , and G to obtain pixel value coordinates obtained from a target image and a reference image. l )
  • the pixel value coordinates are classified into three groups consisting of an upper group G h , an intermediate group G m, and a lower group G l , but the number of groups may vary according to an embodiment.
  • the reference pixel value determiner 113 may classify the pixel value coordinates based on the pixel value P R of the reference pixel. For example, referring to FIG. 7, coordinates having a large pixel value P R of a reference pixel among the pixel value coordinates constitute an upper group G h , and coordinates having a small pixel value P R of a reference pixel are lower group G l. The coordinates in which the pixel value P R of the reference pixel is at an intermediate level may constitute the intermediate group G m .
  • the reference pixel value determiner 113 may classify the pixel value coordinates based on a boundary obtained by parallelly moving the figure (that is, the straight line l m in FIG. 7) in the coordinate system representing the pixel value coordinates.
  • the reference pixel value determiner 113 moves the upper group G based on the upper boundary b h obtained by parallelly moving a straight line l m in a direction in which the pixel value P R of the reference pixel increases. Identify the pixel value coordinates corresponding to h and distinguish the pixel value coordinates corresponding to the lower group G l based on the lower boundary b l obtained by moving the straight line l m in the direction of decreasing pixel value P R of the reference pixel. Can be.
  • boundary or criterion for dividing the pixel value coordinates is not limited thereto, and the pixel value coordinates may be distinguished in various embodiments.
  • the reference pixel value determiner 113 may calculate an equation of a figure in which a sum of distances spaced from pixel value coordinate points belonging to the group is minimum for each group of pixel value coordinates.
  • FIG. 8 is a graph illustrating straight lines l h , l m , l l calculated for each group G h , G m , G l of pixel value coordinates according to another embodiment of the present invention.
  • the reference pixel value determiner 113 is a figure in which a sum of distances spaced from pixel value coordinate points corresponding to pixel value coordinates belonging to each group is small for a group classified through the above-described process (a straight line in FIG. 8). ) Can be calculated.
  • the reference pixel value determiner 113 may obtain equations of straight lines l h , l m , l l as many as the number of groups, and the equations of each straight line are pixels belonging to each group.
  • the value coordinates P T and P R are used as a relational expression of the reference pixel value P R with respect to the pixel value P T of the target image.
  • each pixel in the input image belongs to a group to which corresponding pixel value coordinates belong. Accordingly, the pixel value may be adaptively changed by any one of a plurality of relational expressions.
  • the pixel values of the input image are not adjusted uniformly by one relation but are adaptively adjusted according to any one of a plurality of relations so that different levels of image enhancement may be achieved for each region of the input image. Can be.
  • FIG. 9 is a result image obtained by processing the input image of FIG. 3 based on the straight lines l h , l m , l l shown in FIG. 8 according to another embodiment of the present invention.
  • the illumination of the image is significantly improved compared to the input image of FIG. 3, thereby making it easier to detect an object over the entire area of the image, and it can be seen that the white region generated by saturation of pixel values does not appear in the image. .
  • the reference pixel value determiner 113 may obtain a more natural result image from the input image by adjusting a plurality of relational expressions obtained through the above process.
  • the reference pixel value determiner 113 is directed to at least one of the remaining groups toward a figure (eg, a straight line) for one group in a coordinate system representing the pixel value coordinates P T and P R. You can adjust the equation of the figure for the group by moving the figure.
  • a figure eg, a straight line
  • 10 is a graph showing straight lines l h , l m ', l l ' in which an equation is adjusted according to another embodiment of the present invention.
  • the reference pixel value determiner 113 includes a straight line l h and a bottom line with respect to the upper group G h among the straight lines l h , l m , l l calculated in the embodiment of FIG. 8.
  • the straight lines l h 'and l l ' can be obtained by adjusting the equations of the lines l h and l l such that the straight line l l for the group G l moves in parallel towards the straight line l m for the intermediate group G m .
  • the target image pixel value P T pixel values of the reference image at the time is 0, the P R (i.e., the intercept P R axis) the largest relation (FIG. 10, the linear equation for l h) to and translated in the direction of decreasing pixel values P R of the reference image, , the smallest relation P R-axis intercept of the (in Fig. 10, a straight line l equation l) by a parallel movement in the direction of increasing pixel values P R of the reference image, the pixel value of the station of the relationship reference image P R upper limit or The lower limit value can be avoided.
  • FIG. 11 is a result image obtained by processing the input image of FIG. 3 based on the straight lines l h , l m ', l l ' after adjustment shown in FIG. 10 according to another embodiment of the present invention.
  • the resultant image of FIG. 11 does not show a particular region in the image. It can be seen that more natural results can be obtained than the result images.
  • the computer not only detects the object through image processing from the result image obtained by the preprocessing unit 110, but also detects the object by visually identifying the object. Providing the image allows the viewer to more easily and accurately identify the object from the image.
  • the preprocessor 110 may further include a color space model converter 111.
  • the color space model converter 111 converts a color space model used in an image. According to an exemplary embodiment of the present invention, a luminance and chromaticity of a first color space model used in an input image, a target image, and a reference image may be calculated. The second color space model may be converted into an included second color space model. The color space model converting unit 111 converts the second color space model used in the input image (that is, the resultant image) whose pixel value is changed by the pixel value changing unit 114 into a first color space model. can do.
  • the first color space model may be an RGB color space model
  • the second color space model may be an L * a * b * color space model, but the type of the color space model is limited thereto. It doesn't work.
  • the color space model converter 111 may convert the RGB color space model into another color space model including luminance and chromaticity, for example, a YCbCr color space model.
  • the input image, the target image, and the reference image converted by the color space model converter 111 into the second color space model including luminance and chromaticity are image processed as described with reference to FIGS. 3 to 11.
  • the resulting image may be converted back to the first color space model.
  • the image processing of FIGS. 3 to 11 may be performed on luminance pixel values corresponding to luminance among pixel values.
  • the pixel value coordinate generator 112 may generate pixel value coordinates P T and P R using luminance pixel values corresponding to the luminance of the target pixel and luminance pixel values corresponding to the luminance of the reference pixel. Can be.
  • the reference pixel value determiner 113 determines a reference luminance pixel value corresponding to the luminance pixel value of the target image based on the pixel value coordinates P T and P R for the luminance and changes the pixel value.
  • the unit 114 may change the luminance pixel value corresponding to the luminance of each pixel in the input image into a reference luminance pixel value corresponding to the luminance pixel value.
  • the preprocessing unit 110 may further include a chroma pixel value changing unit 115.
  • the chroma pixel value changing unit 115 changes the chroma pixel value corresponding to the chroma of the input image.
  • the chroma pixel value changing unit 115 inputs the chroma pixel value corresponding to the chromaticity of the target pixels in the target image and the chromaticity pixel value corresponding to the chromaticity of the reference pixels in the reference image.
  • the chromaticity pixel value corresponding to the chromaticity of each pixel in the image may be changed.
  • the chroma pixel value changing unit 115 obtains an average ⁇ T of the chroma pixel values of the target pixels and an average ⁇ R of the chroma pixel values of the reference pixels, and extracts the target pixels from the chroma pixel values of the respective pixels in the input image. After subtracting the average ⁇ T of the chroma pixel values, the average ⁇ R of the chroma pixel values of the reference pixels can be added.
  • the chroma pixel value changing unit 115 changes the chroma pixel value of each pixel in the input image as follows.
  • P i is the chroma pixel value of the input image
  • P result is the chroma pixel value of the resultant image
  • the preprocessor 110 may mitigate the distortion of the color of the resultant image obtained by the preprocess.
  • the pixel values of the a channel and the b channel pixels corresponding to the chroma pixel values of the resultant image are generally By having a value corresponding to a positive number, the red and yellow colors stand out in the image to alleviate color distortion in which the overall color of the image is orange.
  • the preprocessing unit 110 may improve the accuracy and reliability in subsequent image processing such as subsequent object detection by preprocessing the input image based on the reference image.
  • the background separator 120 may obtain a foreground image by separating a background from an input image preprocessed by the preprocessor 110. Then, the object detector 130 may detect the object by using the foreground image.
  • the background separation unit 120 is pre-processed using at least one of Principal Component Analysis (PCA), Robust PCA (RPCA) and Online Robust PCA (ORPCA)
  • PCA Principal Component Analysis
  • RPCA Robust PCA
  • ORPCA Online Robust PCA
  • the background image and the foreground image can be obtained from the input image.
  • the background image can be obtained through a low-rank matrix
  • the foreground image can be obtained through a sparse matrix.
  • the foreground image obtained through the sparse matrix may be transmitted to the object detector 130 to detect various kinds of objects (eg, moving objects, etc.) using the foreground image, and obtained through the low-rank matrix.
  • the background image may be used as the target image and the reference image by the preprocessor 110.
  • the target image may be a background image frame obtained by separating the foreground from an image frame obtained at predetermined time intervals among consecutive input images.
  • the computation amount may increase, and thus the processing speed may be slowed.
  • the exemplary embodiment of the present invention obtains a target frame at each predetermined time interval by acquiring an image frame at a predetermined time interval (for example, one hour, etc.) from a continuous input image and separating the foreground from the input image. Can be updated and used.
  • a predetermined time interval for example, one hour, etc.
  • the embodiment of the present invention enables real-time processing by achieving a fast processing speed with a small amount of calculation even when preprocessing input images continuously input by using the target image and the reference image.
  • the object detecting apparatus 10 may further include a brightness analyzer 100.
  • the brightness analyzer 100 analyzes the brightness of the input image, and transmits the input image to the preprocessor 110 when the brightness is smaller than a preset threshold, and when the brightness is greater than or equal to the threshold.
  • the input image may be transmitted to the background separator 120.
  • the object detecting apparatus 10 may selectively perform preprocessing according to the illuminance of the input image by analyzing the brightness of the input image through the brightness analyzer 100.
  • the embodiment of the present invention does not need to perform preprocessing on all input images continuously inputted, but preprocesses only on low-light input images (e.g., images taken at night time) having illuminance lower than a threshold value. You can lower the amount of computation by
  • FIG. 12 is an exemplary flowchart of an object detection method 20 according to an embodiment of the present invention.
  • the object detecting method 20 may be performed by the object detecting apparatus 10 according to the above-described embodiment of the present invention.
  • the object detecting apparatus 10 may detect an object from an input image by calling and executing an object detecting program stored in a storage device.
  • the object detecting method 20 may include preprocessing an input image (S220), obtaining a foreground image by separating a background from the input image (S230), and using the foreground image to detect an object. It may include the step (S240).
  • the object detecting method 20 further includes analyzing the brightness of the input image (S200) before preprocessing the input image (S220), and when the brightness is lower than a preset threshold (Yes in S210). ) If the input image is preprocessed and the brightness is greater than or equal to the threshold (NO in S210), the preprocessing may be skipped, and the foreground image may be obtained by separating the background from the input image.
  • FIG. 13 is an exemplary flowchart of a pre-processing process S220 according to an embodiment of the present invention.
  • the pretreatment process S220 may be performed by the preprocessor 110 according to the above-described embodiment of the present invention.
  • the preprocessor 110 uses a target image and a reference image for preprocessing the input image
  • the target image is a background image obtained by separating the foreground from the input image by the background separator 120.
  • the reference image may be a background image obtained by separating the foreground from an image of a scene identical to the input image in a brighter environment than the input image.
  • the target image may be a background image frame obtained by separating the foreground from an image frame obtained at predetermined time intervals among consecutive input images.
  • the preprocessing step S220 may include pixel value coordinates P T and P including pixel values of a target pixel in a target image and pixel values of a reference pixel corresponding to the target pixel in a reference image.
  • the method may include changing the pixel value of each pixel within the reference pixel value P R corresponding to the pixel value (S224).
  • the determining of the reference pixel value may include: from each pixel value coordinate point corresponding to each pixel value coordinate in a coordinate system representing the pixel value coordinates P T and P R. Calculating a equation of a figure (eg, a straight line l m in a two-dimensional coordinate system) in which the sum of the spaced distances is the minimum, and obtaining a relation of the reference pixel value P R with respect to the pixel value P T of the target image. have.
  • the determining of the reference pixel value may classify the pixel value coordinates P T , P R into a plurality of groups G h , G m , and G l . And calculating the equations of the figures l h , l m and l l for each group.
  • the step of classifying the pixel value coordinates P T , P R into a plurality of groups G h , G m , and G l may include: a boundary b h obtained by parallelly moving the figure l m in the coordinate system; , b l ) may include classifying the pixel value coordinates P T and P R.
  • the determining of the reference pixel value (S223) may include at least one of the remaining groups G h toward the figure l m for one group G m in the coordinate system. , moving a shape (l h, l l) to G l) to the group (G h, G l) to adjust the shape (l h ', l l' further comprise the step of obtaining the equation) for Can be.
  • the pixel value of each pixel in the input image is changed to a reference pixel value corresponding to the pixel value.
  • the pixel value of the input image is substituted into a variable P T corresponding to the pixel value of the target image in the relation. It may include the step of changing to the reference pixel value P R obtained by.
  • the pre-processing step S220 may include a second color space model (eg, an RGB color space model) used for the input image, the target image, and the reference image, including luminance and chromaticity.
  • the method may further include a step S211 of converting to a color space model (eg, L * a * b * color space model).
  • the pre-processing step (S220) may further include converting the second color space model used in the input image (that is, the resultant image) whose pixel value is changed into the first color space model (S226). It may include.
  • the generating of the pixel value coordinates P T and P R may include generating a luminance pixel value corresponding to the luminance of the target pixel and a luminance pixel value corresponding to the luminance of the reference pixel. And generating pixel value coordinates P T , P R using the same.
  • the preprocessing step S220 may be performed based on a chromaticity pixel value corresponding to chromaticity of target pixels in a target image and a chromaticity pixel value corresponding to chromaticity of reference pixels in a reference image.
  • the method may further include changing a chromaticity pixel value corresponding to the chromaticity of each pixel in the image (S225).
  • FIG. 14 is an exemplary flowchart of a process S225 of changing a chromaticity pixel value of each pixel in an input image according to an embodiment of the present invention.
  • the chroma pixel value of each pixel in the input image may be calculated by obtaining an average ⁇ T of chroma pixel values of target pixels and an average ⁇ R of chroma pixel values of reference pixels ( S2251), subtracting the average ⁇ T of the chroma pixel values of the target pixels from the chroma pixel value P i of each pixel in the input image (S2252), and subtracting the average pixel of the chroma pixel values of the reference pixels from the chroma pixel value obtained by subtraction.
  • Computing R may be included (S2253).
  • At least one of the object detecting method and the image processing method related to the preprocessing may be manufactured as a program for execution in a computer and stored in a computer-readable recording medium.
  • the computer readable recording medium includes all kinds of storage devices for storing data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like.
  • the method may be implemented as a computer program stored in a medium for execution in combination with a computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Nonlinear Science (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un dispositif et un procédé de traitement d'image, et un dispositif de détection d'objet l'utilisant. Le dispositif de traitement d'image selon un mode de réalisation de la présente invention peut comprendre : une unité de génération de coordonnées de valeur de pixel pour générer des coordonnées de valeurs de pixels comprenant une valeur de pixel d'un pixel cible dans une image cible obtenue à partir d'une image d'entrée et une valeur de pixel d'un pixel de référence correspondant au pixel cible dans une image de référence ; une unité de détermination de valeur de pixel de référence pour déterminer une valeur de pixel de référence coïncidant avec la valeur de pixel de l'image cible sur la base des coordonnées de valeur de pixel ; et une unité de changement de valeur de pixel pour changer une valeur de pixel de chaque pixel dans l'image d'entrée en la valeur de pixel de référence coïncidant avec la valeur de pixel correspondante.
PCT/KR2016/009394 2015-08-25 2016-08-24 Dispositif et procédé de traitement d'image pour améliorer de manière adaptative un faible niveau d'éclairage, et dispositif de détection d'objet l'utilisant Ceased WO2017034323A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2015-0119344 2015-08-25
KR1020150119344A KR101715247B1 (ko) 2015-08-25 2015-08-25 적응적으로 저조도를 개선하는 영상 처리 장치 및 방법, 그리고 그를 이용한 객체 검출 장치

Publications (1)

Publication Number Publication Date
WO2017034323A1 true WO2017034323A1 (fr) 2017-03-02

Family

ID=58100632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/009394 Ceased WO2017034323A1 (fr) 2015-08-25 2016-08-24 Dispositif et procédé de traitement d'image pour améliorer de manière adaptative un faible niveau d'éclairage, et dispositif de détection d'objet l'utilisant

Country Status (2)

Country Link
KR (1) KR101715247B1 (fr)
WO (1) WO2017034323A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12026862B2 (en) 2019-10-11 2024-07-02 3M Innovative Properties Company Apparatus and methods for preprocessing images having elements of interest

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102422128B1 (ko) * 2017-08-31 2022-07-20 한화테크윈 주식회사 영상 처리 시스템 및 그에 의한 영상 처리 방법
KR102860057B1 (ko) 2022-10-27 2025-09-12 한전케이디엔주식회사 이미지 조도 개선 기술을 기반으로 한 지하시설물 진단장치 및 그 방법
KR102899834B1 (ko) * 2023-06-14 2025-12-11 장은영 작업기준 위반을 감지하기 위한 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090024898A (ko) * 2007-09-05 2009-03-10 한국전자통신연구원 영상 객체 추출 장치 및 그 방법
KR20090048191A (ko) * 2007-11-09 2009-05-13 주식회사 쎄이미지 다이내믹 레인지를 확장하는 칼라 영상 처리장치 및 방법
KR100927554B1 (ko) * 2009-03-13 2009-11-20 주식회사 일리시스 주야간 영상 합성에 기반한 야간 영상 감시 시스템 및 방법
KR20100056143A (ko) * 2008-11-19 2010-05-27 (주)투미르 영역별 처리기법을 이용한 동작객체 검출방법
KR20110114096A (ko) * 2010-04-12 2011-10-19 주식회사 영국전자 열상 카메라를 채용하는 감시 시스템 및 이를 이용한 야간 감시 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090024898A (ko) * 2007-09-05 2009-03-10 한국전자통신연구원 영상 객체 추출 장치 및 그 방법
KR20090048191A (ko) * 2007-11-09 2009-05-13 주식회사 쎄이미지 다이내믹 레인지를 확장하는 칼라 영상 처리장치 및 방법
KR20100056143A (ko) * 2008-11-19 2010-05-27 (주)투미르 영역별 처리기법을 이용한 동작객체 검출방법
KR100927554B1 (ko) * 2009-03-13 2009-11-20 주식회사 일리시스 주야간 영상 합성에 기반한 야간 영상 감시 시스템 및 방법
KR20110114096A (ko) * 2010-04-12 2011-10-19 주식회사 영국전자 열상 카메라를 채용하는 감시 시스템 및 이를 이용한 야간 감시 방법

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12026862B2 (en) 2019-10-11 2024-07-02 3M Innovative Properties Company Apparatus and methods for preprocessing images having elements of interest

Also Published As

Publication number Publication date
KR20170024287A (ko) 2017-03-07
KR101715247B1 (ko) 2017-03-10

Similar Documents

Publication Publication Date Title
WO2014148829A1 (fr) Procédé et appareil de traitement d'une image sur la base d'informations détectées
WO2020085881A1 (fr) Procédé et appareil de segmentation d'image en utilisant un capteur d'événement
WO2017039259A1 (fr) Appareil et procédé pour le diagnostic d'un équipement électrique à l'aide d'une caméra thermique
WO2016163755A1 (fr) Procédé et appareil de reconnaissance faciale basée sur une mesure de la qualité
WO2017034323A1 (fr) Dispositif et procédé de traitement d'image pour améliorer de manière adaptative un faible niveau d'éclairage, et dispositif de détection d'objet l'utilisant
WO2014058248A1 (fr) Appareil de contrôle d'images pour estimer la pente d'un singleton, et procédé à cet effet
WO2012005387A1 (fr) Procédé et système de suivi d'un objet mobile dans une zone étendue à l'aide de multiples caméras et d'un algorithme de poursuite d'objet
WO2010041836A2 (fr) Procédé de détection d'une zone de couleur peau à l'aide d'un modèle de couleur de peau variable
EP3017422A1 (fr) Procédé et appareil de traitement d'images pour dispositif d'affichage incurvé
WO2016076497A1 (fr) Procédé et dispositif pour un affichage d'image sur la base de métadonnées, et support d'enregistrement associé
WO2015182904A1 (fr) Appareil d'étude de zone d'intérêt et procédé de détection d'objet d'intérêt
WO2019009664A1 (fr) Appareil pour optimiser l'inspection de l'extérieur d'un objet cible et procédé associé
WO2022050668A1 (fr) Procédé de détection du mouvement de la main d'un dispositif de réalité augmentée vestimentaire à l'aide d'une image de profondeur, et dispositif de réalité augmentée vestimentaire capable de détecter un mouvement de la main à l'aide d'une image de profondeur
WO2011019192A2 (fr) Système et procédé pour reconnaître un visage à l'aide d'un éclairage infrarouge
WO2022086001A1 (fr) Procédé de traitement d'image et appareil de traitement d'image l'utilisant
WO2022019675A1 (fr) Dispositif et procédé d'analyse de symboles compris dans un plan d'étage d'un site
WO2022225102A1 (fr) Ajustement d'une valeur d'obturateur d'une caméra de surveillance par le biais d'une reconnaissance d'objets basée sur l'ia
WO2017090892A1 (fr) Caméra de génération d'informations d'affichage à l'écran, terminal de synthèse d'informations d'affichage à l'écran (20) et système de partage d'informations d'affichage à l'écran le comprenant
KR101044903B1 (ko) 영상 감시 시스템에서 은닉 마르코프 모델을 이용한 불 검출방법
WO2022092743A1 (fr) Procédé d'extraction de caractères de plaque d'immatriculation de véhicule et dispositif d'extraction de caractères de plaque d'immatriculation pour appliquer le procédé
WO2017111257A1 (fr) Appareil de traitement d'images et procédé de traitement d'images
WO2023120831A1 (fr) Procédé de désidentification et programme informatique enregistré sur un support d'enregistrement en vue de son exécution
WO2013085278A1 (fr) Dispositif de surveillance faisant appel à un modèle d'attention sélective et procédé de surveillance associé
KR20130015713A (ko) 이미지 데이터 기반의 화재 감지 장치 및 그 방법
WO2016086380A1 (fr) Procédé et dispositif de détection d'objet, dispositif mobile de commande à distance et véhicule de vol

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16839612

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16839612

Country of ref document: EP

Kind code of ref document: A1