[go: up one dir, main page]

WO2018173243A1 - Dispositif de type appareil photo stéréo - Google Patents

Dispositif de type appareil photo stéréo Download PDF

Info

Publication number
WO2018173243A1
WO2018173243A1 PCT/JP2017/011909 JP2017011909W WO2018173243A1 WO 2018173243 A1 WO2018173243 A1 WO 2018173243A1 JP 2017011909 W JP2017011909 W JP 2017011909W WO 2018173243 A1 WO2018173243 A1 WO 2018173243A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
stereo
camera
video input
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/011909
Other languages
English (en)
Japanese (ja)
Inventor
媛 李
三好 雅則
秀行 粂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to PCT/JP2017/011909 priority Critical patent/WO2018173243A1/fr
Publication of WO2018173243A1 publication Critical patent/WO2018173243A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Definitions

  • a stereo measurement apparatus that measures an object or an environment using a plurality of cameras has been put into practical use.
  • control is performed by various measurements using an in-vehicle stereo camera of an automobile.
  • the vehicle is controlled by measuring the front-rear direction from a camera even in automatic driving of a car.
  • stereo cameras in the surveillance field and marketing.
  • stereo three-dimensional measurement two cameras photograph the same object at the same time, perform distortion correction and parallelization on the obtained image, and calculate a difference and parallax on the image. And the shape of a three-dimensional object etc. can be measured using the calculated parallax.
  • the measurement range is fixed depending on specifications such as camera parameters, field angle, and parallax depth.
  • the measurement range is defined from 2 m to 40 m in front of the camera. That is, there is a problem that it becomes impossible to measure 2 m or less and 40 m or more from the front of the camera.
  • Patent Document 1 there is one described in Patent Document 1 as a means having a high ability to detect the surrounding situation.
  • image recognition including an optical flow unit that detects a moving state of an object and a moving object state detecting unit that detects a state of the moving object from a detection result detected from a stereo measurement unit that detects the distance of the object.
  • An apparatus is disclosed.
  • Patent Document 2 discloses that high-accuracy three-dimensional measurement is realized even in a wide range by searching for high-precision parallax using images taken from a plurality of different viewpoints.
  • Patent Document 1 it is possible to propose a method of monitoring with a separate sensor or another additional camera for the blind spot problem that cannot be measured in stereo.
  • integration with another sensor and an increase in additional cost are necessary.
  • the information of another sensor and another additional camera does not include three-dimensional information, it is not easy to maintain monitoring accuracy. Also, changing the camera setting value for the shooting direction and zoom of the camera based on the detection result is not disclosed.
  • Patent Document 2 Although the measurement accuracy is improved with respect to the three-dimensional measurement in a certain range, it is not easy to dynamically adjust the measurement range according to the danger or abnormality that has occurred in the field.
  • an object of the present invention is to provide a highly reliable stereo camera device that takes into account the blind spot area.
  • a stereo camera device in order to solve the above problems, a plurality of actual video input units that capture an image of an object or a region, and a camera control unit that changes a setting value at the time of capturing the actual video input unit, ,
  • An image acquisition unit that converts an electronic signal input from the real video input unit, a monitoring unit that detects abnormality or danger based on the image information acquired by the image acquisition unit, and a detection by the monitoring unit Based on the obtained information, a stereo control unit that calculates a control value when the real video input unit is made stereo, and from the real video input unit to the object or region based on the information acquired by the image acquisition unit
  • a distance calculation unit that calculates a distance of the camera, wherein the camera control unit changes a setting value at the time of shooting of the real image input unit based on a control value calculated by the stereo control unit. Characterized in that it.
  • FIG. 1 It is a block diagram which shows the system configuration
  • the real image input unit (stereo camera) 10 basically includes an imaging unit and a lens.
  • the stereo camera receives external images from the left and right camera lenses.
  • the imaging unit is a mechanism including an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device) which is an image sensor.
  • the lens is a zoomable lens. By zooming the lens, it is possible to image not only the near area but also the telephoto area.
  • the scene to be photographed is not specified, but if it is a stereo camera for the monitoring function, a moving object, a surveillance area, or the like can be considered as a photographing target.
  • a road or a car ahead can be considered as a shooting scene.
  • the shooting can be performed even in a complicated shooting scene or a simple shooting scene, and the shot video is input.
  • the camera of the first embodiment may be a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like.
  • the image acquisition unit 11 is a unit that converts an electronic signal input from the real video input unit 10 into an image.
  • image formats there are various image formats, and the image format is determined by the system such as BMP, JPEG, or PNG.
  • the real video input unit 10, the image acquisition unit 11, and the camera control unit 12 described above are provided in each of the right and left cameras. That is, as shown in FIG. 1, it is comprised by the right demonstration input part, the left demonstration input part, the right image acquisition part, the left image acquisition part, the right camera control part, and the left camera control part.
  • the monitoring unit 13 uses the image information input from the image acquisition unit 11 to monitor and detect abnormalities and dangers.
  • the abnormalities and dangers here vary depending on the shooting scene.
  • Example 1 is a vehicle-mounted camera, it corresponds to a time when a person or a vehicle suddenly appears from the front of the operation or a suspicious dangerous object enters.
  • Various methods have been studied for intruder detection. For example, in the method “Matsuyama et al., Background difference robust against lighting changes”, a background model can be reproduced and an intruder can be detected by a monitoring scene.
  • a method of detecting a specific person or an abnormal thing from an image by machine learning has been proposed.
  • SIFT Scale-Invant Feature Transform
  • HOG Heistograms of Oriented Gradients
  • discriminator Adaboost SVM (supported vector machine)
  • the stereo control unit 14 receives the alarm output from the monitoring unit 13, and calculates a control value for the stereo conversion of the camera based on the detected abnormality and danger information. Based on the calculated control value, the camera control unit 12 adjusts the parameters of the right camera and the left camera. This makes the camera stereo.
  • FIG. 3 is a diagram illustrating a configuration example of the stereo control unit 14.
  • the alarm observation unit 31 determines whether there is an alarm 30 from the monitoring unit 13.
  • the alarm information is output from the alarm observation unit 31 to the alarm analysis unit 32.
  • distance information detected by a laser sensor used in the monitoring unit 13 can be considered as alarm information.
  • the alarm analysis unit 32 specifies a camera to be controlled based on the alarm information. For example, if detection is received from the right camera, it is determined that it is necessary to further measure both cameras in stereo for the observation area of the right camera, and it is possible to decide to control the left camera. . Then, the current camera setting unit 33 acquires the current setting values of the left and right cameras.
  • the installation value means a parameter value of the zoom value, the rotation value, and the inter-camera distance of the camera.
  • the control calculation unit 34 calculates a control value 20 for stereoization based on the camera installation value acquired by the camera current state unit 33 and the result determined by the alarm analysis unit 32, and based on the calculated control value 20,
  • the left camera control unit 13 controls the camera to make it stereo. This operation makes it possible to intelligently control the stereo camera, so that the angle and direction of the stereo camera can be changed separately, and a stereo camera device that can be stereoified dynamically according to abnormalities and dangers. Can be provided. In addition, when two different cameras are made stereo, it is possible to make maximum use of camera resources. Furthermore, if necessary, it can be expected that the cost of the camera will be reduced by making it stereo.
  • FIG. 4 is a diagram illustrating another configuration example of the stereo control unit 14.
  • the alarm is not based on the information detected by the sensor.
  • This is an example in which an alarm is detected using an image and a control value 20 is estimated.
  • the alarm information for example, the position on the image obtained from the right or left camera can be considered.
  • the alarm observation unit is the same as in FIG.
  • the image analysis unit 40 receives the left and right camera images 41 as input and analyzes the danger information on the images.
  • the image analysis unit 40 can analyze a region range where the danger has occurred, a place where the danger has occurred, and the like. For example, when a danger is detected from the image of the right camera, the left camera is controlled to make the left and right cameras stereo.
  • the control estimation unit 42 changes the parameters of the left camera while comparing the images of the left and right cameras, and estimates the control value until it can be made stereo. Then, the estimated control value is output.
  • the camera control unit 12 quickly controls the camera based on the control value obtained when the stereo control unit 14 acquires the stereo image, and the stereo image is generated. By doing so, it is possible to acquire the abnormal or dangerous state in detail as the left and right camera images and camera information.
  • the distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information.
  • FIG. 5 is a diagram illustrating a configuration example of the distance calculation unit.
  • the camera calibration unit estimates the current camera parameters from the camera installation value 50. Camera parameters are camera information indicating the focal length, orientation, etc. of the camera, and can be broadly divided into two parameters: internal parameters and external parameters.
  • K is an internal parameter matrix
  • f is the focal length
  • a is the aspect ratio
  • s is the skew
  • (vc, uc) is the center coordinate of the image coordinates.
  • D is an external parameter matrix
  • (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicates the direction of the camera
  • (tx, ty, tz) indicates the world coordinates of the camera installation position.
  • the camera direction (r11, r12,..., R33) indicating the external parameter is represented by three parameters of pan ⁇ , tilt ⁇ , and roll ⁇ , which are camera installation angles, when defined by Euler angles. Therefore, the number of camera parameters necessary for associating the image coordinates with the world coordinates is 11 which is a total of 5 internal parameters and 6 external parameters.
  • the distortion correction unit 54 corrects the distortion of the camera image using the camera parameter.
  • the parallel processing unit processes the parallelization of the left and right cameras. In other words, the 3D measurement value of the measurement object is
  • FIG. 6 is a diagram illustrating another configuration example of the distance calculation unit 15.
  • the camera automatic calibration unit automatically estimates camera parameters using the input image.
  • estimation methods have been developed. One of them is the automatic calibration method described in “Development of self-calibration technology for variable-parameter stereo camera, Li, 22nd Image Sensing Symposium, IS1-04”. After estimating the parameters, the distance can be obtained by the same principle as in FIG.
  • the stereo camera parameters can be intelligently adjusted. In other words, it is possible to detect danger and abnormality by quickly controlling the camera and making it stereo, and to correctly measure the current situation.
  • safer and safer control can be performed. For example, when one camera monitors a nearby area and the other camera monitors a telephoto area, if an abnormality or danger occurs in either area, the camera is quickly controlled to make it stereo. It is possible to measure abnormalities and dangers.
  • the measurement ranges of the left and right cameras are 71 and 72, and 73 and 74 on the image when measuring the vicinity region.
  • the measurement ranges of the left and right cameras are 75 and 76, and 77 and 78 on the image.
  • the left and right cameras are measured in separate areas, the near area and the telephoto area. Specifically, the left camera measures the near area, and the right camera measures the telephoto area. Then, when the danger 712 is detected by the left camera, the stereo control is immediately performed, and the right camera is controlled so that the telephoto area can be acquired. Then, the area where the danger has occurred is measured with the left and right cameras in a stereo state. As a result, the blind spot area can be eliminated in the vicinity area of the in-vehicle stereo camera, and the danger can be predicted and measured.
  • FIG. 8 is used to explain another embodiment of the stereo camera device shown in FIG.
  • the left and right cameras are measured in separate areas, a near area and a telephoto area.
  • the left camera measures the vicinity area 71
  • the right camera measures the telephoto area 76.
  • the image corresponding to the left camera at that time is 73
  • the image corresponding to the right camera is 74.
  • the right camera detects the danger 80
  • the stereo control is immediately performed and the left camera is controlled.
  • the left and right cameras are measured in a stereo state in the area where the danger 80 has occurred.
  • the blind spot area can be eliminated from the telephoto area of the in-vehicle stereo camera, and the danger can be predicted and measured.
  • the image region 73 of the camera 71 has a region 75 that is assumed to be distant. Therefore, virtual stereo processing can be performed using the region 75 and the region 74. As a result, stereo three-dimensional measurement can be easily performed without causing camera control. However, in this case, it is predicted that the measurement resolution will be reduced, so it is desirable to apply a super-resolution technique or the like.
  • the fourth embodiment includes a monitoring unit 90 that monitors danger and abnormality, and a stereo control unit 91 that instructs the camera to perform stereo control in accordance with the danger and abnormality risk information acquired by the monitoring unit 90.
  • the left and right camera control units 12 are connected to the actual video input unit 10, and the camera control unit 12 adjusts the camera according to the control information of the stereo control unit 91.
  • the image acquisition unit 11 converts the electronic signal input from the real video input unit 10 into an image.
  • the monitoring unit 90 and the stereo control unit 91 are connected to the camera control unit 12, the actual video input unit 10, and the image acquisition unit 11 via the Internet. Accordingly, for example, the monitoring unit 90 detects danger or abnormality, and an instruction to stereo-control the camera of the stereo control unit 91 according to the danger information acts on the camera control unit 12 via the Internet line to perform camera control. .
  • the distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information. Since the distance calculation unit 15 is connected to the real video input unit 10 and the image acquisition unit 11 via the Internet line, the distance calculation unit 15 can acquire camera images and camera information necessary for distance calculation via the Internet.
  • each component is connected to the Internet as described above, but the Internet connection location may be changed as appropriate.
  • Example 5 will be described with reference to FIG.
  • a place where a surveillance camera is installed such as a railway platform is assumed.
  • the stereo camera 100 described in the present embodiment monitors either the near-field or the far-field so that the left and right cameras are different in the vicinity and the distance as proposed in the second and third embodiments. is doing.
  • the stereo camera 100 can be made stereo to the vicinity area and monitored, Risk can be prevented.
  • the stereo camera device can perform shooting and measurement for multiple purposes, and can provide a more secure and safe service.
  • the sixth embodiment is an embodiment in which a stereo camera device using a commercially available PTZ camera is mounted.
  • the PTZ camera is a camera that can control pan, tilt, and zoom of the camera. Each camera's specs and control methods vary from manufacturer to manufacturer.
  • two or more PTZ cameras 110 are connected to the camera control unit 113 and the image acquisition unit 114 via the network 112, a cable, and the like to perform camera control and distance measurement.
  • the camera control unit 113 is a unit that controls the PTZ camera 110 and the baseline stage 111.
  • the image acquisition unit 114 is a unit that acquires an image from the PTZ camera via the network 112.
  • the monitoring unit 116 is a unit that monitors abnormalities and dangers on the image using the camera image acquired by the image acquisition unit 114.
  • the stereo control unit 115 instructs the camera control unit 113 to convert the camera into a stereo based on the danger or abnormality detected by the monitoring unit 116. Then, the camera control unit 113 converts the PTZ camera 110 into a stereo based on an instruction from the stereo control unit 115.
  • the distance calculation unit 15 calculates the distance using the camera image and camera information acquired by the PTZ camera, as in the first embodiment.
  • This embodiment makes it possible to intelligently adjust the parameters of a commercially available PTZ camera, detect dangers and abnormalities, and quickly measure the current state of the stereo image to correctly measure the current situation.
  • safer and safer control can be performed. For example, when one of the PTZ cameras monitors a nearby area and the other PTZ camera monitors a far-field area, if an abnormality or danger occurs in either area, the camera is quickly controlled. By making them stereo, abnormalities and dangers can be measured in detail.
  • a commercially available PTZ camera can be used, the control of the camera can be further simplified, and the development cost can be reduced. Further, when connecting the PTZ camera to each configuration, the measurement range of the camera can be further expanded by using the network 112, and can be adapted to various environments.
  • each component is connected to the network as described above, but the network connection location may be changed as necessary.

Landscapes

  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Le but de la présente invention est de fournir un dispositif de type appareil photo stéréo hautement fiable prenant en considération une région d'angle mort. Pour résoudre le problème ci-dessus, le dispositif de type appareil photo stéréo selon l'invention comprend : une pluralité d'unités d'entrée vidéo réelle 10 destinées à photographier un objet ou une région ; une unité de commande d'appareil photo 12 destinée à changer une valeur réglée au moment de la photographie par l'unité d'entrée vidéo réelle 10 ; une unité d'acquisition d'image 11 destinée à convertir un signal électronique entré à partir de l'unité d'entrée vidéo réelle 10 en une image ; une unité de surveillance 13 destinée à détecter une anomalie et un danger sur la base d'informations d'image acquises par l'unité d'acquisition d'image 11 ; une unité de commande stéréo 14 destinée à calculer une valeur de commande 20 au moment de la conversion en stéréo de l'unité d'entrée vidéo réelle 10 sur la base d'informations détectées par l'unité de surveillance 13 ; et une unité de calcul de distance 15 destinée à calculer la distance de l'unité d'entrée vidéo réelle 10 à l'objet ou la région sur la base des informations acquises par l'unité d'acquisition d'image 11, l'invention étant caractérisée en ce que l'unité de commande d'appareil photo 12 change la valeur réglée au moment de la photographie par l'unité d'entrée vidéo réelle 10 sur la base de la valeur de commande 20 calculée par l'unité de commande stéréo 14.
PCT/JP2017/011909 2017-03-24 2017-03-24 Dispositif de type appareil photo stéréo Ceased WO2018173243A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011909 WO2018173243A1 (fr) 2017-03-24 2017-03-24 Dispositif de type appareil photo stéréo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011909 WO2018173243A1 (fr) 2017-03-24 2017-03-24 Dispositif de type appareil photo stéréo

Publications (1)

Publication Number Publication Date
WO2018173243A1 true WO2018173243A1 (fr) 2018-09-27

Family

ID=63584296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011909 Ceased WO2018173243A1 (fr) 2017-03-24 2017-03-24 Dispositif de type appareil photo stéréo

Country Status (1)

Country Link
WO (1) WO2018173243A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284053A (ja) * 2002-03-27 2003-10-03 Minolta Co Ltd 監視カメラシステムおよび監視カメラ制御装置
JP2004364212A (ja) * 2003-06-09 2004-12-24 Fujitsu Ltd 物体撮影装置、物体撮影方法及び物体撮影プログラム
JP2005176143A (ja) * 2003-12-12 2005-06-30 Sony Corp 監視装置
JP2007235655A (ja) * 2006-03-02 2007-09-13 Victor Co Of Japan Ltd 映像撮像制御システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284053A (ja) * 2002-03-27 2003-10-03 Minolta Co Ltd 監視カメラシステムおよび監視カメラ制御装置
JP2004364212A (ja) * 2003-06-09 2004-12-24 Fujitsu Ltd 物体撮影装置、物体撮影方法及び物体撮影プログラム
JP2005176143A (ja) * 2003-12-12 2005-06-30 Sony Corp 監視装置
JP2007235655A (ja) * 2006-03-02 2007-09-13 Victor Co Of Japan Ltd 映像撮像制御システム

Similar Documents

Publication Publication Date Title
KR101343975B1 (ko) 돌발검지 시스템
JP6355052B2 (ja) 撮像装置、画像処理装置、撮像方法および記録媒体
CN109922251B (zh) 快速抓拍的方法、装置及系统
KR101043450B1 (ko) 카메라를 이용한 위치와 거리 측정장치 및 위치와 거리 측정방법
JP3800217B2 (ja) 監視システム
WO2016194307A1 (fr) Dispositif de correction de secousse, appareil de capture d'image et procédé de correction de secousse
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
JP4979525B2 (ja) マルチカメラシステム
KR101496390B1 (ko) 차량번호인식 시스템
KR101449160B1 (ko) 차량의 사각지대 정보 제공 장치 및 방법
CN102834853A (zh) 车辆周围监测装置、车辆周围监测方法及车辆装置
JP6221464B2 (ja) ステレオカメラ装置、移動体制御システム及び移動体、並びにプログラム
JP2005024463A (ja) ステレオ広視野画像処理装置
KR102183903B1 (ko) 감시 카메라 및 감시 카메라 제어 방법
WO2023067867A1 (fr) Dispositif de commande monté sur véhicule et procédé d'acquisition d'informations tridimensionnelles
JP3491029B2 (ja) 自動監視装置
TW201205506A (en) System and method for managing security of a roof
WO2018173243A1 (fr) Dispositif de type appareil photo stéréo
JP5452540B2 (ja) 撮像装置
JP6204844B2 (ja) 車両のステレオカメラシステム
JP6734994B2 (ja) ステレオ計測装置及びシステム
JP2018201146A (ja) 画像補正装置、画像補正方法、注目点認識装置、注目点認識方法及び異常検知システム
EP4235574A1 (fr) Dispositif de mesure, dispositif mobile, procédé de mesure et support d'informations
JP2021022783A (ja) 制御装置、追尾システム、制御方法、及びプログラム
US20200111227A1 (en) Orientation detection apparatus for vehicle, image processing system, vehicle, and orientation detection method for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17902516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17902516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP