WO2022079910A1 - 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム - Google Patents
熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム Download PDFInfo
- Publication number
- WO2022079910A1 WO2022079910A1 PCT/JP2020/039126 JP2020039126W WO2022079910A1 WO 2022079910 A1 WO2022079910 A1 WO 2022079910A1 JP 2020039126 W JP2020039126 W JP 2020039126W WO 2022079910 A1 WO2022079910 A1 WO 2022079910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- heat trace
- difference
- thermal
- background
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0859—Sighting arrangements, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/80—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for detecting, monitoring or modelling epidemics or pandemics, e.g. flu
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
Definitions
- the present invention relates to a heat trace region extraction method, a heat trace region extraction device, and a program.
- Non-Patent Document 1 a method for improving the efficiency of disinfection using a drone has been proposed (for example, Non-Patent Document 1).
- Disinfection at regular time intervals cannot prevent infections mediated by objects, that is, infections caused by an infected person touching an object and another person touching the object within that interval. However, if it can be disinfected flexibly according to human use, it is thought that such spread of infection can be further reduced. In addition, since such a method can prevent unnecessary disinfection, the effect of reducing the disinfectant solution can be expected. In other words, if disinfection can be performed according to the use of things by people detected by surveillance cameras, labor will be reduced and infection will spread compared to the case of disinfecting all things that may have been used at regular time intervals. Can be expected to prevent and save disinfectant.
- Non-Patent Document 2 a method for detecting contact with an object using a shadow.
- Non-Patent Document 2 requires a strong light source such as a projector.
- a strong light source such as a projector.
- the recognition accuracy is significantly affected by the positional relationship between the camera and the light source.
- a strong light source cannot be installed freely in many environments, and it is considered that it is not suitable for the purpose of detecting and presenting a place touched by a person in various places to support disinfection.
- the present invention has been made in view of the above points, and an object of the present invention is to improve the detection accuracy of a place touched by a person.
- a differential visible image generation procedure for generating a first difference image with respect to the visible image of the background of the certain range with respect to the visible image captured in a certain range, and a difference visible image generation procedure in which the certain range is photographed.
- a thermal trace region is extracted based on the differential thermal image generation procedure for generating a second differential image with respect to the thermal image of the background, and the first differential image and the second differential image.
- the extraction procedure and the computer perform the extraction procedure.
- a device, a method and a program for detecting a place touched by a person by using a thermal image are disclosed with the aim of helping to sterilize or disinfect the virus. Since a person who is a homeothermic animal has heat on his limbs, when he touches an object, the heat remains in the place where he touches it for a certain period of time. For example, it has been reported how to use this heat trace to decrypt the passcode of a smartphone ("Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile". -based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017 ").
- Heat traces remain not only on smartphone screens but also on desks and walls. That is, if the heat trace is identified based on the image (thermal image) of the thermal camera, it is possible to pinpoint the place touched by a person indoors or the like.
- the heat trace area can be extracted by background subtraction with the heat image before human touch as the background.
- this method extracts the human body region as well as the heat trace. Therefore, in the present embodiment, the visible image is acquired at the same time as the thermal image, and the thermal trace region is extracted by comparing the thermal image and the visible image.
- background subtraction is performed for each of the visible image and the thermal image, and the thermal trace region is extracted by the difference in the result of the background subtraction. Since heat traces cannot be observed with a visible image (that is, with the naked eye), they cannot be extracted even if background subtraction is performed on the visible image with the visible image before being touched by a person as the background. On the other hand, when there is a person on the spot, the area of the person is extracted by performing background subtraction with the visible image taken in the absence of the person as the background. That is, when the region extracted by background subtraction in the thermal image is similarly extracted in the visible image, it can be seen that the region is not a thermal trace.
- the region extracted in the thermal image due to background subtraction and not extracted in the visible image is likely to be a thermal trace.
- the heat trace area extracted by such a method is visualized, and the place touched by a person is transmitted to the user.
- a sensor node equipped with a visible light camera and a thermal camera (“Yoshinari Shirai, Yasue Kishino, Takayuki Suyama, Shin Mizutani: PASNIC: a thermal based privacy-aware sensor node for image" Devices such as capturing, UbiComp / ISWC'19 Adjunct, pp.202-205, 2019 ”) may be used.
- FIG. 1 is a schematic diagram of a visible image and a thermal image taken at the same place at the same time.
- FIG. 1 shows a schematic diagram of an image of a hand touching a door with a handle taken simultaneously with a visible light camera and a thermal camera.
- (A) and (a') are time t1 (before the hand touches the door),
- (b) and (b') are time t2 (state where the hand is touching the door),
- ') Is a visible image or a thermal image at time t3 (after the hand touches the door).
- the temperature of the touched place rises as shown in FIG. 1 (c').
- FIG. 2 is a diagram showing an example of an image obtained by background subtraction.
- FIG. 2 shows the difference images of the time t2 and the time t3 when the image of the time t1 of FIG. 1 is used as the background image.
- the shape of the arm is extracted as a difference region in both the visible image and the thermal image, whereas at time t3, the portion touching the door only in the thermal image is extracted as a difference region.
- the difference region extracted by background subtraction with respect to the thermal image at time t2 includes a portion not touching the door.
- the difference region extracted by the thermal image at time t2 is the region where the human body exists, not the region of the heat trace left by actually touching the door. From the viewpoint of disinfection, the difference region extracted at time t3 may be specified, and the difference region extracted by background subtraction at time t2 is unnecessary.
- the difference region is not a heat trace region.
- the region extracted by the background subtraction of the thermal image is not extracted by the visible image, so that the difference region extracted by the thermal image is the thermal trace region (that is, the portion touched by a person). It is determined that there is. If the system presents information indicating the heat trace area extracted based on such a determination, the user who sees the information can efficiently disinfect the part touched by a person.
- FIG. 3 is a diagram showing a hardware configuration example of the heat trace region extraction device 10 according to the embodiment of the present invention.
- the heat trace area extraction device 10 of FIG. 3 has a drive device 100, an auxiliary storage device 102, a memory device 103, a CPU 104, an interface device 105, and the like, which are connected to each other by a bus B, respectively.
- the program that realizes the processing in the heat trace area extraction device 10 is provided by a recording medium 101 such as a CD-ROM.
- a recording medium 101 such as a CD-ROM.
- the program is installed in the auxiliary storage device 102 from the recording medium 101 via the drive device 100.
- the program does not necessarily have to be installed from the recording medium 101, and may be downloaded from another computer via the network.
- the auxiliary storage device 102 stores the installed program and also stores necessary files, data, and the like.
- the memory device 103 reads a program from the auxiliary storage device 102 and stores it when there is an instruction to start the program.
- the CPU 104 executes the function related to the heat trace area extraction device 10 according to the program stored in the memory device 103.
- the interface device 105 is used as an interface for connecting to a network.
- FIG. 4 is a diagram showing a functional configuration example of the heat trace region extraction device 10 according to the embodiment of the present invention.
- the heat trace region extraction device 10 includes a visible image acquisition unit 11, a background visible image generation unit 12, a difference visible image generation unit 13, a thermal image acquisition unit 14, a background thermal image generation unit 15, and a differential thermal image generation unit. 16. It has a heat trace region extraction unit 17, a heat trace region output unit 18, and the like. Each of these parts is realized by a process of causing the CPU 104 to execute one or more programs installed in the heat trace area extraction device 10.
- the heat trace area extraction device 10 is connected to each of these cameras so that images can be input from the visible light camera 21 and the thermal camera 22.
- the visible light camera 21 and the thermal camera 22 are installed so as to be able to photograph the same place (the same range). That is, this embodiment is based on the premise that the shooting area of the visible light camera 21 and the shooting area of the thermal camera 22 coincide with each other on a pixel-by-pixel basis. If the captured portions of the visible light camera 21 and the thermal camera 22 do not match, it is sufficient to perform calibration in advance so that the correspondence between the pixels of the visible image and the thermal image can be grasped.
- FIG. 5 is a flowchart for explaining an example of the processing procedure executed by the heat trace area extraction device 10.
- step S101 the visible image acquisition unit 11 acquires the visible image captured by the visible light camera 21 input from the visible light camera 21, and the thermal image acquisition unit 14 acquires the thermal image input from the thermal camera 22.
- the thermal image taken by the camera 22 is acquired.
- the acquisition of the visible image by the visible image acquisition unit 11 and the acquisition of the thermal image by the thermal image acquisition unit 14 may or may not be performed at the same time. If not at the same time, a part of the frame of the camera with the faster frame rate may be ignored according to the camera with the slower frame rate. Further, there is no problem even if the still images are alternately acquired from the visible light camera 21 and the thermo camera 22 and the acquired images are regarded as being acquired at the same time as long as the fps is relatively fast.
- the visible image acquisition unit 11 transmits the acquired visible image to the background visible image generation unit 12, and the thermal image acquisition unit 14 transmits the acquired thermal image to the background thermal image generation unit 15.
- the background visible image generation unit 12 stores the visible image transmitted from the visible image acquisition unit 11 in the auxiliary storage device 102
- the background thermal image generation unit 15 stores the thermal image transmitted from the thermal image acquisition unit 14. Is stored in the auxiliary storage device 102 (S102).
- Steps S101 and S102 are repeated until the predetermined time T1 elapses.
- the predetermined time T1 may be a period during which one or more visible images and one or more thermal images are accumulated in the auxiliary storage device 102.
- step S104 the background visible image generation unit 12 generates a background image (hereinafter, referred to as “background visible image”) in the shooting range based on the visible image group stored in the auxiliary storage device 102 in the predetermined period T1. Further, in step S104, the background thermal image generation unit 15 generates a background image (hereinafter, referred to as “background thermal image”) in the shooting range based on the thermal image group stored in the auxiliary storage device 102 in the predetermined period T1. do.
- background visible image hereinafter, referred to as “background visible image”
- background thermal image generation unit 15 generates a background image (hereinafter, referred to as “background thermal image”) in the shooting range based on the thermal image group stored in the auxiliary storage device 102 in the predetermined period T1. do.
- each captured image group A background image background visible image and background thermal image
- RGB center value of the pixel values
- the predetermined time T1 corresponds to the time t1 in FIG. That is, the time t1 does not have to be a momentary timing.
- steps S105 and subsequent steps are executed. It should be noted that steps S101 to S104 and step S105 do not have to be executed synchronously. For example, after step S105, it may be started in response to an instruction different from the execution instruction of steps S101 to S104.
- step S105 the visible image acquisition unit 11 and the thermal image acquisition unit 14 wait for the elapse of the predetermined time T2.
- the predetermined time T2 is, for example, the elapsed time from the time t2 to the time t3 in FIG. 2.
- the visible image acquisition unit 11 acquires a visible image input from the visible light camera 21 (hereinafter referred to as “target visible image”), and the thermal image acquisition unit 14 acquires the visible image. , Acquires a thermal image (hereinafter, referred to as “target thermal image”) input from the thermal camera 22 (S106). It is desirable that the target visible image and the target thermal image are images taken at the same time (or almost at the same time).
- the difference visible image generation unit 13 compares the background visible image generated by the background visible image generation unit 12 with the target visible image by the background subtraction method, and makes a difference region (with the background visible image) with respect to the background visible image. By extracting (different regions) from the target visible image, a difference image showing the difference (hereinafter referred to as “difference visible image”) is generated. Further, the differential thermal image generation unit 16 compares the background thermal image generated by the background thermal image generation unit 15 with the target thermal image by the background subtraction method, and has a difference region with respect to the background thermal image (a region different from the background thermal image). Is extracted from the target visible image to generate a difference image (hereinafter referred to as "difference thermal image”) showing the difference.
- each difference image is sent to the heat trace region extraction unit 17.
- the heat trace region extraction unit 17 compares the difference visible image with the differential thermal image and extracts the heat trace region in the photographing range (S108).
- the heat trace region extraction unit 17 When extracting a region dissimilar to the difference visible region of the differential thermal image, the similarity determination of the difference region of each difference image may be used. For example, the heat trace region extraction unit 17 first labels (extracts the connected region) each binary image which is a differential visible image or a differential thermal image. Next, the heat trace region extraction unit 17 has one or more obtained by labeling the differential visible image for each of the one or more differential regions (hereinafter referred to as “differential thermal region”) obtained by labeling the differential thermal image. The degree of overlap with each difference area (hereinafter referred to as “difference visible area”) is compared.
- difference visible area The degree of overlap with each difference area
- the heat trace region extraction unit 17 counts whether or not the difference regions to be compared match each other on a pixel-by-pixel basis, and if the match rate is equal to or higher than a certain threshold value, the two differences compared. The regions are determined to be dissimilar. The heat trace region extraction unit 17 extracts a heat difference region that is dissimilar to any of the difference visible regions as a heat trace region.
- the heat trace region extraction unit 17 transmits information indicating the heat trace region and a background visible image to the heat trace region output unit 18. At this time, the heat trace region extraction unit 17 generates a binary image in which the heat trace region portion is white and the rest is black, and the binary image is used as information indicating the heat trace region as the heat trace region output unit. It may be transmitted to 18. The region similarity determination is actively performed in pattern matching research and the like, and the present embodiment is not limited to a predetermined method. Subsequently, the heat trace region output unit 18 extracts the heat trace region. Information indicating the result is output so that the user can confirm it (S109).
- the heat trace region output unit 18 may output an image obtained by synthesizing white pixels of a binary image showing a heat trace region on a background visible image.
- the output form is not limited to the predetermined form.
- the device on the display device, the storage in the auxiliary storage device 102, the transmission to the user terminal via the network, and the like may be performed.
- step S109 steps S105 and subsequent steps are repeated.
- step S109 may be executed after steps S105 to S108 are repeated a plurality of times. In this case, the heat trace regions extracted in the plurality of times can be collectively output.
- FIG. 6 is a schematic diagram showing an output example of the extraction result of the heat trace region.
- the portion touched by a human hand is painted black (however, black is a color for convenience, and the actual color may be a different color such as white). The user can recognize the portion as a heat trace area.
- a projector or the like may be used to project a binary image showing a heat trace area with respect to the shooting range in the environment.
- the heat trace area trace image is projected on the heat trace portion in the environment, and the portion touched by a person can be directly transmitted to each person in the environment.
- steps S101 to S103 may be executed in parallel with steps S105 and subsequent steps.
- the background visible image and the background thermal image are updated periodically. Therefore, it can be expected that the resistance to the change of the background with the passage of time will be improved.
- the present embodiment it is possible to improve the detection accuracy of the place touched by a person. As a result, for example, it becomes possible to efficiently sterilize and disinfect a place where a virus such as a new type coronavirus may be attached.
- the difference visible image is an example of the first difference image.
- the differential thermal image is an example of the second differential image.
- the heat trace region extraction unit 17 is an example of the extraction unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Multimedia (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Radiation Pyrometers (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Studio Devices (AREA)
Abstract
Description
本実施の形態では、ウイルスの除菌又は消毒に役立てることを狙い、熱画像を利用して人が触れた場所を検知する装置、方法及びプログラムが開示される。恒温動物である人は手足に熱を帯びているため、モノに触れると触れた場所に熱が一定時間残る。例えば、この熱痕跡(Heat trace)をスマートフォンのパスコード解読に悪用する方法などが報告されている(「Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017」)。
本発明の実施の形態の狙いを図1及び図2を用いて説明する。図1は、同じ場所を同時に撮影した可視画像及び熱画像の模式図である。図1には、取っ手付きの扉に手が触れる様子を可視光カメラとサーマルカメラで同時に撮影した画像の模式図が示されている。(a)及び(a')は、時刻t1(手が扉に触れる前)、(b)及び(b')は、時刻t2(手が扉に触れている状態)、(c)及び(c')は、時刻t3(手が扉に触れた後)の可視画像又は熱画像である。人が扉に触れると、図1(c')のように、触れた場所の温度が上昇する。
続いて、熱痕跡領域出力部18は、熱痕跡領域の抽出結果を示す情報をユーザが確認可能なように出力する(S109)。例えば、熱痕跡領域出力部18は、背景可視画像上に熱痕跡領域を示す2値画像の白い画素を合成した画像を出力してもよい。また、出力形態は所定の形態に限定されない。例えば、表示装置への装置、補助記憶装置102への保存、ネットワークを介してユーザ端末へ送信する等が行われてもよい。
11 可視画像取得部
12 背景可視画像生成部
13 差分可視画像生成部
14 熱画像取得部
15 背景熱画像生成部
16 差分熱画像生成部
17 熱痕跡領域抽出部
18 熱痕跡領域出力部
21 可視光カメラ
22 サーマルカメラ
100 ドライブ装置
101 記録媒体
102 補助記憶装置
103 メモリ装置
104 CPU
105 インタフェース装置
B バス
Claims (7)
- 或る範囲が撮影された可視画像について、前記或る範囲の背景の可視画像に対する第1の差分画像を生成する差分可視画像生成手順と、
前記或る範囲が撮影された熱画像について、前記背景の熱画像に対する第2の差分画像を生成する差分熱画像生成手順と、
前記第1の差分画像と前記第2の差分画像とに基づいて、熱痕跡領域を抽出する抽出手順と、
をコンピュータが実行することを特徴とする熱痕跡領域抽出方法。 - 前記抽出手順は、前記第2の差分画像が示す1以上の差分領域のうち、前記第1の差分画像が示す1以上の差分領域のいずれとも非類似な領域を前記熱痕跡領域として抽出する、
ことを特徴とする請求項1記載の熱痕跡領域抽出方法。 - 前記抽出手順において抽出された前記熱痕跡領域を前記或る範囲の可視画像において示す情報を出力する出力手順、
をコンピュータが実行することを特徴とする請求項1又は2記載の熱痕跡領域抽出方法。 - 或る範囲が撮影された可視画像について、前記或る範囲の背景の可視画像に対する第1の差分画像を生成する差分可視画像生成部と、
前記或る範囲が撮影された熱画像について、前記背景の熱画像に対する第2の差分画像を生成する差分熱画像生成部と、
前記第1の差分画像と前記第2の差分画像とに基づいて、熱痕跡領域を抽出する抽出部と、
を有することを特徴とする熱痕跡領域抽出装置。 - 前記抽出部は、前記第2の差分画像が示す1以上の差分領域のうち、前記第1の差分画像が示す1以上の差分領域のいずれとも非類似な領域を前記熱痕跡領域として抽出する、
ことを特徴とする請求項4記載の熱痕跡領域抽出装置。 - 前記抽出部において抽出された前記熱痕跡領域を前記或る範囲の可視画像において示す情報を出力する出力部、
を有することを特徴とする請求項4又は5記載の熱痕跡領域抽出装置。 - 請求項1乃至3いずれか一項記載の熱痕跡領域抽出方法をコンピュータに実行させることを特徴とするプログラム。
Priority Applications (15)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/248,295 US20240019309A1 (en) | 2020-10-16 | 2020-10-16 | Remaining thermal trace extraction method, remaining thermal trace extraction apparatus and program |
| JP2022556819A JP7552711B2 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
| PCT/JP2020/039126 WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
| JP2022556993A JP7552713B2 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| JP2022556991A JP7485071B2 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037676 WO2022080350A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037677 WO2022080351A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| US18/031,141 US12159414B2 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
| JP2022556992A JP7485072B2 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037678 WO2022080352A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| JP2022556990A JP7485070B2 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| US18/031,147 US20230412768A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
| US18/031,346 US20230377159A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
| PCT/JP2021/037679 WO2022080353A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| US18/031,341 US20230377165A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/039126 WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022079910A1 true WO2022079910A1 (ja) | 2022-04-21 |
Family
ID=81208224
Family Applications (5)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/039126 Ceased WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
| PCT/JP2021/037679 Ceased WO2022080353A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037678 Ceased WO2022080352A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037676 Ceased WO2022080350A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037677 Ceased WO2022080351A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
Family Applications After (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/037679 Ceased WO2022080353A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037678 Ceased WO2022080352A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037676 Ceased WO2022080350A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
| PCT/JP2021/037677 Ceased WO2022080351A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
Country Status (3)
| Country | Link |
|---|---|
| US (5) | US20240019309A1 (ja) |
| JP (5) | JP7552711B2 (ja) |
| WO (5) | WO2022079910A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024047807A1 (ja) * | 2022-08-31 | 2024-03-07 | 日本電信電話株式会社 | 閾値決定装置、方法及びプログラム |
| JP7810276B2 (ja) | 2022-08-31 | 2026-02-03 | Ntt株式会社 | 閾値決定装置、方法及びプログラム |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7772240B2 (ja) * | 2022-09-06 | 2025-11-18 | Ntt株式会社 | 背景更新装置、方法及びプログラム |
| WO2024052973A1 (ja) * | 2022-09-06 | 2024-03-14 | 日本電信電話株式会社 | 背景更新装置、方法及びプログラム |
| WO2024262050A1 (ja) * | 2023-06-23 | 2024-12-26 | 株式会社Ysk | 電波式発信機探知装置および電波式発信機探知方法並びに電波式発信機探知プログラム |
| TWI898925B (zh) * | 2024-10-03 | 2025-09-21 | 南亞科技股份有限公司 | 溫度檢測裝置及方法 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017067503A (ja) * | 2015-09-28 | 2017-04-06 | 富士通株式会社 | 位置推定装置、位置推定方法、及び位置推定プログラム |
| JP2017090277A (ja) * | 2015-11-11 | 2017-05-25 | 国立大学法人九州大学 | 把持情報取得装置、ロボット教示装置及びロボット制御装置、並びに把持情報取得方法、ロボット教示方法及びロボット制御方法 |
| WO2020027210A1 (ja) * | 2018-08-03 | 2020-02-06 | 日本電信電話株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5354767B2 (ja) | 2007-10-17 | 2013-11-27 | 株式会社日立国際電気 | 物体検知装置 |
| JP7024713B2 (ja) * | 2016-08-04 | 2022-02-24 | ソニーグループ株式会社 | 画像処理装置、及び画像処理方法 |
| TWI637352B (zh) * | 2017-08-23 | 2018-10-01 | 緯創資通股份有限公司 | 影像處理裝置和方法 |
| US12023415B2 (en) * | 2020-01-29 | 2024-07-02 | Safetyspect Inc | Inspection and sanitation device and method |
| US11615694B2 (en) * | 2020-05-05 | 2023-03-28 | Macondo Vision, Inc. | Clean surface sensor indicator and system |
-
2020
- 2020-10-16 WO PCT/JP2020/039126 patent/WO2022079910A1/ja not_active Ceased
- 2020-10-16 JP JP2022556819A patent/JP7552711B2/ja active Active
- 2020-10-16 US US18/248,295 patent/US20240019309A1/en not_active Abandoned
-
2021
- 2021-10-12 WO PCT/JP2021/037679 patent/WO2022080353A1/ja not_active Ceased
- 2021-10-12 JP JP2022556993A patent/JP7552713B2/ja active Active
- 2021-10-12 US US18/031,147 patent/US20230412768A1/en not_active Abandoned
- 2021-10-12 WO PCT/JP2021/037678 patent/WO2022080352A1/ja not_active Ceased
- 2021-10-12 WO PCT/JP2021/037676 patent/WO2022080350A1/ja not_active Ceased
- 2021-10-12 US US18/031,346 patent/US20230377159A1/en not_active Abandoned
- 2021-10-12 JP JP2022556990A patent/JP7485070B2/ja active Active
- 2021-10-12 JP JP2022556991A patent/JP7485071B2/ja active Active
- 2021-10-12 WO PCT/JP2021/037677 patent/WO2022080351A1/ja not_active Ceased
- 2021-10-12 US US18/031,341 patent/US20230377165A1/en not_active Abandoned
- 2021-10-12 US US18/031,141 patent/US12159414B2/en active Active
- 2021-10-12 JP JP2022556992A patent/JP7485072B2/ja active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017067503A (ja) * | 2015-09-28 | 2017-04-06 | 富士通株式会社 | 位置推定装置、位置推定方法、及び位置推定プログラム |
| JP2017090277A (ja) * | 2015-11-11 | 2017-05-25 | 国立大学法人九州大学 | 把持情報取得装置、ロボット教示装置及びロボット制御装置、並びに把持情報取得方法、ロボット教示方法及びロボット制御方法 |
| WO2020027210A1 (ja) * | 2018-08-03 | 2020-02-06 | 日本電信電話株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024047807A1 (ja) * | 2022-08-31 | 2024-03-07 | 日本電信電話株式会社 | 閾値決定装置、方法及びプログラム |
| JPWO2024047807A1 (ja) * | 2022-08-31 | 2024-03-07 | ||
| JP7810276B2 (ja) | 2022-08-31 | 2026-02-03 | Ntt株式会社 | 閾値決定装置、方法及びプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7552713B2 (ja) | 2024-09-18 |
| WO2022080350A1 (ja) | 2022-04-21 |
| JPWO2022079910A1 (ja) | 2022-04-21 |
| WO2022080352A1 (ja) | 2022-04-21 |
| JPWO2022080352A1 (ja) | 2022-04-21 |
| JP7485070B2 (ja) | 2024-05-16 |
| JPWO2022080351A1 (ja) | 2022-04-21 |
| US12159414B2 (en) | 2024-12-03 |
| US20230377159A1 (en) | 2023-11-23 |
| US20230384162A1 (en) | 2023-11-30 |
| JP7485072B2 (ja) | 2024-05-16 |
| JP7485071B2 (ja) | 2024-05-16 |
| US20240019309A1 (en) | 2024-01-18 |
| WO2022080351A1 (ja) | 2022-04-21 |
| US20230412768A1 (en) | 2023-12-21 |
| US20230377165A1 (en) | 2023-11-23 |
| JPWO2022080353A1 (ja) | 2022-04-21 |
| JP7552711B2 (ja) | 2024-09-18 |
| JPWO2022080350A1 (ja) | 2022-04-21 |
| WO2022080353A1 (ja) | 2022-04-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022079910A1 (ja) | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム | |
| JP5991224B2 (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
| JP2023052914A (ja) | 生体検知装置、生体検知方法、および、生体検知プログラム | |
| JP6803525B2 (ja) | 顔検出装置およびこれを備えた顔検出システムならびに顔検出方法 | |
| CN107431761A (zh) | 图像处理设备、图像处理方法以及图像处理系统 | |
| US20220100196A1 (en) | Smart Thermal Tracking to Guide Surface Sanitization | |
| WO2013121711A1 (ja) | 解析処理装置 | |
| JP2014157316A (ja) | プロジェクタ装置 | |
| JP2020507863A5 (ja) | ||
| JP6516646B2 (ja) | 複数のカメラで撮影した画像から個々の被写体を識別する識別装置、識別方法及びプログラム | |
| JP6194327B2 (ja) | 認証システム、認証方法及びプログラム | |
| JP2021152758A (ja) | 情報処理システム、情報処理装置及び情報処理方法 | |
| JP6839116B2 (ja) | 学習装置、推定装置、学習方法、推定方法及びコンピュータプログラム | |
| WO2017029841A1 (ja) | 画像解析装置、画像解析方法、及び、画像解析プログラム | |
| JP6255968B2 (ja) | 画像処理装置、陽炎補正方法及びプログラム | |
| JP2019101745A (ja) | 生体画像処理装置、生体画像処理方法、及び生体画像処理プログラム | |
| Ko et al. | An efficient method for extracting the depth data from the user | |
| JP7197651B2 (ja) | 表面を監視する監視システム、監視方法、コンピュータプログラム製品および清浄化システム | |
| Wang et al. | Using Environmental Sensing to Measure Hand Hygiene Quality | |
| Shetty et al. | DETECTION OF FACE MASKS USING DEEP LEARNING | |
| JP2023046553A (ja) | 特定プログラム、特定方法および情報処理装置 | |
| Ruffin | A Stereoscopic and Deep Learning Approach for Enhanced Fever Detection Systems | |
| JP2015127910A (ja) | 色変化検出装置、色変化検出方法及び色変化検出プログラム | |
| Fuchs et al. | Poster: SmartLobby: Using a 24/7 Remote Head-Eye-Tracking for Content Personalization | |
| JP2023179903A (ja) | 画像処理装置、画像処理方法、およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20957739 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022556819 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18248295 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20957739 Country of ref document: EP Kind code of ref document: A1 |