IL303207A - System and method for thermal breast cancer screening tests - Google Patents
System and method for thermal breast cancer screening testsInfo
- Publication number
- IL303207A IL303207A IL303207A IL30320723A IL303207A IL 303207 A IL303207 A IL 303207A IL 303207 A IL303207 A IL 303207A IL 30320723 A IL30320723 A IL 30320723A IL 303207 A IL303207 A IL 303207A
- Authority
- IL
- Israel
- Prior art keywords
- thermal
- cameras
- sensor
- camera
- nir
- Prior art date
Links
- 238000012216 screening Methods 0.000 title claims description 51
- 238000000034 method Methods 0.000 title claims description 45
- 206010006187 Breast cancer Diseases 0.000 title claims description 18
- 208000026310 Breast neoplasm Diseases 0.000 title claims description 18
- 238000012360 testing method Methods 0.000 title description 2
- 238000003384 imaging method Methods 0.000 claims description 45
- 238000004458 analytical method Methods 0.000 claims description 16
- 230000008569 process Effects 0.000 claims description 16
- 238000005259 measurement Methods 0.000 claims description 15
- 206010028980 Neoplasm Diseases 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 13
- 230000005855 radiation Effects 0.000 claims description 13
- 238000011084 recovery Methods 0.000 claims description 10
- 230000006641 stabilisation Effects 0.000 claims description 10
- 210000000038 chest Anatomy 0.000 claims description 9
- 238000012544 monitoring process Methods 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 238000013473 artificial intelligence Methods 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 8
- 230000000694 effects Effects 0.000 claims description 8
- 201000011510 cancer Diseases 0.000 claims description 7
- 230000007246 mechanism Effects 0.000 claims description 7
- 238000011105 stabilization Methods 0.000 claims description 7
- 210000004204 blood vessel Anatomy 0.000 claims description 6
- 230000036760 body temperature Effects 0.000 claims description 6
- 238000001816 cooling Methods 0.000 claims description 6
- 206010061218 Inflammation Diseases 0.000 claims description 5
- 230000007613 environmental effect Effects 0.000 claims description 5
- 230000004927 fusion Effects 0.000 claims description 5
- 230000004054 inflammatory process Effects 0.000 claims description 5
- 230000008081 blood perfusion Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 4
- 230000002503 metabolic effect Effects 0.000 claims description 4
- 230000005856 abnormality Effects 0.000 claims description 3
- 230000006978 adaptation Effects 0.000 claims description 2
- 230000003044 adaptive effect Effects 0.000 claims description 2
- 230000005540 biological transmission Effects 0.000 claims description 2
- 239000000090 biomarker Substances 0.000 claims description 2
- 230000001914 calming effect Effects 0.000 claims description 2
- 230000001427 coherent effect Effects 0.000 claims description 2
- 230000000295 complement effect Effects 0.000 claims description 2
- 238000013480 data collection Methods 0.000 claims description 2
- 238000013461 design Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 claims description 2
- 230000002708 enhancing effect Effects 0.000 claims description 2
- 208000015181 infectious disease Diseases 0.000 claims description 2
- 238000007689 inspection Methods 0.000 claims description 2
- 238000009434 installation Methods 0.000 claims description 2
- 230000010354 integration Effects 0.000 claims description 2
- 230000003993 interaction Effects 0.000 claims description 2
- 238000010801 machine learning Methods 0.000 claims description 2
- 230000006371 metabolic abnormality Effects 0.000 claims description 2
- 230000009467 reduction Effects 0.000 claims description 2
- 230000004044 response Effects 0.000 claims description 2
- 239000000779 smoke Substances 0.000 claims description 2
- 230000001360 synchronised effect Effects 0.000 claims description 2
- 238000007664 blowing Methods 0.000 claims 1
- 238000002360 preparation method Methods 0.000 claims 1
- 210000000481 breast Anatomy 0.000 description 25
- 210000001519 tissue Anatomy 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 238000010521 absorption reaction Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000306 recurrent effect Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 206010015995 Eyelid ptosis Diseases 0.000 description 2
- 239000006096 absorbing agent Substances 0.000 description 2
- 238000009529 body temperature measurement Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000005865 ionizing radiation Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 210000002445 nipple Anatomy 0.000 description 2
- 201000003004 ptosis Diseases 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- XHCLAFWTIXFWPH-UHFFFAOYSA-N [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] XHCLAFWTIXFWPH-UHFFFAOYSA-N 0.000 description 1
- 229910021417 amorphous silicon Inorganic materials 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 229910052732 germanium Inorganic materials 0.000 description 1
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 208000018883 loss of balance Diseases 0.000 description 1
- 230000036210 malignancy Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000011524 similarity measure Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000004614 tumor growth Effects 0.000 description 1
- 229910001935 vanadium oxide Inorganic materials 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0091—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for mammography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/43—Detecting, measuring or recording for evaluating the reproductive systems
- A61B5/4306—Detecting, measuring or recording for evaluating the reproductive systems for evaluating the female reproductive systems, e.g. gynaecological evaluations
- A61B5/4312—Breast evaluation or disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Gynecology & Obstetrics (AREA)
- Reproductive Health (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Description
A system and method for thermal breast cancer screening
FIELD OF THE INVENTION
[Para 1] The present invention relates to the field of thermographic screening
for breast cancer diagnosis and more specifically to a thermal system (system)
and method designed for breast cancer screening without ionizing radiation
and body contact.
BACKGROUND OF THE INVENTION AND PRIOR ART
[Para 2] There are few applications/patents describing means for
thermographic screening. In our opinion, the closet prior art is
US20210267463A1(Higia , Inc. 2.9.2021), teaching a system and method that
includes: an exterior enclosure defining an interior examination volume; a
thermographic camera defining a field of view encompassing a target
examination region within the interior examination volume; and a set of
lighting elements configured to illuminate reference locations within the
interior examination volume.
[Para 3] The above system including computational subsystem is configured
to, for each target pose in a series of target poses: illuminate a reference
location via the set of lighting elements, the reference location corresponding
to the target pose; prompt the patient (subject) to locate within the target
examination region and to orient relative to the reference location illuminated
via the set of lighting elements; and capture a thermographic image of the
subject in the target pose. The computational subsystem is further configured
to generate a diagnostic assessment for the subject based on the
thermographic image of the subject in each target pose.
[Para 4] The method for administering a thermographic examination is
executed at an examination cell including an exterior enclosure defining an
interior examination volume, a thermographic camera arranged within the
interior examination volume and defining a field of view encompassing a target
examination region within the interior examination volume, and a set of
lighting elements configured to illuminate a set of reference locations within
the interior examination volume.
[Para 5] The system and method disclosed hereunder are different and have
new and inventive features and advantages over the above prior art.
[Para 6] The thermal system and method, hereunder described, is designed
for breast cancer screening without ionizing radiation and without body
contact. The main advantages of the system are:
(1) A robust multi-modal imaging system is employed leveraging an array of
cameras and sensors to capture a comprehensive suite of data types, such
as Near-Infrared (NIR), thermal or Long Wave Infrared (LWIR), three-
dimensional (3D), and RGB imagery, all from various perspectives. This
system is built around an assembly of five thermal cameras that primarily
capture high-resolution LWIR data. Additionally, integrated 3D Time-of-
Flight cameras contribute detailed 3D images and NIR data. To further enrich
the array of data, optional RGB cameras can be utilized, which are seamlessly
integrated with the 3D cameras, providing invaluable information from the
visual spectrum. This comprehensive and integrated approach offers precise
depth assessment of the breast and potential areas of concern while
capturing thermal and NIR imaging data. This setup, of using multiple
cameras covering the entire area from multiple angles, allows optimal
thermal screening without repositioning the subject during the screening
while offering a comprehensive view of the breast's thermal, 3D and vascular
structure, further aiding in the detection of breast cancer.
(2) The cameras move from or to the subject, to cover a specific Region of
Interest (ROI) based on subject's characteristics. This allows a high-
resolution skin coverage., e.g., if the subject is slim, the cameras move
forward, if the subject is large, the cameras will move backward to have an
optimal distance.
(3) By inducing a temperature difference through chest cooling via air over a
period, the body's temperature drops, and recovery mechanism can be
utilized to assess the thermal recovery rates of various tissue types and
regions. This aids in identifying asymmetries and irregular patterns that may
indicate breast cancer. By recording live video, it is possible to monitor the
changes during time, which increases the accuracy comparing to one image
which cannot catch dynamic changes. Thermal cooling and recovery are
non-homogeneous, and its rate is not linear. Certain thermal changes are
very fast, hence having a view of a region of interest from multiple angles
simultaneously helps with continuous detailed monitoring of the whole
breast region.
(4) Using thermal image calibration methods to make sure all images are on
the exact same scale: gain and offset, with high sensitivity cameras.
(5) Using deep learning analysis algorithms increase robustness and accuracy
comparing to older classical methods, when dealing with thermal
environmental effects and body temperature effects, in addition to finding
features that describe the dynamic changes caused by hot/cool protocol.
(6) Establishing a standardized and automatic procedure for thermal screening
with an optimal configuration. This entails the development of a
standardized approach to thermal screening that includes specific guidelines
for camera placement, calibration techniques, and optimal settings to ensure
accuracy and reliability.
[Para 7] Furthermore, the thermal system is designed to overcome some of
the problems encountered in the prior art. For example, minimizing noise in
the image; obtaining accurate 3D measurements and optimal coverage of the
ROI from multiple angles during the entire dynamic screening, and ensuring a
high resolution of the skin thermal image, meaning that the number of pixels
per square cm of skin is high.
BRIEF DESCRIPTION OF THE FIGURES
[Para 8] Fig 1 showing an illustration of a screening station with a chair, where
the screened subject sits.
[Para 9] Fig. 2 showing a block diagram of the screening Thermal Device (TD).
[Para 10] Fig. 3 showing an illustration of the TD.
[Para 11] fig. 3a showing a closeup of a thermal camera including a sensor.
[Para 12] fig. 3b showing an exploded-view of a 3D camera-sensor.
[Para 13] Fig. 4 - Showing the distribution of the cancer locations by breast
regions.
[Para 14] Fig. 5 – showing breast with ptosis.
[Para 15] Fig. 6 - is an Image quality for different areas of the breast and
different camera angles.
[Para 16] Fig. 7–showing un-distortion correction.
[Para 17] Fig. 8 – is a diagram showing subject information station and
screening system initialization flow.
[Para 18] Fig. 9 – Thermal screening general flow diagram.
[Para 19] Fig. 10- showing (Artificial Intelligence) AI model architecture.
[Para 20] Fig. 11-showing Sensor positioning.
[Para 21] Fig. 11a -showing a sensor focus by rotating the lens.
[Para 22] Fig. 12 showing AI model architecture.
[Para 23] Fig. 13a showing camera close to subject, horizontal rail, head is not
visible.
[Para 24] Fig. 13b showing camera far from subject, horizontal rail, head is
visible.
[Para 25] Fig. 13c showing camera close to subject, rail under angle, head is
not visible.
[Para 26] Fig. 13d showing camera far from subject, rail under angle, head is
not visible.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[Para 27] An embodiment is an example or implementation of the inventions.
The various appearances of "one embodiment," "an embodiment" or "some
embodiments" do not necessarily all refer to the same embodiments. Although
various features of the invention may be described in the context of a single
embodiment, the features may also be provided separately or in any suitable
combination. Conversely, although the invention may be described herein in
the context of separate embodiments for clarity, the invention may also be
implemented in a single embodiment.
[Para 28] Reference in the specification to "one embodiment", "an
embodiment", "some embodiments" or "other embodiments" means that a
particular feature, structure, or characteristic described in connection with the
embodiments is included in at least one embodiment, but not necessarily all
embodiments, of the inventions. It is understood that the phraseology and
terminology employed herein is not to be construed as limiting and are for
descriptive purpose only.
[Para 29] The present invention relates to the field of breast cancer diagnosis.
The thermal system is designed for breast cancer screening without radiation
and without body contact and may serve the diagnosis of any disease which
potentially affects metabolic and vascular functionality with the body, (e.g.,
inflammation).
[Para 30] The goal of the disclosed system and method is:
1. Facilitating optimal thermal image acquisition from all angles simultaneously,
without necessitating the repositioning of the subject during the screening
process and eliminating potential bias from the device operator by utilizing
software assistance for accurately positioning the sensors relative to the
subject for capturing high-resolution images.
2. Minimizing noise within the acquired images, ensuring greater clarity and
accuracy.
3. Obtaining precise 3D measurements, as well as accurately matching the 2D
thermal image onto the 3D point cloud, thus integrating depth data with
thermal signals.
4. Integrating near-infrared signals, which are used to detect blood vessels, and
effectively aligning them with the associated thermal signals.
. Ensuring a high resolution of the skin thermal image, signifying a substantial
number of pixels per square centimeter of skin, irrespective of the subject's
size, by strategically moving the cameras toward or away from the subject in
a manner that ensures the face remains excluded from the captured images.
Setup without repositioning the subject is optimal.
[Para 31] One of the methods for scanning the subject would be to move one
or multiple cameras around the subject to record the skin temperature
distribution from different angles. This approach is beneficial as it allows the
use of a smaller number of cameras and enables the understanding of the 3D
structure of the object from the collected multi-angle data. The camera
movement may be performed in two ways.
1. The first way is to move the camera continuously around the subject back
and forth. The main drawback of this method is the motion blur effect which
reduces the quality of the images collected. For this reason, a setup with fixed
cameras is more beneficial.
2. The second way is to overcome the above effect, by moving the camera and
making several stops around the subject. For example, instead of using
fixed cameras, it is possible to move one camera from one position to another
and wait at each position for 1 second to collect unblurred data, and then
move to the next stop. Assuming that it is possible to move the camera fast
enough from one position to another in 1 second and spending 1 second at
each position out of five, it would require 9 seconds (5 sec + 4 sec) to move
the camera from left to right. Then moving the camera back into initial
position requires another 4 seconds. So, one cycle would take 13 seconds.
[Para 32] The temperature changes, at the beginning and immediately
following the stress test, are rapid, which makes a comprehensive
understanding of each skin area's dynamics during the evaluation challenging.
However, the use of five static cameras-sensors enables us to capture these
swift temperature transitions across all skin regions effectively. Consequently,
this allows us to create models that take these dynamics into account.
3D structure measurements on top of thermal signal
[Para 33] The knowledge of 3D structure is very beneficial for this analysis:
- It allows for a more accurate estimation of asymmetry between the left and
right breast, as the 3D structure of each breast can be precisely matched
and compared. By understanding the 3D shape of each breast, it is possible
to better understand how the size and shape of one breast differs from the
other, and this information can be used for estimating asymmetry more
accurately between the two breasts.
- It can significantly improve the precision of breast/non-breast
segmentation. The depth information that is provided by 3D structure
measurements is used to define the borders of the breast, making it easier
to differentiate more clearly between breast tissue and surrounding tissue.
This can lead to more accurate segmentation results and a better
understanding of the breast tissue overall.
- Furthermore, in cases where a spot has been detected being suspicion of
cancer, the 3D structure aids in determining the depth of the tumor. The
curvature of the area where the spot is located is considered to play a
crucial role in determining the depth of the tumor.
- Emissivity of the skin is not uniform in all directions. With 3D information
it is possible to determine the skin orientation and make necessary
corrections to the measured thermal signal.
Near infrared signal on top of long infrared signal
[Para 34] Near-Infrared (NIR) imaging provides valuable information in
addition to Long Wave Infrared (LWIR) imaging and aids in the detection of
vessel structures which are crucial for breast cancer detection. There are
several reasons why NIR images help to detect vessel structures and improve
breast cancer detection:
[Para 35] Tissue penetration and differentiation: Near-Infrared (NIR) imaging
takes advantage of the optical window in the biological tissue where light
absorption is relatively low, generally in the wavelength range of 650 to 9
nm (known as the first NIR window or NIR-I). This allows NIR light to penetrate
deeper than visible light, making it possible to visualize structures beneath
the skin.
[Para 36] The specific absorption characteristics of different tissues within this
range make it easier to distinguish between them and create a contrast effect.
For example, the two primary absorbers of NIR light in tissue are oxygenated
and deoxygenated haemoglobin in the blood, and water.
Different tissues (like muscles, blood vessels, and fat) will vary in their relative
composition of these absorbers, leading to different absorption and scattering
characteristics.
[Para 37] As a result, the presence of blood vessels can be more easily
identified as they will show up as areas of increased absorption. This helps in
detecting abnormal vessel networks that are indicative of tumor growth.
[Para 38] Reduced scattering: The NIR wavelength range is less prone to
scattering than visible light, resulting in clearer images of the breast tissue
and vessels. This improves image quality and leads to more accurate detection
and localization of abnormalities, such as cancerous tumors.
[Para 39] Multispectral analysis: Combining NIR and LWIR imaging allowing for
a more comprehensive analysis of the breast tissue. The LWIR imaging
provides information on temperature variations, which may indicate an
increased metabolic activity in cancerous tissue, while NIR imaging reveals the
presence of abnormal vessel structures. The combination of these two imaging
modalities improves the specificity and sensitivity of breast cancer detection.
[Para 40] To conclude, NIR imaging alongside 3D structure measurements and
thermal imaging significantly enhances the detection of vessel structures and
the overall accuracy of breast cancer detection. The ability to visualize and
quantify abnormal vessel networks and tumor depth, as well as the improved
image quality and multispectral analysis, all contribute to the enhanced
effectiveness of breast cancer detection.
Resolution of the image:
[Para 41] The accurate measurement of the heat pattern is fundamental to the
system's functioning. Therefore, achieving a high degree of resolution and
minimal noise is critical for this process. This prompts the question of how
many cameras would be required to ensure a comprehensive and uniform
pixel distribution across the entire breast.
[Para 42] The frontal camera 103 including sensor 203 captures high
resolution images of the frontal side of the breast, but the left, right and lower
sides are not visible for cancer location distribution (see Fig. 3).
[Para 43] Cancer locations are frequently found in the nipple area, which may
be difficult to capture with thermal frontal camera 103 (including sensor 203)
for women with ptosis, where the nipple is directed downwards (see fig. 5).
For this reason, in Thermal Device (TD) 100 there are at least 2 cameras 1
& 104 (including sensor 203) positioned facing the right and left breasts from
45o degree angle below main frontal camera 103, (see Fig. 3).
[Para 44] As depicted in Figure 3, a configuration utilizing five cameras
(inclusive of sensors 203) permits a thorough temperature evaluation of the
entire breast, including the neck and axillary region, from the front, sides,
and underside. This arrangement is meticulously designed to ensure
expansive coverage of the breast, chest, armpits and neck, leaving no zone
unchecked. This comprehensive monitoring is crucial for precise temperature
measurement and the identification of any potential areas of concern.
[Para 45] In one embodiment, the system includes a Thermal Device (TD) 1
that comprises frontal camera 103 including 3D thermal depth sensor 203, two
cameras 102 & 104 including depth sensors each 203, positioned at a 45-
degree angle from below and 45-degree angle from the side, two cameras
101 & 105 including depth sensors 203 each, positioned at 80-90-degree
angles from the sides on the same level as frontal camera 103, and fans106.
[Para 46] In other embodiments the TD comprises distinct combination of
sensors 101-105, capable of acquiring data ranging from 4-dimensional to 8-
dimensional. The 8-dimension comes from adding up the dimensions of each
type of data: 3 from colour (RGB), 3 from the 3D structure, 1 from heat or LWIR,
and 1 from NIR. However, depending on application's needs, other
configurations may be used. For example, a camera system can be constructed
using only thermal and 3D data, which would result in a 4D model.
[Para 47] In another embodiment, a 5D thermal camera system, is used
comprising:
a) a long-wave infrared (LWIR) sensor configured to detect and capture
long-wave infrared radiation emitted by objects within a scene.
b) a 3D depth sensor configured to measure the distance between the
camera and objects within the scene based on the time it takes for emitted
light pulses to return to the sensor after reflecting off said objects.
c) a near-infrared (NIR) sensor configured to detect and capture near-
infrared radiation reflected by objects within the scene.
d) a video streaming module configured to process and transmit captured
data from the LWIR, 3D, and NIR sensors in real-time, enabling live video
streaming.
e) an integration module configured to enable communication and data
exchange between the 5D thermal camera and other external sensor
systems as part of a multiple sensor system, wherein the combination of
LWIR, 3D, and NIR sensors provides a comprehensive and dynamic five-
dimensional representation of the scene, allowing for enhanced situational
awareness and improves analysis of environmental conditions.
[Para 48] The 5D imaging system further comprises an integrated machine
learning module, wherein said module utilizes artificial intelligence algorithms
for real-time object detection, classification, and tracking, enhancing the
system's ability to recognize and analyse complex scenes, and facilitating
various computer vision applications such as autonomous navigation, security
and surveillance, and advanced human-computer interaction.
[Para 49] The 5D imaging system further comprises an adaptive fusion
mechanism, wherein said mechanism dynamically adjusts the weights
assigned to each imaging modality based on scene characteristics, ambient
conditions, or specific application requirements, optimizing the output video
stream for improved clarity, accuracy, and contextual information.
[Para 50] The 5D imaging system further comprises a modular design, allowing
for the interchangeability and upgradability of individual sensor components
or the addition of new imaging modalities, thereby enabling customization
and adaptation of the system to meet specific use-case demands or to
accommodate advances in sensor technology.
[Para 51] The processing unit of 5D imaging system is further configured to
perform real-time image enhancement techniques, such as noise reduction,
contrast stretching, and edge sharpening, on the LWIR, NIR, and 3D sensor
data prior to alignment and merging, thereby improving the quality and
reliability of the resulting 5D video stream.
[Para 52] The 5D imaging system is utilized for non-invasive medical
diagnostics, allowing healthcare professionals to visualize and analyse surface
temperature variations and blood perfusion in the human body, aiding in the
detection of inflammation, infection, or other abnormalities.
[Para 53] The 5D imaging system is utilized for monitoring treatment response
in cancer patients by not only providing comprehensive and non-invasive
assessment of tumour characteristics, such as size, shape, and
vascularization, but also evaluating various physiological parameters and
biomarkers, including body temperature, blood perfusion, inflammation rates
and metabolic activity by integrating data from the LWIR, NIR, and 3D sensors
with additional diagnostic information.
[Para 54] This comprehensive monitoring approach enables healthcare
professionals to track and evaluate the effectiveness of therapeutic
interventions, monitor patients' overall health status, and adjust treatment
plans accordingly for optimized patient outcomes.
[Para 55] The method of compressing and encoding the 5D video stream
generated by the imaging system employs specialized algorithms and data
structures designed to exploit redundancies and correlations within and
between the different imaging modalities, resulting in a compressed video
stream that retains essential information while reducing storage and
transmission bandwidth requirements.
[Para 56] In yet another embodiment an 8D imaging system is used. The 8D
system is constructed by adding an RGB camera to the 5D system, that
comprises an LWIR sensor for capturing long-wave infrared images, a NIR
sensor for capturing near-infrared images and a 3D sensor for obtaining depth
information, a processing unit configured to process and align the data
acquired from LWIR, NIR, 3D, and RGB sensors.
[Para 57] A communication interface is transmitting the 8D video stream
generated from the aligned and merged data of the LWIR, NIR, 3D, and RGB
sensors. The 8D imaging system allows for enhanced scene understanding
and analysis by combining the complementary imaging modalities into a
single, coherent, and integrated video stream.
[Para 58] This system can be utilised in various applications, for example
adapted for installation on unmanned aerial vehicles (UAVs), such as drones,
allowing for enhanced remote monitoring and data collection in a variety of
environments and applications.
[Para 59] The UAV-mounted 8D imaging system can be employed for
purposes such as environmental monitoring, wildlife observation, search and
rescue operations, and infrastructure inspection, providing comprehensive
and multi-dimensional data that combines LWIR, NIR, RGB and 3D information
to enable more informed decision-making and improved outcomes in each
respective field and improved visibility under varying conditions.
[Para 60] The fusion of multiple imaging modalities allows the system to
perform well under different lighting conditions and in challenging
environments, such as low-light, fog, or smoke, by leveraging the strengths
of each imaging type.
[Para 61] As presented in fig.3b, a camera comprises a RGB sensor 201,
capturing high-resolution images with accurate colour reproduction, thereby
representing the scene authentically.
[Para 62] The camera further incorporates a Time-of-Flight (ToF) depth
module, consisting of a Near-Infrared (NIR) emitter (202), sensor (203), an
optics system (204), and a computation unit (205). The NIR emitter transmits
light, which, upon reflection from objects in the scene, is captured by the NIR
sensor.
[Para 63] Optics system 204 focuses this reflected NIR light onto the sensor.
Computation unit 205 processes the time taken for this light to travel back
and forth, thereby calculating distance measurements. This data enables the
creation of a detailed 3D representation of the scene, while simultaneously
producing a NIR image.
[Para 64] Thermal cameras (101-105) can produce far Infrared images. It
detects infrared radiation to generate a 'heat map' of the scene, revealing
temperature distribution and anomalies.
[Para 65] Infrared Lens (206) of thermal cameras (101-105) focus incoming
infrared radiation onto thermal detector 207, which is made of materials like
germanium, which are transparent to infrared light. Thermal Detector (207) is
composed of a grid of pixels that change electrical resistance based on
temperature. Common materials include Vanadium Oxide or Amorphous
Silicon.
[Para 66] Signal Processing Electronics (208) amplifies the signal, reduces
noise, and converts the signal to digital format for image creation.
[Para 67] Signal Processing Module (209) includes a computational module
enabling simultaneous processing and merging of data streams from all
sensors. This module synchronizes data, resulting in comprehensive, multi-
dimensional scene representation and turns the output from all detectors into
a form usable by external devices.
[Para 68] The camera's high-precision components are housed within a
durable body 210, safeguarding against external elements, facilitating user-
friendly operation, and ensuring accessibility for all user proficiency levels.
[Para 69] To reconstruct the 3D shape of the breast with high precision, the
3D sensors 203 within cameras 101-105 are used. The skin is scanned with
high resolution over the entire breast.
[Para 70] The cameras 101-105 are positioned on arms 107 of TD 100 and are
directed at the subject in the centre of the multiple angles (see fig.1). TD 100
has a degree of freedom allowing it to adjust to the subject, based on her
individual characteristics. This allows for a fixed thermal acquisition, thereby
removing any bias from the technician who performs the screening.
[Para 71] TD 100 is easily operated by a certified technician/nurse during the
screening process. Thermal sensors capture synchronised thermal videos of
subject’s chest area over time to analyse the thermal recovery rates of
different areas and tissue types. Sensors 203 enhance the accuracy of thermal
signals processed by adding information about the depth of the signal and
helping to reconstruct the 3D model of the thermal map.
[Para 72] Furthermore, the blood vessel modeling is enriched by incorporating
near-infrared signal acquisition, which is aligned with thermal and depth
signals.
Screening Process:
[Para 73] The screening process includes 3 phases in a row:
- The first minutes are for acclimatisation, where the Sensors capture how
the body is calming and coming to the steadiness with the room
temperature.
- Then starts the cooling phase, which automatically switches on fans 1
mounted on TD 100. Fans 106 blow air towards the body’s chest area to
cool it down by several degree Celsius. This creates contrast to highlight
some of the blood vessels. When fans 106 automatically switch off, the
stabilisation phase starts.
- During the stabilisation phase, that takes a few minutes, the Sensors
capture the recovery of the body temperature from cool state to normal
state to further analyse the recovery rates of different body areas and
tissue types.
[Para 74] The contrast fades away but different tissue types have different
recovery rates. Some areas recover faster than others. With screening it is
possible to capture the entire dynamic continuum and construct a dynamic
thermal map, which is further analysed to identify asymmetries and anomalies
in those patterns that indicate disease.
[Para 75] After the screening is completed, the captured signals including
thermal, depth, and NIR streams are automatically uploaded to the system for
further Artificial Intelligence (AI) analysis to find metabolic abnormalities
related to breast cancer.
Camera Positioning on arms:
[Para 76] VFOV is a Vertical Field of View of thermal camera, HFOV is a
Horizontal Field of View.
[Para 77] In one embodiment, Sensors 203, and rails 107 are positioned on TD
100 in the following way:
• optical axis of the top Sensors is horizontal.
• optical axis of the bottom Sensors is under 45.
• The rail of top Sensors is under vertical angle = VFOV/2.
• The rail of the bottom Sensors is under angle = 45 + VFOV/2.
• In addition to that, for top left/right camera 101 & 105 the rail is
additionally rotating to the left/right in horizontal plane with respect to the
direction of optical axis under angle HFOV/2 (see figs 11 & 11a).
[Para 78] This is done to achieve the minimum possible movements necessary
to adapt the camera positions to the screening subject size.
[Para 79] This setup also guaranties that the woman's head is never visible
from any possible position of the cameras. This is the only setup which
achieved such a result with only one degree of freedom per camera.
[Para 80] Figs 13c-d demonstrate this effect for one top camera in comparison
to situation in Figs 13a-b when the rail is horizontal.
[Para 81] With the horizontal rail the camera may be adopted to have upper
boundary of FOV on the neck of the screened subject. If the next screened
subject is larger, the cameras move back to see the ROI. The problem is that
the head is visible and since the head is outside the ROI, it is not desirable to
spend pixels of the image on the head. Privacy concerns is another reason for
aiming the ROI, so it does not include the head.
[Para 82] If the rail is directed under VFOV/2 angle and if the level of subject’s
head is always on the same height from the floor for all subjects, then this rail
angle guarantees that for every position of the camera the top boundary of
field of view will be always on the neck level and never higher.
[Para 83] The same principle works for the lower cameras 102 & 104, when
needed to rotate the camera with respect to the optical axis, for lower cameras
45 + VFOV/2 angle, is needed.
[Para 84] For the top left camera 105, with horizontal angle HFOV/2 of rail
107, the same effect as for the head being always on the top border of the
frame is achieved, but the chair must always be on the left border of the frame.
[Para 85] In this way the system guarantees that the head of the screened
subject is never visible from any position of the camera, preserving privacy and
ensuring optimal ROI capturing for different body sizes with minimum number
of movements of camera.
Thermal sensor calibration for accurate thermal acquisition
[Para 86] Intrinsic calibration: un-distortion correction
Lens might suffer from distortion effect (red and blue images), while the goal
is to correct the distortion and get the green image, where all the lines are
straight (see fig.7). This eliminates lens artifacts and keeps each image of
each camera uniform (see https://purveyoroflight.com/blog/correcting-
lens-distortion-in-adobe-lightroom).
[Para 87] In order to achieve elimination of lens artifacts, intrinsic parameters
are calculated with chessboard optimized for thermal camera (black is hot/high
emissivity, and white is cold/low emissivity). These parameters are fixed for
each lens.
Thermal camera stabilization (fig. 8).
[Para 88] When turning on uncooled TD 100, it takes several minutes for the
Sensor’s temperature to stabilize (step 3). Once stabilized, the measurements
drift is slow, and the non-uniformity of the image is stable for a long time.
Moreover, if during screening, the environmental changes, it might affect
stabilization. So, there is a need to measure it and compensate the change in
real time. Therefore, stabilization measurement is required.
[Para 89] The stabilization measurement is achieved as follows:
a. Predefined ROI, which is a known temperature marker with emissivity >
0.95 that was added to the device’s chair, is found in the image.
b. The median digital level over X sec is measured.
c. If the median is less than the stabilization threshold, then the camera is
stable, otherwise a 1-minute wait is required and then, the process is
repeated.
d. If there is a spike during the measurement (something moved in front of
the camera) the measurement must be restarted or ignored.
[Para 90] An alternative approach for stabilization measurement entails using
the temperature of the thermal sensor in lieu of the median digital level. In
this case the above logic would be applied identically.
Image uniformity check
[Para 91] Uncooled camera’s image is influenced by thermal change of the
environment. It causes thermal non uniformity of the image. Thermal non
uniformity means, that if the camera observes a uniform temperature, (like
black body), the digital level of each pixel in the image corresponding to the
temperature is not equal. Before screening, the goal is to get a uniform image.
[Para 92] The assumption is that the digital level differences are cause by an
offset. Therefore, Non-Uniformity Correction (NUC) is performed as a standard
step in thermal cameras as follows: shutter is closed, (uniform temperature),
and then the offset of each pixel is subtracted from the average digital level of
the image. To make sure that it was performed correctly, (meaning that the
shutter was fully closed), the following algorithm applies:
- Immediately after NUC, the camera’s shutter close.
- To check (Standard Deviation) STD of the image’s pixels intensity.
- If the STD it smaller than the uniform threshold, the image is uniform.
Otherwise, NUC and STD is repeated and checked as described above.
Gain factor and offset calibration for all cameras.
[Para 93] Uniform gain and offset of all cameras, all the time, in all cameras
are essential for uniform data analysis. Therefore, the goal is to normalize the
gain and offset of all cameras (the difference will be the noise level).
[Para 94] The equation of a linear line is Y = mX + b where m is the slope (or
multiplier of the line) and b is the offset (or the y-intercept of the line).
[Para 95] In thermal camera, the slop is the gain of each pixel: digital level for
temperature difference. The offset is digital level at specific temperature.
[Para 96] Therefore, gain calculation requires two different temperatures,
while offset require only one. When the image is uniform (after NUC), two
objects with known temperature and high emissivity in the FOV are needed to
calculate the gain and the offset and normalize them according to the linear
line equation. The following algorithm is:
- Uniformity check must pass.
- To find the two ROIs with the thermometers which have predefined different
temperature in the FOV.
- To calculate the gain of each camera (the difference between the
temperatures).
- To normalize all cameras’ gain to be the same by multiple with a factor for
each camera.
- To add an offset to each camera by setting 1 temperature to be equal for all
of them. (Subtract the offset for each camera).
- Once calibrated, the calibration gain and offset parameters of each camera
are fed to the recording mechanism, to make sure the recording data of all
cameras is equally normalized.
[Para 97] Since these temperature markers are in the FOV during screening, it
is possible to perform this calibration for each frame during screening.
[Para 98] The method of preparing the subject for the actual screening, and
the method of deriving results, are described in fig 8. The operator of the
screening is the "system user", meaning, certified technician or clinical staff.
Step 1 inserting subject information (PI) into PI station. eCRF is an electronic
Case Report Form – meaning the subject information station 109;
Step 2 Initializing TD 100 for the subject screening;
Step 3 System Calibration;
Step 4 Screening with Thermal Device;
Step 5 Saving the results in cloud together with PI.TM cloud meaning cloud
database.
Calibration.
[Para 99] For every new screened subject, the screening device needs
calibration to get optimal thermal acquisition.
After a new subject sits on the screening chair, first, the technician adjusts the
height of the chair to be sure that the head of the subject is positioned on the
same predefined height from the floor.
[Para 100] Before starting the screening, the technician does a calibration
procedure. This makes sure the cameras are in the right position and focused
properly. This is needed to achieve optimal position of ROI in the field of view
and optimal focus of the image.
[Para 101] The calibration of the device consists of positioning each Sensor to
its optimal distance from the screened subject and rotating the lens for optimal
focus. It is done for optimal thermal screening acquisition, adjusted to the
characteristics of each subject. The calibration is assisted by the software.
[Para 102] For each Sensor, the software displays a score ranging from 0-100%
to assess the current position and suggest moving the camera closer or farther
from the subject. The score is based on the size of the ROI in the frame, aiming
at displaying the full ROI and minimizing the area outside it. Sensors 203 are
held on sliding rail 107.
[Para 103] The Sensors do not move along the rail, rather the rail moves
together with the camera, to ensure that the rail will not be in the field of view
of the camera. Sensors 203 can rotate 360 horizontally leaning 90 up and
down, (see fig 11).
[Para 104] However, with the rail and camera angles properly fixed, the only
necessary adjustment for the technician is to move the camera along the rail
towards or away from the patient. This effectively means the camera operates
with a single degree of freedom.
[Para 105] Once all cameras are positioned optimally, the user adjusts the focus
of Sensors 203 of cameras 101-105 by rotating the lens of each camera in both
directions. The software guides the user to the best position of the lens for
sharpness of the ROI. The software signals a green light when the image is
optimally sharp, (see fig 11a).
[Para 106] After calibration is complete and all cameras are recognized by the
software to be in the optimal position and focus, the user may begin the
screening.
[Para 107] The calibration steps (moving the cameras to optimal position and
focus) may be operated automatically by motors.
[Para 108] During the first couple of seconds of screening, the Non-Uniformity
Correction (NUC) is performed. Thermal cameras need NUC to adjust for the
inherent variations in pixel sensitivity across the Sensor, ensuring accurate and
consistent temperature measurements. During NUC, the shutter of the camera
is closed to prevent incoming radiation and affecting the measurement.
[Para 109] Shutter 210 is a mechanical device that covers the camera's sensor,
controls the duration of exposure to incoming radiation, and helps to produce
accurate temperature readings. During calibration shutter 210 is closed twice,
once to perform NUC, and the second time, to ensure that shutter 210 is
functioning by checking the noise distribution after NUC.
Vision One (VO) Screening software
[Para 110] The operation and screening of TD is performed by TM VO system.
As shown on fig 10, the software platform has main central window and at least
small windows on the side. The small windows show the screening process
from each Sensor on the screening device, meaning different angle views of
the chest. The main window shows the enlarged selected angle view.
Motion correction algorithm
[Para 111] When capturing dynamic thermal imaging to screen a subject, small
movements such as breathing and temporary loss of balance may occur,
leading to variations between images.
[Para 112] To minimize the impact of these movements on thermal signals, all
images are registered. Registration is the process of aligning or "matching"
images. In this case, registration is used to establish a relationship between
two images of the same scene, for achieving the best possible overlap.
[Para 113] The first image in the sequence is considered the reference image,
while the remaining images are considered as transformable images.
[Para 114] The registration is performed in two stages. In the first stage, a rigid
body registration is applied based on intensities with geometric
transformation consisting of translation, rotation, and scale. The second stage
uses non-rigid 2D registration with Residual Complexity (RC), (see Liu, H., Zhang,
J., Yang, K., Hu, X. and Stiefelhagen, R., 2022. CMX: Cross-Modal Fusion for RGB-X Semantic
Segmentation with Transformers. arXiv preprint arXiv:2203.04838).
[Para 115] Given that the image intensity changes over time due to cooling, as
a similarity measure mean, Normalized Mutual Information (NMI) is used
between reference image and all frames in stabilized video.
AI model architecture (fig. 12)
[Para 116] Each frame contains spatial information, and the sequence of those
frames contains temporal information. To model both aspects, a hybrid
architecture is used consisting of convolutions (for spatial processing) as well
as recurrent layers (for temporal processing). Specifically, a Convolutional
Neural Network (CNN) and a Recurrent Neural Network (RNN), consisting of
Gated recurrent unit (GRU) layers, are used. This kind of hybrid architecture is
known as a CNN-RNN.
[Para 117] The images of a video, along with the corresponding NIR images and
3D structure information, are fed to a CNN model to extract high-level features.
In parallel, features are extracted from the vessel map derived from the thermal
and NIR images. After concatenating the features from all 5 angles they are fed to
an RNN layer and the output of the RNN layer is connected to a fully connected
layer to get the classification output.
[Para 118] In one of the implementations ResNet18 pre-trained is used on
ImageNet (~11 mln parameters) as the base CNN model and RNN model with
hidden size 100 and two layers. The hidden state from RNN model is then
concatenated with features extracted from risk factors and then together are used
to predict malignancy.
Claims (34)
1. A thermal system for breast cancer screening without radiation and without body contact comprising; a Thermal Device (TD) comprising: - at least one frontal thermal camera including at least one thermal 3D sensor; at least one near infrared sensor (NIR); and at least one shutter for each sensor; - at least two movable thermal cameras, including depth and NIR sensors each, positioned at a 45 angle from below and 45 angle from the side; - at least one rail; - at least one fan - a designated software; - thermal cameras imaging systems constructed to use from 4D up to 8D models; wherein the sensors are either an integral part of the camera or a separate device connected to the camera; and wherein at least one camera is positioned on at least one rail of the TD, directed at the subject sitting on a chair in the centre of the multiple angles; and wherein the cameras rotate 360 horizontally leaning on 90 up and down; wherein the TD has a degree of freedom allowing the adjustment to subject's individual characteristics; wherein the sensors capture synchronised thermal videos of subject’s chest area over time, analysing the thermal recovery rates of different areas and tissue types.
2. The Thermal Device of claim 1 wherein the cameras and rails are positioned in the following way: • optical axis of top cameras is horizontal; • optical axis of bottom cameras is under 45; • The rail of top cameras is under vertical angle = (vertical field of view) VFOV/2; • The rail of bottom cameras is under angle = 45 + VFOV/2.
3. The Thermal Device of claims 1 & 2 wherein the rail of top left and right cameras is rotating to the left or to the right in horizontal plane with respect to the direction of optical axis under angle (horizontal field of view) HFOV/2.
4. The Thermal Device of any one of claims 1-3 wherein the cameras on the horizontal rail may be adopted to have upper boundary of field of view (FOV) on the neck of the screened subject.
5. The Thermal Device of any one of claims 1- 4 wherein the cameras on the horizontal rail move according to screened subject's body structure so that the head is always out of the frame.
6. The Thermal Device of any one of claims 1- 5 wherein when the rail is directed under VFOV/2 angle and the level of subject’s head is on the same height from the floor, then the rail angle guarantees that in every position of the camera the top boundary of the FOV is always on the neck level and never higher.
7. The Thermal Device of any one of claims 1- 6 wherein the same mechanism of claim 6 applies for lower cameras, when needed to rotate the cameras with respect to the optical axis, for lower cameras 45 + VFOV/2 angle.
8. The Thermal Device of any one of claims 1-7 wherein the top left/right cameras with horizontal angle HFOV/2 achieve the same effect as for the head being always on the top border of the frame, when the chair is always on the left/right border of the frame.
9. The Thermal Device of any one of claims 1- 8 wherein the intrinsic parameters of the lens artefacts are determined by chessboard, a circle board, or any other calibrated target specifically tailored for thermal camera applications, being designed to stand out from the background in thermal imagery due to its emissivity and temperature and being maintained constantly for each individual lens.
10. The Thermal Device of any one of claims 1- 9 wherein the TD is being calibrated before every screening to get optimal thermal acquisition.
11. The Thermal Device of any one of claims 1-10 is operated by Vision One (VO) system wherein the software platform having main central window showing the enlarged selected angle view and at least 5 small windows on the side, showing the screening process from each Sensor on different angle views of the chest.
12. The Thermal Device of any one of claims 1-11 wherein the shutter is a mechanical device covering the camera's sensor controlling the duration of exposure to incoming radiation and helping to produce accurate temperature readings.
13. The Thermal Device of any one of claims 1-12 comprising a 4D thermal camera system constructed to use thermal and 3D data.
14. The Thermal Device of any one of claims 1-12 comprising a 5D thermal camera system, comprising: a) a long-wave infrared (LWIR) sensor configured to detect and capture long-wave infrared radiation emitted by objects within a scene (thermal sensor); b) a 3D depth sensor configured to measure the distance between the camera and objects within the scene based on the time it takes for emitted light pulses to return to the sensor after reflecting off said objects; c) a near-infrared (NIR) sensor configured to detect and capture near-infrared radiation reflected by objects within the scene; d) a video streaming module configured to process and transmit captured data from the LWIR, 3D, and NIR sensors in real-time, enabling live video streaming; e) an integration module configured to enable communication and data exchange between the 5D thermal camera and other external sensor systems as part of a multiple sensor system; wherein the combination of LWIR, 3D, and NIR sensors provides a comprehensive and dynamic five-dimensional representation of the scene.
15. The 5D imaging system of claim 14, further comprising an integrated machine learning module, wherein said module utilizes artificial intelligence algorithms for real-time object detection, classification, and tracking, enhancing the system's ability to recognize and analyse complex scenes, and facilitating various computer vision applications such as autonomous navigation, security and surveillance, and advanced human-computer interaction.
16. The 5D imaging system of claim 14, further comprising an adaptive fusion mechanism, wherein said mechanism dynamically adjusts the weights assigned to each imaging modality based on scene characteristics, ambient conditions, or specific application requirements, optimizing the output video stream for improved clarity, accuracy, and contextual information.
17. The 5D imaging system of claim 14, further comprising a modular design, allowing for the interchangeability and upgradability of individual sensor components or the addition of new imaging modalities, thereby enabling customization and adaptation of the system to meet specific use-case demands or to accommodate advances in sensor technology.
18. The 5D imaging system of claim 14 wherein the processing unit is further configured to perform real-time image enhancement techniques, such as noise reduction, contrast stretching, and edge sharpening, on the LIR, NIR, and 3D sensor data prior to alignment and merging, thereby improving the quality and reliability of the resulting 5D video stream.
19. The 5D imaging system of claim 14, wherein the system is utilized for non-invasive medical diagnostics, allowing healthcare professionals to visualize and analyse surface temperature variations and blood perfusion in the human body, aiding in the early detection of inflammation, infection, or other abnormalities.
20. The 5D imaging system of claim 14, wherein the system is utilized for monitoring treatment response in cancer patients by not only providing comprehensive and non-invasive assessment of tumour characteristics, such as size, shape, and vascularization, but also evaluating various physiological parameters and biomarkers, including body temperature, blood perfusion, inflammation rates and metabolic activity by integrating data from the LIR, NIR, and 3D sensors with additional diagnostic information.
21. The 8D imaging system of claim 1 comprising a RGB camera, that comprises a LIR sensor for capturing long-range infrared images, a NIR sensor for capturing near-infrared images and a 3D sensor for obtaining depth information. The RGB sensor captures visible light images, a processing unit configured to process and align the data acquired from LIR, NIR, 3D, and RGB sensors.
22. The 8D imaging system of claim 21 further comprising a communication interface transmitting the 8D video stream generated from the aligned and merged data of the LIR, NIR, 3D, and RGB sensors.
23. The 8D imaging system of any one of claims 21 to 22 further allowing the enhanced scene understanding and analysis by combining the complementary imaging modalities into a single, coherent, and integrated video stream.
24. The 8D imaging system of any one of claims 21 to 23 can be utilised for installation on unmanned aerial vehicles (UAVs), allowing for enhanced remote monitoring and data collection in a variety of environments and applications.
25. The UAV-mounted 8D imaging system of claim 24 can be employed for environmental monitoring, wildlife observation, search, rescue operations, and infrastructure inspection, providing comprehensive and multi-dimensional data that combines LIR, NIR, RGB and 3D information to enable more informed decision-making and improved outcomes in each respective field and improved visibility under varying conditions.
26. The Thermal Device of any one of claims 1-12 comprising an 8D thermal camera system constructed to use the dimensions of each type of data: 3 from colour (RGB), 3 from the 3D structure, 1 from heat or LIR, and from NIR.
27. The fusion of multiple imaging modalities of any one of claims 14 to 26 allows the system to perform well under different lighting conditions and in challenging environments, such as low-light, fog, or smoke, by leveraging the strengths of each imaging type.
28. A method for breast cancer screening by the TD without radiation and without body contact, the method comprising the steps of: a. Turning on the TD and waiting a few minutes for stabilization of cameras' temperature and uniformity of the image; b. Performing a non-uniformity correction (NUC) in thermal cameras; c. inserting patient information (PI) into PI station; d. Initializing the TD for screening; e. Calibrating the TD assisted by the software comprising: positioning each thermal sensor to its optimal distance from the screened subject and rotating the lens for optimal focus adjusted to the characteristics of each subject; f. Displaying by the software a score ranging of each sensor from 0- 100% assessing the current position and suggesting moving the cameras closer or farther from the subject, wherein the score is based on the size of the region of interest (ROI) in the frame, with the aim of displaying the full ROI and minimizing the area outside it; g. Displaying by the software a score ranging of each sensor from 0- 100% assessing the current lens position and suggesting the rotating of the lens for optimal focus; h. Screening with TD; when the calibration is complete and all cameras are recognized by the software to be in the optimal position and focus, the screening starts; i. Performing the Non-Uniformity Correction (NUC) during the first couple of seconds of the screening, wherein the shutter of the thermal camera is closed, thus preventing the incoming radiation, and affecting the measurement; j. After the screening is completed, the captured signals including thermal, depth, NIR streams are automatically uploaded to the system for further Artificial Intelligence (AI) analysis to find metabolic abnormalities related to breast cancer.
29. The method of claim 28 wherein the calibration steps may be operated automatically by motors.
30. The method of claim 28 wherein during calibration the shutter is closed twice, once to perform NUC and the second time, to ensure that the shutter is functioning by checking the noise distribution after NUC.
31. The method of claim 28 wherein all images are registered, a process of aligning or matching images, establishing a relationship between two images of the same scene, whereas the first image in the sequence is the reference image, and the remaining images are transformable images.
32. The method of claim 31 wherein the process of image registration aligns or matches images of different modalities, including but not limited to (LIR), NIR, and 3D images.
33. The method of claim 28 wherein the step of preparation for the screening comprises 3 phases in a row: - Acclimatisation phase - capturing the calming of the body by the sensors and coming to the steadiness with the room temperature; - cooling phase – by automatically switching on the fans mounted on the TD blowing air toward the chest area and cooling down the chest temperature by several degree Celsius, thus creating contrast to highlight some of the blood vessels; - stabilisation phase – the fans automatically switch off and the sensors capture the recovery of body temperature from cool state to normal state for further analyzing the recovery rates of different body areas and tissue types.
34. The method for compressing and encoding the 5D video stream generated by the imaging system of any one of claims 14 to 20, wherein the method employs specialized algorithms and data structures designed to exploit redundancies and correlations within and between the different imaging modalities, resulting in a compressed video stream that retains essential information while reducing storage and transmission bandwidth requirements.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL303207A IL303207A (en) | 2023-05-24 | 2023-05-24 | System and method for thermal breast cancer screening tests |
| PCT/IB2024/055047 WO2024241277A1 (en) | 2023-05-24 | 2024-05-23 | System and method for determining an existence of a physiological abnormality in a body of a subject |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL303207A IL303207A (en) | 2023-05-24 | 2023-05-24 | System and method for thermal breast cancer screening tests |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| IL303207A true IL303207A (en) | 2024-12-01 |
Family
ID=93589057
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| IL303207A IL303207A (en) | 2023-05-24 | 2023-05-24 | System and method for thermal breast cancer screening tests |
Country Status (2)
| Country | Link |
|---|---|
| IL (1) | IL303207A (en) |
| WO (1) | WO2024241277A1 (en) |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7738683B2 (en) * | 2005-07-22 | 2010-06-15 | Carestream Health, Inc. | Abnormality detection in medical images |
| AU2011237659B2 (en) * | 2010-04-08 | 2015-05-21 | The Regents Of The University Of California | Methods, system and apparatus for the detection, diagnosis and treatment of biological rhythm disorders |
| WO2016057633A1 (en) * | 2014-10-08 | 2016-04-14 | Revealix, Inc. | Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly |
| EP3490454A4 (en) * | 2016-07-29 | 2020-08-05 | Novadaq Technologies ULC | METHODS AND SYSTEMS FOR CHARACTERIZING A PERSON'S TISSUE WITH THE HELP OF MACHINE LEARNING |
| JP7231631B2 (en) * | 2017-12-05 | 2023-03-01 | ベンタナ メディカル システムズ, インコーポレイテッド | Methods for calculating tumor spatial heterogeneity and intermarker heterogeneity |
-
2023
- 2023-05-24 IL IL303207A patent/IL303207A/en unknown
-
2024
- 2024-05-23 WO PCT/IB2024/055047 patent/WO2024241277A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024241277A1 (en) | 2024-11-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4014875B1 (en) | Method for controlling a medical imaging examination of a subject, medical imaging system and computer-readable data storage medium | |
| US8542877B2 (en) | Processing images of at least one living being | |
| Van Gastel et al. | Motion robust remote-PPG in infrared | |
| US11026763B2 (en) | Projection mapping apparatus | |
| US8634591B2 (en) | Method and system for image analysis | |
| CN105636505B (en) | Apparatus and method for obtaining vital signs of a subject | |
| CN106999116B (en) | Apparatus and method for skin detection | |
| US20230389826A1 (en) | Systems and methods for determining subject positioning and vital signs | |
| EP3174467B1 (en) | Ultrasound imaging apparatus | |
| US11375968B2 (en) | Methods and systems for user and/or patient experience improvement in mammography | |
| US20240233310A1 (en) | Information processing apparatus, information processing system and method for processing information | |
| US20230248268A1 (en) | Camera-based Respiratory Triggered Medical Scan | |
| CN117042697A (en) | Non-contact measurement and visualization of respiration for chest radiographic image examination | |
| JP7536858B2 (en) | Dermatoscope device for examining skin lesions | |
| US11783564B2 (en) | Contactless parameters measurement system and method | |
| JP7740266B2 (en) | Non-contact temperature-based patient monitoring | |
| US11961232B2 (en) | Automated sensing system for health detection | |
| IL303207A (en) | System and method for thermal breast cancer screening tests | |
| Maurya et al. | Simultaneous breathing monitoring of multiple persons using thermal and visible imaging | |
| KR101856855B1 (en) | Method, system and non-transitory computer-readable recording medium for standardizing measuring results of hemodynamics | |
| FI123944B (en) | Three-dimensional identification utilizing emissive-optical tomography method based on micrometre wavelength radiation to detect tumor and follow-up of therapy response | |
| Young-Jin et al. | A Study on the Development of Ultrasonography Guide using Motion Tracking System | |
| CN114864041A (en) | Moxibustion curative effect analysis device and method | |
| CN119941610A (en) | Information processing device, method and program product | |
| CN120616583A (en) | Method for automatically adjusting height of vertical position detector assembly, shooting method and equipment |