WO2014012070A1 - Systèmes et procédés de surveillance de nourrisson à l'aide d'une imagerie thermique - Google Patents
Systèmes et procédés de surveillance de nourrisson à l'aide d'une imagerie thermique Download PDFInfo
- Publication number
- WO2014012070A1 WO2014012070A1 PCT/US2013/050393 US2013050393W WO2014012070A1 WO 2014012070 A1 WO2014012070 A1 WO 2014012070A1 US 2013050393 W US2013050393 W US 2013050393W WO 2014012070 A1 WO2014012070 A1 WO 2014012070A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- infant
- images
- thermal images
- imaging module
- infrared imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/087—Measuring breath flow
- A61B5/0878—Measuring breath flow using temperature sensing means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/23—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from thermal infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/67—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response
- H04N25/671—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction
- H04N25/673—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction by using reference sources
- H04N25/674—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction by using reference sources based on the scene itself, e.g. defocusing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/04—Babies, e.g. for SIDS detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/04—Babies, e.g. for SIDS detection
- A61B2503/045—Newborns, e.g. premature baby monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/06—Children, e.g. for attention deficit diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/08—Elderly
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/04—Constructional details of apparatus
- A61B2560/0431—Portable apparatus, e.g. comprising a handle or case
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
- A61B5/7207—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
- A61B5/7214—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts using signal cancellation, e.g. based on input of two identical physiological sensors spaced apart, or based on two signals derived from the same sensor, for different optical wavelengths
Definitions
- One or more embodiments of the invention relate generally to thermal imaging devices and more particularly, for example, to the use of thermal images to provide monitoring of an infant, an elderly person, a patient, or other persons who may need observation.
- abnormal breathing patterns including apnea
- S1DS sudden infant death syndrome
- an abnormal body temperature may be a sign of serious illness that requires immediate attention
- no temperature reading can be obtained through conventional video images.
- conventional solutions may be available for limited active monitoring that detects movement, heartbeat, or temperature
- these conventional solutions are based on techniques that require contact. That is, these solutions require patches and/or electrodes in direct contact with the body of the infant, patches and/or electrodes in diapers or clothes, sensor pads on mattresses, or other sensors in direct or indirect contact with the infant to detect temperature, movement, or heartbeat.
- contact-based solutions may not only be inconvenient but also restrict the choice of the monitoring location (e.g., only on a bed or in a crib).
- an infant monitoring system may include an infrared imaging module, a visible light camera, a processor, a display, a communication module, and a memory.
- the monitoring system may capture thermal images of a scene including at least a partial view of an infant, using the infrared imaging module enclosed in a portable or mountable housing configured to be positioned for remote monitoring of the infant.
- Various thermal image processing and analysis operations may be performed on the thermal images to generate monitoring information relating to the infant.
- the monitoring information may include various alarms that actively provide warnings to caregivers, and user-viewable images of the scene.
- the monitoring information may be presented at external devices or the display located remotely for convenient viewing by caregivers.
- a monitoring system includes an infrared imaging module comprising a focal plane array (FPA) configured to capture thermal images of a scene within a field of view (FOV) of the infrared imaging module; a housing substantially enclosing the infrared imaging module and configured to be positioned to place at least a portion of an infant within the FOV; and a processor in communication with the infrared imaging module, the processor configured to analyze the thermal images to generate monitoring information relating to the infant.
- FPA focal plane array
- a method in another embodiment, includes capturing, at an FPA of an infrared imaging module, thermal images of a scene within an FOV of the infrared imaging module, wherein the infrared imaging module is positioned so that at least a portion of an infant is placed within the FOV of the infrared imaging module; and analyzing the thermal images to generate monitoring information relating to the infant.
- Fig. 1 illustrates an infrared imaging module configured to be implemented in a host device in accordance with an embodiment of the disclosure.
- Fig. 2 illustrates an assembled infrared imaging module in accordance with an embodiment of the disclosure.
- Fig. 3 illustrates an exploded view of an infrared imaging module juxtaposed over a socket in accordance with an embodiment of the disclosure.
- Fig. 4 illustrates a block diagram of an infrared sensor assembly including an array of infrared sensors in accordance with an embodiment of the disclosure.
- Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure.
- Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure.
- Fig. 7 illustrates a flat field correction technique in accordance with an embodiment of the disclosure.
- Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline in accordance with an embodiment of the disclosure.
- Fig. 9 illustrates a temporal noise reduction process in accordance with an embodiment of the disclosure.
- Fig. 10 illustrates particular implementation details of several processes of the image processing pipeline of Fig. 6 in accordance with an embodiment of the disclosure.
- Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure.
- Fig. 12 illustrates a block diagram of an infant monitoring system having an infrared imaging module in accordance with an embodiment of the disclosure.
- Fig. 13 illustrates an example thermal image of an infant that may be captured by an infrared imaging module and analyzed by a processor in accordance with an embodiment of the disclosure.
- Fig. 14 illustrates an infant monitoring system provided in two separate housings in accordance with an embodiment of the disclosure.
- Fig. 15 illustrates a process for monitoring an infant using thermal imaging in accordance with an embodiment of the disclosure.
- Fig. 16 illustrates a process to combine thermal images and visible light images in accordance with an embodiment of the disclosure. Embodiments of the invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
- Fig. 1 illustrates an infrared imaging module 100 (e.g., an infrared camera or an infrared imaging device) configured to be implemented in a host device 102 in accordance with an embodiment of the disclosure.
- Infrared imaging module 100 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques.
- infrared imaging module 100 may be configured to be implemented in a small portable host device 102, such as a mobile telephone, a tablet computing device, a laptop computing device, a personal digital assistant, a visible light camera, a music player, or any other appropriate mobile device.
- infrared imaging module 100 may be used to provide infrared imaging features to host device 102.
- infrared imaging module 100 may be configured to capture, process, and/or otherwise manage infrared images and provide such infrared images to host device 102 for use in any desired fashion (e.g., for further processing, to store in memory, to display, to use by various applications running on host device 102, to export to other devices, or other uses).
- infrared imaging module 100 may be configured to operate at low voltage levels and over a wide temperature range.
- infrared imaging module 100 may operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or lower voltages, and operate over a temperature range of approximately -20 degrees C to approximately +60 degrees C (e.g., providing a suitable dynamic range and performance over an environmental temperature range of approximately 80 degrees C).
- infrared imaging module 100 may experience reduced amounts of self heating in comparison with other types of infrared imaging devices. As a result infrared imaging module 100 may be operated with reduced measures to compensate for such self heating.
- host device 102 may include a socket 104. a shutter 105, motion sensors 194, a processor 195, a memory 196, a display 197, and/or other components 198.
- Socket 104 may be configured to receive infrared imaging module 100 as identified by arrow 101.
- Fig. 2 illustrates infrared imaging module 100 assembled in socket 104 in accordance with an embodiment of the disclosure.
- Motion sensors 194 may be implemented by one or more accelerometers, gyroscopes, or other appropriate devices that may be used to detect movement of host device 102. Motion sensors 194 may be monitored by and provide information to processing module 160 or processor 195 to detect motion. In various embodiments, motion sensors 194 may be implemented as part of host device 102 (as shown in Fig. 1 ), infrared imaging module 100, or other devices attached to or otherwise interfaced with host device 102.
- Processor 195 may be implemented as any appropriate processing device (e.g., logic device, microcontroller, processor, application specific integrated circuit (ASIC), or other device) that may be used by host device 102 to execute appropriate instructions, such as software instructions provided in memory 196.
- Display 197 may be used to display captured and/or processed infrared images and/or other images, data, and information.
- Other components 198 may be used to implement any features of host device 102 as may be desired for various applications (e.g., clocks, temperature sensors, a visible light camera, or other components).
- a machine readable medium 193 may be provided for storing non-transitory instructions for loading into memory 196 and execution by processor 195.
- infrared imaging module 100 and socket 104 may be implemented for mass production to facilitate high volume applications, such as for
- the combination of infrared imaging module 100 and socket 104 may exhibit overall dimensions of approximately 8.5 mm by 8.5 mm by 5.9 mm while infrared imaging module 100 is installed in socket 104.
- FIG. 3 illustrates an exploded view of infrared imaging module 100 juxtaposed over socket 104 in accordance with an embodiment of the disclosure.
- Infrared imaging module 100 may include a lens barrel 110, a housing 120, an infrared sensor assembly 128, a circuit board 170, a base 150, and a processing module 160.
- Lens barrel 11 0 may at least partially enclose an optical element 180 (e.g., a lens) which is partially visible in Fig. 3 through an aperture 112 in lens barrel 110.
- Lens barrel 110 may include a substantially cylindrical extension 114 which may be used to interface lens barrel 110 with an aperture 122 in housing 120.
- Infrared sensor assembly 128 may be implemented, for example, with a cap 130 (e.g., a lid) mounted on a substrate 140.
- Infrared sensor assembly 128 may include a plurality of infrared sensors 132 (e.g.. infrared detectors) implemented in an array or other fashion on substrate 140 and covered by cap 130.
- infrared sensor assembly 128 may be implemented as a focal plane array (FPA).
- FPA focal plane array
- Such a focal plane array may be implemented, for example, as a vacuum package assembly (e.g., sealed by cap 130 and substrate 140).
- infrared sensor assembly 128 may be implemented as a wafer level package (e.g., infrared sensor assembly 128 may be singulated from a set of vacuum package assemblies provided on a wafer). In one embodiment, infrared sensor assembly 128 may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts. 2.8 volts, or similar voltages.
- Infrared sensors 132 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands (MWIR), long wave infrared wave bands (LWIR), and/or other thermal imaging bands as may be desired in particular implementations.
- infrared sensor assembly 128 may be provided in accordance with wafer level packaging techniques.
- Infrared sensors 132 may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels.
- infrared sensors 132 may be implemented as vanadium oxide (VOx) detectors with a 17 ⁇ m pixel pitch.
- arrays of VOx vanadium oxide
- Substrate 140 may include various circuitry including, for example, a read out integrated circuit (ROIC) with dimensions less than approximately 5.5 mm by 5.5 mm in one embodiment.
- Substrate 140 may also include bond pads 142 that may be used to contact complementary connections positioned on inside surfaces of housing 120 when infrared imaging module 100 is assembled as shown in Figs. 5A, 5B, and 5C.
- the ROIC may be implemented with low-dropout regulators (LDO) to perform voltage regulation to reduce power supply noise introduced to infrared sensor assembly 128 and thus provide an improved power supply rejection ratio (PSRR).
- LDO low-dropout regulators
- PSRR power supply rejection ratio
- Fig. 4 illustrates a block diagram of infrared sensor assembly 128 including an array of infrared sensors 132 in accordance with an embodiment of the disclosure.
- infrared sensors 132 are provided as part of a unit cell array of a ROIC 402.
- ROIC 402 includes bias generation and timing control circuitry 404, column amplifiers 405, a column multiplexer 406, a row multiplexer 408, and an output amplifier 410.
- Image frames e.g., thermal images
- image frames captured by infrared sensors 132 may be provided by output amplifier 410 to processing module 160, processor 195, and/or any other appropriate components to perform various processing techniques described herein.
- FIG. 4 any desired array configuration may be used in other embodiments.
- Further descriptions of ROIC s and infrared sensors may be found in U.S. Patent No. 6,028,309 issued February 22, 2000. which is incorporated herein by reference in its entirety.
- Infrared sensor assembly 128 may capture images (e.g., image frames) and provide such images from its ROIC at various rates.
- Processing module 160 may be used to perform appropriate processing of captured infrared images and may be implemented in accordance with any appropriate architecture.
- processing module 160 may be implemented as an ASIC.
- ASIC may be configured to perform image processing with high performance and/or high efficiency.
- processing module 160 may be implemented with a general purpose central processing unit (CPU) which may be configured to execute appropriate software instructions to perform image processing, coordinate and perform image processing with various image processing blocks, coordinate interfacing between processing module 160 and host device 102, and/or other operations.
- processing module 1 0 may be implemented with a field programmable gate array (FPGA).
- Processing module 160 may be implemented with other types of processing and/or logic circuits in other embodiments as would be understood by one skilled in the art.
- processing module 160 may also be implemented with other components where appropriate, such as, volatile memory, non-volatile memory, and/or one or more interfaces (e.g., infrared detector interfaces, inter-integrated circuit (I2C) interfaces, mobile industry processor interfaces (MTPI), joint test action group (JTAG) interfaces (e.g., IEEE 1 149.1 standard test access port and boundary-scan architecture), and/or other interfaces).
- I2C inter-integrated circuit
- MTPI mobile industry processor interfaces
- JTAG joint test action group
- infrared imaging module 100 may further include one or more actuators 199 which may be used to adjust the focus of infrared image frames captured by infrared sensor assembly 128.
- actuators 199 may be used to move optical element 180.
- Actuators 199 may be implemented in accordance with any type of motion-inducing apparatus or mechanism, and may positioned at any location within or external to infrared imaging module 100 as appropriate for different applications.
- housing 120 When infrared imaging module 100 is assembled, housing 120 may substantially enclose infrared sensor assembly 128, base 150, and processing module 160. Housing 120 may facilitate connection of various components of infrared imaging module 100. For example, in one embodiment, housing 120 may provide electrical connections 126 to connect various components as further described.
- Electrical connections 126 may be electrically connected with bond pads 1 2 when infrared imaging module 100 is assembled.
- electrical connections 126 may be embedded in housing 120, provided on inside surfaces of housing 120, and/or otherwise provided by housing 120. Electrical connections 126 may terminate in connections 124 protruding from the bottom surface of housing 120 as shown in Fig. 3. Connections 124 may connect with circuit board 170 when infrared imaging module 100 is assembled (e.g., housing 120 may rest atop circuit board 170 in various embodiments).
- Processing module 160 may be electrically connected with circuit board 170 through appropriate electrical connections.
- infrared sensor assembly 128 may be electrically connected with processing module 160 through, for example, conductive electrical paths provided by: bond pads 142. complementary connections on inside surfaces of housing 120, electrical connections 126 of housing 120, connections 124, and circuit board 170.
- electrical connections 126 in housing 120 may be made from any desired material (e.g., copper or any other appropriate conductive material).
- electrical connections 126 may aid in dissipating heat from infrared imaging module 100.
- sensor assembly 128 may be attached to processing module 160 through a ceramic board that connects to sensor assembly 128 by wire bonds and to processing module 160 by a ball grid array (BGA).
- sensor assembly 128 may be mounted directly on a rigid flexible board and electrically connected with wire bonds, and processing module 160 may be mounted and connected to the rigid flexible board with wire bonds or a BGA.
- BGA ball grid array
- infrared imaging module 100 and host device 102 set forth herein are provided for purposes of example, rather than limitation. In this regard, any of the various techniques described herein may be applied to any infrared camera system, infrared imager, or other device for performing infrared/thermal imaging.
- Substrate 140 of infrared sensor assembly 128 may be mounted on base 150.
- base 150 e.g., a pedestal
- base 150 may be made, for example, of copper formed by metal injection molding ( ⁇ ) and provided with a black oxide or nickel-coated finish.
- base 150 may be made of any desired material, such as for example zinc, aluminum, or magnesium, as desired for a given application and may be formed by any desired applicable process, such as for example aluminum casting, ⁇ , or zinc rapid casting, as may be desired for particular applications.
- base 150 may be implemented to provide structural support, various circuit paths, thermal heat sink properties, and other features where appropriate.
- base 150 may be a multi-layer structure implemented at least in part using ceramic material.
- circuit board 170 may receive housing 120 and thus may physically support the various components of infrared imaging module 100.
- circuit board 170 may be implemented as a printed circuit board (e.g., an FR4 circuit board or other types of circuit boards), a rigid or flexible interconnect (e.g., tape or other type of interconnects), a flexible circuit substrate, a flexible plastic substrate, or other appropriate structures.
- base 150 may be implemented with the various features and attributes described for circuit board 170, and vice versa.
- Socket 104 may include a cavity 106 configured to receive infrared imaging module 100
- Infrared imaging module 100 and/or socket 104 may include appropriate tabs, arms, pins, fasteners, or any other appropriate engagement members which may be used to secure infrared imaging module 100 to or within socket 104 using friction, tension, adhesion, and/or any other appropriate manner.
- Socket 104 may include engagement members 107 that may engage surfaces 109 of housing 120 when infrared imaging module 100 is inserted into a cavity 106 of socket 104. Other types of engagement members may be used in other embodiments.
- Infrared imaging module 100 may be electrically connected with socket 104 through appropriate electrical connections (e.g., contacts, pins, wires, or any other appropriate connections).
- socket 104 may include electrical connections 108 which may contact corresponding electrical connections of infrared imaging module 100 (e.g., interconnect pads, contacts, or other electrical connections on side or bottom surfaces of circuit board 170, bond pads 142 or other electrical connections on base 150, or other connections).
- Electrical connections 108 may be made from any desired material (e.g., copper or any other appropriate conductive material).
- electrical connections 108 may be mechanically biased to press against electrical connections of infrared imaging module 100 when infrared imaging module 100 is inserted into cavity 106 of socket 104.
- electrical connections 108 may at least partially secure infrared imaging module 100 in socket 104.
- Other types of electrical connections may be used in other embodiments.
- Socket 104 may be electrically connected with host device 102 through similar types of electrical connections.
- host device 102 may include electrical connections (e.g., soldered connections, snap-in connections, or other connections) that connect with electrical connections 108 passing through apertures 190. Tn various embodiments, such electrical connections may be made to the sides and/or bottom of socket 104.
- infrared imaging module 100 may be implemented with flip chip technology which may be used to mount components directly to circuit boards without the additional clearances typically needed for wire bond connections.
- Flip chip connections may be used, as an example, to reduce the overall size of infrared imaging module 100 for use in compact small form factor applications.
- processing module 160 may be mounted to circuit board 170 using flip chip connections.
- infrared imaging module 100 may be implemented with such flip chip configurations.
- infrared imaging module 100 and/or associated components may be implemented in accordance with various techniques (e.g., wafer level packaging techniques) as set forth in U.S. Patent Application No. 12/844,124 filed July 27, 2010, and U.S. Provisional Patent Application No. 61/469,651 filed March 30, 2011, which are incorporated herein by reference in their entirety.
- infrared imaging module 100 and/or associated components may be implemented, calibrated, tested, and/or used in accordance with various techniques, such as for example as set forth in U.S. Patent No. 7,470,902 issued December 30, 2008, U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued November 2, 2004, U.S. Patent No. 7,034,301 issued April 25, 2006, U.S. Patent No. 7,679,048 issued March 16, 2010, U.S. Patent No. 7,470,904 issued
- host device 102 may include shutter 105.
- shutter 105 may be selectively positioned over socket 104 (e.g., as identified by arrows 103) while infrared imaging module 100 is installed therein.
- shutter 105 may be used, for example, to protect infrared imaging module 100 when not in use.
- Shutter 105 may also be used as a temperature reference as part of a calibration process (e.g., a NUC process or other calibration processes) for infrared imaging module 100 as would be understood by one skilled in the art.
- shutter 105 may be made from various materials such as, for example, polymers, glass, aluminum (e.g., painted or anodized) or other materials.
- shutter 105 may include one or more coatings to selectively filter electromagnetic radiation and/or adjust various optical properties of shutter 105 (e.g., a uniform blackbody coating or a reflective gold coating).
- shutter 105 may be fixed in place to protect infrared imaging module 100 at all times.
- shutter 105 or a portion of shutter 105 may be made from appropriate materials (e.g., polymers or infrared transmitting materials such as silicon, germanium, zinc selenide, or chalcogenide glasses) that do not substantially filter desired infrared wavelengths.
- a shutter may be implemented as part of infrared imaging module 100 (e.g., within or as part of a lens barrel or other components of infrared imaging module 100), as would be understood by one skilled in the art.
- a shutter e.g., shutter 105 or other type of external or internal shutter
- a NUC process or other type of calibration may be performed using shutterless techniques.
- a NUC process or other type of calibration using shutterless techniques may be performed in combination with shutter- based techniques.
- Infrared imaging module 100 and host device 102 may be implemented in accordance with any of the various techniques set forth in U.S. Provisional Patent Application No.
- the components of host device 102 and/or infrared imaging module 100 may be implemented as a local or distributed system with components in communication with each other over wired and/or wireless networks. Accordingly, the various operations identified in this disclosure may be performed by local and/or remote components as may be desired in particular implementations.
- Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure.
- the operations of Fig. 5 may be performed by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
- infrared sensors 132 begin capturing image frames of a scene.
- the scene will be the real world environment in which host device 102 is currently located.
- shutter 105 if optionally provided may be opened to permit infrared imaging module to receive infrared radiation from the scene.
- Infrared sensors 132 may continue capturing image frames during all operations shown in Fig. 5.
- the continuously captured image frames may be used for various operations as further discussed.
- the captured image frames may be temporally filtered (e.g., in accordance with the process of block 826 further described herein with regard to Fig. 8) and be processed by other terms (e.g., factory gain terms 812, factory offset terms 816, previously determined NUC terms 817, column FPN terms 820, and row FPN terms 824 as further described herein with regard to Fig. 8) before they are used in the operations shown in Fig. 5.
- a NUC process initiating event is detected.
- the NUC process may be initiated in response to physical movement of host device 102. Such movement may be detected, for example, by motion sensors 194 which may be polled by a processor.
- a user may move host device 102 in a particular manner, such as by intentionally waving host device 102 back and forth in an "erase" or '"swipe" movement. In this regard, the user may move host device 102 in accordance with a predetermined speed and direction
- velocity such as in an up and down, side to side, or other pattern to initiate the NUC process.
- movement such as in an up and down, side to side, or other pattern to initiate the NUC process.
- the use of such movements may permit the user to intuitively operate host device 102 to simulate the "erasing" of noise in captured image frames.
- a NUC process may be initiated by host device 102 if motion exceeding a threshold value is exceeded (e.g., motion greater than expected for ordinary use). It is contemplated that any desired type of spatial translation of host device 102 may be used to initiate the NUC process.
- a NUC process may be initiated by host device 102 if a minimum time has elapsed since a previously performed NUC process.
- a NUC process may be initiated by host device 102 if infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process.
- a NUC process may be continuously initiated and repeated.
- the NUC process may be selectively initiated based on whether one or more additional conditions are met. For example, in one embodiment, the NUC process may not be performed unless a minimum time has elapsed since a previously performed NUC process. In another embodiment, the NUC process may not be performed unless infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process. Other criteria or conditions may be used in other embodiments. If appropriate criteria or conditions have been met, then the flow diagram continues to block 520. Otherwise, the flow diagram returns to block 505.
- blurred image frames may be used to determine NUC terms which may be applied to captured image frames to correct for FPN.
- the blurred image frames may be obtained by accumulating multiple image frames of a moving scene (e.g., captured while the scene and/or the thermal imager is in motion).
- the blurred image frames may be obtained by defocusing an optical element or other component of the thermal imager.
- block 520 a choice of either approach is provided. If the motion-based approach is used, then the flow diagram continues to block 525. If the defocus-based approach is used, then the flow diagram continues to block 530.
- motion is detected.
- motion may be detected based on the image frames captured by infrared sensors 132.
- an appropriate motion detection process e.g., an image registration process, a frame-to-frame difference calculation, or other appropriate process
- it can be determined whether pixels or regions around the pixels of consecutive image frames have changed more than a user defined amount (e.g., a percentage and/or threshold value). If at least a given percentage of pixels have changed by at least the user defined amount, then motion will be detected with sufficient certainty to proceed to block 535.
- a user defined amount e.g., a percentage and/or threshold value
- motion may be determined on a per pixel basis, wherein only pixels that exhibit significant changes are accumulated to provide the blurred image frame.
- counters may be provided for each pixel and used to ensure that the same number of pixel values are accumulated for each pixel, or used to average the pixel values based on the number of pixel values actually accumulated for each pixel.
- Other types of image-based motion detection may be performed such as performing a Radon transform.
- motion may be detected based on data provided by motion sensors 194. Tn one embodiment, such motion detection may include detecting whether host device 102 is moving along a relatively straight trajectory through space. For example, if host device 102 is moving along a relatively straight trajectory, then it is possible that certain objects appearing in the imaged scene may not be sufficiently blurred (e.g., objects in the scene that may be aligned with or moving substantially parallel to the straight trajectory). Thus, in such an embodiment, the motion detected by motion sensors 194 may be conditioned on host device 102 exhibiting, or not exhibiting, particular trajectories.
- both a motion detection process and motion sensors 194 may be used.
- a determination can be made as to whether or not each image frame was captured while at least a portion of the scene and host device 102 were in motion relative to each other (e.g.. which may be caused by host device 102 moving relative to the scene, at least a portion of the scene moving relative to host device 102, or both).
- the image frames for which motion was detected may exhibit some secondary blurring of the captured scene (e.g., blurred thermal image data associated with the scene) due to the thermal time constants of infrared sensors 132 (e.g., microbolometer thermal time constants) interacting with the scene movement.
- some secondary blurring of the captured scene e.g., blurred thermal image data associated with the scene
- thermal time constants of infrared sensors 132 e.g., microbolometer thermal time constants
- image frames for which motion was detected are accumulated. For example, if motion is detected for a continuous series of image frames, then the image frames of the series may be accumulated. As another example, if motion is detected for only some image frames, then the non-moving image frames may be skipped and not included in the accumulation. Thus, a continuous or discontinuous set of image frames may be selected to be accumulated based on the detected motion.
- the accumulated image frames are averaged to provide a blurred image frame. Because the accumulated image frames were captured during motion, it is expected that actual scene information will vary between the image frames and thus cause the scene information to be further blurred in the resulting blurred image frame (block 545). In contrast, FPN (e.g., caused by one or more components of infrared imaging module 100) will remain fixed over at least short periods of time and over at least limited changes in scene irradiance during motion. As a result, image frames captured in close proximity in time and space during motion will suffer from identical or at least very similar FPN. Thus, although scene information may change in consecutive image frames, the FPN will stay essentially constant. By averaging, multiple image frames captured during motion will blur the scene information, but will not blur the FPN. As a result, FPN will remain more clearly defined in the blurred image frame provided in block 545 than the scene information.
- FPN e.g., caused by one or more components of infrared imaging module 100
- 32 or more image frames are accumulated and averaged in blocks 535 and 540.
- any desired number of image frames may be used in other embodiments, but with generally decreasing correction accuracy as frame count is decreased.
- a defocus operation may be performed to intentionally defocus the image frames captured by infrared sensors 132.
- one or more actuators 199 may be used to adjust, move, or otherwise translate optical element 180.
- infrared sensor assembly 128, and/or other components of infrared imaging module 100 to cause infrared sensors 132 to capture a blurred (e.g., unfocused) image frame of the scene.
- Other non-actuator based techniques are also contemplated for intentionally defocusing infrared image frames such as, for example, manual (e.g., user- initiated) defocusing.
- FPN e.g., caused by one or more components of infrared imaging module 100
- FPN will remain unaffected by the defocusing operation.
- a blurred image frame of the scene will be provided (block 545) with FPN remaining more clearly defined in the blurred image than the scene information.
- the defocus-based approach has been described with regard to a single captured image frame.
- the defocus-based approach may include accumulating multiple image frames while the infrared imaging module 100 has been defocused and averaging the defocused image frames to remove the effects of temporal noise and provide a blurred image frame in block 545.
- a blurred image frame may be provided in block 545 by either the motion-based approach or the defocus-based approach. Because much of the scene information will be blurred by either motion, defocusing, or both, the blurred image frame may be effectively considered a low pass filtered version of the original captured image frames with respect to scene information.
- the blurred image frame is processed to determine updated row and column FPN terms (e.g., if row and column FPN terms have not been previously determined then the updated row and column FPN terms may be new row and column FPN terms in the first iteration of block 550).
- updated row and column FPN terms e.g., if row and column FPN terms have not been previously determined then the updated row and column FPN terms may be new row and column FPN terms in the first iteration of block 550.
- the terms row and column may be used
- infrared sensors 132 and/or other components of infrared imaging module 100.
- block 550 includes determining a spatial FPN correction term for each row of the blurred image frame (e.g., each row may have its own spatial FPN correction term), and also determining a spatial FPN correction term for each column of the blurred image frame (e.g., each column may have its own spatial FPN correction term).
- Such processing may be used to reduce the spatial and slowly varying ( 1/f) row and column FPN inherent in thermal imagers caused by, for example, 1/f noise characteristics of amplifiers in ROIC 402 which may manifest as vertical and horizontal stripes in image frames.
- row and column FPN terms may be determined by considering differences between neighboring pixels of the blurred image frame.
- Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure. Specifically, in Fig. 6 a pixel 610 is compared to its 8 nearest horizontal neighbors: d0-d3 on one side and d4-d7 on the other side. Differences between the neighbor pixels can be averaged to obtain an estimate of the offset error of the illustrated group of pixels. An offset error may be calculated for each pixel in a row or column and the average result may be used to correct the entire row or column.
- threshold values may be used (thPix and -thPix). Pixel values falling outside these threshold values (pixels d1 and d4 in this example) are not used to obtain the offset error. Tn addition, the maximum amount of row and column FPN correction may be limited by these threshold values. Further techniques for performing spatial row and column FPN correction processing are set forth in U.S. Patent Application No. 12/396,340 filed March 2, 2009 which is incorporated herein by reference in its entirety.
- the updated row and column FPN terms determined in block 550 are stored (block 552) and applied (block 555) to the blurred image frame provided in block 545. After these terms are applied, some of the spatial row and column FPN in the blurred image frame may be reduced. However, because such terms are applied generally to rows and columns, additional FPN may remain such as spatially uncorrelated FPN associated with pixel to pixel drift or other causes. Neighborhoods of spatially correlated FPN may also remain which may not be directly associated with individual rows and columns. Accordingly, further processing may be performed as discussed below to determine NUC terms.
- local contrast values e.g., edges or absolute values of gradients between adjacent or small groups of pixels
- scene information in the blurred image frame includes contrasting areas that have not been significantly blurred (e.g., high contrast edges in the original scene data)
- contrasting areas e.g., high contrast edges in the original scene data
- local contrast values in the blurred image frame may be calculated, or any other desired type of edge detection process may be applied to identify certain pixels in the blurred image as being part of an area of local contrast. Pixels that are marked in this manner may be considered as containing excessive high spatial frequency scene information that would be interpreted as FPN (e.g., such regions may correspond to portions of the scene that have not been sufficiently blurred). As such, these pixels may be excluded from being used in the further determination of NUC terms.
- contrast detection processing may rely on a threshold that is higher than the expected contrast value associated with FPN (e.g., pixels exhibiting a contrast value higher than the threshold may be considered to be scene information, and those lower than the threshold may be considered to be exhibiting FPN).
- the contrast determination of block 560 may be performed on the blurred image frame after row and column FPN terms have been applied to the blurred image frame (e.g., as shown in Fig. 5). In another embodiment, block 560 may be performed prior to block 550 to determine contrast before row and column FPN terms are determined (e.g., to prevent scene based contrast from contributing to the determination of such terms). Following block 560, it is expected that any high spatial frequency content remaining in the blurred image frame may be generally attributed to spatially uncorrelated FPN.
- any remaining high spatial frequency content may be attributed to spatially uncorrelated FPN.
- the blurred image frame is high pass filtered. In one embodiment, this may include applying a high pass filter to extract the high spatial frequency content from the blurred image frame. In another embodiment, this may include applying a low pass filter to the blurred image frame and taking a difference between the low pass filtered image frame and the unfiltered blurred image frame to obtain the high spatial frequency content.
- a high pass filter may be implemented by calculating a mean difference between a sensor signal (e.g., a pixel value) and its neighbors.
- a flat field correction process is performed on the high pass filtered blurred image frame to determine updated NUC terms (e.g., if a NUC process has not previously been performed then the updated NUC terms may be new NUC terms in the first iteration of block 570).
- Fig. 7 illustrates a flat field correction technique 700 in accordance with an embodiment of the disclosure.
- a NUC term may be determined for each pixel 710 of the blurred image frame using the values of its neighboring pixels 712 to 726.
- several gradients may be determined based on the absolute difference between the values of various adjacent pixels. For example, absolute value differences may be determined between: pixels 712 and 714 (a left to right diagonal gradient), pixels 716 and 718 (a top to bottom vertical gradient), pixels 720 and 722 (a right to left diagonal gradient), and pixels 724 and 726 (a left to right horizontal gradient).
- a weight value may be determined for pixel 710 that is inversely proportional to the summed gradient. This process may be performed for all pixels 710 of the blurred image frame until a weight value is provided for each pixel 710. For areas with low gradients (e.g., areas that are blurry or have low contrast), the weight value will be close to one. Conversely, for areas with high gradients, the weight value will be zero or close to zero.
- the update to the NUC term as estimated by the high pass filter is multiplied with the weight value.
- the risk of introducing scene information into the NUC terms can be further reduced by applying some amount of temporal damping to the NUC term determination process.
- NUC terms Although the determination of NUC terms has been described with regard to gradients, local contrast values may be used instead where appropriate. Other techniques may also be used such as, for example, standard deviation calculations. Other types flat field correction processes may be performed to determine NUC terms including, for example, various processes identified in U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued
- block 570 may include additional processing of the NUC terms.
- the sum of all NUC terms may be normalized to zero by subtracting the NUC term mean from each NUC term.
- the mean value of each row and column may be subtracted from the NUC terms for each row and column.
- row and column FPN filters using the row and column FPN terms determined in block 550 may be better able to filter out row and column noise in further iterations (e.g., as further shown in Fig. 8) after the NUC terms are applied to captured images (e.g., in block 580 further discussed herein).
- the row and column FPN filters may in general use more data to calculate the per row and per column offset coefficients (e.g., row and column FPN terms) and may thus provide a more robust alternative for reducing spatially correlated FPN than the NUC terms which are based on high pass filtering to capture spatially uncorrelated noise.
- NUC terms may be optionally performed to remove spatially correlated FPN with lower spatial frequency than previously removed by row and column FPN terms.
- some variability in infrared sensors 132 or other components of infrared imaging module 100 may result in spatially correlated FPN noise that cannot be easily modeled as row or column noise.
- Such spatially correlated FPN may include, for example, window defects on a sensor package or a cluster of infrared sensors 132 that respond differently to irradiance than neighboring infrared sensors 132.
- such spatially correlated FPN may be mitigated with an offset correction. If the amount of such spatially correlated FPN is significant, then the noise may also be detectable in the blurred image frame.
- a high pass filter with a small kernel may not detect the FPN in the neighborhood (e.g., all values used in high pass filter may be taken from the neighborhood of affected pixels and thus may be affected by the same offset error).
- the high pass filtering of block 565 is performed with a small kernel (e.g., considering only immediately adjacent pixels that fall within a neighborhood of pixels affected by spatially correlated FPN)
- broadly distributed spatially yorrelated FPN may not be detected.
- Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure.
- a neighborhood of pixels 1110 may exhibit spatially correlated FPN that is not precisely correlated to individual rows and columns and is distributed over a neighborhood of several pixels (e.g., a neighborhood of approximately 4 by 4 pixels in this example).
- Sample image frame 1100 also includes a set of pixels 1120 exhibiting substantially uniform response that are not used in filtering calculations, and a set of pixels 1130 that are used to estimate a low pass value for the neighborhood of pixels 111 0.
- pixels 1130 may be a number of pixels divisible by two in order to facilitate efficient hardware or software calculations.
- additional high pass filtering and further determinations of updated NUC terms may be optionally performed to remove spatially correlated FPN such as exhibited by pixels 1110.
- the updated NUC terms determined in block 570 are applied to the blurred image frame.
- the blurred image frame will have been initially corrected for spatially correlated FPN (e.g., by application of the updated row and column FPN terms in block 555), and also initially corrected for spatially uncorrected FPN (e.g., by application of the updated NUC terms applied in block 571 ).
- a further high pass filter is applied with a larger kernel than was used in block 565, and further updated NUC terms may be determined in block 573.
- the high pass filter applied in block 572 may include data from a sufficiently large enough neighborhood of pixels such that differences can be determined between unaffected pixels (e.g., pixels 1120) and affected pixels (e.g., pixels 1110).
- a low pass filter with a large kernel can be used (e.g., an N by N kernel that is much greater than 3 by 3 pixels) and the results may be subtracted to perform appropriate high pass filtering.
- a sparse kernel may be used such that only a small number of neighboring pixels inside an N by N neighborhood are used.
- distant neighbors e.g., a large kernel
- the temporal damping factor ⁇ may be set close to 1 for updated NUC terms determined in block 573.
- blocks 571 -573 may be repeated (e.g., cascaded) to iteratively perform high pass filtering with increasing kernel sizes to provide further updated NUC terms further correct for spatially correlated FPN of desired neighborhood sizes.
- the decision to perform such iterations may be determined by whether spatially correlated FPN has actually been removed by the updated NUC terms of the previous performance of blocks 571-573.
- thresholding criteria may be applied to individual pixels to determine which pixels receive updated NUC terms.
- the threshold values may correspond to differences between the newly calculated NUC terms and previously calculated NUC terms.
- the threshold values may be independent of previously calculated NUC terms. Other tests may be applied (e.g., spatial correlation tests) to determine whether the NUC terms should be applied.
- the flow diagram returns to block 505. Otherwise, the newly determined NUC terms are stored (block 575) to replace previous NUC terms (e.g., determined by a previously performed iteration of Fig. 5) and applied (block 580) to captured image frames.
- Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline 800 in accordance with an embodiment of the disclosure.
- pipeline 800 identifies various operations of Fig. 5 in the context of an overall iterative image processing scheme for correcting image frames provided by infrared imaging module 100.
- pipeline 800 may be provided by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
- Image frames captured by infrared sensors 132 may be provided to a frame averager 804 that integrates multiple image frames to provide image frames 802 with an improved signal to noise ratio.
- Frame averager 804 may be effectively provided by infrared sensors 132, ROIC 402, and other components of infrared sensor assembly 128 that are implemented to support high image capture rates.
- infrared sensor assembly 128 may capture infrared image frames at a frame rate of 240 Hz (e.g., 240 images per second).
- a high frame rate may be implemented, for example, by operating infrared sensor assembly 128 at relatively low voltages (e.g., compatible with mobile telephone voltages) and by using a relatively small array of infrared sensors 132 (e.g., an array of 64 by 64 infrared sensors in one embodiment).
- such infrared image frames may be provided from infrared sensor assembly 128 to processing module 160 at a high frame rate (e.g., 240 Hz or other frame rates).
- infrared sensor assembly 128 may integrate over longer time periods, or multiple time periods, to provide integrated (e.g., averaged) infrared image frames to processing module 160 at a lower frame rate (e.g., 30 Hz, 9 Hz, or other frame rates). Further information regarding implementations that may be used to provide high image capture rates may be found in
- Image frames 802 proceed through pipeline 800 where they are adjusted by various terms, temporally filtered, used to determine the various adjustment terms, and gain compensated.
- factory gain terms 812 and factory offset terms 816 are applied to image frames 802 to compensate for gain and offset differences, respectively, between the various infrared sensors 132 and/or other components of infrared imaging module 100 determined during manufacturing and testing.
- NUC terms 817 are applied to image frames 802 to correct for FPN as discussed.
- block 580 may not be performed or initialization values may be used for NUC terms 817 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
- column FPN terms 820 and row FPN terms 824 are applied to image frames 802.
- Column FPN terms 820 and row FPN terms 824 may be determined in accordance with block 550 as discussed. In one embodiment, if the column FPN terms 820 and row FPN terms 824 have not yet been determined (e.g., before a NUC process has been initiated), then blocks 818 and 822 may not be performed or initialization values may be used for the column FPN terms 820 and row FPN terms 824 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
- temporal filtering is performed on image frames 802 in accordance with a temporal noise reduction (TNR) process.
- TNR temporal noise reduction
- Fig. 9 illustrates a TNR process in accordance with an embodiment of the disclosure.
- a presently received image frame 802a and a previously temporally filtered image frame 802b are processed to determine a new temporally filtered image frame 802e.
- Image frames 802a and 802b include local neighborhoods of pixels 803a and 803 b centered around pixels 805a and 805b, respectively. Neighborhoods 803a and 803b correspond to the same locations within image frames 802a and 802b and are subsets of the total pixels in image frames 802a and 802b. In the illustrated embodiment, neighborhoods 803a and 803b include areas of 5 by 5 pixels. Other neighborhood sizes may be used in other embodiments.
- Averaged delta value 805c may be used to determine weight values in block 807 to be applied to pixels 805a and 805b of image frames 802a and 802b.
- the weight values determined in block 807 may be inversely proportional to averaged delta value 805c such that weight values drop rapidly towards zero when there are large differences between neighborhoods 803a and 803b.
- large differences between neighborhoods 803a and 803b may indicate that changes have occurred within the scene (e.g., due to motion) and pixels 802a and 802b may be appropriately weighted, in one embodiment, to avoid introducing blur across frame-to-frame scene changes.
- Other associations between weight values and averaged delta value 805c may be used in various embodiments.
- the weight values determined in block 807 may be applied to pixels 805a and 805b to determine a value for corresponding pixel 805e of image frame 802e (block 811).
- pixel 805e may have a value that is a weighted average (or other combination) of pixels 805a and 805b, depending on averaged delta value 805c and the weight values determined in block 807.
- pixel 805e of temporally filtered image frame 802e may be a weighted sum of pixels 805a and 805b of image frames 802a and 802b. If the average difference between pixels 805a and 805b is due to noise, then it may be expected that the average change between neighborhoods 805a and 805b will be close to zero (e.g., corresponding to the average of uncorrected changes). Under such circumstances, it may be expected that the sum of the differences between neighborhoods 805a and 805b will be close to zero. In this case, pixel 805a of image frame 802a may both be appropriately weighted so as to contribute to the value of pixel 805e.
- averaged delta value 805c has been described as being determined based on neighborhoods 805a and 805b, in other embodiments averaged delta value 805c may be determined based on any desired criteria (e.g., based on individual pixels or other types of groups of sets of pixels).
- image frame 802a has been described as a presently received image frame and image frame 802b has been described as a previously temporally filtered image frame.
- image frames 802a and 802b may be first and second image frames captured by infrared imaging module 100 that have not been temporally filtered.
- Fig. 10 illustrates further implementation details in relation to the TNR process of block 826.
- image frames 802a and 802b may be read into line buffers 1010a and 1010b, respectively, and image frame 802b (e.g., the previous image frame) may be stored in a frame buffer 1020 before being read into line buffer 1010b.
- line buffers lOlOa-b and frame buffer 1020 may be implemented by a block of random access memory (RAM) provided by any appropriate component of infrared imaging module 100 and/or host device 102.
- RAM random access memory
- image frame 802e may be passed to an automatic gain compensation block 828 for further processing to provide a result image frame 830 that may be used by host device 102 as desired.
- Fig. 8 further illustrates various operations that may be performed to determine row and column FPN terms and NUC terms as discussed.
- these operations may use image frames 802e as shown in Fig. 8. Because image frames 802e have already been temporally filtered, at least some temporal noise may be removed and thus will not inadvertently affect die determination of row and column FPN terms 824 and 820 and NUC terms 817. In another embodiment, non-temporally filtered image frames 802 may be used.
- a NUC process may be selectively initiated and performed in response to various NUC process initiating events and based on various criteria or conditions.
- the NUC process may be performed in accordance with a motion-based approach (blocks 525, 535, and 540) or a defocus-based approach (block 530) to provide a blurred image frame (block 545).
- Fig. 8 further illustrates various additional blocks 550, 552, 555, 560, 565, 570, 571 , 572, 573, and 575 previously discussed with regard to Fig. 5.
- row and column FPN terms 824 and 820 and NUC terms 817 may be determined and applied in an iterative fashion such that updated terms are determined using image frames 802 to which previous terms have already been applied. As a result, the overall process of Fig. 8 may repeatedly update and apply such terms to continuously reduce the noise in image frames 830 to be used by host device 102.
- blocks 525, 535, and 540 are shown as operating at the normal frame rate of image frames 802 received by pipeline 800.
- the determination made in block 525 is represented as a decision diamond used to determine whether a given image frame 802 has sufficiently changed such that it may be considered an image frame that will enhance the blur if added to other image frames and is therefore accumulated (block 535 is represented by an arrow in this embodiment) and averaged (block 540).
- column FPN terms 820 (block 550) is shown as operating at an update rate that in this example is 1/32 of the sensor frame rate (e.g., normal frame rate) due to the averaging performed in block 540. Other update rates may be used in other embodiments. Although only column FPN terms 820 are identified in Fig. 10, row FPN terms 824 may be implemented in a similar fashion at the reduced frame rate.
- Fig. 10 also illustrates further implementation details in relation to the NUC
- the blurred image frame may be read to a line buffer 1030 (e.g., implemented by a block of RAM provided by any appropriate component of infrared imaging module 100 and/or host device 102).
- the flat field correction technique 700 of Fig. 7 may be performed on the blurred image frame.
- the rate at which row and column FPN terms and/or NUC terms are updated can be inversely proportional to the estimated amount of blur in the blurred image frame and/or inversely proportional to the magnitude of local contrast values (e.g., determined in block 560).
- the described techniques may provide advantages over conventional shutter-based noise correction techniques.
- a shutter e.g., such as shutter 105
- Power and maximum voltage supplied to, or generated by, infrared imaging module 100 may also be reduced if a shutter does not need to be mechanically operated. Reliability will be improved by removing the shutter as a potential point of failure.
- a shutterless process also eliminates potential image interruption caused by the temporary blockage of the imaged scene by a shutter.
- noise correction may be performed on image frames that have irradiance levels similar to those of the actual scene desired to be imaged. This can improve the accuracy and effectiveness of noise correction terms determined in accordance with the various described techniques.
- FIG. 12 a block diagram is shown of an infant monitoring system 1200 having an infrared imaging module 1202 in accordance with an embodiment of the disclosure. While an infant 1232 is depicted in this and other examples of the disclosure as a baby or young child, it will be appreciated that systems and methods disclosed herein may be used to monitor older children, elderly persons, patients, or any other person for whom monitoring or observation may be required and/or desired. Thus, an "infant" in the present disclosure should be interpreted to include all and any such persons, and infant monitoring system 1200 may be utilized in any other suitable setting such as in a nursing home for the elderly or in a hospital.
- Monitoring system 1200 may include infrared imaging module 1202, a visible light camera 1206. a processor 1208, a memory 1210, a communication module 1212, a display 1214, motion sensors 1216, a control panel 1217, and/or miscellaneous components 1218.
- components of system 1200 may be implemented in the same or similar manner as corresponding components of host device 102 of Fig. 1. Moreover, components of system 1200 may be configured to perform various NUC processes and other processes described herein.
- infrared imaging module 1202 may be a small form factor infrared camera or a small form factor infrared imaging device implemented in accordance with various embodiments disclosed herein.
- Infrared imaging module 1202 may include an FPA implemented, for example, in accordance with various embodiments disclosed herein or others where appropriate. Infrared imaging module 1202 may be configured to capture, process, and/or otherwise manage infrared images (e.g., including thermal images) of a scene 1230 that comprises at least a partial view of infant 1232. In this regard, infrared imaging module 1202 may be attached, mounted, installed, or otherwise disposed at any suitable location that allows at least a portion of infant 1232 to be placed within a field of view (FOV) 1204 of infrared imaging module 1202.
- FOV field of view
- infrared imaging module 1202 may be adjustably attached to a wall, a bed rail, a headboard, a crib barrier, a frame of a stroller, a car seatback, or any suitable part of any structure or piece of furniture as needed to at least partially place infant 1232 within FOV 1204.
- Infrared imaging module 1202 may be housed in a housing 1220 which in some embodiment comprises clamps, clips, suction cups, or other suitable attachment mechanisms to releasably attach housing 1220, and hence infrared imaging module 1202. to a suitable location as listed above.
- housing 1220 may be fixedly attached to a suitable location with an appropriate fastener.
- housing 1220 may comprise a stand that allows housing 1220 to be placed on a table top or any other substantially horizontal surfaces.
- the housing may comprise at least one articulable joint or other similar mechanism for further adjusting the position, orientation, and/or angle of infrared imaging module 1202 housed within it.
- the housing may be configured for suitably positioning infrared imaging module 1202 to at least partially place infant 1232 within FOV 1204.
- infrared imaging module 1202 may include various optical elements 1203 (e.g., infrared-transmissive lens, infrared-transmissive prisms, infrared-reflective mirrors, infrared fiber optics) that guide infrared radiation from scene 1230 to an FPA of infrared imaging module 1202.
- Optical elements 1203 may be useful when it is difficult to mount infrared imaging module 1202 at a desired angle and/or location. For example, if there is little or no room for mounting infrared imaging module 1202 at a desired location in an incubator for premature babies, a flexible fiber-optic cable and lens may be utilized to route infrared radiation to infrared imaging module 1202 mounted elsewhere.
- optical elements 1203 may be used to suitably define or alter FOV 1205 of infrared imaging module 1202.
- a switchable FOV e.g., selectable by infrared imaging module 1202 and/or processor 1204 may optionally be provided, which may be useful when, for example, a selective close-up view of the facial area of infant 1232 is desired.
- Optical elements 1203 may also include one or more filters adapted to pass infrared radiation of certain wavelengths but substantially block off others (e.g., short-wave infrared (SWIR) filters, mid-wave infrared (MWIR) filters, long-wave infrared (LWIR) filters, and narrow-band filters).
- filters may be utilized to tailor infrared imaging module 1202 for increased sensitivity to a desired band of infrared wavelengths. For example, when detecting exhaled breaths of infant 1232 as further described herein, a better result may be achieved by utilizing a narrow-band filter that transmits only in the wavelengths matching a specific absorption/emission spectrum of carbon dioxide (COi) or other constituent gases of an exhaled breath.
- COi carbon dioxide
- filters may be selectable (e.g., provided as a selectable filter wheel). In other embodiments, filters may be fixed as appropriate for a desired application of monitoring system 1200.
- Infrared images captured, processed, and/or otherwise managed by infrared imaging module 1202 may be radiometrically normalized infrared images (e.g., thermal images). That is, pixels that make up the captured image may contain calibrated thermal data (e.g., temperature).
- calibrated thermal data e.g., temperature
- infrared imaging module 1202 and/or associated components may be calibrated using appropriate techniques so that images captured by infrared imaging module 1202 are properly calibrated thermal images.
- appropriate calibration processes may be performed periodically by infrared imaging module 1202 and/or processor 1208 so that infrared imaging module 1202, and hence the thermal images captured by it, may maintain proper calibration.
- Radiometric normalization permits infrared imaging module 1202 and/or processor 1208 to efficiently detect, from thermal images, objects having a specific range of temperature.
- Infrared imaging module 1202 and/or processor 1208 may detect such objects efficiently and effectively, because thermal images of objects having a specific temperature may be easily discernible from a background and other objects, and yet less susceptible to lighting conditions or obscuring (e.g., obscured by clothing).
- object detection operations performed on visible light images e.g., images captured by CMOS or CCD sensors
- non-normalized infrared images such as performing edge detection and/or pattern recognition algorithms on such images, may be computationally complex yet ineffective.
- infrared imaging module 1202 and/or processor 1208 may be configured to detect from thermal images a contiguous region of pixels (also referred to as a "blob” or "warm blob") having a temperature approximately in the range of a clothed infant, for example, between approximately 75°F (e.g., clothed part of a body) and approximately I I 0°F (e.g., exposed part of a body such as a face and hands).
- blob contiguous region of pixels
- Such a "warm blob” may indicate a presence of an infant (e.g., infant 1232) in scene 1230, and may be analyzed further as described herein to asceitain the presence of the infant, track the facial area of the infant, and determine various attributes associated with the infant.
- Visible light camera 1206 may be a small form factor visible light imaging module or imaging device, and may be implemented in a similar manner as various embodiments of infrared imaging module 1202 disclosed herein, but with one or more sensors responsive to visible light (e.g., radiation in the visible spectrum).
- visible light camera 1206 may be implemented with a charge-coupled device (CCD) sensor, an electron multiplying CCD (EMCCD) sensor, a complementary metal-oxide-semiconductor (CMOS) sensor, a scientific CMOS (sCMOS) sensor, or other sensors.
- CCD charge-coupled device
- EMCD electron multiplying CCD
- CMOS complementary metal-oxide-semiconductor
- sCMOS scientific CMOS
- visible light images captured by visible light camera 1206 may be received by processor 1208, which may be configured to fuse, superimpose, or otherwise combine the visible light images with the thermal images captured by infrared imaging module 1202 as further described herein.
- visible light camera 1206 may be co-located with infrared imaging module 1202 in housing 1220 and oriented so that an FOV 1207 of visible light camera 1206 at least partially overlaps FOV 1204 of infrared imaging module 1202.
- infrared imaging module 1202 and visible light camera 1206 may be implemented as a dual sensor module sharing a common substrate according to various techniques described in U.S. Provisional Patent Application No. 61 748,018 previously referenced herein.
- Such a dual sensor module implementation may include common circuitry and/or common restraint devices for infrared imaging and visible light imaging, thereby potentially reducing an overall size of infant monitoring system 1200 as compared to embodiments where infrared imaging module 1202 and visible light camera 1206 are implemented as individual modules. Additionally, the dual sensor module implementation may be adapted to reduce a parallax error between images captured by infrared imaging module 1202 and visible light camera 1206 by spacing them closer together.
- Processor 1208 may be implemented as any appropriate processing device as described with regard to processor 195 in Fig. 1. In some embodiments, at least some part or some functionalities of processor 1208 described herein may be implemented as part of infrared imaging module 1202, for example, at processing module 160 described above in connection with Fig. 1. In some embodiments, at least some part or some functionalities of processor 1208 may be part of or implemented with other existing processors of an external device such as a mobile phone, a tablet device, a laptop computer, a desktop computer, an automobile information display system, or any other devices that may be used to present monitoring information from monitoring system 1200. In other embodiments, processor 1208 may interface and communicate with such other external processors and components associated with such processors.
- Processor 1208 may be configured to interface and communicate with other components of monitoring system 1200 to perform various processing and analysis operations described herein.
- Processor 1208 may be configured to receive thermal images captured by infrared imaging module 1202.
- Processor 1208 may be configured to perform, on the received thermal images of a scene (e.g., scene 1230) including at least a partial view of an infant (e.g., infant 1232), various thermal image processing and analysis operations as further described herein, for example, to detect and track the infant, and determine various attributes associated with the infant.
- Processor 1208 may be configured to collect, compile, analyze, or otherwise process the outcome of the thermal image processing and analysis operations to generate monitoring information regarding the infant.
- processor 1208 may be configured to determine whether the infant is breathing normally or not, and generate an alarm upon determining that the infant is not breathing normally (e.g., indicating a pattern of apnea, hyperventilation, or other abnormal breathing patterns).
- processor 1208 may be configured to detect and track the face and facial features of the infant in the thermal images according to one or more embodiments of the disclosure.
- FIG. 13 an example thermal image (shown as a user-viewable thermal image for ease of understanding, with lighter portions representing higher temperatures) that may be captured by infrared imaging module 1202 is shown.
- a face 1334 of an infant generally exhibits a higher temperature than a covered body 1335 or a background.
- facial features such as the eyes, mouth, and nostrils generally exhibit even higher temperatures.
- a face e.g., face 1334
- the eyes, nose, and mouth e.g., eye regions 1336, tear duct regions 1339, and an oronasal region 1337
- eye regions 1336 e.g., eye regions 1336, tear duct regions 1339, and an oronasal region 1337
- processor 1208 may be configured to track the face and facial features based additionally or alternatively on the visible light images.
- the visible light images may provide more detail and contrast than the thermal images in certain ambient light conditions, and thus may be analyzed using suitable face tracking algorithms in such favorable light conditions.
- both the visible light images and the thermal images may be analyzed to complementarily increase detection and tracking accuracy.
- the thermal images and the visible light images may be combined or fused as further described herein, and the combined or fused images may be analyzed to track the face and facial features. If processor 1208 is configured to detect and track the face and facial features using the visible light images, processor 1208 may be further configured to convert pixel coordinates of the tracked face and facial features in the visible light images to corresponding pixel coordinates in the thermal images.
- Whether or not the infant is breathing normally may be determined by analyzing the thermal images to detect exhaled breaths of the infant and analyzing the intervals between the detected exhalation, according to an embodiment of the disclosure.
- processor 1208 may be configured to detect a presence of exhaled breaths 1338 in or near oronasal region 1337 being tracked.
- Exhaled breaths 1338 may appear in the thermal images for a short period after each exhalation, and may be detectable as a distinct plume of gas rich in C0 and having a temperature slightly lower than the body temperature.
- exhaled breaths 1338 may be detected.
- narrow-band filters may be utilized in some embodiments, so that infrared radiation absorbed and emitted by C0 2 may be shown more clearly and in higher contrast to infrared radiation from other substances for an improved detection of exhaled breaths 1338.
- Processor 1208 may be configured to generate an alarm when, for example, no exhalation is detected for a certain period of time (e.g., indicative of apnea), the interval between the detected exhalations is too long (e.g.. indicative of apnea), or die interval between the detected exhalations is too short (e.g., indicative of hyperventi lation).
- processor 1208 may be configured to detect breathing by analyzing tracked oronasal region 1337 to detect periodic variations in the temperature and/or shape of oronasal region 1337.
- processor 1208 may be configured to detect periodic alteration of slightly higher and lower temperatures in the nostrils and/or periodic movement of oronasal region 1337, which may be indicative of periodic inhalation and exhalation cycles.
- processor 1208 may be configured to detect breathing by performing other suitable analysis and/or processing operations, for example, for detecting various periodic variations indicative of breathing.
- processor 1208 may be configured to detect breathing by perforating any combination of breathing detection operations described herein.
- monitoring information that may be generated by processor 1208 includes an approximate body temperature of an infant and/or an alarm to warn of abnormal body temperature.
- processor 1208 may be configured to locate and track the face of an infant in the thermal images by analyzing the thermal images, visible light images, and/or combined thermal-visible light images. Tn one embodiment, processor 1208 may be configured to determine an approximate body temperature by aggregating, averaging, and/or otherwise analyzing the radiometric data (e.g., temperature data) associated with thermal image pixels that correspond to the face of the infant.
- radiometric data e.g., temperature data
- processor 1208 may be configured to determine an approximate body temperature by obtaining a temperature associated with tear duct (also referred to as lachrymal duct or nasolacrimal duct) regions 1339 of the infant's eyes.
- tear duct regions 1339 exhibit temperatures that are more stable (e.g., less affected by ambient temperatures) and closer to the core temperature of a human body than other exposed skin parts of the body.
- processor 1208 in this embodiment may be configured to detect and track tear duct regions 1339 (e.g., inside corners of the eyes) as shown in Fig. 13, and determine an approximate body temperature by analyzing the radiometric data (e.g., temperature data) associated with thermal image pixels mat correspond to the detected tear duct regions 1339.
- radiometric data e.g., temperature data
- processor 1208 may be configured to calculate an approximate body temperature by performing other appropriate processing and analysis operations on the thermal images and the radiometric data contained therein. In various embodiments, processor 1208 may be configured to generate an alarm if the approximate body temperature determined from the thermal images is higher or lower than certain threshold values, so as to warn of high fever or other abnormal health conditions. In yet another example of generating monitoring information, processor 1208 may be configured to analyze the thermal images to detect a presence of a foreign substance in an oronasal region of an infant. In one embodiment, processor 1208 may be configured to analyze the tracked oronasal region (e.g., oronasal region 1337) for patterns indicative of presence of a foreign substance.
- processor 1208 may be configured to analyze the tracked oronasal region (e.g., oronasal region 1337) for patterns indicative of presence of a foreign substance.
- radiometric properties e.g., temperature, emission/absorption wavelengths, emissivity, reflectance, and/or transmittance
- the thermal images of the tracked oronasal region may be analyzed for variances that may be indicative of presence of foreign substances.
- processor 1208 may be configured to detect presence of a foreign substance by performing other appropriate object detection operations suitable for thermal images.
- processor 1208 may be configured to generate an alarm if a foreign substance is detected in the oronasal region, so as notify that the infant may need to be cleaned up to prevent potential choking or otherwise need assistance.
- processor 1208 may be configured to analyze the thermal images to determine the approximate posture of an infant (e.g.. whether the infant is prone, supine, sitting up, or standing). As described above, the location of body, face, and facial features of an infant may be tracked in the thermal images. In one embodiment, processor 1208 may be configured to determine the approximate posture by analyzing the location and/or orientation of the face relative to the body. In another embodiment, the profile and/or the aspect ratio of the infant in the thermal images may be analyzed to determine the posture. In various embodiments, processor 1208 may be configured to determine the posture of the infant by performing any combination of posture detection operations described herein and other appropriate thermal image analysis operations for posture detection.
- processor 1208 may be configured to analyze the thermal images to determine the approximate posture of an infant by performing any combination of posture detection operations described herein and other appropriate thermal image analysis operations for posture detection.
- processor 1208 may be configured to receive a selection of an alarm- triggering posture from a user, and generate an alarm if the approximate posture of the infant is detected as matching the selected posture.
- a user may choose to be notified or warned if the infant is standing up in a baby crib, so that the user may tend to the infant and/or prevent the infant from falling.
- monitoring information that may be generated by processor 1208 includes user-viewable images (e.g., thermograms) of a scene (e.g., scene 1230) captured by infrared imaging module 1202.
- Processor 1208 may be configured to convert the thermal images using appropriate methods and algorithms.
- the radiometric data e.g., temperature data
- the radiometric data contained in the pixels of the thermal images may be converted into gray- scaled or color-scaled pixels to construct images that can be viewed by a person.
- User-viewable thermal images may optionally include a legend or scale that indicates the approximate temperature of corresponding pixel color and/or intensity.
- Such user-viewable images may be viewed by a user (e.g., a parent, a caregiver) to visually check the condition of the infant even in when lights are turned off or dimmed (e.g., at night when the infant is in bed).
- processor 1208 may be configured to superimpose, fuse, blend, or otherwise combine the thermal images and the visible light images to generate user-viewable images having a higher definition and/or contrast.
- processor 1208 may be configured to generate combined images including radiometric data and/or other infrared characteristics corresponding to scene 1230 but with significantly more object detail (e.g., contour and/or edge detail) and/or contrast than typically provided by the thermal or visible light images alone, as further described herein.
- the combined images may include radiometric data and visible light characteristics (e.g., a visible spectrum color) corresponding to one or more objects (e.g., infant 1232) in scene 1230, as described for appropriate embodiments disclosed in various patent applications referenced herein such as, for example, U.S. Patent Application Nos. 61/473,207, 61/746,069, 61/746,074, 61/792,582, 61/793,952, 12/766,739, 13/105,765, or 13/437,645, or International Patent Application No. PCT/EP2011/056432, or others as appropriate.
- Combined images generated in these examples may provide sufficient radiometric data, edge detail, and contrast to allow easier recognition and/or interpretation of the condition of infant 1232.
- monitoring information that may be generated by processor 1208 includes an approximate ambient temperature near an infant.
- processor 1208 may be configured to determine the ambient temperature from the radiometric data (e.g., temperature data) of pixels that correspond to the background.
- the radiometric data may be aggregated and/or averaged for a more accurate determination of the ambient temperature.
- processor 1208 may be configured to calculate an approximate ambient temperature by performing other appropriate processing and analysis operations on the thermal images and the radiometric data contained therein.
- a conventional temperature sensor may be used to determine the ambient temperature in place of, or in addition to, the ambient temperature detection operation using the thermal images.
- Memory 1210 may include one or more memory devices to store data and information, including thermal images and monitoring information.
- the one or more memory devices may include various types of memory for thermal image and other information storage including volatile and non-volatile memory devices, such as RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically-Erasable Read-Only Memory), flash memory, and/or a disk drive.
- volatile and non-volatile memory devices such as RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically-Erasable Read-Only Memory), flash memory, and/or a disk drive.
- thermal images and monitoring information stored in the one or more memory devices may be retrieved later for purposes of reviewing and/or further diagnosing the conditions of the infant monitored by monitoring system 1200.
- processor 1208 may be configured to execute software instructions stored on memory 1210 to perform various methods, processes, or operations in the manner described herein.
- Display 1212 may be configured to present, indicate, or otherwise convey monitoring information generated by processor 1208.
- display 1212 may be implemented with an electronic display screen, such as a liquid crystal display (LCD) a cathode ray tube (CRT), or various other types of generally known video displays and monitors.
- Display 1212 according to such embodiments may be suitable for presenting user-viewable thermal images converted by processor 1208 from thermal images captured by infrared imaging module 1202.
- display 1212 may be housed in a second housing distinct from housing 1220 where infrared imaging module 1202 may be disposed, so that the monitoring information may be viewed by a user (e.g., a parent, a caregiver) at a location remote from a scene (e.g., scene 1230) that may include at least a partial view of the infant.
- a user e.g., a parent, a caregiver
- scene e.g., scene 1230
- existing display screens on external devices such as mobile phones, tablet devices, laptop computers, desktop computers, automobile information display systems, or any other devices that may receive the thermal images and/or the monitoring information from monitoring system 1200 to present the monitoring information to a user.
- communication modules 1214 may be configured to handle, manage, or otherwise facilitate wired and/or wireless communication between various components of monitoring system 1200 and between monitoring system 1200 and an external device.
- infrared imaging module 1202 which may be disposed in housing 1220 and located at a suitable location for capturing thermal images of infant 1232, may transmit and receive data to and from processor 1208, which may be located at another location for viewing by a user, through communication modules 1214.
- processor 1208 may transmit and receive data to and from an external device, which may receive and further process raw/processed thermal images and/or monitoring information for presentation to a user, through communication module 1214 configured to manage wired and/or wireless connections.
- communication modules 1214 may include a wireless communication component (e.g., based on the IEEE 802.11 WiFi standards, the BluetoothTM standard, the ZigBeeTM standard, or other appropriate short range wireless communication standards), a wireless broadband component (e.g., based on WiMax technologies), mobile cellular component, a wireless satellite component, or other appropriate wireless communication components.
- Communication module 1214 may also be configured for a proprietary wireless communication protocol and interface based on radio frequency (RF), microwave frequency (MWF). infrared frequency (IRF), and/or other appropriate wireless transmission technologies.
- Communication module 1214 may include an antenna coupled thereto for wireless
- communication module 1214 may handle, manage, or otherwise facilitate wireless communication by establishing wireless link to a wireless router, hub, or other appropriate wireless networking devices.
- communication module 1214 may be configured to interface with a wired network via a wired communication component such as an Ethernet interface, a power-line modem, a Digital Subscriber Line (DSL) modem, a Public Switched Telephone Network (PSTN) modem, a cable modem, and/or other appropriate components for wired communication.
- a wired communication component such as an Ethernet interface, a power-line modem, a Digital Subscriber Line (DSL) modem, a Public Switched Telephone Network (PSTN) modem, a cable modem, and/or other appropriate components for wired communication.
- DSL Digital Subscriber Line
- PSTN Public Switched Telephone Network
- cable modem e.g., a cable modem
- communication module 1214 may be configured to communicate over a wired link (e.g., through a network router, switch, hub, or other network devices) for wired communication purposes.
- a wired link may be implemented with a power-line cable, a coaxial cable, a fiber-
- monitoring system 1200 may comprise as many such elements
- communication module 1214 as desired for various applications of monitoring system 1200 to suit various types of monitoring environments.
- communication module 1214 may be integrated into or implemented as part of various other components of monitoring system 1200.
- infrared imaging module 1202, processor 1208, and display 1212 may each comprise a subcomponent that may be configured to perform the operations of communication module 1214, and may communicate via wired and/or wireless connection without separate communication module 1214.
- Motion sensors 1216 may be implemented in the same or similar manner as described with regard to motion sensors 194 in Fig. 1. Motion sensors 1216 may be monitored by and provide information to infrared imaging module 1202 and/or processor 1208 for performing various NUC techniques described herein.
- monitoring system 1200 may include a control panel 1217 having one or more user-activated mechanisms (e.g.. buttons, knobs, sliders, etc.) configured to interface with a user and receive user input control signals.
- control panel 1217 may be part of display 1212 configured to function as both a user input device and a display device.
- control panel 1217 may be implemented as a graphical user interface
- GUT presented on display 1212 (e.g., a user actuated touch screen), having one or more images of the user-activated mechanisms (e.g., buttons, knobs, sliders, etc.) configured to interface with a user and receive user input control signals via display 1212.
- the user-activated mechanisms e.g., buttons, knobs, sliders, etc.
- a user may selectively turn on or off the various detections/alarms provided by monitoring system 1200 or adjust other configurations of monitoring system 1200 using control panel 1212.
- a control panel may be implemented or presented at an external device (e.g., a mobile phone, a tablet device, a laptop computer, a desktop computer, an automobile information display system) or any other device that may be used to process thermal images and/or present monitoring information), which may receive user input signals and communicate them to monitoring system 1200.
- an external device e.g., a mobile phone, a tablet device, a laptop computer, a desktop computer, an automobile information display system
- any other device may be used to process thermal images and/or present monitoring information
- Miscellaneous components 1218 may include any other device or component as may be desired for various application of monitoring system 1200.
- miscellaneous components 1218 may include a warning light (e.g., a strobe light, a flashing light), a chime, a speaker with associated circuitry for generating a lone, or other appropriate devices that may be used to generate an audible and/or visible alert in response to the alarm generated by processor 1208.
- miscellaneous components 1218 may include a microphone for capturing sound from, for example, infant 1232 in scene 1230, so that a user may hear any sound that infant 1232 makes in addition to viewing the monitoring information. Tn some
- miscellaneous components 1218 may include a temperature sensor (e.g., a thermocouple, a thermometer), a moisture sensor, and other sensors that may provide reference data points for calibrating or verifying the various thermal image analytics described herein.
- a temperature sensor e.g., a thermocouple, a thermometer
- a moisture sensor e.g., a thermometer
- other sensors e.g., a moisture sensor, and other sensors that may provide reference data points for calibrating or verifying the various thermal image analytics described herein.
- processor 1208 may be combined with infrared imaging module 1202, memory 1210, and/or communication module 1214.
- processor 1208 may be combined with infrared imaging sensor 1202 with only certain operations of processor 1208 performed by circuitry (e.g.. processor, logic device, microprocessor, microcontroller, etc.) within infrared imaging module 1202.
- circuitry e.g. processor, logic device, microprocessor, microcontroller, etc.
- Fig. 14 illustrates an infant monitoring system 1400 provided in a camera housing 1420 and a display housing 1422 in accordance with an embodiment of the disclosure.
- Monitoring system 1400 may include an infrared imaging module 1402, a visible light camera 1406, a processor 1408, a memory 1410, a display 1412, communication modules 1414, motion sensors 1416, a control panel 1417, and other miscellaneous components 1418, any one of which may be implemented in the same or similar manner as the corresponding components of monitoring system 1200 of Fig. 12.
- Camera housing 1420 may be implemented in a similar manner as housing 1220, and may house infrared imaging module 1402, visible light camera 1406, communication module 1414, and motion sensors 1416.
- Camera housing 1420 may comprise a clamp 1424 or other suitable attachment mechanisms to releasably attach camera housing 1420 to a suitable structure 1428 (e.g., a bed rail, a headboard, a crib barrier, a frame of a stroller, a car seatback, or any other suitable part of a piece of furniture) at a location that allows at least a portion of an infant 1432 to be placed within an FOV 1404 of infrared imaging module 1402.
- Camera housing 1420 may further comprise an articulable joint 1 26 or other similar mechanism for further adjusting the position, orientation, and/or angle of camera housing 1420.
- a user may releasably attach and/or adjust camera housing 1420 to position infrared imaging module 1404 for capturing a scene 1430 that includes at least a portion of infant 1432, so as to monitor an infant in a crib, on a bed, in a play area, in a stroller, in a car, or any other place where an infant or other persons needing observation may be placed.
- camera housing 1420 may alternatively or additionally comprise a stand that may be configured to allow camera housing 1420 to be placed on a table top or any other substantially horizontal surfaces.
- a display housing 1422 may be used to house display 1412 and communication module 1414. Other remaining components, such as processor 1408, memory 1410, and miscellaneous components 1418, may be housed in camera housing 1420, display housing 1422, or both (e.g., components may be replicated or divided into parts) as desired for various applications of monitoring system 1400.
- Display housing 1422 may be portable and separate from camera housing 1420, so that monitoring information may be viewed by a user at a location remote from scene 1430 captured by infrared imaging module 1402 in camera housing 1420.
- communication modules 1414 may facilitate communication between a component (e.g., infrared imaging module 1402) housed in camera housing 1420 and another component (e.g., processor 1408) housed in display housing 1422 via a wired link 1413 (e.g., including a network router, switch, or hub) or a wireless link (e.g., including a wireless router or hub).
- a component e.g., infrared imaging module 1402 housed in camera housing 1420 and another component (e.g., processor 1408) housed in display housing 1422 via a wired link 1413 (e.g., including a network router, switch, or hub) or a wireless link (e.g., including a wireless router or hub).
- monitoring system 1200/1400 may allow a user to define a virtual boundary 1440.
- a user may define virtual boundary 1440 through, for example, an interaction with control panel 1217/1417 and/or the GUI presented on display 1212/1412.
- Virtual boundary 1440 may be defined by a user to delineate an area where it may be unsafe or otherwise undesirable for an infant to be present. For example, the area in scene 1430 outside virtual boundary 1440 may be indicated by a user as unsafe or otherwise undesirable.
- processor 1208/1408 may be configured to detect the presence of and track the location of an infant as described above.
- Processor 1208/1408 may be further configured to perform to determine whether the approximate location of the infant falls outside a safe area defined by virtual boundary 1440, and generate an alarm upon determining that the infant may be outside virtual boundary 1440 or undetected in the thermal images.
- a user can be alerted if an infant crawls out of a safe play zone, an infant falls off a bed, or otherwise moves out of a safe area defined by a virtual boundary.
- processor 1208/1408 may be configured to detect the presence of and track the location of an infant as described above.
- Processor 1208/1408 may be further configured to perform to determine whether the approximate location of the infant falls outside a safe area
- 1208/1408 may be configured to detect an infant out of bound by analyzing and comparing pixel coordinates of the location of an infant with those of a safe area defined by a virtual boundary. In other embodiments, the detection may be performed using one or more image analysis operations
- video analytics which may include scene reconstruction operations, object tracking operations, and/or virtual tripwire detection operations.
- Fig. 14 also shows an example screenshot on display 1412.
- Some or all of the monitoring information generated by processor 1208/1408 may be presented on display 1212/1412 in various text and/or graphical forms. In some embodiments, some or all of the monitoring information may be provided additionally or alternatively in audible form, as well as through various indicators and lights (e.g., flashing alarm lights).
- This example screenshot shows an alarm 1442. temperature information 1443, infant posture information 1444, and a user-viewable image of the scene presented on display 1212/1412 for viewing by a user. The user-viewable image may show a thermographic shape 1446 of the infant, as well as a temperature scale 1448.
- the user- viewable image may also be presented in a more natural color (e.g., using visible light images alone or combined with thermal images), in addition to or as an alternative to presenting thermograms.
- a user may view images of the scene (e.g., scene 1230/1430) including the infant (e.g., infant 1232/1432) even in complete darkness, and at the same time advantageously obtain various alerts and descriptions of the monitoring information. Therefore, monitoring system 1200/1400 may be conveniently placed for a remote monitoring of an infant or other persons for whom observation is desired or needed.
- monitoring system 1200/1400 may determine various conditions associated with the scene and the infant, and generate monitoring information.
- the monitoring information may include, but is not limited to, alarms to warn of an abnormal breathing, an abnormal temperature, a posture change, a foreign substance in mouth/nose, and an infant out of a safe zone, as well as various conditions (e.g., posture, temperature) associated with the infant and user-viewable images converted from thermal images of the scene.
- Monitoring system 1200/1400 may thus advantageously provide an active warning to caregivers, and thereby help prevent death, injury, or other harm attributable to S1DS and other conditions of the infant and/or the environment.
- Monitoring system 1200/1400 may also advantageously provide to caregivers a clear view of the infant even when the infant is placed in a low or no light environment.
- process 1500 may be performed by monitoring system 1200/1400 for monitoring infant
- monitoring system 1200/1400 and infant 1232/1432 are identified only for purposes of giving examples and that any other suitable system may be used to perform all or part of process 1500.
- thermal images e.g., containing pixels with radiometric data
- a scene e.g.. scene 1230/1430
- an infant e.g., infant 1232/1432
- the captured thermal images may be radiometrically calibrated thermal images as described above in connection with infrared imaging module 1202/1402.
- the captured thermal images may be scale and/or perspective calibrated thermal images. That is, geometric properties (e.g., size and position) of objects (e.g., an infant) in the actual scene can be derived from the pixel coordinates of objects in the thermal images.
- Scale/perspective calibration may be performed manually or automatically using suitable techniques when infrared imaging module (e.g., infrared imaging module 1202/1402) is first installed at a desired location.
- infrared imaging module e.g., infrared imaging module 1202/1402
- automatic recalibration may also be performed using suitable techniques periodically after installation.
- the captured thermal images may be received, for example, at processor 1208/1408 that is communicatively coupled to infrared imaging module 1202/1402.
- the captured thermal images may be transmitted from an infrared imaging module via wireless or wired connection using appropriate network protocols and interfaces (e.g., through
- the captured thermal images may be transmitted wirelessly to processor 1208/1408, which may be co-located with display 1212/1412 in display housing 1422 placed near a user (e.g., a parent, a caregiver) for remotely monitoring of an infant.
- processor 1208/1408 may be co-located with display 1212/1412 in display housing 1422 placed near a user (e.g., a parent, a caregiver) for remotely monitoring of an infant.
- an NUC process may be performed on the captured thermal images to remove noise therein, for example, by using various NUC techniques disclosed herein.
- the captured thermal images may be analyzed to generate monitoring information regarding the infant. For example, various analysis and processing operations may be performed on the captured thermal images to detect and track the infant, and determine various attributes associated with the infant and/or the scene.
- regions of contiguous pixels having temperature values in a specific range may be detected from the radiometrically calibrated thermal images for detection and tracking of the infant.
- the detection operation may differentiate a region (or a
- blob having a surface temperature distribution that is characteristic of an infant (e.g., with an exposed face).
- the thermal images and the blob detected therein may be further processed and/or analyzed, for example, by performing various filtering operations and analyzing the size, shape, and/or thermal characteristics of the blobs, to ascertain the detection of the infant and to further localize the face and fecial features for tracking.
- facial features such as the eyes, mouth, and nostrils generally exhibit temperatures higher than other exposed area of the face.
- filtering operations such as dilation and threshold filtering performed on the detected blob may be utilized to further localize the facial features.
- the size, shape, and/or radiometric properties of the localized facial features may be further analyzed if needed to ascertain the detection of the facial features.
- the thermal images may be analyzed to detect one or more candidate foreground objects, for example, using background modeling techniques, edge detection techniques, or other foreground object detection techniques suitable for use with thermal images.
- the radiometric properties (e.g., surface temperature distribution) of the candidate objects may then be analyzed to determine whether they correspond to those of an infant that may be present in the scene. For example, a doll placed on a baby crib may initially be detected as a candidate foreground object, but its radiometric properties may then quickly reveal that it does not have a surface temperature distribution characteristic of an infant and thus is not an infant.
- object detection using the thermal images may be less susceptible to false detection of spurious objects compared with object detection techniques using visible light images.
- the size and shape of the candidate objects may also be analyzed, so that the detection may be ascertained based on the size, the shape, and the radiometric properties of the detected candidates. As described above, further processing and analysis operations may be performed if needed to localize and track the facial features of the infant.
- background modeling techniques may be used to detect objects in the scene. Because the background (e.g., a baby crib or bed) of the scene rarely changes and because thermal images are generally insensitive to changing lighting conditions, a background model (e.g., pixels that belong to a background) may be constructed with high accuracy, and a region of pixels different from the background (also referred to as a "region of interest") may easily be distinguished as a candidate foreground object. As described above, the radiometric properties of such a region of interest (ROI) may then be analyzed to further ascertain whether the detected ROT likely represent an infant or not.
- ROI region of interest
- the various processing and analysis operations described for block 1506 may be omitted or included, and may be performed in any other order as appropriate for detecting and tracking an infant and/or its face. For example, in some embodiments, detecting a warm "blob" in the thermal images may be sufficient to detect and track an infant in a scene, whereas in other embodiments various thermal image analytics may be performed in combination to increase the accuracy of the detection and tracking.
- operations for block 1506 may additionally or alternatively involve performing suitable face detection and tracking algorithms on the visible light images or combined images of the visible light images and the thermal images. If the detection and tracking of the face and facial features are performed using the visible light images, operations for block 1506 may further involve converting pixel coordinates of the tracked face and facial features in the visible light images to corresponding pixel coordinates in the thermal images. Other appropriate techniques for detecting and tracking objects in the thermal images by analyzing the thermal images, visible light images, and/or combined images may also be utilized for block 1506.
- Various attributes associated with the infant and/or the scene may be determined to generate monitoring information, by further analysis and processing and/or during the processing and analysis performed for detection and tracking.
- the approximate body temperature, the approximate ambient temperature, the relative location of the infant in the scene, and the posture of the infant inay be determined by analyzing and processing the thermal images as described above for processor 1208 of Fig. 12.
- the various attributes may be further analyzed and/or processed to generate alarms to warn of an abnormal body temperature, a posture change, and an infant moving out of a safe zone.
- exhaled breaths from the infant may be detected by further analyzing the tracked oronasal region, and alarms may be generated if an abnormal breathing pattern is detected as described above with respect to processor 1208 of Fig. 12. As also described for processor 1208, an alarm may be generated if a foreign substance is detected in the tracked oronasal region of the infant.
- user-viewable images of the scene may be generated.
- the user-viewable images may be generated by converting the thermal images using appropriate methods and algorithms.
- the thermal data e.g., temperature data
- the thermal images may be converted into gray-scaled or color-scaled pixels to construct images that can be viewed by a person.
- the user-viewable thermal images may optionally include a legend or scale that indicates the approximate temperature of corresponding pixel color and/or intensity.
- process 1500 may further include capturing a visible light image of the scene using a visible light camera (e.g., visible light camera 1206/1406).
- a visible light camera e.g., visible light camera 1206/1406
- user-viewable may optionally be presented in a more natural color using visible light images alone or combined with thermal images instead of presenting thermograms when, for example, enough light is available to generate discernible visible light images.
- operations for block 1508 may also involve fusing or combining the thermal images and the visible light images to generate user-viewable images having a higher definition, contrast, and/or detail.
- Fig. 16 is a flowchart of a process 1600 to combine or fuse the thermal images and the visible light images.
- the combined images may include radiometric data and/or other infrared characteristics corresponding to scene 1230/1430, but with significantly more object detail (e.g., contour or edge detail) and/or contrast than typically provided by the thermal or visible light images alone.
- object detail e.g., contour or edge detail
- contrast typically provided by the thermal or visible light images alone.
- the combined images generated in these examples may beneficially provide sufficient radiometric data, detail, and contrast to allow easier recognition and/or interpretation of the condition of infant
- visible light images may be received.
- visible light images of scene 1230/1430 may be captured by visible light camera 1206/1406, and the captured visible light images may be received by processor 1208/1408 in similar manner as described for receiving the thermal images in block 1502.
- processor 1208/1408 may perform various operations of process 1600 using both the thermal images and visible light images, for example.
- high spatial frequency content from one or more of the visible light and thermal images may be derived at block 1608.
- processor 1208/1408 may be configured to derive high spatial frequency content from one or more of the visible light and thermal images received in blocks 1602 and/or 1502.
- High spatial frequency content derived according to various embodiments may include edge/contour details and/or high contrast pixels extracted from the one or more of the visible light and thermal images, for example.
- high spatial frequency content may be derived from the received images by performing a high pass filter (e.g., a spatial filter) operation on the images, where the result of the high pass filter operation is the high spatial frequency content.
- high spatial frequency content may be derived from the received images by performing a low pass filter operation on the images, and then subtracting the result from the original images to get the remaining content, which is the high spatial frequency content.
- high spatial frequency content may be derived from a selection of images through difference imaging, for example, where one image is subtracted from a second image that is perturbed from the first image in some fashion, and the result of the subtraction is the high spatial frequency content.
- optical elements 1203 of infrared imaging module 1202/1402 and/or optical elements of visible light camera 1206/1406 may be configured to introduce vibration, de-focusing, and/or movement artifacts into a series of images captured by one or both of infrared imaging module 1202/1402 and visible light camera 1206/1406.
- High spatial frequency content may be derived from subtractions of adjacent or semi-adjacent images in the series.
- high spatial frequency content may be derived from only the visible light images or the thermal images. In other embodiments, high spatial frequency content may be derived from only a single visible light or thermal image. In further embodiments, high spatial frequency content may be derived from one or more components of the visible light and/or thermal mages, such as a luminance component of visible light images, for example, or a radiometric component of thermal images. Resulting high spatial frequency content may be stored temporarily (e.g., in memory 1210/1410) and/or may be further processed according to block 1608.
- one or more thermal images may be de-noised.
- processor 1208/1408 may be configured to de-noise, smooth, or blur one or more thermal images of scene 1230/1430 using a variety of image processing operations.
- removing high spatial frequency noise from the thermal images allows the processed thermal images to be combined with high spatial frequency content derived according to block 1604 with significantly less risk of introducing double edges (e.g., edge noise) to objects depicted in combined images of scene 1230/1430.
- removing noise from the thermal mages may include performing a low pass filter (e.g., a spatial and/or temporal filter) operation on the images, where the result of the low pass filter operation is de-noised or processed thermal images.
- removing noise from one or more thermal images may include down-sampling the thermal images and then up-sampling the images back to the original resolution.
- processed thermal images may be derived by actively blurring thermal images of scene 1230/1430.
- optical elements 1203 may be configured to slightly de-focus one or more thermal images captured by infrared imaging module 1202/1402.
- the resulting intentionally blurred thermal images may be sufficiently de-noised or blurred so as to reduce or eliminate a risk of introducing double edges into combined images of scene 1230/1430, as further described below.
- blurring or smoothing image processing operations may be performed by processor 1208/1408 on the received thermal images as an alternative or supplement to using optical elements 1203 to actively blur thermal images of scene 1230/1430.
- Resulting processed thermal images may be stored temporarily (e.g., in memory 1210/1410) and/or may be further processed according to block. 1608.
- high spatial frequency content may be blended with one or more thermal images.
- processor 1208/1 08 may be configured to blend high spatial frequency content derived in block 1604 with one or more thermal images of scene 1230/1430, such as the processed thermal images provided in block 1606.
- high spatial frequency content may be blended with thermal images by superimposing the high spatial frequency content onto the thermal images, where the high spatial frequency content replaces or overwrites those portions of the thermal images corresponding to where the high spatial frequency content exists.
- the high spatial frequency content may include edges of objects depicted in images of scene 1230/1430. but may not exist within the interior of such objects.
- blended image data may simply include the high spatial frequency content, which may subsequently be encoded into one or more components of combined images, as described in block 1610.
- a radiometric component of thermal images may be a chrominance component of the thermal images, and the high spatial frequency content may be derived from the luminance and/or chrominance components of visible light images.
- combined images may include the radiometric component (e.g., the chrominance component of the thermal images) encoded into a chrominance component of the combined images and the high spatial frequency content directly encoded (e.g., as blended image data but with no thermal image contribution) into a luminance component of the combined images. By doing so, a radiometric calibration of the radiometric component of the thermal images may be retained.
- blended image data may include the high spatial frequency content added to a luminance component of the thermal images, and the resulting blended data encoded into a luminance component of resulting combined images.
- high spatial frequency content may be derived from one or more particular components of one or a series of visible light and/or thermal images, and the high spatial frequency content may be encoded into corresponding one or more components of combined images.
- the high spatial frequency content may be derived from a luminance component of visible spectrum images, and the high spatial frequency content, which in this embodiment is all luminance image data, may be encoded into a luminance component of combined images.
- high spatial frequency content may be blended with thermal images using a blending parameter and an arithmetic equation.
- the high spatial frequency content may be derived from a luminance component of visible light images.
- the high spatial frequency content may be blended with a corresponding luminance component of thermal image according to a blending parameter and a blending equation to produce blended image data.
- the blended image data may be encoded into a luminance component of combined images, for example, and the chrominance component of the thermal images may be encoded into the chrominance component of the combined images.
- the radiometric component of the infrared images may be their chrominance component
- the combined images may retain a radiometric calibration of the thermal images.
- portions of the radiometric component may be blended with the high spatial frequency content and then encoded into combined images.
- the high spatial frequency content may be derived from one or more components of visible light images and/or thermal image.
- the high spatial frequency content may be blended with one or more components of the thermal images to produce blended image data (e.g., using a blending parameter and a blending equation), and resulting combined images may include the blended image data encoded into corresponding one or more components of the combined images.
- the one or more components of the blended data do not have to correspond to the eventual one or more components of the combined images (e.g., a color space/format conversion may be performed as part of an encoding process).
- a blending parameter value may be selected by a user (e.g., via control panel 1217/1417), or may be automatically determined by processor 1208/1408 according to context or other data, for example, or according to an image enhancement level expected by infant monitoring system 1200/1400.
- the blending parameter may be adjusted or refined using a knob of control panel 1217/1417, for example, while combined images are being displayed by display 1212/1412.
- a blending parameter may be selected such that blended image data includes only thermal characteristics, or, alternatively, only visible light characteristics.
- a blending parameter may also be limited in range, for example, so as not to produce blended data that is out-of-bounds with respect to a dynamic range of a particular color space/format or a display.
- processing according to the high contrast mode may include one or more processing steps, ordering of processing steps, arithmetic combinations, and/or adjustments to blending parameters as disclosed in U.S. Patent Application No. 13/437,645 previously referenced herein.
- the following equations may be used to determine the components Y, Cr and Cb for the combined images with the Y component from the high pass filtered visible light images and the Cr and Cb components from the thermal images.
- hp_y_vis highpass(y vis)
- highpass(y_vis) may be high spatial frequency content derived from high pass filtering a luminance component of visible light images.
- Colored(lowpass(ir_signal_linear)) may be the resulting luminance and chrominance components of the thermal images after the thermal images are low pass filtered.
- the thermal images may include a luminance component that is selected to be 0.5 times a maximum luminance (e.g., of a display and/or a processing step).
- the radiometric component of the thermal images may be the chrominance component of the theimal images.
- the y ir component of the thermal images may be dropped and the components of the combined images may be (hp_y_vis, cr_ir, cb ir), using the notation above.
- the following equations may be used to determine the components Y, Cr and Cb for combined images with the Y component from the high pass filtered visible light images and the Cr and Cb components from the thermal images.
- alpha can be an infinitely large number, but in practice a limitation will probably be necessary, to limit the size of alpha that can be chosen to what will be convenient in the current application.
- processing may proceed to block 1610. where blended data may be encoded into components of the combined images in order to form the combined images.
- the blended data may be encoded into one or more components of the combined images.
- processor 1208/1408 may be configured to encode blended data derived or produced in accordance with block 1608 into combined images that increases, refines, or otherwise enhances the information conveyed by either the visible light or thermal images viewed by themselves.
- encoding blended image data into a component of combined images may include additional image processing operations, for example, such as dynamic range adjustment, normalization, gain and offset operations, noise reduction, and color space conversions, for instance.
- processor 1208/1408 may be configured to encode other image data into combined images. For example, if blended image data is encoded into a luminance component of combined images, a chrominance component of either visible light images or thermal images may be encoded into a chrominance component of combined images. Selection of source images may be made through user input, for example, or may be determined automatically based on context or other data. More generally, in some embodiments, a component of combined images that is not encoded with blended data may be encoded with a corresponding component of visible light images or thermal images. By doing so, a radiometric calibration of thermal images and/or a color space calibration of visible light images may be retained in the resulting combined images.
- the combined images obtained according to one or more embodiments of process 1600 may then be utilized at block 1508 to generate user-viewable images having higher contrast and/or detail than those that may be generated using thermal images alone.
- the generated monitoring information including the user-viewable images, may be presented to a user.
- some or all of the monitoring information may be presented on a display (e.g., display 1212/1412) as text descriptions, graphics, and/or icons, as shown in the example screenshot of display 1412.
- some of the monitoring information may be presented additionally or alternatively in audible form.
- users may be notified of the various alarms by sounding a siren and/or delivering a computer-generated or pre-recorded speech announcement, using a speaker, a bell, a siren, a chime, and/or other components for generating sound.
- some or all of the monitoring information may be presented using various lights and indicators.
- segmented LED indicators may be used to present temperature information, and flashing lights may be used to indicate the various alarms.
- the generated monitoring information may be transmitted from a processor via wireless or wired connection using appropriate network protocols and interfaces (e.g., through communication module 1214/1414) to a remotely located display or an external device to present the monitoring information.
- the generated monitoring information may be converted, wrapped, structured or otherwise formatted for data exchange with an external device using suitable application layer protocols (e.g., Simple Object Access Protocol (SOAP). Hypertext Transfer Protocol (HTTP)) or a proprietary data exchange format.
- SOAP Simple Object Access Protocol
- HTTP Hypertext Transfer Protocol
- process 1500 may advantageously provide non-contact (e.g., without placing sensor patches on body or sensor pads on mattresses) and active (e.g.. automatically detecting) monitoring of various conditions associated with an infant, and permit a user to conveniently view the monitoring information at a remote location and/or an external device.
- Process 1500 may also advantageously provide user-viewable images of a scene including at least a partial view of an infant, even when the scene receives little or no illumination.
- visible images and/or thermal images may be blended or otherwise combined in accordance with any of the techniques set forth in U.S. Patent Application Nos. 61/473,207, 61/746,069, 61/746,074, 61/792,582. 61/793,952, 12/766,739, 13/105,765, or 13/437,645, or International Patent
- Non-transitory instructions, program code, and/or data can be stored on one or more non-transitory machine readable mediums. It is also contemplated that software identified herein can be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein can be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Toxicology (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Studio Devices (AREA)
- Radiation Pyrometers (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Photometry And Measurement Of Optical Pulse Characteristics (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201380047256.1A CN104684465B (zh) | 2012-07-12 | 2013-07-12 | 使用热成像的婴儿监测系统及方法 |
Applications Claiming Priority (12)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201261670824P | 2012-07-12 | 2012-07-12 | |
| US61/670,824 | 2012-07-12 | ||
| US201261746074P | 2012-12-26 | 2012-12-26 | |
| US201261746069P | 2012-12-26 | 2012-12-26 | |
| US61/746,074 | 2012-12-26 | ||
| US61/746,069 | 2012-12-26 | ||
| US201261748018P | 2012-12-31 | 2012-12-31 | |
| US61/748,018 | 2012-12-31 | ||
| US201361793952P | 2013-03-15 | 2013-03-15 | |
| US201361792582P | 2013-03-15 | 2013-03-15 | |
| US61/792,582 | 2013-03-15 | ||
| US61/793,952 | 2013-03-15 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014012070A1 true WO2014012070A1 (fr) | 2014-01-16 |
Family
ID=48857017
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2013/050393 Ceased WO2014012070A1 (fr) | 2012-07-12 | 2013-07-12 | Systèmes et procédés de surveillance de nourrisson à l'aide d'une imagerie thermique |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN104684465B (fr) |
| WO (1) | WO2014012070A1 (fr) |
Cited By (36)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104486562A (zh) * | 2014-12-26 | 2015-04-01 | 昆明物理研究所 | 基于固定积分时间的嵌入式红外图像超帧处理方法 |
| EP3029600A1 (fr) * | 2014-12-01 | 2016-06-08 | Axis AB | Détection d'objet d'image thermique |
| WO2016108225A1 (fr) * | 2014-12-31 | 2016-07-07 | Parenting Science Ltd | Systèmes et procédés permettant de surveiller et d'encourager l'exercice physique chez les nourrissons |
| WO2016116307A1 (fr) | 2015-01-19 | 2016-07-28 | Koninklijke Philips N.V. | Dispositif, système et procédé de détection de peau |
| CN106413545A (zh) * | 2014-05-13 | 2017-02-15 | 欧姆龙株式会社 | 姿势估计装置、姿势估计系统、姿势估计方法、姿势估计程序和记录该程序的计算机介质 |
| CN106919806A (zh) * | 2017-04-27 | 2017-07-04 | 刘斌 | 一种人体监测方法、装置以及系统及计算机可读存储设备 |
| EP3164990A4 (fr) * | 2014-04-08 | 2017-12-06 | UdiSense Inc. | Systèmes et procédés de configuration de caméras de surveillance de bébé servant à obtenir des ensembles de données uniformes à des fins d'analyse |
| WO2018020509A1 (fr) * | 2016-07-28 | 2018-02-01 | Chigru Innovations (OPC) Private Limited | Système de surveillance de nourrisson |
| US10083501B2 (en) | 2015-10-23 | 2018-09-25 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
| GB2565279A (en) * | 2017-08-01 | 2019-02-13 | Jaguar Land Rover Ltd | Image processor and method for image processing |
| US10271020B2 (en) | 2014-10-24 | 2019-04-23 | Fluke Corporation | Imaging system employing fixed, modular mobile, and portable infrared cameras with ability to receive, communicate, and display data and images with proximity detection |
| USD854074S1 (en) | 2016-05-10 | 2019-07-16 | Udisense Inc. | Wall-assisted floor-mount for a monitoring camera |
| US10357117B2 (en) | 2016-07-13 | 2019-07-23 | Chigru Innovations (OPC) Private Limited | Rocking cradle |
| USD855684S1 (en) | 2017-08-06 | 2019-08-06 | Udisense Inc. | Wall mount for a monitoring camera |
| CN110633710A (zh) * | 2019-09-09 | 2019-12-31 | 重庆小富农康农业科技服务有限公司 | 一种生猪疾病预警系统 |
| US10530977B2 (en) | 2015-09-16 | 2020-01-07 | Fluke Corporation | Systems and methods for placing an imaging tool in a test and measurement tool |
| US10539268B2 (en) | 2016-07-13 | 2020-01-21 | Chigru Innovations (OPC) Private Limited | Oscillation systems |
| US10602082B2 (en) | 2014-09-17 | 2020-03-24 | Fluke Corporation | Triggered operation and/or recording of test and measurement or imaging tools |
| US10708550B2 (en) | 2014-04-08 | 2020-07-07 | Udisense Inc. | Monitoring camera and mount |
| CN111507268A (zh) * | 2020-04-17 | 2020-08-07 | 浙江大华技术股份有限公司 | 报警方法及装置、存储介质和电子装置 |
| USD900431S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle blanket with decorative pattern |
| USD900428S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle band |
| USD900429S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle band with decorative pattern |
| USD900430S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle blanket |
| RU2737138C1 (ru) * | 2020-08-19 | 2020-11-25 | ООО "Ай Ти Ви групп" | Система и способ контроля температуры тела людей по видеоданным |
| CN112057074A (zh) * | 2020-07-21 | 2020-12-11 | 北京迈格威科技有限公司 | 呼吸速率测量方法、装置、电子设备及计算机存储介质 |
| US10874332B2 (en) | 2017-11-22 | 2020-12-29 | Udisense Inc. | Respiration monitor |
| US20210212595A1 (en) * | 2018-06-13 | 2021-07-15 | Braintrain2020 Limited | Apparatus for sensing |
| WO2021161016A1 (fr) * | 2020-02-11 | 2021-08-19 | BreatheOx Limited | Dispositif de surveillance respiratoire |
| RU2754392C1 (ru) * | 2018-05-18 | 2021-09-01 | Эссити Хайджиен Энд Хэлс Актиболаг | Обнаружение присутствия и отсутствия |
| US20210358284A1 (en) * | 2020-05-14 | 2021-11-18 | Yun yun AI Baby camera Co., Ltd. | Visible-light-image physiological monitoring system with thermal detecting assistance |
| IL275524B (en) * | 2020-06-18 | 2021-12-01 | Elbit Systems C4I & Cyber Ltd | System and method for measuring parameters without contact |
| TWI755907B (zh) * | 2020-10-23 | 2022-02-21 | 正修學校財團法人正修科技大學 | 人臉影像真偽辨識系統及其方法 |
| US11669962B2 (en) | 2020-10-26 | 2023-06-06 | Covidien Lp | Temperature monitoring with a thermal camera |
| CN117690159A (zh) * | 2023-12-07 | 2024-03-12 | 武汉星巡智能科技有限公司 | 基于多模态数据融合的婴幼儿趴睡监测方法、装置及设备 |
| US12211237B2 (en) | 2018-05-09 | 2025-01-28 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Correlation of thermal satellite image data for generating thermal maps at high spatial resolution |
Families Citing this family (46)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104586401A (zh) * | 2015-01-19 | 2015-05-06 | 赵树乔 | 一种追踪人体姿态的方法 |
| CN105389943B (zh) * | 2015-11-18 | 2017-10-13 | 上海斐讯数据通信技术有限公司 | 防止跌落的安全装置、安全监控系统及方法 |
| US10912516B2 (en) | 2015-12-07 | 2021-02-09 | Panasonic Corporation | Living body information measurement device, living body information measurement method, and storage medium storing program |
| JP6763719B2 (ja) * | 2015-12-07 | 2020-09-30 | パナソニック株式会社 | 生体情報測定装置、生体情報測定方法及びプログラム |
| CN105539217A (zh) * | 2016-02-03 | 2016-05-04 | 成都欧贝乐商贸有限公司 | 一种用于记录儿童健康数据的儿童安全座椅 |
| CN105554477A (zh) * | 2016-02-04 | 2016-05-04 | 武克易 | 物联网智能拍摄系统 |
| WO2017132931A1 (fr) * | 2016-02-04 | 2017-08-10 | 武克易 | Dispositif intelligent de l'internet des objets assurant une fonction de prestation de soins |
| CN105551189A (zh) * | 2016-02-04 | 2016-05-04 | 武克易 | 物联网设备智能看护方法 |
| CN105551188A (zh) * | 2016-02-04 | 2016-05-04 | 武克易 | 具有看护功能的物联网智能设备实现方法 |
| CN105554476A (zh) * | 2016-02-04 | 2016-05-04 | 武克易 | 具有看护功能的物联网智能设备 |
| CN107582303A (zh) * | 2016-05-24 | 2018-01-16 | 朱林清 | 一种血液科一体式手术床 |
| CN106037681A (zh) * | 2016-06-30 | 2016-10-26 | 宁德师范学院 | 一种提高生活质量的床 |
| WO2018069790A1 (fr) * | 2016-10-14 | 2018-04-19 | Facense Ltd. | Systèmes et procédés pour détecter des paramètres respiratoires et fournir une biorétroaction |
| CN106781380A (zh) * | 2016-12-13 | 2017-05-31 | 安徽乐年健康养老产业有限公司 | 一种红外智能语音看护系统 |
| CN106725358A (zh) * | 2016-12-29 | 2017-05-31 | 杭州博博科技有限公司 | 一种病房体温测量数据采集系统 |
| CN107592335A (zh) * | 2017-07-25 | 2018-01-16 | 深圳市盛路物联通讯技术有限公司 | 一种活动区域管理方法及物联网服务器 |
| TWI637352B (zh) * | 2017-08-23 | 2018-10-01 | 緯創資通股份有限公司 | 影像處理裝置和方法 |
| CN107647854A (zh) * | 2017-10-19 | 2018-02-02 | 宋彦震 | 基于物联网的人体信息采集终端 |
| CN107679518A (zh) * | 2017-10-27 | 2018-02-09 | 深圳极视角科技有限公司 | 一种检测系统 |
| CN107944346B (zh) * | 2017-11-02 | 2020-07-03 | 歌尔股份有限公司 | 基于图像处理的异常情况监测方法及监测设备 |
| CN108652625B (zh) * | 2018-02-05 | 2021-07-16 | 苏州朗润医疗系统有限公司 | 一种用于保障磁共振扫描安全的图像识别方法及系统 |
| CN112041848B (zh) * | 2018-03-27 | 2024-05-31 | 泰立戴恩菲力尔有限责任公司 | 人数统计和跟踪系统及方法 |
| CN108852362A (zh) * | 2018-03-29 | 2018-11-23 | 广东美的制冷设备有限公司 | 睡眠状态的检测方法、装置、空调器及可读存储介质 |
| CN108682112A (zh) * | 2018-05-15 | 2018-10-19 | 京东方科技集团股份有限公司 | 一种婴幼儿监护装置、终端、系统、方法和存储介质 |
| CN109091303A (zh) * | 2018-05-24 | 2018-12-28 | 何泽熹 | 智能监护系统 |
| CN108600706B (zh) * | 2018-06-15 | 2023-12-15 | 云南电网有限责任公司文山供电局 | 手持式测温仪扩展无人远程监控系统及监控方法 |
| CN109211409A (zh) * | 2018-09-27 | 2019-01-15 | 中国医学科学院北京协和医院 | 病床监测系统 |
| CN110974186B (zh) * | 2018-10-02 | 2022-08-30 | 希尔-罗姆服务公司 | 用于确定目标区域温度变化的温度监测系统和方法 |
| CN109907739B (zh) * | 2019-03-21 | 2021-07-30 | 苏州浪潮智能科技有限公司 | 一种基于图像识别的睡眠时患感冒的告警方法与系统 |
| CN111507290A (zh) * | 2019-05-28 | 2020-08-07 | 小蚁科技(香港)有限公司 | 受抚者监视和看护系统 |
| CN110338769A (zh) * | 2019-06-18 | 2019-10-18 | 秒针信息技术有限公司 | 告警处理方法、装置、存储介质及电子装置 |
| CN111272293A (zh) * | 2019-12-31 | 2020-06-12 | 扬州海通电子科技有限公司 | 一种热分布监控系统及其检测方法 |
| CN113645435A (zh) * | 2020-04-27 | 2021-11-12 | 财团法人工业技术研究院 | 影像监控装置与方法 |
| CN116057362A (zh) * | 2020-06-01 | 2023-05-02 | 前视红外系统股份公司 | 升高温度筛查系统和方法 |
| CN111696684A (zh) * | 2020-06-12 | 2020-09-22 | 南通沪联智慧医疗科技有限公司 | 一种智能测温物联网大数据预警平台的方法 |
| CN111772633B (zh) * | 2020-07-16 | 2023-06-23 | 韩锋 | 一种遥感呼吸功能监护装置及方法 |
| GB202017750D0 (en) * | 2020-11-10 | 2020-12-23 | Mead Johnson Nutrition Co | Systems and methods for recognising children suffering cows' milk allergy |
| CN113989935A (zh) * | 2021-10-29 | 2022-01-28 | 中汽创智科技有限公司 | 目标检测方法、装置、存储介质和电子设备 |
| CN114216564B (zh) * | 2021-11-26 | 2025-02-11 | 杭州七格智联科技有限公司 | 一种基于头部多区域定位的婴幼儿智能体温检测方法 |
| CN114387644A (zh) * | 2021-12-28 | 2022-04-22 | 卢嘉颖 | 非侵入式呼吸状态识别方法、系统、设备及存储介质 |
| CN114732246B (zh) * | 2022-03-30 | 2024-02-06 | 浙江梦神家居股份有限公司 | 一种智能床垫软硬度调节方法、系统、存储介质及智能终端 |
| CN115191781B (zh) * | 2022-07-28 | 2023-07-21 | 慕思健康睡眠股份有限公司 | 一种基于智能床垫的画面抓取方法及相关产品 |
| CN115381440B (zh) * | 2022-09-30 | 2023-05-23 | 广东工业大学 | 一种床边跌倒检测方法 |
| TWI895879B (zh) * | 2022-12-29 | 2025-09-01 | 財團法人工業技術研究院 | 臉部辨識系統及生理資訊生成方法 |
| CN117373110B (zh) * | 2023-08-30 | 2025-02-11 | 武汉星巡智能科技有限公司 | 可见光-热红外成像的婴幼儿行为识别方法、装置及设备 |
| CN118097512A (zh) * | 2024-03-13 | 2024-05-28 | 武汉星巡智能科技有限公司 | 共眠场景下婴幼儿和成人识别方法、装置、设备及介质 |
Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5704367A (en) * | 1995-03-28 | 1998-01-06 | Nihon Kohden Corporation | Respiration monitor for monitoring respiration based upon an image signal of a facial region |
| US5903659A (en) * | 1997-04-17 | 1999-05-11 | Raytheon Company | Adaptive non-uniformity compensation algorithm |
| US6028309A (en) | 1997-02-11 | 2000-02-22 | Indigo Systems Corporation | Methods and circuitry for correcting temperature-induced errors in microbolometer focal plane array |
| US20040076316A1 (en) * | 2000-12-15 | 2004-04-22 | Fauci Mark A | Method and apparatus for measuring physiology by means of infrared detector |
| US6812465B2 (en) | 2002-02-27 | 2004-11-02 | Indigo Systems Corporation | Microbolometer focal plane array methods and circuitry |
| US7034301B2 (en) | 2002-02-27 | 2006-04-25 | Indigo Systems Corporation | Microbolometer focal plane array systems and methods |
| US20060232675A1 (en) * | 2003-04-25 | 2006-10-19 | Land Instruments International Limited | Thermal imaging system and method |
| US7470902B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera electronic architectures |
| US7470904B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera packaging |
| US7679048B1 (en) | 2008-04-18 | 2010-03-16 | Flir Systems, Inc. | Systems and methods for selecting microbolometers within microbolometer focal plane arrays |
| US20100191124A1 (en) * | 2007-04-17 | 2010-07-29 | Prokoski Francine J | System and method for using three dimensional infrared imaging to provide psychological profiles of individuals |
| WO2011151806A1 (fr) * | 2010-06-04 | 2011-12-08 | Tecnimed S.R.L. | Procédé et dispositif de mesure de la température corporelle interne d'un patient |
| US20120075462A1 (en) * | 2010-09-23 | 2012-03-29 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
| EP2460469A1 (fr) * | 2010-12-01 | 2012-06-06 | Hill-Rom Services, Inc. | Système de suivi de patients |
| US11486508B2 (en) | 2017-06-08 | 2022-11-01 | Superior Energy Services, Llc | Deep set safety valve |
-
2013
- 2013-07-12 WO PCT/US2013/050393 patent/WO2014012070A1/fr not_active Ceased
- 2013-07-12 CN CN201380047256.1A patent/CN104684465B/zh not_active Expired - Fee Related
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5704367A (en) * | 1995-03-28 | 1998-01-06 | Nihon Kohden Corporation | Respiration monitor for monitoring respiration based upon an image signal of a facial region |
| US6028309A (en) | 1997-02-11 | 2000-02-22 | Indigo Systems Corporation | Methods and circuitry for correcting temperature-induced errors in microbolometer focal plane array |
| US5903659A (en) * | 1997-04-17 | 1999-05-11 | Raytheon Company | Adaptive non-uniformity compensation algorithm |
| US20040076316A1 (en) * | 2000-12-15 | 2004-04-22 | Fauci Mark A | Method and apparatus for measuring physiology by means of infrared detector |
| US6812465B2 (en) | 2002-02-27 | 2004-11-02 | Indigo Systems Corporation | Microbolometer focal plane array methods and circuitry |
| US7034301B2 (en) | 2002-02-27 | 2006-04-25 | Indigo Systems Corporation | Microbolometer focal plane array systems and methods |
| US20060232675A1 (en) * | 2003-04-25 | 2006-10-19 | Land Instruments International Limited | Thermal imaging system and method |
| US7470902B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera electronic architectures |
| US7470904B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera packaging |
| US20100191124A1 (en) * | 2007-04-17 | 2010-07-29 | Prokoski Francine J | System and method for using three dimensional infrared imaging to provide psychological profiles of individuals |
| US7679048B1 (en) | 2008-04-18 | 2010-03-16 | Flir Systems, Inc. | Systems and methods for selecting microbolometers within microbolometer focal plane arrays |
| WO2011151806A1 (fr) * | 2010-06-04 | 2011-12-08 | Tecnimed S.R.L. | Procédé et dispositif de mesure de la température corporelle interne d'un patient |
| US20120075462A1 (en) * | 2010-09-23 | 2012-03-29 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
| EP2460469A1 (fr) * | 2010-12-01 | 2012-06-06 | Hill-Rom Services, Inc. | Système de suivi de patients |
| US11486508B2 (en) | 2017-06-08 | 2022-11-01 | Superior Energy Services, Llc | Deep set safety valve |
Non-Patent Citations (1)
| Title |
|---|
| FRANKENBERGER R T ET AL: "MESSUNG SEITLICHER HAUTTEMPERATURPROFILE VON FRUEHGEBORENEN IN INKUBATOREN MITTELS THERMOGRAPHIE. MEASURING LATERAL SKIN TEMPERATURE PROFILES OF PRETERM INFANTS IN INCUBATORS BY THERMOGRAPHY", BIOMEDIZINISCHE TECHNIK, FACHVERLAG SCHIELE UND SCHOEN GMBH. BERLIN, DE, vol. 43, no. 6, 1 June 1998 (1998-06-01), pages 174 - 178, XP000765166, ISSN: 0013-5585 * |
Cited By (54)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10708550B2 (en) | 2014-04-08 | 2020-07-07 | Udisense Inc. | Monitoring camera and mount |
| US10165230B2 (en) | 2014-04-08 | 2018-12-25 | Udisense Inc. | Systems and methods for configuring baby monitor cameras to provide uniform data sets for analysis and to provide an advantageous view point of babies |
| EP3164990A4 (fr) * | 2014-04-08 | 2017-12-06 | UdiSense Inc. | Systèmes et procédés de configuration de caméras de surveillance de bébé servant à obtenir des ensembles de données uniformes à des fins d'analyse |
| CN106413545B (zh) * | 2014-05-13 | 2019-07-05 | 欧姆龙株式会社 | 姿势估计装置、姿势估计系统和姿势估计方法 |
| US10198813B2 (en) | 2014-05-13 | 2019-02-05 | Omron Corporation | Posture estimation device, posture estimation system, posture estimation method, posture estimation program, and computer-readable recording medium on which posture estimation program is recorded |
| CN106413545A (zh) * | 2014-05-13 | 2017-02-15 | 欧姆龙株式会社 | 姿势估计装置、姿势估计系统、姿势估计方法、姿势估计程序和记录该程序的计算机介质 |
| US10602082B2 (en) | 2014-09-17 | 2020-03-24 | Fluke Corporation | Triggered operation and/or recording of test and measurement or imaging tools |
| US10271020B2 (en) | 2014-10-24 | 2019-04-23 | Fluke Corporation | Imaging system employing fixed, modular mobile, and portable infrared cameras with ability to receive, communicate, and display data and images with proximity detection |
| EP3029600A1 (fr) * | 2014-12-01 | 2016-06-08 | Axis AB | Détection d'objet d'image thermique |
| CN104486562A (zh) * | 2014-12-26 | 2015-04-01 | 昆明物理研究所 | 基于固定积分时间的嵌入式红外图像超帧处理方法 |
| WO2016108225A1 (fr) * | 2014-12-31 | 2016-07-07 | Parenting Science Ltd | Systèmes et procédés permettant de surveiller et d'encourager l'exercice physique chez les nourrissons |
| WO2016116307A1 (fr) | 2015-01-19 | 2016-07-28 | Koninklijke Philips N.V. | Dispositif, système et procédé de détection de peau |
| US10530977B2 (en) | 2015-09-16 | 2020-01-07 | Fluke Corporation | Systems and methods for placing an imaging tool in a test and measurement tool |
| US10083501B2 (en) | 2015-10-23 | 2018-09-25 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
| US10586319B2 (en) | 2015-10-23 | 2020-03-10 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
| US11210776B2 (en) | 2015-10-23 | 2021-12-28 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
| US12293501B2 (en) | 2015-10-23 | 2025-05-06 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
| USD854074S1 (en) | 2016-05-10 | 2019-07-16 | Udisense Inc. | Wall-assisted floor-mount for a monitoring camera |
| US10357117B2 (en) | 2016-07-13 | 2019-07-23 | Chigru Innovations (OPC) Private Limited | Rocking cradle |
| US10539268B2 (en) | 2016-07-13 | 2020-01-21 | Chigru Innovations (OPC) Private Limited | Oscillation systems |
| US10447972B2 (en) | 2016-07-28 | 2019-10-15 | Chigru Innovations (OPC) Private Limited | Infant monitoring system |
| WO2018020509A1 (fr) * | 2016-07-28 | 2018-02-01 | Chigru Innovations (OPC) Private Limited | Système de surveillance de nourrisson |
| CN106919806A (zh) * | 2017-04-27 | 2017-07-04 | 刘斌 | 一种人体监测方法、装置以及系统及计算机可读存储设备 |
| US10614556B2 (en) | 2017-08-01 | 2020-04-07 | Jaguar Land Rover Limited | Image processor and method for image processing |
| GB2565279B (en) * | 2017-08-01 | 2020-02-12 | Jaguar Land Rover Ltd | Image processor and method for image processing |
| DE102018212179B4 (de) * | 2017-08-01 | 2025-12-31 | Jaguar Land Rover Limited | Bildverarbeitungsvorrichtung und verfahren zur bildverarbeitung |
| GB2565279A (en) * | 2017-08-01 | 2019-02-13 | Jaguar Land Rover Ltd | Image processor and method for image processing |
| USD855684S1 (en) | 2017-08-06 | 2019-08-06 | Udisense Inc. | Wall mount for a monitoring camera |
| US10874332B2 (en) | 2017-11-22 | 2020-12-29 | Udisense Inc. | Respiration monitor |
| US12211237B2 (en) | 2018-05-09 | 2025-01-28 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Correlation of thermal satellite image data for generating thermal maps at high spatial resolution |
| US11928846B2 (en) | 2018-05-18 | 2024-03-12 | Essity Hygiene And Health Aktiebolag | Presence and absence detection |
| RU2754392C1 (ru) * | 2018-05-18 | 2021-09-01 | Эссити Хайджиен Энд Хэлс Актиболаг | Обнаружение присутствия и отсутствия |
| US20210212595A1 (en) * | 2018-06-13 | 2021-07-15 | Braintrain2020 Limited | Apparatus for sensing |
| USD900431S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle blanket with decorative pattern |
| USD900430S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle blanket |
| USD900429S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle band with decorative pattern |
| USD900428S1 (en) | 2019-01-28 | 2020-11-03 | Udisense Inc. | Swaddle band |
| CN110633710A (zh) * | 2019-09-09 | 2019-12-31 | 重庆小富农康农业科技服务有限公司 | 一种生猪疾病预警系统 |
| CN110633710B (zh) * | 2019-09-09 | 2022-04-15 | 重庆大直科技有限公司 | 一种生猪疾病预警系统 |
| WO2021161016A1 (fr) * | 2020-02-11 | 2021-08-19 | BreatheOx Limited | Dispositif de surveillance respiratoire |
| CN111507268B (zh) * | 2020-04-17 | 2024-02-20 | 浙江华感科技有限公司 | 报警方法及装置、存储介质和电子装置 |
| CN111507268A (zh) * | 2020-04-17 | 2020-08-07 | 浙江大华技术股份有限公司 | 报警方法及装置、存储介质和电子装置 |
| US11574532B2 (en) * | 2020-05-14 | 2023-02-07 | Yun yun AI Baby camera Co., Ltd. | Visible-light-image physiological monitoring system with thermal detecting assistance |
| US20210358284A1 (en) * | 2020-05-14 | 2021-11-18 | Yun yun AI Baby camera Co., Ltd. | Visible-light-image physiological monitoring system with thermal detecting assistance |
| WO2021255738A1 (fr) * | 2020-06-18 | 2021-12-23 | Elbit Systems C4I and Cyber Ltd. | Système et procédé de mesure de paramètres sans contact |
| US11783564B2 (en) | 2020-06-18 | 2023-10-10 | Elbit Systems C4I and Cyber Ltd. | Contactless parameters measurement system and method |
| IL275524B (en) * | 2020-06-18 | 2021-12-01 | Elbit Systems C4I & Cyber Ltd | System and method for measuring parameters without contact |
| CN112057074A (zh) * | 2020-07-21 | 2020-12-11 | 北京迈格威科技有限公司 | 呼吸速率测量方法、装置、电子设备及计算机存储介质 |
| RU2737138C1 (ru) * | 2020-08-19 | 2020-11-25 | ООО "Ай Ти Ви групп" | Система и способ контроля температуры тела людей по видеоданным |
| TWI755907B (zh) * | 2020-10-23 | 2022-02-21 | 正修學校財團法人正修科技大學 | 人臉影像真偽辨識系統及其方法 |
| US11669962B2 (en) | 2020-10-26 | 2023-06-06 | Covidien Lp | Temperature monitoring with a thermal camera |
| US11978205B2 (en) | 2020-10-26 | 2024-05-07 | Covidien Lp | Temperature monitoring with a thermal camera |
| CN117690159B (zh) * | 2023-12-07 | 2024-06-11 | 武汉星巡智能科技有限公司 | 基于多模态数据融合的婴幼儿趴睡监测方法、装置及设备 |
| CN117690159A (zh) * | 2023-12-07 | 2024-03-12 | 武汉星巡智能科技有限公司 | 基于多模态数据融合的婴幼儿趴睡监测方法、装置及设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN104684465B (zh) | 2017-07-07 |
| CN104684465A (zh) | 2015-06-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9843743B2 (en) | Infant monitoring systems and methods using thermal imaging | |
| CN104684465B (zh) | 使用热成像的婴儿监测系统及方法 | |
| CN103561647B (zh) | 用于监测用户的皮肤颜色的方法和系统 | |
| US9813643B2 (en) | Thermal recognition systems and methods | |
| US9900478B2 (en) | Device attachment with infrared imaging sensor | |
| US9986175B2 (en) | Device attachment with infrared imaging sensor | |
| KR101831486B1 (ko) | 스마트 감시 카메라 시스템 및 방법 | |
| EP3164990B1 (fr) | Systèmes et procédés de configuration de caméras de surveillance de bébé servant à obtenir des ensembles de données uniformes à des fins d'analyse | |
| US20170035344A1 (en) | Detection of an Allergic Reaction Using Thermal Measurements of the Face | |
| US20160156880A1 (en) | Durable compact multisensor observation devices | |
| US20160074724A1 (en) | Thermal-assisted golf rangefinder systems and methods | |
| JP6488291B2 (ja) | リモート光体積変動記録法のための自動カメラ調整 | |
| US20160206216A1 (en) | Device, system and method for skin detection | |
| CN103927250B (zh) | 一种终端设备用户姿态检测方法 | |
| EP2923187B1 (fr) | Réseau hybride de capteurs infrarouges présentant des capteurs infrarouges hétérogènes | |
| WO2014054293A1 (fr) | Dispositif d'estimation d'endormissement, procédé d'estimation d'endormissement et support d'enregistrement non transitoire lisible par ordinateur | |
| KR20130101566A (ko) | 적외선 어레이 센서를 이용한 온도측정장치 및 온도측정방법 | |
| WO2014105241A1 (fr) | Fixation de dispositif avec capteur d'imagerie infrarouge | |
| JP2015037547A (ja) | 遠隔で医療診断を行うためのシステムおよび方法 | |
| JP2023548886A (ja) | カメラを制御するための装置及び方法 | |
| CN116130079A (zh) | 全方位看护的健康云系统 | |
| JP2024016619A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
| JP2018533240A (ja) | 占有検出 | |
| US20230126197A1 (en) | Distance compensation for thermal imaging temperature measurement of inner canthus systems and methods | |
| KR101880448B1 (ko) | 제스처 인식기능을 갖는 디지털 미러 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13740177 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13740177 Country of ref document: EP Kind code of ref document: A1 |