[go: up one dir, main page]

WO2019144247A1 - Systems and methods for automated facial acne assessment from digital photographic images - Google Patents

Systems and methods for automated facial acne assessment from digital photographic images Download PDF

Info

Publication number
WO2019144247A1
WO2019144247A1 PCT/CA2019/050108 CA2019050108W WO2019144247A1 WO 2019144247 A1 WO2019144247 A1 WO 2019144247A1 CA 2019050108 W CA2019050108 W CA 2019050108W WO 2019144247 A1 WO2019144247 A1 WO 2019144247A1
Authority
WO
WIPO (PCT)
Prior art keywords
acne
image
subject
features
digital image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CA2019/050108
Other languages
French (fr)
Inventor
Nicholas Bennett MACKINNON
Fartash VASEFI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Etreat Medical Diagnostics Inc
Original Assignee
Etreat Medical Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Etreat Medical Diagnostics Inc filed Critical Etreat Medical Diagnostics Inc
Publication of WO2019144247A1 publication Critical patent/WO2019144247A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • This application relates to systems and methods for facial acne assessment and monitoring.
  • Acne vulgaris involving mostly the face, may have a great impact on a patient's emotional, social, and psychological functions, particularly in adolescents and young adults.
  • the chronic inflammatory lesions characteristic of acne biology are associated with symptomatic discomfort, scarring, emotional and psychosocial distress, occupational consequences and potential psychiatric disturbances including depression and suicide [1]
  • stage 0 (Clear) assessed as no inflammatory or non-inflammatory lesions
  • stage 1 (almost clear) considered as a few scattered comedones and a few small papules
  • stage 2 (mild), easily recognizable with less than half the surface involved and some comedones and some papules and pustules present
  • stage 3 (moderate) with more than half of the surface involved and many comedones, papules and pustules and one nodule may be present
  • stage 4 severe
  • the lesion count is assessed in both face (from the forehead, left and right cheeks and chin above the jaw line, excluding the nose) and trunk (shoulders, upper back and upper anterior chest) looking for the number of non-inflammatory lesions (open comedones and closed comedones), inflammatory lesions (papules, pustules), and other lesions (nodules and cysts).
  • the first problem is standardization of the images so that relative facial locations can be compared from image to image. This is most commonly accomplished by
  • Digital color images can be based on color or gray scale thresholding, using RGB image data, or using other color spaces such as YCR or CIE 1976 L*a*b* color space and some kind of feature recognition and classification algorithm.
  • k-means clustering using color features has been used by Ramli et al [17] More recently, texture analysis using bimodal Gaussian mixture model (GMM) to distinguish Gabor features of normal skin from skin imperfections has been used for acne analysis [18] Alamdari et al compared multiple machine learning algorithms such as texture analysis, k-means clustering, HSV model segmentation techniques, two level k-means clustering and showed that the highest accuracy of differentiating acne scarring from active inflammatory lesions was by fuzzy-c-means and support vector machine methods. [8] These methods can be applied to images created using the large and expensive instrumentation described above but can also be applied to simpler low-cost image capture devices.
  • GMM Gaussian mixture model
  • this invention comprises a method and system that allows an individual concerned about or experiencing chronic acne disease to use their smart phone to collect information and to assess their facial acne lesions. This information can be analyzed to identify skin regions of interest and to track changes in such regions over time without the need to use facial fixturing to provide acceptable levels of positional accuracy.
  • the method and system collects sensor data from the smart phone and analyzes and correlates this with biographical and treatment information including, age, gender, disease status, pain status, medication status, treatment status, diet, cleansing routine, medical history or other useful biographic, treatment and/or environmental information relating to skin, such as experiential measures of discomfort, medication, weather, sunlight, UV exposure, and regional demographics. It is intended to integrate with existing health record systems compliant with the ISO/IEEE 11073 standards, meet
  • HIPA/HIPAA and other privacy standards connect to personal health records, like Microsoft Healthvault.
  • the invention comprises a mobile software application on a smart phone that collects basic biographical information, captures and calibrates images of the face, performs skin analysis of the calibrated facial image to identify face regions of interest and detect the position of acne lesions within or overlapping these region
  • the data is transferred and stored on a cloud database server connected wirelessly to the smart phone.
  • the calibration and analysis of the data is performed by software deployed on a cloud processing server connected to a cloud database server.
  • the analyzed data and reports are transferred to a personal health record system on a cloud database server.
  • the analysis method identifies five regions of the face and detects the presence of inflammatory acne lesions and tracks them over time. Individuals may provide their personal physician, or other health providers, access to this information via their personal health record.
  • the method incorporates biographical, treatment and/or environmental data into the database and analyzes such data to provide graphical reports of correlations between changes in acne features and other factors, such a cleansing or treatment regimens or diet.
  • other potentially relevant factors may include weather, location, age, and gender. Such factors can be compared to typical expectations for those who are without comparable symptoms, etc.
  • FIG 1. displays a monochrome rendering of a user-captured color facial image with the desired orientation and field of view.
  • FIG. 2 is a flowchart that describes a method for image-capture.
  • FIG. 3 is a flowchart that describes a method for face recognition and face normalization.
  • FIG. 4 is a flowchart that describes a method for face regions of interest (ROIs) specification.
  • FIG. 5 displays a monochrome rendering of a user-captured image with five regions of interest specified on the image.
  • FIG. 6 is a flowchart that describes a method for identification of acne lesions from the image.
  • FIG. 7 displays a monochrome rendering of a user-captured image and the process of acne identification using a* image and Otsu thresholding.
  • FIG. 8 is a flowchart that describes a method for classification of acne lesions into papule, pustule, and scab categories.
  • FIG. 9 displays a monochrome rendering of a sample of papule and pustule color images and their differences in the binary image, used for classification.
  • FIG. 10 displays monochrome renderings of a user-captured image and the identified and classified acne lesions.
  • FIG. 11 is a flowchart that describes a method for face image registration.
  • FIG. 12 displays monochrome renderings of user-captured images obtained at two different times with slight variation in the face angle and the resulting image registration.
  • ‘Software’ is computer implemented programming to perform a task or tasks either automatically or based on specific user inputs or prompts.
  • A‘Cloud Server’ is a virtual private Internet server that enables users to install and run applications, maintain databases and communicate with external input/output devices much like a physical server. It offers flexible, fast outward scalability for operations that is not offered by physical servers.
  • A‘Cloud Processing Server’ is a Cloud Server equipped with sufficiently powerful central processing units (CPUs) and available memory and that functions primarily to process or analyze information, for example, complex image processing.
  • CPUs central processing units
  • A‘Cloud Database Server’ is a Cloud Server that functions primarily to store and retrieve data that can then be processed, analyzed or reviewed, typically after being transferred to another computer system.
  • A‘mobile application’ is a software application that runs on a mobile platform environment such as Android, Apple iOS or Windows mobile deployed on smart phones and tablets.
  • An‘electronic health record’ is a digital record of patient and physician information that can be shared across different health care settings.
  • the invention comprises a mobile device such as a smart phone or tablet with Internet connectivity, a mobile application installed on the smart phone or tablet and software to process data provided from the smart phone or tablet to the processing software.
  • the processing software is installed on a Cloud Server.
  • the processing software may be installed on the mobile device.
  • the system and method comprises capturing images of the face using the mobile application on the smart phone and uploading the images to a cloud server for storage and processing.
  • the mobile device, the mobile application, the cloud data processing server, the cloud data processing software, the cloud database server, the electronic health record software, and the secure communication software is collectively known as the system.
  • the front-end of the system comprises the mobile device and the mobile application, which provides an interface for the user to capture and input images and other data and provides an interface to review past reports and analyses.
  • the front-end may further comprise a mobile application providing a connection to an electronic health record where user information can be stored.
  • the back-end of the system comprises the Cloud Processing Server, the data processing software, the Cloud Database Server, and the electronic health record software.
  • the complexity of the data processing software currently requires code structure that cannot be deployed natively on all smart phone environments in a consistent manner.
  • the Cloud Database Server hosts the electronic health record software and associated databases storing each unique user’s data and images, and interfaces with the cloud processing server.
  • An advantage of deploying both the database and the data processing software on a cloud server ensures that the system operates under a low latency of communication between the data processing server and the database server, providing a faster response time for communicating results to the mobile device.
  • a further advantage of cloud servers is that they provide a deployment environment that is easily scalable for high growth and a secure framework for sensitive patient data.
  • FIG. 1 provides an example of a user-taken image that can be processed by the system.
  • Image of the face is captured using a front or back camera.
  • the image requires only one face to be in the image frame and all of the face is preferably within and at the center of the image frame [100] This is the preferred placement and field of view for image capture and subsequent processing.
  • the application provides user guidance for image capture so that the images will meet basic requirements for analysis.
  • the image of the face should be taken from the front and all the face is within the image frame and the eyes are open.
  • FIG. 2 is a flowchart that describes the steps taken by the user during this image capture process. All of the face must be within the field of view of the camera.
  • the mobile application provides instructions to the user, guiding the user to orient the device until the camera is reasonably close to parallel to the face to minimize spatial distortion [200]
  • the application imports the image for user review [220] and approval and then uploads it to the cloud server for processing [230] If the image is not approved, the user can retake the image. Possible reasons for not approving the image are face oblique angle, image blur due to movement, poor focus or poor brightness and contrast.
  • FIG. 3 is a flowchart that describes a method used by the system to perform human face recognition and normalization.
  • a face recognition algorithm detects nine main facial regions including eyes, eyebrows, nose, lips, and face boundary and defines the coordinate of these features using 68 main landmarks on the face in the image [300]
  • a human face normalization algorithm detects the location of the irises using a circle detection algorithm [310] Once the positions of the irises are known, the image is optionally rotated about the image centerline to ensure the eyes are completely horizontal to one another [320], and the image is resized to make the eyes a set number of pixels apart from one another [330] Horizontal cropping is done based on the distance between the center of the irises and the rule of facial fifths [340]
  • the rule of facial fifths describes the ideal transverse proportions of the face to comprise equal fifths, each segment roughly equal to one eye width. Enough margins are left on each side of the face to prevent cropping part of the face.
  • the output image provides the
  • FIG. 4 is a flowchart that describes a method used by the system to specify facial regions of interest (ROIs).
  • ROIs facial regions of interest
  • This process specifies five main regions in the face include the forehead, nose, left and right cheeks and chin.
  • the accurate and consistent specification of ROIs is important to provide a quantitative assessment of acne lesions (size and number) over the time.
  • the system uses five trapezoids to represent the facial ROIs.
  • the location of the trapezoids depends on the individual’s face as well as the angle of the face and camera with respect to each other (both up down and left right angles). Ideally, the individual is facing directly toward the camera (zero-degree sideway rotation).
  • the ROIs’ sizes and locations are specified statistically based on the distance between the centers of the irises of the respective eyes which is generally constant for any given individual [400]
  • Another process is designed to dynamically update the ROIs based on the angle of the face toward the camera.
  • This process uses the relative location of the irises and the top lip or nose to find if the face is rotated [410] and update the location and size of the ROIs [420]
  • This process ensures that the regions of interest are accurately mapped onto an image of the face, even if that subsequent image may be slightly laterally or slightly vertically rotated, relative to the camera. This provides correction for the slight variability that can be expected in images captured from time to time. It also ensures that the location of lesions within the ROIs are consistent enough for evaluation.
  • FIG. 5 shows an exemplary user-captured image after image normalization and specification of regions of interest [500]
  • FIG. 6 is a flowchart that describes a method used by the system to perform acne identification and separating the portion of the face representing acne lesions from the normal skin image.
  • One method of acne identification comprises conversion of the image to a luminance, chrominance color space such as the CIE L*a*b color space [600]
  • a* channel represents the redness of the pixel color independent of its luminance and can be used to robustly detect the increase of skin redness and identify acne regions since most of the skin is covered by the noninflamed normal skin color. Therefore, a new image is generated using a lowpass 2D Gaussian filter to blur the image and remove details and noises [610]
  • the Gaussian kernel standard deviation is defined to ensure that the acne lesions are eliminated in the smoothed image.
  • the output image is smoothed and represent ROIs with normal skin color and without any details such as acne.
  • the difference between the a* image before and after smoothing is used to find the redness change in each pixel [620] Pixels with large differences in redness are then considered as an acne (Otsu thresholding) and used to create a binary image of acne lesions [630]
  • FIG. 7 shows an exemplary RGB image exhibiting a user-captured image [700], a* image before [710] and after Gaussian filter implementation [720], difference of redness [730], and generated binary image of acne lesions [740]
  • FIG. 8 is a flowchart that describes a method used by the system to classify the severity of acne lesions to papule, pustule, and scab.
  • the method of acne classification uses the properties of the binary image and their color saturation.
  • the center of pustule region remains zero in the binary image - the white center has a relatively less redness difference compared to the surrounding region with high redness due to the skin inflammation. This results in a binary mask with a circular shaped hole inside of the pustules (white area).
  • the Euler number for the binary images is used to automatically separate the binary regions with and without hole (papules/scabs and pustules) [800]
  • the Euler number is the total number of objects in the binary region minus the total number of holes in that region.
  • a Euler number equal to one represents a papule or scab and a Euler number equal to zero represents a pustule [810]
  • a scab is differentiated from a papule using the color saturation of the scab and average of the saturation in the image [820]
  • the saturation of the scab is significantly higher than the papule [830]
  • FIG. 9 shows an exemplary RGB image of papule [900] and pustule [910] with their corresponding binary objects [920] [930]
  • FIG. 10 shows an exemplary monochrome rendering of an RGB image of the face with the detected papules, pustules, and scabs visualized using solid circles [1000], dashed circles [1010], and squares [1020], respectively.
  • FIG. 1 1 is a flowchart that describes a method used by the system to register face images over time [1 100]
  • the newly submitted image (subsequent image) is registered to the reference image (i.e. a prior image of the subject) by finding the 2D transformation between the face images.
  • the spatial transformation is calculated using the coordinates of the five regions of interest trapezoids of the two images [1 110] The calculated
  • FIG. 12 shows exemplary monochrome renderings of RGB images of the face - reference image [1200] and subsequent image [1210] - and the registered image
  • the method thus comprises collecting the locations, number and features of the acne lesions (such as size, type, stage, severity, characteristics and classification) as a data set that can be compared from time to time to determine disease progression, healing or other changes that may be diagnostically, therapeutically and/or cosmetically useful.
  • the method can be employed to compare the facial appearance of a subject before and after make-up is applied.
  • the method can further comprise collecting sensor information from the smart phone comprising at least one of geographic location, time and date, ambient light levels, smart phone camera settings and characteristics and correlating these with the measurements as part of the dataset.
  • the method can further comprise correlating the geographic location and time and date with external databases containing weather data, sunlight, ultraviolet (UV) light exposure, population statistics such as mortality, disease incidence and similar measures and correlating them with the image analysis.
  • the method can further comprise collecting biographic information from the subject comprising at least one of age, gender, disease status, pain status, medication status, treatment status, diet, cleansing routine, medical history or other useful biographic, treatment, and/or environmental variables and correlating them with the image analysis.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Epidemiology (AREA)
  • Veterinary Medicine (AREA)
  • Primary Health Care (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Medicinal Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Dermatology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

A method and system are provided for characterizing inflammatory acne lesions in the face. This invention comprises a smart phone or tablet deployable mobile software application that uses device sensors, internet connectivity and cloud-based image processing to document and analyze chronic facial acne lesions. The application facilitates image capture and performs image processing that identifies acne regions in the face, classifies acne type, and tracks the change of the acne over time to report on and quantify the effects of treatment, diet, and/or skin cleansing routine on chronic skin disease.

Description

SYSTEMS AND METHODS FOR AUTOMATED FACIAL ACNE ASSESSMENT FROM DIGITAL PHOTOGRAPHIC IMAGES
Reference to Related Applications
[0001] This application claims priority to and the benefit of United States Provisional Patent Application No. 62/623,487 filed 29 January 2018, which is incorporated by reference herein in its entirety for all purposes.
Technical Field
[0002] This application relates to systems and methods for facial acne assessment and monitoring.
Background
[0003] Acne vulgaris, involving mostly the face, may have a great impact on a patient's emotional, social, and psychological functions, particularly in adolescents and young adults. The chronic inflammatory lesions characteristic of acne biology are associated with symptomatic discomfort, scarring, emotional and psychosocial distress, occupational consequences and potential psychiatric disturbances including depression and suicide [1]
[0004] It has also been shown that treatment can have a significant effect in reducing the mental and social distress associated with acne [2] However optimal treatment of acne vulgaris depends on accurate severity assessment [3] Global prevalence of acne is about 9.4% according to the Global Burden of Disease Project [4], but prevalence only tells part of the story since approximately 85% of 11 -30 year-olds are affected by acne at some time [5] The total cost for acne treatment exceeds US$1 billion every year [6]
[0005] Many research reports show that the optimal treatment of acne depends on accurate assessment of acne severity which depends on both global assessment and lesion counting. [0006] Acne develops in the hair follicles of the skin due to trapped oil and dead skin cells. These blockages form comedones as material builds up behind the blockage. If the follicle is closed near the surface of the skin and the blockage is cut off from the air, it is called a closed comedone. This build-up of cell debris and sebum form a whitish spot commonly called a white-head. If the follicle is open to the air, the paste of cellular debris and sebum oxidizes and turns black. Such open comedones are also called black-heads. Growth of bacteria, primarily p.acnes, in these blocked follicles can trigger inflammation causing a skin erythematic response, resulting in redness and swelling around the follicle. This creates the characteristic acne lesion known as a papule. If infection is present and pus builds up under the skin, this can create a whitish yellow spot, typically in the center of the acne lesion. This type of lesion is called a pustule. Acne severity grading systems are usually composed of both global assessments and lesion counting. The global assessment has five scales: stage 0 (Clear) assessed as no inflammatory or non-inflammatory lesions; stage 1 (almost clear) considered as a few scattered comedones and a few small papules; stage 2 (mild), easily recognizable with less than half the surface involved and some comedones and some papules and pustules present; stage 3 (moderate) with more than half of the surface involved and many comedones, papules and pustules and one nodule may be present; and finally stage 4 (severe) where the entire surface is involved, covered with comedones, numerous papules and pustules and a few nodules may be present. The lesion count is assessed in both face (from the forehead, left and right cheeks and chin above the jaw line, excluding the nose) and trunk (shoulders, upper back and upper anterior chest) looking for the number of non-inflammatory lesions (open comedones and closed comedones), inflammatory lesions (papules, pustules), and other lesions (nodules and cysts).
[0007] Acne classification systems have been focused on clinical counting and
classification of lesions [7][8][9] Automated systems to both count and classify are challenging, so solutions for chronic acne have focused on counting only. For nuisance acne, it is prognosis rather than change in the number of lesions that is of concern.
[0008] There have been a number of research initiatives to use optical systems and methods to detect and assess acne lesions including standard flash photography, fluorescence photography, polarization photography, and multispectral imaging. In many areas of disease, the understanding of the interaction of light and tissue and its application in diagnosis has expanded rapidly. However, most of these techniques require specialized equipment for measurement and interpretation.
[0009] When analyzing dermatological features in images over time, there are two problems. The first problem is standardization of the images so that relative facial locations can be compared from image to image. This is most commonly accomplished by
standardizing the image capture conditions by placing the face and head in a fixture such as a chin rest, and having the camera or other device fixed in relation to that location when imaging or making measurements on the face. The other problem is that dermatological features like acne lesions change in appearance as they evolve from comedones through papules, pustules or nodules or resolve. This means that matching lesions by image registration relies more on precise position knowledge from image to image.
[0010] This has made it difficult to capture images that can be used to assess progress outside a clinical or laboratory setting. With the advent of wireless mobile computing devices, such as smart phones and tablets, this constraint is rapidly changing. High quality image capture and the availability of sensor data and face recognition algorithms are constantly improving automation and accuracy of image capture. However, these have not yet been able to provide the accuracy of the face positioning fixture and image capture systems used in clinical settings and described above.
[0011] Various data analysis methods have been used for acne detection and
classification which mainly fall into: 1 ) spectral (color) intensity analysis or 2) spatial information in the image, such as size and distribution of shapes or textures. Digital color images can be based on color or gray scale thresholding, using RGB image data, or using other color spaces such as YCR or CIE 1976 L*a*b* color space and some kind of feature recognition and classification algorithm. In 2012, k-means clustering using color features has been used by Ramli et al [17] More recently, texture analysis using bimodal Gaussian mixture model (GMM) to distinguish Gabor features of normal skin from skin imperfections has been used for acne analysis [18] Alamdari et al compared multiple machine learning algorithms such as texture analysis, k-means clustering, HSV model segmentation techniques, two level k-means clustering and showed that the highest accuracy of differentiating acne scarring from active inflammatory lesions was by fuzzy-c-means and support vector machine methods. [8] These methods can be applied to images created using the large and expensive instrumentation described above but can also be applied to simpler low-cost image capture devices.
[0012] Mobile devices are becoming part of the health care ecosystem and applications for smart phones and tablets are proliferating rapidly. The use of imaging and other sensors in smart phone applications is now common and available for the majority of the population in the developed world and many in the developing world.
[0013] It is a goal of this invention to overcome existing difficulties in observing the evolution of acne over time without the need for face and head positioning fixtures and complex imaging devices that require deployment in a clinical or laboratory setting and to provide individuals who may be suffering from chronic acne with digital tools to assess and monitor the progress of their disease using their smart phone or tablet as a mobile medical device.
Summary
[0014] The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope. In various embodiments, one or more of the above- described problems have been reduced or eliminated, while other embodiments are directed to other improvements.
[0015] In some embodiments, this invention comprises a method and system that allows an individual concerned about or experiencing chronic acne disease to use their smart phone to collect information and to assess their facial acne lesions. This information can be analyzed to identify skin regions of interest and to track changes in such regions over time without the need to use facial fixturing to provide acceptable levels of positional accuracy. In some embodiments, the method and system collects sensor data from the smart phone and analyzes and correlates this with biographical and treatment information including, age, gender, disease status, pain status, medication status, treatment status, diet, cleansing routine, medical history or other useful biographic, treatment and/or environmental information relating to skin, such as experiential measures of discomfort, medication, weather, sunlight, UV exposure, and regional demographics. It is intended to integrate with existing health record systems compliant with the ISO/IEEE 11073 standards, meet
HIPA/HIPAA and other privacy standards, and connect to personal health records, like Microsoft Healthvault.
[0016] In some embodiments the invention comprises a mobile software application on a smart phone that collects basic biographical information, captures and calibrates images of the face, performs skin analysis of the calibrated facial image to identify face regions of interest and detect the position of acne lesions within or overlapping these region
boundaries with sufficient precision to measure, report and track these assessments over time. In some embodiments the data is transferred and stored on a cloud database server connected wirelessly to the smart phone.
[0017] In some embodiments the calibration and analysis of the data is performed by software deployed on a cloud processing server connected to a cloud database server. In some embodiments the analyzed data and reports are transferred to a personal health record system on a cloud database server. In some embodiments the analysis method identifies five regions of the face and detects the presence of inflammatory acne lesions and tracks them over time. Individuals may provide their personal physician, or other health providers, access to this information via their personal health record.
[0018] In some embodiments the method incorporates biographical, treatment and/or environmental data into the database and analyzes such data to provide graphical reports of correlations between changes in acne features and other factors, such a cleansing or treatment regimens or diet. As described herein, other potentially relevant factors may include weather, location, age, and gender. Such factors can be compared to typical expectations for those who are without comparable symptoms, etc.
[0019] It is to be understood that this summary is provided as a means for generally determining what follows in the drawings and detailed description and is not intended to limit the scope of the invention. The foregoing and other objects, features, and advantages of the invention can be readily understood upon consideration of the following detailed description taken in conjunction with the accompanying drawings.
Brief Description of the Drawings
[0020] Exemplary embodiments are illustrated in referenced figures of the drawings. It is intended that the embodiments and figures disclosed herein are to be considered illustrative rather than restrictive.
[0021] FIG 1. displays a monochrome rendering of a user-captured color facial image with the desired orientation and field of view.
[0022] FIG. 2 is a flowchart that describes a method for image-capture.
[0023] FIG. 3 is a flowchart that describes a method for face recognition and face normalization.
[0024] FIG. 4 is a flowchart that describes a method for face regions of interest (ROIs) specification.
[0025] FIG. 5 displays a monochrome rendering of a user-captured image with five regions of interest specified on the image.
[0026] FIG. 6 is a flowchart that describes a method for identification of acne lesions from the image.
[0027] FIG. 7 displays a monochrome rendering of a user-captured image and the process of acne identification using a* image and Otsu thresholding.
[0028] FIG. 8 is a flowchart that describes a method for classification of acne lesions into papule, pustule, and scab categories. [0029] FIG. 9 displays a monochrome rendering of a sample of papule and pustule color images and their differences in the binary image, used for classification.
[0030] FIG. 10 displays monochrome renderings of a user-captured image and the identified and classified acne lesions.
[0031] FIG. 11 is a flowchart that describes a method for face image registration.
[0032] FIG. 12 displays monochrome renderings of user-captured images obtained at two different times with slight variation in the face angle and the resulting image registration.
Definitions
[0033] The following section provides definitions for terms and processes used in the description.
[0034] ‘Software’ is computer implemented programming to perform a task or tasks either automatically or based on specific user inputs or prompts.
[0035] A‘Cloud Server’ is a virtual private Internet server that enables users to install and run applications, maintain databases and communicate with external input/output devices much like a physical server. It offers flexible, fast outward scalability for operations that is not offered by physical servers.
[0036] A‘Cloud Processing Server’ is a Cloud Server equipped with sufficiently powerful central processing units (CPUs) and available memory and that functions primarily to process or analyze information, for example, complex image processing.
[0037] A‘Cloud Database Server’ is a Cloud Server that functions primarily to store and retrieve data that can then be processed, analyzed or reviewed, typically after being transferred to another computer system. [0038] A‘mobile application’ is a software application that runs on a mobile platform environment such as Android, Apple iOS or Windows mobile deployed on smart phones and tablets.
[0039] An‘electronic health record’ is a digital record of patient and physician information that can be shared across different health care settings.
Description
[0040] Throughout the following description specific details are set forth in order to provide a more thorough understanding to persons skilled in the art. However, well known elements may not have been shown or described in detail to avoid unnecessarily obscuring the disclosure. Accordingly, the description and drawings are to be regarded in an illustrative, rather than a restrictive, sense.
[0041] In some embodiments the invention comprises a mobile device such as a smart phone or tablet with Internet connectivity, a mobile application installed on the smart phone or tablet and software to process data provided from the smart phone or tablet to the processing software. In one embodiment, the processing software is installed on a Cloud Server. In another embodiment, the processing software may be installed on the mobile device. At this time, the processing capability of mobile devices is insufficient to provide sufficient processing capability for some applications. For those applications where the processing capability of the mobile device is sufficient, data processing may occur on the mobile device. In some embodiments, the system and method comprises capturing images of the face using the mobile application on the smart phone and uploading the images to a cloud server for storage and processing.
[0042] In some embodiments, the mobile device, the mobile application, the cloud data processing server, the cloud data processing software, the cloud database server, the electronic health record software, and the secure communication software is collectively known as the system. The front-end of the system comprises the mobile device and the mobile application, which provides an interface for the user to capture and input images and other data and provides an interface to review past reports and analyses. The front-end may further comprise a mobile application providing a connection to an electronic health record where user information can be stored.
[0043] The back-end of the system comprises the Cloud Processing Server, the data processing software, the Cloud Database Server, and the electronic health record software. The complexity of the data processing software currently requires code structure that cannot be deployed natively on all smart phone environments in a consistent manner.
Therefore, it is an advantage of the system to use a Cloud Processing Server to ensure consistency of data processing throughout many mobile platforms and to provide
streamlined performance.
[0044] The Cloud Database Server hosts the electronic health record software and associated databases storing each unique user’s data and images, and interfaces with the cloud processing server. An advantage of deploying both the database and the data processing software on a cloud server ensures that the system operates under a low latency of communication between the data processing server and the database server, providing a faster response time for communicating results to the mobile device. A further advantage of cloud servers is that they provide a deployment environment that is easily scalable for high growth and a secure framework for sensitive patient data.
[0045] Turning to the figures, FIG. 1 provides an example of a user-taken image that can be processed by the system. Image of the face is captured using a front or back camera. The image requires only one face to be in the image frame and all of the face is preferably within and at the center of the image frame [100] This is the preferred placement and field of view for image capture and subsequent processing. The application provides user guidance for image capture so that the images will meet basic requirements for analysis. The image of the face should be taken from the front and all the face is within the image frame and the eyes are open.
[0046] FIG. 2 is a flowchart that describes the steps taken by the user during this image capture process. All of the face must be within the field of view of the camera. The mobile application provides instructions to the user, guiding the user to orient the device until the camera is reasonably close to parallel to the face to minimize spatial distortion [200] After capturing the image, the application imports the image for user review [220] and approval and then uploads it to the cloud server for processing [230] If the image is not approved, the user can retake the image. Possible reasons for not approving the image are face oblique angle, image blur due to movement, poor focus or poor brightness and contrast.
[0047] FIG. 3 is a flowchart that describes a method used by the system to perform human face recognition and normalization. In one embodiment, a face recognition algorithm detects nine main facial regions including eyes, eyebrows, nose, lips, and face boundary and defines the coordinate of these features using 68 main landmarks on the face in the image [300] A human face normalization algorithm detects the location of the irises using a circle detection algorithm [310] Once the positions of the irises are known, the image is optionally rotated about the image centerline to ensure the eyes are completely horizontal to one another [320], and the image is resized to make the eyes a set number of pixels apart from one another [330] Horizontal cropping is done based on the distance between the center of the irises and the rule of facial fifths [340] The rule of facial fifths describes the ideal transverse proportions of the face to comprise equal fifths, each segment roughly equal to one eye width. Enough margins are left on each side of the face to prevent cropping part of the face. The output image provides the isolated and normalized image of the face with eyes at the specific locations, background completely removed, and the image size is always the same.
[0048] FIG. 4 is a flowchart that describes a method used by the system to specify facial regions of interest (ROIs). This process specifies five main regions in the face include the forehead, nose, left and right cheeks and chin. The accurate and consistent specification of ROIs is important to provide a quantitative assessment of acne lesions (size and number) over the time. The system uses five trapezoids to represent the facial ROIs. The location of the trapezoids depends on the individual’s face as well as the angle of the face and camera with respect to each other (both up down and left right angles). Ideally, the individual is facing directly toward the camera (zero-degree sideway rotation). In such cases, the ROIs’ sizes and locations are specified statistically based on the distance between the centers of the irises of the respective eyes which is generally constant for any given individual [400]
To provide consistent ROIs even when the face is slightly rotated with respect to the camera (±5°), another process is designed to dynamically update the ROIs based on the angle of the face toward the camera. This process uses the relative location of the irises and the top lip or nose to find if the face is rotated [410] and update the location and size of the ROIs [420] This process ensures that the regions of interest are accurately mapped onto an image of the face, even if that subsequent image may be slightly laterally or slightly vertically rotated, relative to the camera. This provides correction for the slight variability that can be expected in images captured from time to time. It also ensures that the location of lesions within the ROIs are consistent enough for evaluation.
[0049] FIG. 5 shows an exemplary user-captured image after image normalization and specification of regions of interest [500]
[0050] FIG. 6 is a flowchart that describes a method used by the system to perform acne identification and separating the portion of the face representing acne lesions from the normal skin image. One method of acne identification comprises conversion of the image to a luminance, chrominance color space such as the CIE L*a*b color space [600] a* channel represents the redness of the pixel color independent of its luminance and can be used to robustly detect the increase of skin redness and identify acne regions since most of the skin is covered by the noninflamed normal skin color. Therefore, a new image is generated using a lowpass 2D Gaussian filter to blur the image and remove details and noises [610] The Gaussian kernel standard deviation is defined to ensure that the acne lesions are eliminated in the smoothed image. The output image is smoothed and represent ROIs with normal skin color and without any details such as acne. The difference between the a* image before and after smoothing is used to find the redness change in each pixel [620] Pixels with large differences in redness are then considered as an acne (Otsu thresholding) and used to create a binary image of acne lesions [630]
[0051] FIG. 7 shows an exemplary RGB image exhibiting a user-captured image [700], a* image before [710] and after Gaussian filter implementation [720], difference of redness [730], and generated binary image of acne lesions [740]
[0052] FIG. 8 is a flowchart that describes a method used by the system to classify the severity of acne lesions to papule, pustule, and scab. The method of acne classification uses the properties of the binary image and their color saturation. During the acne identification process, the center of pustule region remains zero in the binary image - the white center has a relatively less redness difference compared to the surrounding region with high redness due to the skin inflammation. This results in a binary mask with a circular shaped hole inside of the pustules (white area). The Euler number for the binary images is used to automatically separate the binary regions with and without hole (papules/scabs and pustules) [800] The Euler number is the total number of objects in the binary region minus the total number of holes in that region. A Euler number equal to one represents a papule or scab and a Euler number equal to zero represents a pustule [810] A scab is differentiated from a papule using the color saturation of the scab and average of the saturation in the image [820] The saturation of the scab is significantly higher than the papule [830]
[0053] FIG. 9 shows an exemplary RGB image of papule [900] and pustule [910] with their corresponding binary objects [920] [930]
[0054] FIG. 10 shows an exemplary monochrome rendering of an RGB image of the face with the detected papules, pustules, and scabs visualized using solid circles [1000], dashed circles [1010], and squares [1020], respectively.
[0055] FIG. 1 1 is a flowchart that describes a method used by the system to register face images over time [1 100] The newly submitted image (subsequent image) is registered to the reference image (i.e. a prior image of the subject) by finding the 2D transformation between the face images. The spatial transformation is calculated using the coordinates of the five regions of interest trapezoids of the two images [1 110] The calculated
transformation is then applied to the five regions of interest from the reference image
[1 120] This warps the trapezoids that are referenced to the eye iris and the lip or nose positions so that the trapezoids on the subsequent face image that is slightly rotated or tilted match those of the reference image. This region of interest registration method is
implemented to ensure that the same acne is compared over time and the slight variation in the imaging conditions of the subsequent images (i.e. rotation of the face with respect to the camera) does not affect the tracking aspect of the software [1 130] [0056] FIG. 12 shows exemplary monochrome renderings of RGB images of the face - reference image [1200] and subsequent image [1210] - and the registered image
superimposed on the reference image [1220]
[0057] The method thus comprises collecting the locations, number and features of the acne lesions (such as size, type, stage, severity, characteristics and classification) as a data set that can be compared from time to time to determine disease progression, healing or other changes that may be diagnostically, therapeutically and/or cosmetically useful. In one embodiment, the method can be employed to compare the facial appearance of a subject before and after make-up is applied. The method can further comprise collecting sensor information from the smart phone comprising at least one of geographic location, time and date, ambient light levels, smart phone camera settings and characteristics and correlating these with the measurements as part of the dataset. The method can further comprise correlating the geographic location and time and date with external databases containing weather data, sunlight, ultraviolet (UV) light exposure, population statistics such as mortality, disease incidence and similar measures and correlating them with the image analysis. The method can further comprise collecting biographic information from the subject comprising at least one of age, gender, disease status, pain status, medication status, treatment status, diet, cleansing routine, medical history or other useful biographic, treatment, and/or environmental variables and correlating them with the image analysis.
[0058] While the foregoing description of the systems and methods is directed to imaging of the face for diagnosis and monitoring of chronic acne disease, it is obvious for one skilled in the art that the method is equally applicable to diagnosis and monitoring other skin chronic conditions or conditions in different skin regions.
[0059] While the described systems and methods refer to analysis of two-dimensional images it is also obvious that the method is not limited to two dimensional images but may be applied to three-dimensional images such as those captured using depth camera, multi- angle imaging reconstruction or any other method of creating a three-dimensional image of an object. [0060] While a number of exemplary aspects and embodiments have been discussed above, those of skill in the art will recognize certain modifications, permutations, additions and sub-combinations thereof. It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions and sub-combinations as are consistent with the broadest interpretation of the specification as a whole.
References
[1] Mallon E, Newton JN, Klassen A, Stewart-Brown SL, Ryan TJ, Finlay AY. The quality of life in acne: a comparison with general medical conditions using generic questionnaires. British Journal of Dermatology. 1999 Apr 1 ; 140(4):672-6.
[2] Halvorsen, J.A., Stern, R.S., Dalgard, F., Thoresen, M., Bjertness, E. and Lien, L., 201 1.
Suicidal ideation, mental health problems, and social impairment are increased in adolescents with acne: a population-based study. Journal of Investigative Dermatology, 131 (2), pp.363-370.
[3] Rizova E, Kligman A. New photographic techniques for clinical evaluation of acne. J Eur Acad Dermatol Venereol. 2001 ; 15:13-18.
[4] Dreno B, Poli F. Epidemiology of acne. Dermatology 2003; 206: 7-10.
[5] Tan K. Current measures for the evaluation of acne severity. Expert Rev Dermatol 2008;
3: 595-603
[6] Cordin L, Linderberg S, Hurtado M, Hill K, Eaton S. Acne Vulgaris: a disease of western civilization. Arch Dermatol 2002; 138: 1584-1590.
[7] Abas, Fazly Salleh, Benjamin Kaffenberger, Joseph Bikowski, and Metin N. Gurcan.
"Acne image analysis: lesion localization and classification." In SPIE Medical Imaging, pp. 97850B-97850B. International Society for Optics and Photonics, 2016.
[8] Alamdari, Nasim, Kouhyar Tavakolian, Minhal Alhashim, and Reza Fazel-Rezai.
"Detection and classification of acne lesions in acne patients: A mobile application." In Electro Information Technology (EIT), 2016 IEEE International Conference on, pp. 0739- 0743. IEEE, 2016.
[9] Kittigul, Natchapol, and Bunyarit Uyyanonvara. "Automatic acne detection system for medical treatment progress report." In 2016 7th International Conference of Information and Communication Technology for Embedded Systems (IC-ICTES), pp. 41 -44. IEEE, 2016.
[10] Lucchina LC, Kollias N, Gillies R, Phillips SB, Muccini JA, Stiller MJ, et al.
Fluorescence photography in the evaluation of acne. J Am Acad Dermatol. 1996;35:58- 63.
[1 1] Pagnoni A, Kligman AM, Kollias N, Goldberg S, Stoudemayer T. Digital fluorescence photography can assess the suppressive effect of benzoyl peroxide on
Propionibacterium acnes. J Am Acad Dermatol. 1999;41 :710-716. [12] Lee WL, Shalita AR, Poh-Fitzpatrick MB. Comparative studies of porphyrin production in Propionibacterium acnes and Propionibacterium granulosum. J Bacteriol. 1978;133:811-815.
[13] McGinley KJ, Webster GF, Leyden JJ. Facial follicular porphyrin fluorescence: correlation with age and density of Propionibacterium acnes. Br J Dermatol.
1980;102:437-441.
[14] Chiang A, Hafeez F, Maibach HI. Skin lesion metrics: role of photography in acne.
Journal of Dermatological Treatment. 2014 Apr 1 ;25(2): 100-5.
[15] Phillips SB, Kollias N, Gillies R, Muccini JA, Drake LA. Polarized light photography enhances visualization of inflammatory lesions of acne vulgaris. J Am Acad Dermatol.
1997;37:948-952.
[16] Fuji H, Yanagisawa T, Mitsui M, et al. Extraction of acne lesion in acne patients from multispectral images. 30th Annual IEEE EMBS Conference; 20-24 August 2008;
Vancouver, British Columbia, Canada. New York: IEEE; 2008:4078-4081.
[17] R. Ramli, A. S. Malik, A. F. M. Hani, A. Jamil, "Acne analysis grading and
computational assessment methods: an overview", Skin Res. Technol., vol. 18, no. 1 , pp. 1-14, Feb. 2012.
[18] N. Batool, R. Chellappa, "Detection and inpainting of facial wrinkles using texture orientation fields and Markov random field modeling", IEEE Trans. Image Process., vol. 23, no. 9, pp. 3773-88, Sep. 2014.

Claims

CLAIMS:
1. A method for assessment of facial acne in a human subject comprising:
(a) obtaining a color digital image of a face of the subject;
(b) analyzing the digital image to locate facial features;
(c) normalizing the digital image based on the facial features;
(d) defining one or more regions of interest in the digital image;
(e) identifying the location of acne lesions within or overlapping the boundary of the one or more regions of interests; and
(f) determining features of the acne lesions, wherein the features are selected from the group consisting of the size, type, stage, severity, characteristics and classification of the acne lesions.
2. The method of claim 1 , comprising storing acne data for the subject comprising one or more of the number, location and the features of the acne lesions in an electronic health record specific to the subject.
3. The method of claim 1 or claim 2, comprising comparing the acne data to previously determined reference acne data for the subject and providing a report indicating any changes in one or more of the number, location and the features of the acne lesions over time.
4. The method of claim 3, comprising collecting biographical and treatment data
relevant to the subject, wherein the biographical and treatment data comprises at least one of age, gender, disease status, pain status, medication status, medical history, treatment history, skin cleansing routine and diet of the subject.
5. The method of claim 4, comprising storing the biographical and treatment data in the electronic health record.
6. The method of claim 4 or claim 5, comprising correlating the biographical and
treatment data and any changes in one or more of the number, location and the features of the acne lesions over time.
7. The method of claim 6, comprising providing the subject with acne treatment recommendations based on the correlating, wherein the acne treatment
recommendations are selected from the group consisting of skin cleansing, medical treatment and diet recommendations.
8. The method of any one of claims 1-7, wherein obtaining the digital image comprises capturing the digital image by a digital camera incorporated in a mobile wireless device, wherein the wireless device is selected from the group consisting of a smart phone, a cell phone, a tablet-type computer, notebook-type computer and a wireless computer.
9. The method of any one of claims 1-8, wherein the method of assessment is
performed by software executable by one or more processors, wherein the one or more processors are located on a mobile wireless device and/or a cloud-based computer system.
10. The method of any one of claims 1-9, where the normalizing the digital image
comprises identifying the boundaries of the face of the subject, searching for substantially circular regions corresponding to the irises of the eyes of the subject, optionally rotating the image around center of an imaginary line between the irises until the irises are aligned substantially horizontal with respect to one another, resampling the image such that the left and right irises have a fixed number of pixels between them, wherein the center of the left iris is located at a pre-defined reference coordinate in the image, and wherein the boundaries of the image are set to a predefined distance to the left and right of the center of the irises and above and below the center of the irises, and cropping the image based on the distance between the center points on the irises and the rule of facial fifths.
11. The method of any one of claims 1-10, wherein the regions of interest are
trapezoidal in shape.
12. A system for assessment of facial acne in a human subject comprising: (a) a mobile device comprising a camera for capturing a color digital image of a face of the subject; and
(b) software executable by one or more processors configured to analyze the digital image to locate facial features; normalize the digital image based on the facial features; define one or more regions of interest in the digital image; identify the location of acne lesions within or overlapping the boundary of the one or more regions of interests; and determine features of the acne lesions, wherein the features are selected from the group consisting of the size, type, stage, severity, characteristics and classification of the acne lesions.
13. The system of 12, wherein said mobile device is a mobile wireless device, wherein the wireless device is selected from the group consisting of a smart phone, a cell phone, a tablet-type computer, a notebook computer and a wireless computer.
14. The system of claim 12 or claim 13, wherein the mobile device has Internet
connectivity and wherein the one or more processors comprises a cloud processing server.
15. The system of claim 12 or claim 13, wherein the one or more processors comprises a processor of the mobile device.
16. The system of any one of claims 12-15, wherein the software is operable to
determine whether image capture conditions of the digital image are acceptable for further processing of the digital image.
17. The system of any one of claims 12-16, wherein the mobile device is configured to provide an interface enabling a user of the mobile device to input biographical and treatment data relevant to the subject, wherein the biographical and treatment data comprises at least one of age, gender, disease status, pain status, medication status, medical history, treatment history, skin cleansing routine and diet of the subject.
18. The system of any one of claims 12-17, comprising electronic health record software executable on a cloud database server for storing and retrieving acne data for the subject comprising one or more of the number, location and the features of the acne lesions specific to the subject.
19. The system of claim 18, wherein the health record software is executable to
produce a report indicating any changes in the acne data over time.
20. A method for assessment of facial acne in a human subject comprising:
(a) obtaining a color digital image of a face of the subject;
(b) analyzing the digital image to locate facial features;
(c) identifying the boundaries of the face;
(d) Identifying substantially circular regions within the digital image corresponding to the irises of the eyes of the subject;
(e) optionally rotating the image around the center of an imaginary line between the irises until the irises are aligned substantially horizontal with respect to one another;
(f) resampling the image such that the left and right irises have a fixed number of pixels between them, wherein the center of the left iris is located at a pre-defined reference coordinate in the image, and wherein the boundaries of the image are set to a predefined distance to the left and right of the center of the irises and above and below the center of the irises;
(g) cropping the image based on the distance between the center points on the irises and the rule of facial fifths;
(h) defining one or more regions of interest in the image of the face;
(i) analyzing and determining the orientation of the face and scaling the location and dimensions of the regions of interest based on the orientation;
(j) identifying the location of acne lesions located within or overlapping the
boundary of the one or more of the regions of interests; and
(k) determining features of the acne lesions, wherein the features are selected from the group consisting of the size, type, stage, severity, characteristics and classification of the acne lesions.
21. The method of claim 20, comprising comparing the number, location and features of the acne lesions to previously determined reference acne data for the subject and providing a report indicating any changes in the number, location and/or features of the acne lesions over time.
22. The method of claim 20 or 21 , comprising collecting biographical and treatment data relevant to the subject, wherein the biographical and treatment data comprises at least one of age, gender, disease status, pain status, medication status, medical history, treatment history, skin cleansing routine and diet.
23. The method of claim 22, comprising correlating the biographical and treatment data and any changes in one or more of the number, location and the features of the acne lesions over time.
24. The method of claim 23, comprising providing the subject with acne treatment
recommendations based on the correlating, wherein the acne treatment
recommendations are selected from the group consisting of skin cleansing, medical treatment and diet recommendations.
25. A system for assessment of facial acne in a human subject comprising:
(a) a mobile device having Internet connectivity and comprising a camera for
capturing a color digital image of a face of the subject;
(b) a mobile application executable by a mobile device processor configured to
collect sensor data relating to the digital image; and
(c) processing software executable by a cloud processing server configured to
analyze the digital image and sensor data received from the mobile device, wherein the processing software is operable to locate facial features; normalize the digital image based on the facial features; define one or more regions of interest in the digital image; identify the location of acne lesions within or overlapping the boundary of the one or more regions of interests; and determine features of the acne lesions, wherein the features are selected from the group consisting of the size, type, stage, severity, characteristics and classification of the acne lesions.
PCT/CA2019/050108 2018-01-29 2019-01-29 Systems and methods for automated facial acne assessment from digital photographic images Ceased WO2019144247A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862623487P 2018-01-29 2018-01-29
US62/623,487 2018-01-29

Publications (1)

Publication Number Publication Date
WO2019144247A1 true WO2019144247A1 (en) 2019-08-01

Family

ID=67395184

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2019/050108 Ceased WO2019144247A1 (en) 2018-01-29 2019-01-29 Systems and methods for automated facial acne assessment from digital photographic images

Country Status (1)

Country Link
WO (1) WO2019144247A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210201008A1 (en) * 2019-12-31 2021-07-01 L'oreal High-resolution and hyperspectral imaging of skin
CN113499036A (en) * 2021-07-23 2021-10-15 厦门美图之家科技有限公司 Skin monitoring method and device, electronic equipment and computer readable storage medium
CN116322486A (en) * 2020-10-20 2023-06-23 坎菲尔德科技有限公司 Acne severity grading method and device
WO2024249717A1 (en) 2023-05-31 2024-12-05 The Procter & Gamble Company Method and system for determining cosmetic skin attributes based on disorder value
WO2024249716A1 (en) 2023-05-31 2024-12-05 The Procter & Gamble Company Method and system for visualizing color gradient of human face or cosmetic skin attributes based on such color gradient
US12190515B2 (en) * 2023-01-27 2025-01-07 BelleTorus Corporation Compute system with acne diagnostic mechanism and method of operation thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7454046B2 (en) * 2005-09-20 2008-11-18 Brightex Bio-Photonics, Llc Method and system for analyzing skin conditions using digital images
CA2690952C (en) * 2009-11-13 2013-03-26 Institute For Information Industry Facial skin defect resolution system, method and computer program product
EP3017755A1 (en) * 2014-11-04 2016-05-11 Samsung Electronics Co., Ltd. Electronic device, and method for analyzing face information in electronic device
US20170270350A1 (en) * 2016-03-21 2017-09-21 Xerox Corporation Method and system for assessing facial skin health from a mobile selfie image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7454046B2 (en) * 2005-09-20 2008-11-18 Brightex Bio-Photonics, Llc Method and system for analyzing skin conditions using digital images
CA2690952C (en) * 2009-11-13 2013-03-26 Institute For Information Industry Facial skin defect resolution system, method and computer program product
EP3017755A1 (en) * 2014-11-04 2016-05-11 Samsung Electronics Co., Ltd. Electronic device, and method for analyzing face information in electronic device
US20170270350A1 (en) * 2016-03-21 2017-09-21 Xerox Corporation Method and system for assessing facial skin health from a mobile selfie image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ESKANDARI, MARYAM, FUSION OF FACE AND IRIS BIOMETRICS FOR PERSON IDENTITY VERIFICATION, 2014, Retrieved from the Internet <URL:https://pdfs.semanticscholar.org/4a05/351596a9aa762682914a1833c68ca793d481.pdfhttp://i-rep.emu.edu.tr:8080/xmlui/handle/11129/3836> *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210201008A1 (en) * 2019-12-31 2021-07-01 L'oreal High-resolution and hyperspectral imaging of skin
US12190623B2 (en) * 2019-12-31 2025-01-07 L'oreal High-resolution and hyperspectral imaging of skin
CN116322486A (en) * 2020-10-20 2023-06-23 坎菲尔德科技有限公司 Acne severity grading method and device
CN113499036A (en) * 2021-07-23 2021-10-15 厦门美图之家科技有限公司 Skin monitoring method and device, electronic equipment and computer readable storage medium
US12190515B2 (en) * 2023-01-27 2025-01-07 BelleTorus Corporation Compute system with acne diagnostic mechanism and method of operation thereof
WO2024249717A1 (en) 2023-05-31 2024-12-05 The Procter & Gamble Company Method and system for determining cosmetic skin attributes based on disorder value
WO2024249716A1 (en) 2023-05-31 2024-12-05 The Procter & Gamble Company Method and system for visualizing color gradient of human face or cosmetic skin attributes based on such color gradient

Similar Documents

Publication Publication Date Title
US10949965B2 (en) System and method for assessing wound
WO2019144247A1 (en) Systems and methods for automated facial acne assessment from digital photographic images
US10499845B2 (en) Method and device for analysing an image
US10032287B2 (en) System and method for assessing wound
De Greef et al. Bilicam: using mobile phones to monitor newborn jaundice
Likitlersuang et al. Egocentric video: a new tool for capturing hand use of individuals with spinal cord injury at home
US11854200B2 (en) Skin abnormality monitoring systems and methods
US20210174505A1 (en) Method and system for imaging and analysis of anatomical features
Karargyris et al. DERMA/care: An advanced image-processing mobile application for monitoring skin cancer
US20190340762A1 (en) Skin Abnormality Monitoring Systems and Methods
US11798189B2 (en) Computer implemented methods and devices for determining dimensions and distances of head features
US20180279943A1 (en) System and method for the analysis and transmission of data, images and video relating to mammalian skin damage conditions
Anping et al. Assessment for facial nerve paralysis based on facial asymmetry
CN111524080A (en) Face skin feature identification method, terminal and computer equipment
Nejati et al. Smartphone and mobile image processing for assisted living: Health-monitoring apps powered by advanced mobile imaging algorithms
Do et al. Early melanoma diagnosis with mobile imaging
KR102274330B1 (en) Face Image Analysis Method and System for Diagnosing Stroke
EP3413790A1 (en) Systems and methods for evaluating pigmented tissue lesions
Abas et al. Acne image analysis: lesion localization and classification
KR102172192B1 (en) Facial Wrinkle Recognition Method, System, and Stroke Detection Method through Facial Wrinkle Recognition
ES3034281T3 (en) Method and system for anonymizing facial images
Amini et al. Automated facial acne assessment from smartphone images
Fadzil et al. Independent component analysis for assessing therapeutic response in vitiligo skin disorder
Ko et al. Image-processing based facial imperfection region detection and segmentation
Nethravathi et al. Acne Vulgaris Severity Analysis Application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19743861

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM XXXX DATED 13.11.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 19743861

Country of ref document: EP

Kind code of ref document: A1