[go: up one dir, main page]

WO2021138477A1 - Image process systems for skin condition detection - Google Patents

Image process systems for skin condition detection Download PDF

Info

Publication number
WO2021138477A1
WO2021138477A1 PCT/US2020/067548 US2020067548W WO2021138477A1 WO 2021138477 A1 WO2021138477 A1 WO 2021138477A1 US 2020067548 W US2020067548 W US 2020067548W WO 2021138477 A1 WO2021138477 A1 WO 2021138477A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
calibration
image
data
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2020/067548
Other languages
French (fr)
Inventor
Kyle Yeates
Ozgur Yildirim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LOreal SA
Original Assignee
LOreal SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LOreal SA filed Critical LOreal SA
Publication of WO2021138477A1 publication Critical patent/WO2021138477A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • Embodiments of the present disclosure relate to image processing.
  • image processing techniques are employed for skin condition diagnostics and/or treatment.
  • calibration techniques can be employed.
  • a computer implemented method for determining changes in a skin condition of a subject comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over time, wherein each image taken is separated in time by a time period; and determining one or more differences between the plurality of images.
  • the method may further comprise generating an image map of the area of interest, the image map indicative of the differences between the plurality of images.
  • the image map indicates changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest.
  • the method may further comprise determining a skin condition based on the image map.
  • the method may further comprise recommending one of a treatment or a product based on the determined skin condition.
  • the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
  • the time period is selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, six months, and one year.
  • the method may further comprise notifying the user that a change has been detected if the difference detected is greater than a preselected threshold value.
  • the user is notified via an email, a text message, or a banner notification on a user device.
  • the area of interest includes a back, a face, an arm, a neck, a shoulder, or regions thereof.
  • the method may further comprise determining a skin condition based on the determined differences; and recommending one of a treatment or a product based on the determined skin condition.
  • the plurality of images are obtained by a camera, the camera including one or more adjustable settings, and wherein the method further comprises: automatically calibrating the camera by adjusting one or more of the camera settings based on an associated calibration device.
  • the method may further comprise calibrating the captured images based on calibration data presented in the one or more of the captured images.
  • the calibration data is provided by a calibration device captured in the images.
  • the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
  • the attribute is a color
  • the attribute is indicia indicative of a color
  • the method further comprises obtaining calibration data based on the indicia by accessing a data store linked to the indicia.
  • the indicia includes a QR codes or a bar code.
  • the method may further comprise obtaining calibration data from a calibration device captured by the image.
  • the calibration device is a cosmetics apparatus or packaging associated therewith.
  • a system for determining changes in a skin condition of a subject includes a camera configured to capture one or more images; and one or more processing engines including circuitry configured to: cause the camera to capture one or more images of an area of interest associated with the subject, the one or more images taken sequentially over time so as to obtain a plurality of images separated in time by a time period selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, and six months, and one year; determine one or more differences between the captured images, the differences indicative of changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest; and determine a skin condition based on the determined differences or identifying the object for subsequent analysis if the differences are greater than a preselected threshold.
  • the one or more processing engines include circuitry configured to: determine the skin condition based on the determined differences; and recommend a treatment protocol or a product based on the determined skin condition.
  • the one or more processing engines include circuitry configured to determine changes in one or more of: size, shape, color, uniformity of an existing lesion, detect new lesions, detect the absence of previously detected lesion(s), or detect a progression of a lesion.
  • the one or more processing engines include circuitry configured to: detect a progression of a lesion from the detected differences in the plurality of images; and determine one or more stages of the lesion based on the detected progression of the lesion.
  • the one or more processing engines includes: a user interface engine including circuitry configured to cause the camera to capture the plurality of images; an image analysis engine including circuitry for comparing two or more images using a similar/difference algorithm to determine one or more differences between the images; and a skin condition engine including circuity configured for analyzing an image map of the determined one or more differences to locate a lesion, and for determining the stage of the lesion located in the image map.
  • the one or more processing engines further includes: a recommendation engine including circuity configured to recommend a treatment protocol and/or product for each region based at least on the determined skin condition.
  • the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
  • the system may further comprise a calibration engine including circuitry configured to calibrate the camera prior to image capture for generating calibrated images or to calibrate the images captured by the camera for generating calibrated images, the calibration engine obtaining calibration data from a calibration device.
  • the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference.
  • the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
  • the attribute is a reference color.
  • the attribute is indicia indicative of a color or color data
  • the calibration engine is configured to retrieve the calibration data based on the indicia by accessing a data store linked to the indicia.
  • the indicia include a QR codes or a bar code.
  • a computer- implemented method for determining changes in a skin condition of a subject comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; determining a skin condition based on at least the plurality of images; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
  • the obtaining, by a first computing device, a plurality of images of an area of interest associated with the subject includes capturing, by a camera of a first computing device, the plurality of images.
  • the determining a skin condition based on least the plurality of images or the determining at least one product recommendation based on at least the determined skin condition is carried out by a second computing device remote from the first computing device.
  • the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
  • a computer- implemented method for determining changes in a skin condition of a subject.
  • the method comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; calibrating the plurality of images based on calibration data obtained via the images; determining a skin condition based on least the plurality of images calibrated; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
  • a computer implemented method for accurate skin diagnosis.
  • the method includes calibrating, by a computing device, one or more images of an area of interest associated with a subject; and determining a skin condition based on the one or more calibrated images.
  • the one or more images includes a plurality of images taken sequentially over a period of time, and wherein the determining a skin condition is based on the plurality of images.
  • the method may further comprise generating a treatment protocol and/or product recommendation for an area of interest of the subject based on the determined skin condition.
  • the calibrating, by a computing device, one or more images of an area of interest associated with a subject includes obtaining calibration data from a calibration device; calibrating a camera based on the calibration data; and capturing the one or more images of a user with the calibrated camera.
  • the method may further comprise generating, via the calibration device, light meter data or color temperature data of the subject; receiving the calibration data from the calibration device; adjusting one or more camera settings for calibrating the camera prior to image capture.
  • the calibrating, by a computing device, one or more images of an area of interest associated with a subject includes capturing the one or more images via a camera associated with the computing device; obtaining calibration data from a calibration device associated with the one or more images captured by the camera; and calibrating the one or more images captured by the camera based on the calibration data.
  • the method may further comprise generating, via the calibration device, light meter data or color temperature data of the subject; receiving the light meter data and/or color meter data from the calibration device, and using the light meter data and/or color meter data obtained from the calibration device to calibrate the captured images.
  • the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference, the method further comprising capturing at least one image of the subject in the presence of the calibration device.
  • the calibration device is a cosmetics apparatus.
  • a method in accordance with another aspect of the present disclosure, includes obtaining calibration data from a calibration device; and generating, by a computing device, calibrated images by one of: calibrating a camera of a mobile computing device based on the calibration data and capturing one or more images of a user with the calibrated camera; or calibrating one or more images captured with the camera based on the calibration data.
  • the method may further comprise determining a skin condition based on the one or more calibrated images. In any embodiment, the method may further comprise recommending one or more of: a skin treatment protocol; and a product configured to treat the skin condition.
  • the method may further comprise operating the calibration device to generate data indicative of light meter data or color temperature data of the subject.
  • the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data as calibration data.
  • the images captured include an area of interest associated with the user, the images of the area of interest usable for tracking skin conditions of the subject.
  • the method may further comprise capturing the one or more image of the subject in the presence of the calibration device.
  • the calibration device includes calibration data represented by one selected from the group consisting of: a color card, a color chip and a calibration reference.
  • the method may further comprise generating the calibrated image by adjusting one or more camera settings based on the calibration data.
  • the calibration engine automatically detects a color reference associated with the captured image and uses the color reference in order to correct the colors of the captured image to generate calibrated images.
  • the method may further comprise comparing the image captured to a reference image stored in a calibration data store.
  • a computer system in accordance with another aspect of the present disclosure, includes a user interface engine including circuitry configured to cause an image capture device to capture images of the user; a calibration engine including circuitry configured to calibrate the image capture device prior to image capture for generating calibrated images or to calibrate the images captured by the image capture device for generating calibrated images, the calibration engine obtaining calibration data from a calibration device; and a skin condition engine configured to determine a skin condition of the user based on the generated calibrated images image.
  • system may further comprise a recommendation engine including circuity configured to recommend a treatment protocol or a product based at least on the determined skin condition.
  • the calibration device includes one or more sensors configured to generate data indicative of calibration data, and wherein the calibration engine is configured to receive the calibration data and adjust one or more suitable camera settings for calibrating the camera prior to image capture.
  • the calibration device includes an attribute suitable for use by the calibration engine to generate the calibrated images.
  • the attribute is a color or indicia indicative of a color
  • the calibration engine configured to obtain calibration data based on the indicia.
  • the calibration engine includes circuitry configured to obtain the calibration data from the image captured by the image capture device, the image captured including an image of the calibration device.
  • the calibration device is a cosmetics apparatus or packaging associated therewith.
  • the calibration engine is configured to: automatically detect a color reference associated with the captured image; and use the color reference in order to correct the colors of the captured image to generate calibrated images.
  • the calibration device includes one selected from the group consisting of: a color card, a color chip and a calibration reference.
  • the calibration engine includes circuitry configured to extract calibration data from the captured image and generate the calibrated images.
  • the calibration engine includes circuitry configured to adjust one or more image attributes of the image after image capture for calibrating the image.
  • the image attributes includes white balance, brightness, and/or color values.
  • the user interface engine includes circuitry configured for creating a subject profile and storing the subject profile in a subject data store.
  • the subject profile includes previously provided recommendations, biographically data of the subject, area(s) of interest of the subject, current medicaments used, cosmetic products used, and/or treatment protocols used.
  • the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data.
  • the generated light meter data and/or color meter data is used by the calibration engine to calibrate either the image capture device or the images captured by the image capture device.
  • the calibration engine is configured to receive the calibration data from the calibration device and adjust one or more camera settings for calibrating the camera prior to image capture.
  • the calibration engine is configured to use the light meter data and/or color meter data obtained from the calibration device to calibrate the images captured by the image capture device.
  • the captured images are calibrated by adjustment to one or more settings of the image capture device based on the calibration data
  • the calibration engine includes circuitry configured to compare the images captured to one or more reference images stored in the calibration data store to determine color calibration data for the captured images.
  • the calibration device includes one or more colors with a known color value
  • the calibration engine includes circuitry configured to retrieve from a calibration data store color data associate with the known color value
  • a computing device configured to: analyze a first image taken at time T1 and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine at least one product recommendation based on the at least one difference between the first image and the second image; and provide the at least one product recommendation to the subject.
  • a computing device configured to: analyze a first image taken at time Tl and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine if the difference between the first image and the second image is greater than a preselected value; identify the area of the images in which the differences are greater than the preselected value; and recommend review of the area identified.
  • FIG. 1 is a schematic diagram that illustrates a non-limiting example of a system for detecting and/or diagnosing skin conditions of a user according to an aspect of the present disclosure
  • FIG. 2 is a block diagram that illustrates a non-limiting example of a mobile computing device according to an aspect of the present disclosure
  • FIG. 3 is a block diagram that illustrates a non-limiting example of a server computing device according to an aspect of the present disclosure
  • FIG. 4 is a flowchart that illustrates a non-limiting example of a method for detecting and/or diagnosing a skin condition according to an aspect of the present disclosure
  • FIG. 5 is a schematic diagram that illustrates a non-limiting example of a system for calibrating images of a user according to an aspect of the present disclosure, the calibrated images being suitable for use in applications such as diagnosing skin conditions, facial recognition, cosmetic recommendations, etc., an example of which is set forth in FIG. 1 ;
  • FIG. 6 is a block diagram that illustrates a non-limiting example of a mobile computing device according to various aspects of the present disclosure
  • FIG. 7 is a block diagram that illustrates a non-limiting example of a server computing device according to an aspect of the present disclosure
  • FIG. 8 is a flowchart that illustrates a non-limiting example of a method for generating calibrated images according to an aspect of the present disclosure.
  • FIG. 9 is a block diagram that illustrates a non-limiting example of a computing device appropriate for use as a computing device with embodiments of the present disclosure.
  • Any changes in skin conditions over time may be used as a diagnosis and/or treatment aid for a physician. Any changes in skin conditions over time may be also used in a computer implemented method that provides diagnosis and/or treatment recommendations.
  • improved image capture methodologies and technologies can be used.
  • these examples of methodologies and technologies for improved image capture are non-limiting.
  • the methodologies and technologies for improved image capture can be additionally or alternatively used in other applications, such as product selection, facial recognition, etc., or any application that would benefit by improved image capture.
  • the disclosed subject matter provides examples of systems and methods for detecting a skin condition, such as acne, by looking at multiple images of a user taken at different points in time (e.g., once a day for 1-2 weeks, once a day for a month, etc.) and using image processing techniques to detect changes of size, shape, color, uniformity, etc., of areas of the image to determine whether the changes represent characteristics (e.g., blemishes) caused by a skin condition (e.g., acne).
  • the images can be captured by a camera of the consumer product (e.g., mobile phone, tablet, etc.) and then transferred to a computer system that stores the images for subsequent access and analysis.
  • the computer system is part of the consumer product (e.g., mobile phone, tablet, etc.). After a number of images are collected, the computer system compares the images for detecting changes in the images over time (e.g., from the earliest image to the latest image). If any changes are detected, skin condition analysis can be carried out in some embodiments to determine how many acne blemishes exist, how severe the user's acne is, what stage of acne each blemish is in, etc.
  • the system and methods in some examples can recommend a treatment based on results of the skin condition analysis.
  • the treatment recommendation can include one or more treatment protocols and may include, for example, one or more product recommendations.
  • the systems and methods can track the efficacy of the recommendation and can train the system for improved recommendations in subsequent uses.
  • features on the face are static (e.g., location of nose, lips, chin, moles, freckles, etc.) relative to acne blemishes.
  • Acne blemishes last anywhere from 5-10 days to months, and during this span the acne blemish follows an understood trajectory (e.g., blocked pore, black head, white head, papule, pustule, lesion, scar).
  • Each stage of the blemish has unique colors and sizes relative to the other stages.
  • multiple images of an area of interest of the user taken over time can be analyzed via image processing techniques for determining changes in skin condition(s). If the changes to certain areas (e.g., pixel groups) of the images match, for example, the progression of a known skin condition (e.g., an acne blemish), the systems and methods in some examples identify groups of pixels as a blemish and can create an acne profile of the user associated with this area of interest.
  • the profile may include, for example, assignment of an acne stage(s) to each blemish or sections thereof. This profile can then be matched to suggested products and treatment protocols to address the skin condition. While the face is described in some embodiments, other body locations of the user can be monitored, such as the back, the chest, arms, etc.
  • multiple areas of interest can be analyzed, and an acne profile can be generated for each area of interest.
  • the system and methods again capture images of an area of interest (e.g., the back) taken at different points in time.
  • the time period is extended (e.g., every 6 months, every year).
  • the images are then transferred to a computer system that stores the images for subsequent access and analysis.
  • the computer system is part of the image capture device (e.g., mobile phone, tablet, etc.).
  • the computer system can compare the images to identify, for example, new lesions (e.g. moles, sun spots, aging spots, etc.) that did not exist before, or flag lesions that underwent a change (e.g., size, shape, color, uniformity etc.) greater than a predetermined threshold (e.g., 2-5% change).
  • new lesions e.g. moles, sun spots, aging spots, etc.
  • flag lesions that underwent a change e.g., size, shape, color, uniformity etc.
  • a predetermined threshold e.g. 2-5% change
  • Examples of the present disclosure also relate to systems and methods for generating more accurate image(s) of a user via a camera of a consumer product (e.g., mobile phone, tablet, laptop, etc.) for subsequent use in, for example, computer implemented applications, such as skin diagnosis, including the examples briefly described above, or for facial recognition or cosmetic simulation, selection and/or recommendation, etc.
  • a consumer product e.g., mobile phone, tablet, laptop, etc.
  • Examples of the systems and methods improve image accuracy and quality by addressing issues relating to unpredictable and inconsistent lighting conditions, among others.
  • the system includes a mobile computing device and an object with known lighting and/or color attributes (e.g., a reference). Such an object acts as a calibration device for images to be captured by the mobile computing device.
  • the calibration device can provide light or color meter data, color card data, or other calibration data to the mobile computing device. By accessing or receiving calibration data from the calibration device, the mobile computing device can generate calibrated images to compensate for non-uniform lighting conditions, for example.
  • the calibration data can be used prior to image capture for camera setting(s) adjustment. In other embodiments, the calibration data can be alternatively used after image capture for calibrating the images when the captured images are processed for storage.
  • examples of the systems and methods provide an extremely powerful tool that can be deployed on a simple consumer product, such as a smart phone, tablet, etc., with optional cloud or server storage systems for assisting dermatologists in identifying potential problems, such as cancer.
  • these systems and methods can be deployed in consumer products owned or accessible to most users, these systems and methods can to utilized to assist the user in tracking the changes over time (e.g., reduction) of individual lesions (blemishes, acne lesions, dark spots, etc.) to demonstrate the effectiveness of their cosmetic interventions and to provide encouragement to continue such treatment by demonstrating the actual changes over time. If such treatment is shown by the systems and methods of the present disclosure to be ineffective, the user is able to change treatment protocols sooner than without such tools.
  • a computing system that includes, for example, a handheld smart device (e.g., a smart phone, tablet, laptop, game console, etc.) with a camera and memory.
  • An optional cloud data store can be accessed by the system for storage of images of the user at different time points with appropriate metadata (e.g., date, user ID, user annotations etc.).
  • the computing system also includes one or more image processing algorithms or engines that are either local to the handheld smart device or remote to the handheld smart device (e.g., server/cloud system) for analyzing the captured images.
  • an image processing algorithm or engine compares and interprets the gross changes of lesions over time to determine and flag (flag (e.g., identify, highlight, mark, etc.) a subset of lesions that are categorized as "suspicious.”
  • an image processing algorithm or engine compares and interprets the changes of lesions over time for generating a skin condition profile (e.g., acne profile).
  • a user interface can be presented by the handheld smart device to aid the user in image capture, image storage, access to previously stored images, interaction with the analysis engines and to notify and/or display any lesions flagged as suspicious by the system.
  • some methodologies and technologies of the disclosure are provided to a user as a computer application (i.e., an "App") through a mobile computing device, such as a smart phone, a tablet, a wearable computing device, or other computing devices that are mobile and are configured to provide an App to a user.
  • a mobile computing device such as a smart phone, a tablet, a wearable computing device, or other computing devices that are mobile and are configured to provide an App to a user.
  • the methodologies and technologies of the disclosure may be provided to a user on a computer device by way of a network, through the Internet, or directly through hardware configured to provide the methodologies and technologies to a user.
  • FIG. 1 is a schematic diagram that illustrates a non-limiting embodiment of a system for detecting changes in the skin condition of a user according to an aspect of the present disclosure.
  • a user 102 interacts with a mobile computing device 104.
  • the mobile computing device 104 may be used to capture one or more images of the user 102, from which at least one skin condition, such as acne, eczema, psoriasis, or suspicious lesion can be diagnosed.
  • the mobile computing device 104 can be used to capture one or more image(s) of the user's area of interest (e.g., back, face, neck, etc.) at different points in time (e.g., once a week, once a month, once every six months, once a year, etc.)
  • the mobile computing device 104 is used to process the collected images in order to determine changes of the area of interest over a selected period of time.
  • the selected period of time can be, for example, one week, one month, one year, etc.
  • the results of the processed images can then be used for diagnostic purposes by a physician. For example, the results of the processed images may indicate a suspicious lesion. The physician can then use the results to determine whether a biopsy or other further analysis should be made.
  • the mobile computing device 104 analyzes the changes reflected in the processed images for determining skin conditions associated with the area of interest. With this skin condition information, the mobile computing device may also be used for determining a product recommendation, treatment protocol, etc., to be presented to the user 102. The efficacy of the treatment protocol, product usage, etc., may then be tracked with subsequent image capture and analysis by the mobile computing device 104.
  • the mobile computing device 104 in some embodiments transmits the captured images to the server computing device 108 via a network 110 for image processing and/or storage.
  • the network 110 may include any suitable wireless communication technology (including but not limited to Wi-Fi, WiMAX, Bluetooth, 2G, 3G, 4G, 5G, and LTE), wired communication technology (including but not limited to Ethernet, USB, and FireWire), or combinations thereof.
  • FIG. 2 is a block diagram that illustrates a non-limiting example embodiment of a system that includes a mobile computing device 104 according to an aspect of the present disclosure.
  • the mobile computing device 104 is configured to collect information from a user 102 in the form of images of an area of interest.
  • the area of interest can be a specific body part of the user, such as the back, face, arm, neck, etc., or can be region(s) thereof, such as the forehead, chin, or nose of the face, the shoulder, dorsum, or lumbus of the back, etc.
  • the mobile computing device 104 may be a smartphone. In some embodiments, the mobile computing device 104 may be any other type of computing device having the illustrated components, including but not limited to a tablet computing device or a laptop computing device. In some embodiments, the mobile computing device 104 may not be mobile, but may instead by a stationary computing device such as a desktop computing device or computer kiosk. In some embodiments, the illustrated components of the mobile computing device 104 may be within a single housing. In some embodiments, the illustrated components of the mobile computing device 104 may be in separate housings that are communicatively coupled through wired or wireless connections (such as a laptop computing device with an external camera connected via a USB cable). The mobile computing device 104 also includes other components that are not illustrated, including but not limited to one or more processors, a non-transitory computer-readable medium, a power source, and one or more communication interfaces.
  • the mobile computing device 104 includes a display device 202, a camera 204, an image analysis engine 206, a skin condition engine 208, a user interface engine 210, a recommendation engine 212, and one or more data stores, such as a user data store 214, a product data store 216 and/or skin condition data store 218.
  • a display device 202 the mobile computing device 104 includes a display device 202, a camera 204, an image analysis engine 206, a skin condition engine 208, a user interface engine 210, a recommendation engine 212, and one or more data stores, such as a user data store 214, a product data store 216 and/or skin condition data store 218.
  • the display device 202 is an LED display, an OLED display, or another type of display for presenting a user interface.
  • the display device 202 may be combined with or include a touch-sensitive layer, such that a user 102 may interact with a user interface presented on the display device 202 by touching the display.
  • a separate user interface device including but not limited to a mouse, a keyboard, or a stylus, may be used to interact with a user interface presented on the display device 202.
  • the user interface engine 210 is configured to present a user interface on the display device 202.
  • the user interface engine 210 may be configured to use the camera 204 to capture images of the user 102.
  • a separate image capture engine may also be employed to carry out at least some of the functionality of the user interface 210.
  • the user interface presented on the display device 202 can aid the user in capturing images, storing the captured images, accessing the previously stored images, interacting with the other engines, etc.
  • the user interface presented on the display device 202 can also present one or more lesions that were flagged as suspicious by the system, and/or can present a treatment protocol to the user 102 with or without product recommendations.
  • the user interface engine 210 may also be configured to create a user profile.
  • Information in the user profile may be stored in a data store, such as the user data store 214.
  • Data generated and/or gathered by the system 100 e.g., images, analysis data, statistical data, user activity data, or other data
  • the user profile information may therefore incorporate information the user provides to the system through an input means, for example, such as a keyboard, a touchscreen, or any other input means.
  • the user profile may further incorporate information generated or gathered by the system 100, such as statistical results, recommendations, and may include information gathered from social network sites, such as FacebookTM, Instagram, etc.
  • the user may input information such as the user's name, the user's email address, social network information pertaining to the user, the user's age, user's area of interest, and any medications, topical creams or ointments, cosmetic products, treatment protocol, etc., currently used by the user, previously recommended treatments and/or products, etc.
  • the camera 204 is any suitable type of digital camera that is used by the mobile computing device 104.
  • the mobile computing device 104 may include more than one camera 204, such as a front-facing camera and a rear- facing camera.
  • any reference to images being utilized by embodiments of the present disclosure should be understood to reference video, images (one or more images), or video and images (one or more images), as the present disclosure is operable to utilize video, images (one or more images), or video and images (one or more images) in its methods and systems described herein.
  • the mobile computing device 104 may use an image capture engine (not shown) to capture images of the user.
  • the image capture engine is part of the user interface engine 210.
  • the image capture engine is configured to capture one or more images of an area of interest.
  • the area of interest can be for example the back, the face, the neck, the chest, or sections thereof, of the user 102.
  • the images can be captured by the user 102 as a "selfie," or the mobile computing device 104 can be used by a third party for capturing images of a user 102.
  • the image capture engine timestamps the captured image(s) and stores the images according to the user profile with other data, such as flash/camera settings.
  • the image capture engine may also send the images with the associated information to the server computer device 108 for storage, optional processing, and subsequent retrieval, as will be described in more detail below.
  • the image analysis engine 206 is configured to compare two or more images.
  • the image analysis engine 206 checks the timestamps of the images and runs a similar/difference algorithm or image processing routine.
  • the similar/difference algorithm determines or detects changes in size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc.
  • image analysis engine 206 compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious."
  • the lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold.
  • This subset of lesions can be highlighted on the image, represented in a skin condition map or profile, etc.
  • the image analysis engine 206 can identify the changes in the images as acne blemishes, which can also be highlighted on the image, represented in a skin condition map or profile, etc.
  • the skin condition engine 208 is configured to analyze, for example, the skin condition map or profile, and can determine, for example, the stages of acne for each region of the image. In doing so, the skin condition engine 208 can access data from the skin condition data store 218. In some embodiments, the skin condition engine 208 identifies a progression of a skin condition, such as acne (e.g., determined from an analyses of the images).
  • the skin condition engine 208 can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.).
  • a skin condition level e.g., acne stage, etc.
  • the recommendation engine 212 in some embodiments is configured to recommend a treatment protocol and/or product (e.g., topical formula, such as an ointment, cream, lotion, etc.) for each region based at least on the determined skin condition (e. g., stage of acne, etc.). In doing so, the recommendation engine 212 can access data from the product data store 216 and/or the user data store 214. Any recommendation generated by the recommendation engine 212 can be presented to the user in any fashion via the user interface engine 210 on display 202.
  • a treatment protocol and/or product e.g., topical formula, such as an ointment, cream, lotion, etc.
  • the recommendation engine 212 can access data from the product data store 216 and/or the user data store 214. Any recommendation generated by the recommendation engine 212 can be presented to the user in any fashion via the user interface engine 210 on display 202.
  • Engine refers to refers to logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVATM, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASP, Microsoft .NETTM, Go, and/or the like.
  • An engine may be compiled into executable programs or written in interpreted programming languages.
  • Software engines may be callable from other engines or from themselves.
  • the engines described herein refer to logical modules that can be merged with other engines or can be divided into sub-engines.
  • the engines can be stored in any type of computer-readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
  • Data store refers to any suitable device configured to store data for access by a computing device.
  • a data store is a highly reliable, high-speed relational database management system (DBMS) executing on one or more computing devices and accessible over a high-speed network.
  • DBMS relational database management system
  • Another example of a data store is a key-value store.
  • any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries may be used, and the computing device may be accessible locally instead of over a network or may be provided as a cloud-based service.
  • a data store may also include data stored in an organized manner on a computer-readable storage medium, such as a hard disk drive, a flash memory, RAM, ROM, or any other type of computer-readable storage medium.
  • a computer-readable storage medium such as a hard disk drive, a flash memory, RAM, ROM, or any other type of computer-readable storage medium.
  • FIG. 3 is a block diagram that illustrates various components of a non-limiting example of an optional server computing device 108 according to an aspect of the present disclosure.
  • the server computing device 108 includes one or more computing devices that each include one or more processors, non-transitory computer- readable media, and network communication interfaces that are collectively configured to provide the components illustrated below.
  • the one or more computing devices that make up the server computing device 108 may be rack- mount computing devices, desktop computing devices, or computing devices of a cloud computing service.
  • image processing and/or storage of the captured images can be additionally or alternatively carried out at an optional server computing device 108.
  • the server computing device 108 can receive captured and/or processed images from the mobile computing device 104 over the network 110 for processing and/or storage.
  • the server computing device 108 optionally includes an image analysis engine 306, a skin condition engine 308, a recommendation engine 312, and one or more data stores, such as a user data store 314, a product data store 316 and/or skin condition data store 318.
  • the image analysis engine 306, a skin condition engine 308, a recommendation engine 312, and one or more data stores, such as a user data store 314, a product data store 316 and/or skin condition data store 318 are substantially identical in structure and functionality as the image analysis engine 206, a skin condition engine 208, a recommendation engine 212, and one or more data stores, such as a user data store 214, a product data store 216 and/or skin condition data store 218 of the mobile computing device 104 illustrated in FIG. 2.
  • FIG. 4 is a flowchart that illustrates a non-limiting example embodiment of a method 400 for determining changes in skin conditions of a user according to various aspects of the present disclosure.
  • the method 400 also analyzes the changes in skin conditions and optionally recommends a treatment protocol and/or product to treat the user 102.
  • the following method steps can be carried out in any order or at the same time, unless an order is set forth in an express manner or understood in view of the context of the various operation(s). Additional process steps can also be carried out. Of course, some of the method steps can be combined or omitted in example embodiments.
  • the method 400 proceeds to block 402, where a mobile computing device 104 captures image(s) of the user 102 at a time (T n ).
  • the mobile computing device 104 uses the camera 204 to capture at least one image.
  • more than one image with different lighting conditions may be captured in order to allow an accurate color determination to be generated.
  • the captured image is of an area of interest to the user 102.
  • the area of interest can be one of face, the neck, the back, etc., for tracking lesions, such as moles, sun spots, acne, eczema, etc., skin condition analysis, etc.
  • the one or more images can be stored in the user data store 214 at the mobile computing device 104 and/or server computer 108.
  • additional data collected at the time of image capture can be associated with the images.
  • each image is time stamped, and may include other information, such as camera settings, flash settings, etc., area of interest captured, etc.
  • the user interface engine 210 can be used to create a user profile, as described above.
  • the user interface engine 210 may query the user to enter the intended location (e.g., back, face, arm, neck, etc.) so that the captured image can be associated with the user's area of interest.
  • the area of interest can be a specific body part of the user, such as the back, face, arm, neck, etc., or can be regions thereof, such as the forehead, chin, or nose of the face, the shoulder, dorsum, or lumbus of the back, etc.
  • the user interface engine 210 can be repeatedly used until all images are captured.
  • the captured images are stored in the user data store 214. If stored at the server computer 108 in user data store 314, the mobile computing device 104 can transmit the images over the network 110.
  • Images of the same area of interest are then captured sequentially over a period of time (Ti, T2, T3, T n ) at block 402.
  • the images can be captured daily, weekly, bi-weekly, monthly, bi-monthly, semi-annually, annually, etc.
  • the period of image capture can change during observation of the area of interest. For example, if an area of interest is flagged by the system, the user is notified by the system or if the user notices changes when reviewing one or more of the captured images, the frequency of image capture can be adjusted accordingly.
  • the images captured over a period of time are processed by the image analysis engine 206 of the mobile computing device 104 or the image analysis engine 306 of the server computing device 108.
  • the images collected over time are processed, for example, to detect differences or changes in the images by comparing each image to the other images.
  • the image analysis engine is initiated by user input (e.g., via user interface 210).
  • the image analysis engine may automatically analyze the images once the images are stored in user data store 214 and/or 314. If differences are determined, the image analysis engine is configured to notify the user. For example, if the determined differences are greater than a preset threshold value, the user is notified. Notification can be carried out via email, text message, banner notification via the user interface, etc., the preference of which can be set up in the user profile.
  • the image analysis engine can employ one or more image processing techniques to determine the area of interest of the user.
  • the image analysis engine may access information from a data store to assist in this determination.
  • the captured images may be compared to images with known static body (e.g., facial) features, such as the eyes, nose, and ears in order to determine the area of interest.
  • registration between captured images is performed to improve the analysis. This can be accomplished in some embodiments by referencing static body (e.g., facial) features present in each of the images to be analyzed. In some embodiments, one or more of these processes can be trained.
  • the example of the method 400 proceeds to block 406, where an image map is generated depicting changes to the area of interest over time.
  • the image analysis engine determines or detects changes in one or more of size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc.
  • the image analysis engine compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious.”
  • the lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold (e.g., 1-3%, 2- 4%, 3-5%, etc.).
  • This subset of lesions can be represented in an image map in the form of a skin condition map or profile, etc.
  • the image analysis engine can identify the changes in the images as acne blemishes, or other skin conditions, which can also be represented in a skin condition map or profile, etc.
  • a skin condition of the area of interest is determined based on the skin condition map or profile.
  • the skin condition engine 208 of the mobile computing device 104 or the skin condition engine 306 of the server computing device 108 analyzes the skin condition map or profile and determines, for example, the stages of acne for each region of the area of interest. In doing so, the skin condition engine can access data from the skin condition data store 218, 318.
  • the skin condition engine identifies a progression of a skin condition, such as acne (determined from an analyses of the images). In other embodiments, this step can be carried out, at least in part, by the image analysis engine.
  • the skin condition engine can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.).
  • a skin condition level e.g., acne stage, etc.
  • the example of the method 400 then proceeds to block 410, where a treatment protocol and/or product are recommended for each region of the area of interest based on the determined skin condition (e. g., stage of acne, etc.).
  • data can be accessed from the product data store 216, 316, user data store 214, 314, etc.
  • Different products and/or treatment protocols can be recommended for regions with difference skin condition levels.
  • Any recommendation generated by the recommendation engine can be presented to the user in any fashion via the user interface engine 210 on display 202.
  • the recommendation can be saved in the user’s profile in user data store 214, 314.
  • previous recommendations and/or treatments administered by the user can be used in the product and/or treatment protocol recommendation.
  • the efficacy of the recommendation can be tracked, which can be used to train the recommendation engine and/or data stored in the product data store for improved recommendations in subsequent uses.
  • the method 400 then proceeds to an end block and terminates.
  • FIG. 5 is a schematic diagram that illustrates a non-limiting example of a system 500 for generating calibrated images of a user according to various aspects of the present disclosure.
  • the system 500 or aspects thereof may use the calibrated images for diagnosing a skin condition of a user, for example.
  • one or more aspects of the system 500 can be employed by the system 100 described above.
  • a calibration routine from system 500 can be carried out by the system 100 prior to area of interest image analysis.
  • the images captured by the camera can be employed by the system 100 in order to carry out the skin condition determination method 400 described above with reference to FIG. 4.
  • system 500 can be configured to carry out the skin condition methodologies and technologies of system 100, as shown schematically in FIGURES 6 and 7.
  • the system 500 can use the calibrated images for other applications, such as facial recognition applications, for example.
  • the calibrated images may be utilized for generating a recommendation for cosmetic products.
  • cosmetic products may be for skin, anti-aging, face, nails, and hair, or any other beauty or health product.
  • products may include creams, cosmetics, nail polish, shampoo, conditioner, other hair products, vitamins, any health-related products of any nature, or any other product that offer results visible in a person's appearance, such as a person's skin, hair, nails or other aspects of a person's appearance.
  • treatments may include diet treatments, physical fitness treatments, acupuncture treatments, acne treatments, appearance modification treatments, or any other treatment that offers results visible in a person's appearance.
  • a user 502 interacts with a mobile computing device 504 and a calibration device 506.
  • the mobile computing device 504 is used to capture one or more images of the user 502 in the presence of the calibration device 506.
  • the calibration device 506 is associated with or generates calibration data, such as light meter data, color meter data, color card (e.g., color reference) data, etc.
  • the calibration data is used by the system 500 to generate calibrated images via the mobile computing device 504, for example.
  • the calibration device 506 can be used to calibrate the mobile computing device 504 (e.g., a camera of the mobile computing device) prior to image capture in order to generate calibrated images.
  • the calibration data can be used by the mobile computing device 504 after image capture for generating calibrated images. Because of the calibration data provided by the calibration device, the images can be either captured or processed in a way to obtain, for example, true colors of the user, regardless of the lighting conditions, etc., in which the image was taken.
  • the calibration device 506 is a cosmetic, such as lipstick.
  • the cosmetic packaging includes one or more colors that can be used as a color calibration reference.
  • the color(s) is chosen from a list of colors from the Macbeth chart.
  • the Macbeth chart is comprised of a number of colors with known color values.
  • Other color reference systems can be also used.
  • the color(s) can be on the exterior of the cosmetic packaging or on a part thereof (e.g., cap, lid, etc.) that can be visible to the mobile computing device 504.
  • the calibration data of the calibration device 506 can be associated with other material obtained at the point of sale, for example, the box or other container/packaging, the product literature, etc.
  • the associated material includes a color card, or parts thereof, for example.
  • the color card can include colors, for example, of the Macbeth chart.
  • the associated material includes one or more colors and/or associated indicia.
  • the associated indicia e.g., QR code, bar code, symbol, etc.
  • the associated indicia can be used by the system to obtain, for example, the color value(s) of the one or more colors included in the associated material or of the cosmetic packaging for calibration purposes.
  • the associated indicia can be linked to color value(s) in a calibration data store.
  • the calibration device 506 includes one or more sensors configured to generate color meter data, light meter data, etc.
  • the calibration device 506 in one embodiment includes one or more photosensors (e.g., photodiodes) configured to sense light conditions and generate light calibration data.
  • the calibration device 506 in other embodiments includes one or more photosensors (e.g., filtered photodiodes) configured to sense color temperature and generate color calibration data.
  • the mobile device may include such sensors, and may be used to capture such calibration affecting data.
  • the calibration device 506 can take many forms or functions.
  • the calibration device 506 can be a cosmetic, such as a lipstick, eyeshadow, foundation, etc., a hair brush, a toothbrush, etc., or an appliance, such as a Clarisonic branded skin care appliance.
  • the only function of the calibration device 506 is to provide calibration data.
  • the calibration device 506 is configured to transmit the calibration data to the mobile computing device 504.
  • the calibration device 506 can be coupled (e.g., wired or wirelessly) in data communication with the mobile computing device 504 according to any known or future developed protocol, such as universal serial bus, Bluetooth, WiFi, Infrared, ZigBee, etc.
  • the calibration device 506 includes a transmitter for transmitting the calibration data.
  • the calibration device 506 once the calibration device 506 is turned on and in range of the mobile computing device 504, it automatically pairs and sends the calibration data to the mobile computing device 504. In other embodiments, the mobile computing device 504 pulls the calibration data from the calibration device 506 via a request or otherwise. In yet other embodiments, the mobile computing device 504 obtains the calibration data from a local data store or a remote data store, such as the calibration data store, based on the associated indicia of the calibration device 506.
  • the mobile computing device 504 in some embodiments can carry out a device calibration routine to adjust camera settings, such as white balance, brightness, contrast, exposure, aperture, flash, etc., based on the provided calibration data prior to image capture.
  • the calibration data can be also used after image capture in some embodiments. For example, an image captured along with calibration data can be adjusted via imaging processing. In one embodiment in which color data is obtained via a color reference, the image can be compared to a reference image that also contains the color reference with the same color value(s). From the comparison(s), various attributes of the image(s) can be adjusted to calibrate the image.
  • the calibration device includes associated indicia that can be used to retrieve color values of the calibration device. From the retrieved color value(s), various attributes of the image(s) can be adjusted to calibrate the captured image.
  • the calibration device can transmit light and/or color meter data to the mobile computing device. With the light and/or color meter data, calibrated images are generated by the mobile computing device from the captured images.
  • the images captured and/or processed by the mobile computing device 504 would look the same whether the user has taken the photo in a dark room, a bright room, or a room with non-uniform and highly angled lighting.
  • calibrated images are generated by the mobile computing device. This standardization process can lead to a reduction or elimination in the variability in the quality of images used for applications ranging from diagnosing skin conditions and/or cosmetic recommendations to facial recognition, for example.
  • the mobile computing device 504 in some embodiments transmits the captured images to the server computing device 508 via a network 510 for image processing (e.g., calibration, skin condition diagnosis, product recommendation, facial recognition, etc.) and/or storage.
  • the network 510 may include any suitable wireless communication technology (including but not limited to Wi-Fi, WiMAX, Bluetooth, 2G, 2G, 4G, 5G, and LTE), wired communication technology (including but not limited to Ethernet, USB, and FireWire), or combinations thereof.
  • the server computing device 508 may process the captured images for calibration purposes and/or store the calibrated images for subsequent retrieval.
  • calibrated images are transmitted to the server computing device 508 for storage and/or further processing, such as skin condition diagnosis, etc.
  • the server computing device 508 can serve calibration data to the mobile computing device 504 for local processing.
  • FIG. 6 is a block diagram that illustrates a non-limiting example of a mobile computing device 504 according to an aspect of the present disclosure.
  • the mobile computing device 504 may be a smartphone.
  • the mobile computing device 504 may be any other type of computing device having the illustrated components, including but not limited to a tablet computing device or a laptop computing device.
  • the mobile computing device 504 may not be mobile, but may instead by a stationary computing device such as a desktop computing device or computer kiosk.
  • the illustrated components of the mobile computing device 504 may be within a single housing.
  • the illustrated components of the mobile computing device 504 may be in separate housings that are communicatively coupled through wired or wireless connections (such as a laptop computing device with an external camera connected via a USB cable).
  • the mobile computing device 504 also includes other components that are not illustrated, including but not limited to one or more processors, a non-transitory computer-readable medium, a power source, and one or more communication interfaces.
  • the mobile computing device 504 includes a display device 602, a camera 604, an optional image analysis engine 606, a skin condition engine 608 (optional), a user interface engine 610, a recommendation engine 612 (optional), and one or more optional data stores, such as a user data store 614, a product data store 616 and/or skin condition data store 618, and a calibration data store 620.
  • the mobile computing device 504 further includes a calibration engine 622. Each of these components will be described in turn.
  • the display device 602 is an LED display, an OLED display, or another type of display for presenting a user interface.
  • the display device 602 may be combined with or include a touch-sensitive layer, such that a user 502 may interact with a user interface presented on the display device 602 by touching the display.
  • a separate user interface device including but not limited to a mouse, a keyboard, or a stylus, may be used to interact with a user interface presented on the display device 602.
  • the user interface engine 610 is configured to present a user interface on the display device 602.
  • the user interface engine 610 may be configured to use the camera 604 to capture images of the user 502.
  • the user 502 may take a "selfie" with the mobile computing device 504 via camera 604.
  • a separate image capture engine may also be employed to carry out at least some of the functionality of the user interface 610.
  • the user interface presented on the display device 602 can aid the user in capturing images, storing the captured images, accessing the previously stored images, interacting with the other engines, etc.
  • the camera 604 is any suitable type of digital camera that is used by the mobile computing device 504.
  • the mobile computing device 504 may include more than one camera 612, such as a front-facing camera and a rear- facing camera.
  • the camera 604 includes adjustable settings, such as white balance, brightness, contrast, exposure, aperture, and/or flash, etc.
  • any reference to images being utilized by the present disclosure should be understood to reference both video, images (one or more images), or video and images (one or more images), as the present disclosure is operable to utilize video, images (one or more images), or video and images (one or more images) in its methods and systems described herein.
  • the calibration engine 606 is configured to calibrate the camera 604 of the mobile computing device 504 based on calibration data obtained from at least one of the calibration device 506 or the calibration data store 620. In some embodiments, the calibration engine 606 is configured to adjust the settings of the camera 604 prior to image capture. In other embodiments, instead of calibrating the camera 604 prior to image capture, the calibration engine 606 is configured to calibrate the images after image capture. For example, calibration data from the calibration device 506 can be used when processing the captured images prior to or during storage.
  • the calibration engine 606 detects a color reference (such as a color card) within the captured image and uses the color reference in order to correct the colors of the captured image for calibration purposes. For example, the calibration engine 606 in some embodiments compares the image captured by the camera 604 to a reference image stored in the calibration data store 620.
  • the reference image contains some of, all of, etc., the color calibration data of the captured image.
  • the calibration device 506 e.g., cosmetic packaging, product literature, appliance handle, etc.
  • the calibration device 506 may include a color card, a color chip, or other color reference, etc., to be compared to the reference image stored in calibration data store 620.
  • the color of the calibration device 506 has a known color value.
  • the calibration device 506 includes one or more colors with a known color value that can be retrieved from the calibration data store 620 via indicia visibly associated with the calibration device 506.
  • the color reference detected by the calibration engine 606 within the captured image is indicia that can be used to retrieve the color value(s) from the calibration data store 620 in order to correct the colors of the captured image for calibration purposes.
  • the calibrated images are saved in a data store, such as user data store 614, and can be subsequently used for product selection (e.g., hair color, lipstick color, eye shadow color, etc.,), diagnosis, such as skin condition diagnosis, or for other purposes, such as facial recognition applications.
  • product selection e.g., hair color, lipstick color, eye shadow color, etc.
  • diagnosis such as skin condition diagnosis, or for other purposes, such as facial recognition applications.
  • the mobile computing device 504 may be provided with other engines for increased functionality.
  • the mobile computing device 504 includes a skin condition engine 608.
  • the skin condition engine 608 is configured to analyze the calibrated images to determine one or more skin conditions (e.g., acne, eczema, psoriasis, etc.) of the user 502.
  • the skin condition engine 608 may retrieve data from the skin condition data store 618 during the analysis.
  • a recommendation engine 612 may also be provided, which recommends a treatment protocol, products for treatment, etc., based on the results of the analysis carried out by the skin condition engine 608. In doing so, the recommendation engine 612 can access data from the product data store 616.
  • the mobile computing device 504 includes an image analysis engine 622 as well as the skin condition engine 608.
  • Image analysis engine 622 includes, among other things, the functionality of the image analysis engine 206 described above with reference to FIG. 2.
  • the image analysis engine 622 in some embodiments is configured to compare two or more images.
  • the image analysis engine 622 checks the timestamps of the images and runs a similar/difference algorithm or image processing routine.
  • the similar/difference algorithm determines or detects changes in size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc.
  • image analysis engine 622 compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious.”
  • the lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold.
  • This subset of lesions can be highlighted on the image, represented in a skin condition map or profile, etc.
  • the image analysis engine 622 can identify the changes in the images as acne blemishes, which can also be highlighted on the image, represented in a skin condition map or profile, etc.
  • the skin condition engine 608 is configured to analyze the skin condition map or profile generated by the image analysis engine 622, and can determine, for example, the stages of acne for each region of the image. In doing so, the skin condition engine 608 can access data from the skin condition data store 618. In some embodiments, the skin condition engine 608 identifies a progression of a skin condition, such as acne (determined from an analyses of the images).
  • the skin condition engine 608 can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.).
  • a skin condition level e.g., acne stage, etc.
  • the recommendation engine 612 is configured to recommend a treatment protocol and/or product for each region based on the determined skin condition (e. g., stage of acne, etc.) in some embodiments. In doing so, the recommendation engine 612 can access data from the product data store 616. Any recommendation generated by the recommendation engine 612 can be presented to the user in any fashion via the user interface engine 610 on display 602.
  • a facial recognition engine (not shown) is provided, which is configured to identify the identity of, or other attribute, of the user.
  • a cosmetic recommendation engine (not shown) is provided, which can simulate product color, such as hair color, lipstick, etc., on the user for aid in product selection, product recommendation, etc.
  • the cosmetic recommendation engine is part of the recommendation engine 612 and can access data from the product data store 616. Any recommendation generated by the recommendation engine 612 can be presented to the user in any fashion via the user interface engine 610 on display 602.
  • FIG. 7 is a block diagram that illustrates various components of a non-limiting example of an optional server computing system 508 according to an aspect of the present disclosure.
  • the server computing system 508 includes one or more computing devices that each include one or more processors, non-transitory computer- readable media, and network communication interfaces that are collectively configured to provide the components illustrated below.
  • the one or more computing devices that make up the server computing system 508 may be rack-mount computing devices, desktop computing devices, or computing devices of a cloud computing service.
  • image processing and/or storage of the captured images can be additionally or alternatively carried out at an optional server computing device 508.
  • the server computing device 508 can receive captured and/or processed images from the mobile computing device 504 over the network 510 for processing and/or storage.
  • the server computing device 508 optionally includes a calibration engine 706, a skin condition engine 708, a recommendation engine 712, and one or more data stores, such as a user data store 714, a product data store 716, a skin condition data store 718, and/or a calibration data store 720.
  • the calibration engine 706, the skin condition engine 708, the recommendation engine 712, and the one or more data stores are substantially identical in structure and functionality as the calibration engine 606, the skin condition engine 608, the recommendation engine 612, and one or more data stores, such as the user data store 614, the product data store 616, the skin condition data store 618, and/or the calibration data store 620 of the mobile computing device 504 illustrated in FIG. 6.
  • FIG. 8 is a flowchart that illustrates a non-limiting example embodiment of a method 500 for calibrating images of a user according to an aspect of the present disclosure. It will be appreciated that the following method steps can be carried out in any order or at the same time, unless an order is set forth in an express manner or understood in view of the context of the various operation(s). Additional process steps can also be carried out. Of course, some of the method steps can be combined or omitted in example embodiments.
  • the method 800 proceeds to block 802, where calibrated images are generated by the mobile computing device 504 and/or the server computing system 508 with the aid of calibration data from the calibration device 506.
  • the user 502 can operate the calibration device 506 in some embodiments to generate data indictive of, for example, ambient lighting conditions.
  • the calibration device 506 may additionally or alternatively generate color temperature data of the user 502.
  • the user 502 can scan an area of interest (e.g., face) with a sweeping movement. This can occur, for example, during a face cleansing or make-up application/removal routine just prior to, contemporaneously with, or just after image capture by the mobile computing device 504.
  • the calibration device 506 records light meter data generated by the photosensor(s). If equipped, the calibration device 506 alternatively or additionally records color meter data of the user via an appropriate sensor. The light meter data and/or color meter data can then be transferred (wired or wirelessly) to the mobile computing device 504 and/or server computing system 508.
  • the calibration engine can calibrate either the camera 604 of the mobile computing device 504 or the images captured by the camera.
  • the mobile computing device 504 can receive the calibration data (e.g., light meter data, color meter data, etc.) from the calibration device 506 via any wired or wireless protocol and adjust the appropriate camera settings to calibrate the camera 604 prior to image capture.
  • the mobile computing device can generate calibrated image(s) of an area of interest of the user 502.
  • the calibration engine can use the light meter data and/or color meter data obtained from the calibration device 506 to calibrate the images captured by the camera 604.
  • the images captured are of an area of interest to the user 502.
  • the area of interest can be one of face, the neck, the arm, etc., for tracking skin conditions, such as moles, sun spots, acne, eczema, etc.
  • an attribute of the calibration device 506 can be used by the calibration engine to generate calibrated images.
  • the mobile computing device 504 captures at least one image of the user 502 in the presence of the calibration device 506.
  • the at least one image to be captured is of an area of interest to the user 502.
  • the area of interest can be one of face, the neck, the arm, etc., for tracking skin conditions, such as lesions, moles, sun spots, acne, eczema, etc.
  • the user 502 can capture an image of themselves (a "selfie") holding the calibration device 506.
  • the calibration device 506 may include a color card, a color chip or other feature that can provide a reference for calibration purposes.
  • the calibration engine 606 can extract calibration data and can then generate a calibrated image.
  • the calibrated image is generated by adjusting the appropriate camera settings to calibrate the camera 604.
  • the mobile computing device With the calibrated camera settings, the mobile computing device generates calibrated images.
  • the user interface engine captures an image to be used for calibration purposes.
  • the camera can be used to capture calibrated images for skin condition applications, facial recognition applications, etc.
  • a calibrated image is generated via image processing techniques by adjusting one or more image attributes (e.g., white balance, brightness, color values, etc.) of the image after image capture.
  • the calibration engine automatically detects a color reference (such as a color card) within the captured image and uses the color reference in order to correct the colors of the captured image for calibration purposes. For example, the calibration engine in some embodiments compares the image captured by the camera 604 to a reference image stored in the calibration data store 620, 720.
  • the reference image contains some of, all of, etc., the color calibration data of the captured image.
  • the calibration device 506 e.g., cosmetic packaging, product literature, appliance handle, etc.
  • the calibration device 506 in the captured image may include a color card, a color chip, or other color reference, etc., to be compared to the reference image stored in calibration data store.
  • the color of the calibration device 506 has a known color value.
  • the calibration device 506 includes one or more colors with a known color value that can be retrieved from the calibration data store via indicia visibly associated with the calibration device 506.
  • the color reference detected by the calibration engine 606 within the captured image is indicia that can be used to retrieve the color value(s) from the calibration data store 620 in order to correct the colors of the captured image for calibration purposes. The calibrated images generated by the calibration engine are then stored in the user data store 614 of the mobile computing device 504 for subsequent retrieval.
  • additional image processing e.g., filtering, transforming, compressing, etc.
  • the captured images can be transferred to the server computing device 508 over the network 510 for storage at the user data store 714.
  • the calibrated images can be analyzed for any suitable application, including any of those set forth above.
  • the calibrated images can be analyzed to determine a skin condition of the area of interest.
  • the skin condition engine 608 of the mobile computing device 504 or the skin condition engine 706 of the server computing device 508 analyzes the calibrated images and determines, for example, acne, age spots, dry patches, etc., for each region of the area of interest. In doing so, the skin condition engine can access data from the skin condition data store 618, 718.
  • the example of the method 500 then proceeds to block 506, where an optional treatment protocol and/or product is for each region of the area of interest is recommended based on the determined skin condition (e. g., acne, dry skin, age spots, etc.).
  • the recommendation engine 612 of the mobile computing device 504 or the recommendation engine 712 of the server computing device 508 recommends a treatment protocol and/or product for each region of the area of interest based on the determined skin condition(s). In doing so, data can be accessed from the product data store 616, 716. Different products and/or treatment protocols can be recommended for regions with difference skin conditions. Any recommendation generated by the recommendation engine can be presented to the user in any fashion via the user interface engine on display 602. In some embodiments, the efficacy of the recommendation can be tracked, which can be used to train the recommendation engine and/or data stored in the product data store for improved recommendations in subsequent uses.
  • any processing accomplished at the mobile computing device 504 can be additionally or alternatively carried out at the server computing device 508.
  • the method 800 then proceeds to an end block and terminates ⁇
  • the calibration device and/or mobile computing device could also include positional sensors and inertial measurement sensors for generating additional data to be used to calibrate the images.
  • FIG. 9 is a block diagram that illustrates aspects of an exemplary computing device 900 appropriate for use as a computing device of the present disclosure. While multiple different types of computing devices were discussed above, the exemplary computing device 900 describes various elements that are common to many different types of computing devices. While FIG. 9 is described with reference to a computing device that is implemented as a device on a network, the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other devices that may be used to implement portions of embodiments of the present disclosure. Moreover, those of ordinary skill in the art and others will recognize that the computing device 900 may be any one of any number of currently available or yet to be developed devices.
  • the computing device 900 includes at least one processor 902 and a system memory 904 connected by a communication bus 906.
  • the system memory 904 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or similar memory technology.
  • ROM read only memory
  • RAM random access memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or similar memory technology.
  • system memory 904 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 902.
  • the processor 902 may serve as a computational center of the computing device 900 by supporting the execution of instructions.
  • the computing device 900 may include a network interface 910 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure may access basic services that utilize the network interface 910 to perform communications using common network protocols.
  • the network interface 910 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WIFI, 2G, 3G, LTE, WiMAX, Bluetooth, Bluetooth low energy, and/or the like.
  • the network interface 910 illustrated in FIG. 9 may represent one or more wireless interfaces or physical communication interfaces described and illustrated above with respect to particular components of the computing device 900.
  • the computing device 900 also includes a storage medium 908.
  • the storage medium 908 depicted in FIG. 9 is represented with a dashed line to indicate that the storage medium 908 is optional.
  • the storage medium 908 may be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, and/or the like.
  • computer-readable medium includes volatile and non-volatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer readable instructions, data structures, program modules, or other data.
  • system memory 904 and storage medium 908 depicted in FIG. 9 are merely examples of computer-readable media.
  • FIG. 9 does not show some of the typical components of many computing devices.
  • the computing device 900 may include input devices, such as a keyboard, keypad, mouse, microphone, touch input device, touch screen, tablet, and/or the like. Such input devices may be coupled to the computing device 900 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, Bluetooth low energy, USB, or other suitable connections protocols using wireless or physical connections.
  • the computing device 900 may also include output devices such as a display, speakers, printer, etc. Since these devices are well known in the art, they are not illustrated or described further herein.
  • the present application may reference quantities and numbers. Unless specifically stated, such quantities and numbers are not to be considered restrictive, but exemplary of the possible quantities or numbers associated with the present application. Further in this regard, the present application may use the term “plurality” to reference a quantity or number. In this regard, the term “plurality” is meant to be any number that is more than one, for example, two, three, four, five, etc. The terms “about,” “approximately,” “near,” etc., mean plus or minus 5% of the stated value.
  • the phrase "at least one of A, B, and C,” for example, means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B, and C), including all further possible permutations when greater than three elements are listed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Examples of methodologies and technologies for determining changes in one or more skin conditions of a user over time are described herein. Any changes in skin conditions over time may be used as a diagnosis and/or treatment aid for a physician. Any changes in skin conditions over time may be also used in a computer implemented method that provides diagnosis and/or treatment recommendations. In some examples, improved image quality is attained via calibration techniques and methodologies. In these examples, improved image accuracy and quality is provided by addressing issues relating to unpredictable and inconsistent lighting conditions, among others. In an example, the system includes a mobile computing device and an object with known lighting and/or color attributes (e.g., a reference). Such an object acts as a calibration device for images to be captured by the mobile computing device.

Description

IMAGE PROCESS SYSTEMS FOR SKIN CONDITION DETECTION
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U.S. Provisional Application No. 62/955159, filed December 30, 2019, and U.S. Provisional Application
No. 62/955128, filed December 30, 2019, the disclosures of which are incorporated herein in their entirety.
TECHNICAL FIELD Embodiments of the present disclosure relate to image processing. In some embodiments, image processing techniques are employed for skin condition diagnostics and/or treatment. In order to provide improved image processing, calibration techniques can be employed.
SUMMARY OF THE DISCLOSURE This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In accordance with an aspect of the present disclosure, a computer implemented method for determining changes in a skin condition of a subject is provided. In an embodiment, the method comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over time, wherein each image taken is separated in time by a time period; and determining one or more differences between the plurality of images.
In any embodiment, the method may further comprise generating an image map of the area of interest, the image map indicative of the differences between the plurality of images.
In any embodiment, the image map indicates changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest.
In any embodiment, the method may further comprise determining a skin condition based on the image map.
In any embodiment, the method may further comprise recommending one of a treatment or a product based on the determined skin condition.
In any embodiment, the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
In any embodiment, the time period is selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, six months, and one year.
In any embodiment, the method may further comprise notifying the user that a change has been detected if the difference detected is greater than a preselected threshold value.
In any embodiment, the user is notified via an email, a text message, or a banner notification on a user device.
In any embodiment, the method may further comprise determining the area of interest based at least one the captured images.
In any embodiment, the area of interest includes a back, a face, an arm, a neck, a shoulder, or regions thereof.
In any embodiment, the method may further comprise determining a skin condition based on the determined differences; and recommending one of a treatment or a product based on the determined skin condition. In any embodiment, the plurality of images are obtained by a camera, the camera including one or more adjustable settings, and wherein the method further comprises: automatically calibrating the camera by adjusting one or more of the camera settings based on an associated calibration device.
In any embodiment, the method may further comprise calibrating the captured images based on calibration data presented in the one or more of the captured images.
In any embodiment, the calibration data is provided by a calibration device captured in the images.
In any embodiment, the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
In any embodiment, the attribute is a color.
In any embodiment, the attribute is indicia indicative of a color, and wherein the method further comprises obtaining calibration data based on the indicia by accessing a data store linked to the indicia.
In any embodiment, the indicia includes a QR codes or a bar code.
In any embodiment, the method may further comprise obtaining calibration data from a calibration device captured by the image.
In any embodiment, the calibration device is a cosmetics apparatus or packaging associated therewith.
In accordance with another aspect of the disclosure, a system for determining changes in a skin condition of a subject is provided. In an embodiment, the system includes a camera configured to capture one or more images; and one or more processing engines including circuitry configured to: cause the camera to capture one or more images of an area of interest associated with the subject, the one or more images taken sequentially over time so as to obtain a plurality of images separated in time by a time period selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, and six months, and one year; determine one or more differences between the captured images, the differences indicative of changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest; and determine a skin condition based on the determined differences or identifying the object for subsequent analysis if the differences are greater than a preselected threshold.
In any embodiment, the one or more processing engines include circuitry configured to: determine the skin condition based on the determined differences; and recommend a treatment protocol or a product based on the determined skin condition.
In any embodiment, the one or more processing engines include circuitry configured to determine changes in one or more of: size, shape, color, uniformity of an existing lesion, detect new lesions, detect the absence of previously detected lesion(s), or detect a progression of a lesion.
In any embodiment, the one or more processing engines include circuitry configured to: detect a progression of a lesion from the detected differences in the plurality of images; and determine one or more stages of the lesion based on the detected progression of the lesion.
In any embodiment, the one or more processing engines includes: a user interface engine including circuitry configured to cause the camera to capture the plurality of images; an image analysis engine including circuitry for comparing two or more images using a similar/difference algorithm to determine one or more differences between the images; and a skin condition engine including circuity configured for analyzing an image map of the determined one or more differences to locate a lesion, and for determining the stage of the lesion located in the image map.
In any embodiment, the one or more processing engines further includes: a recommendation engine including circuity configured to recommend a treatment protocol and/or product for each region based at least on the determined skin condition.
In any embodiment, the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis. In any embodiment, the system may further comprise a calibration engine including circuitry configured to calibrate the camera prior to image capture for generating calibrated images or to calibrate the images captured by the camera for generating calibrated images, the calibration engine obtaining calibration data from a calibration device.
In any embodiment, the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference.
In any embodiment, the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
In any embodiment, the attribute is a reference color.
In any embodiment, the attribute is indicia indicative of a color or color data, wherein the calibration engine is configured to retrieve the calibration data based on the indicia by accessing a data store linked to the indicia.
In any embodiment, the indicia include a QR codes or a bar code.
In accordance with another aspect of the present disclosure, a computer- implemented method for determining changes in a skin condition of a subject is provided. In an embodiment, the method comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; determining a skin condition based on at least the plurality of images; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
In any embodiment, the obtaining, by a first computing device, a plurality of images of an area of interest associated with the subject includes capturing, by a camera of a first computing device, the plurality of images.
In any embodiment, the determining a skin condition based on least the plurality of images or the determining at least one product recommendation based on at least the determined skin condition is carried out by a second computing device remote from the first computing device.
In any embodiment, the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
In accordance with another aspect of the present disclosure, a computer- implemented method is provided for determining changes in a skin condition of a subject. In an embodiment, the method comprises obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; calibrating the plurality of images based on calibration data obtained via the images; determining a skin condition based on least the plurality of images calibrated; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
In accordance with another aspect of the present disclosure, a computer implemented method is provided for accurate skin diagnosis. In an embodiment, the method includes calibrating, by a computing device, one or more images of an area of interest associated with a subject; and determining a skin condition based on the one or more calibrated images.
In any embodiment, the one or more images includes a plurality of images taken sequentially over a period of time, and wherein the determining a skin condition is based on the plurality of images.
In any embodiment, the method may further comprise generating a treatment protocol and/or product recommendation for an area of interest of the subject based on the determined skin condition.
In any embodiment, the calibrating, by a computing device, one or more images of an area of interest associated with a subject includes obtaining calibration data from a calibration device; calibrating a camera based on the calibration data; and capturing the one or more images of a user with the calibrated camera. In any embodiment, the method may further comprise generating, via the calibration device, light meter data or color temperature data of the subject; receiving the calibration data from the calibration device; adjusting one or more camera settings for calibrating the camera prior to image capture.
In any embodiment, the calibrating, by a computing device, one or more images of an area of interest associated with a subject includes capturing the one or more images via a camera associated with the computing device; obtaining calibration data from a calibration device associated with the one or more images captured by the camera; and calibrating the one or more images captured by the camera based on the calibration data.
In any embodiment, the method may further comprise generating, via the calibration device, light meter data or color temperature data of the subject; receiving the light meter data and/or color meter data from the calibration device, and using the light meter data and/or color meter data obtained from the calibration device to calibrate the captured images.
In any embodiment, the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference, the method further comprising capturing at least one image of the subject in the presence of the calibration device.
In any embodiment, the calibration device is a cosmetics apparatus.
In accordance with another aspect of the present disclosure, a method is provided. In an embodiment, the method includes obtaining calibration data from a calibration device; and generating, by a computing device, calibrated images by one of: calibrating a camera of a mobile computing device based on the calibration data and capturing one or more images of a user with the calibrated camera; or calibrating one or more images captured with the camera based on the calibration data.
In any embodiment, the method may further comprise determining a skin condition based on the one or more calibrated images. In any embodiment, the method may further comprise recommending one or more of: a skin treatment protocol; and a product configured to treat the skin condition.
In any embodiment, the method may further comprise operating the calibration device to generate data indicative of light meter data or color temperature data of the subject.
In any embodiment, the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data as calibration data.
In any embodiment, the images captured include an area of interest associated with the user, the images of the area of interest usable for tracking skin conditions of the subject.
In any embodiment, the method may further comprise capturing the one or more image of the subject in the presence of the calibration device.
In any embodiment, the calibration device includes calibration data represented by one selected from the group consisting of: a color card, a color chip and a calibration reference.
In any embodiment, the method may further comprise generating the calibrated image by adjusting one or more camera settings based on the calibration data.
In any embodiment, the calibration engine automatically detects a color reference associated with the captured image and uses the color reference in order to correct the colors of the captured image to generate calibrated images.
In any embodiment, the method may further comprise comparing the image captured to a reference image stored in a calibration data store.
In accordance with another aspect of the present disclosure, a computer system is provided. In an embodiment, the system includes a user interface engine including circuitry configured to cause an image capture device to capture images of the user; a calibration engine including circuitry configured to calibrate the image capture device prior to image capture for generating calibrated images or to calibrate the images captured by the image capture device for generating calibrated images, the calibration engine obtaining calibration data from a calibration device; and a skin condition engine configured to determine a skin condition of the user based on the generated calibrated images image.
In any embodiment, the system may further comprise a recommendation engine including circuity configured to recommend a treatment protocol or a product based at least on the determined skin condition.
In any embodiment, the calibration device includes one or more sensors configured to generate data indicative of calibration data, and wherein the calibration engine is configured to receive the calibration data and adjust one or more suitable camera settings for calibrating the camera prior to image capture.
In any embodiment, the calibration device includes an attribute suitable for use by the calibration engine to generate the calibrated images.
In any embodiment, the attribute is a color or indicia indicative of a color, the calibration engine configured to obtain calibration data based on the indicia.
In any embodiment, the calibration engine includes circuitry configured to obtain the calibration data from the image captured by the image capture device, the image captured including an image of the calibration device.
In any embodiment, the calibration device is a cosmetics apparatus or packaging associated therewith.
In any embodiment, the calibration engine is configured to: automatically detect a color reference associated with the captured image; and use the color reference in order to correct the colors of the captured image to generate calibrated images.
In any embodiment, the calibration device includes one selected from the group consisting of: a color card, a color chip and a calibration reference.
In any embodiment, the calibration engine includes circuitry configured to extract calibration data from the captured image and generate the calibrated images.
In any embodiment, the calibration engine includes circuitry configured to adjust one or more image attributes of the image after image capture for calibrating the image. In any embodiment, the image attributes includes white balance, brightness, and/or color values.
In any embodiment, the user interface engine includes circuitry configured for creating a subject profile and storing the subject profile in a subject data store.
In any embodiment, the subject profile includes previously provided recommendations, biographically data of the subject, area(s) of interest of the subject, current medicaments used, cosmetic products used, and/or treatment protocols used.
In any embodiment, the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data.
In any embodiment, the generated light meter data and/or color meter data is used by the calibration engine to calibrate either the image capture device or the images captured by the image capture device.
In any embodiment, the calibration engine is configured to receive the calibration data from the calibration device and adjust one or more camera settings for calibrating the camera prior to image capture.
In any embodiment, the calibration engine is configured to use the light meter data and/or color meter data obtained from the calibration device to calibrate the images captured by the image capture device.
In any embodiment, the captured images are calibrated by adjustment to one or more settings of the image capture device based on the calibration data
In any embodiment, the calibration engine includes circuitry configured to compare the images captured to one or more reference images stored in the calibration data store to determine color calibration data for the captured images.
In any embodiment, the calibration device includes one or more colors with a known color value, and wherein the calibration engine includes circuitry configured to retrieve from a calibration data store color data associate with the known color value.
In accordance with another aspect of the present disclosure, a computing device is provided. In an embodiment, the computing device is configured to: analyze a first image taken at time T1 and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine at least one product recommendation based on the at least one difference between the first image and the second image; and provide the at least one product recommendation to the subject.
In accordance with another aspect of the present disclosure, a computing device is provided. In an embodiment, the computing device is configured to: analyze a first image taken at time Tl and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine if the difference between the first image and the second image is greater than a preselected value; identify the area of the images in which the differences are greater than the preselected value; and recommend review of the area identified.
DESCRIPTION OF THE DRAWINGS
The foregoing aspects and many of the attendant advantages of disclosed subject matter will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
FIG. 1 is a schematic diagram that illustrates a non-limiting example of a system for detecting and/or diagnosing skin conditions of a user according to an aspect of the present disclosure;
FIG. 2 is a block diagram that illustrates a non-limiting example of a mobile computing device according to an aspect of the present disclosure;
FIG. 3 is a block diagram that illustrates a non-limiting example of a server computing device according to an aspect of the present disclosure;
FIG. 4 is a flowchart that illustrates a non-limiting example of a method for detecting and/or diagnosing a skin condition according to an aspect of the present disclosure; FIG. 5 is a schematic diagram that illustrates a non-limiting example of a system for calibrating images of a user according to an aspect of the present disclosure, the calibrated images being suitable for use in applications such as diagnosing skin conditions, facial recognition, cosmetic recommendations, etc., an example of which is set forth in FIG. 1 ;
FIG. 6 is a block diagram that illustrates a non-limiting example of a mobile computing device according to various aspects of the present disclosure;
FIG. 7 is a block diagram that illustrates a non-limiting example of a server computing device according to an aspect of the present disclosure;
FIG. 8 is a flowchart that illustrates a non-limiting example of a method for generating calibrated images according to an aspect of the present disclosure; and
FIG. 9 is a block diagram that illustrates a non-limiting example of a computing device appropriate for use as a computing device with embodiments of the present disclosure.
DETAILED DESCRIPTION
Examples of methodologies and technologies for determining changes in one or more skin conditions of a user over time are described herein. Any changes in skin conditions over time may be used as a diagnosis and/or treatment aid for a physician. Any changes in skin conditions over time may be also used in a computer implemented method that provides diagnosis and/or treatment recommendations.
In employing some of the of methodologies and technologies for determining changes in one or more skin conditions of a user over time, improved image capture methodologies and technologies can be used. Of course, these examples of methodologies and technologies for improved image capture are non-limiting. For example, the methodologies and technologies for improved image capture can be additionally or alternatively used in other applications, such as product selection, facial recognition, etc., or any application that would benefit by improved image capture.
In the following description, numerous specific details are set forth to provide a thorough understanding of the examples. One skilled in the relevant art will recognize; however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to "one example" or "one embodiment" or similar terminology means that a particular feature, structure, or characteristic described in connection with the example is included in at least one example of the present disclosure. Thus, the appearances of the phrases "in one example" or "in one embodiment," for example, in various places throughout this specification are not necessarily all referring to the same example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more examples.
The disclosed subject matter provides examples of systems and methods for detecting a skin condition, such as acne, by looking at multiple images of a user taken at different points in time (e.g., once a day for 1-2 weeks, once a day for a month, etc.) and using image processing techniques to detect changes of size, shape, color, uniformity, etc., of areas of the image to determine whether the changes represent characteristics (e.g., blemishes) caused by a skin condition (e.g., acne). For example, the images can be captured by a camera of the consumer product (e.g., mobile phone, tablet, etc.) and then transferred to a computer system that stores the images for subsequent access and analysis. In some examples, the computer system is part of the consumer product (e.g., mobile phone, tablet, etc.). After a number of images are collected, the computer system compares the images for detecting changes in the images over time (e.g., from the earliest image to the latest image). If any changes are detected, skin condition analysis can be carried out in some embodiments to determine how many acne blemishes exist, how severe the user's acne is, what stage of acne each blemish is in, etc.
With this information, the system and methods in some examples can recommend a treatment based on results of the skin condition analysis. The treatment recommendation can include one or more treatment protocols and may include, for example, one or more product recommendations. In some examples, the systems and methods can track the efficacy of the recommendation and can train the system for improved recommendations in subsequent uses.
In general, features on the face, for example, are static (e.g., location of nose, lips, chin, moles, freckles, etc.) relative to acne blemishes. Acne blemishes last anywhere from 5-10 days to months, and during this span the acne blemish follows an understood trajectory (e.g., blocked pore, black head, white head, papule, pustule, lesion, scar). Each stage of the blemish has unique colors and sizes relative to the other stages. By understanding the overall lifespan of the acne blemish and taking multiple, sequential images of the face (e.g., once a day, once a week, etc.), a skin condition (e.g., acne, etc.) map or profile can be generated.
For example, multiple images of an area of interest of the user taken over time can be analyzed via image processing techniques for determining changes in skin condition(s). If the changes to certain areas (e.g., pixel groups) of the images match, for example, the progression of a known skin condition (e.g., an acne blemish), the systems and methods in some examples identify groups of pixels as a blemish and can create an acne profile of the user associated with this area of interest. The profile may include, for example, assignment of an acne stage(s) to each blemish or sections thereof. This profile can then be matched to suggested products and treatment protocols to address the skin condition. While the face is described in some embodiments, other body locations of the user can be monitored, such as the back, the chest, arms, etc. Of course, multiple areas of interest can be analyzed, and an acne profile can be generated for each area of interest.
In other examples, the system and methods again capture images of an area of interest (e.g., the back) taken at different points in time. In these examples, the time period is extended (e.g., every 6 months, every year). The images are then transferred to a computer system that stores the images for subsequent access and analysis. In some examples, the computer system is part of the image capture device (e.g., mobile phone, tablet, etc.).
After a number of images are collected over time, the computer system can compare the images to identify, for example, new lesions (e.g. moles, sun spots, aging spots, etc.) that did not exist before, or flag lesions that underwent a change (e.g., size, shape, color, uniformity etc.) greater than a predetermined threshold (e.g., 2-5% change). With the computer system, suspicious lesions can be identified and flagged for closer examination by a dermatologist, or other methods. With the lesions identified by the system, the dermatologist will be more able to identify and focus on the most concerning lesions. Examples of the present disclosure also relate to systems and methods for generating more accurate image(s) of a user via a camera of a consumer product (e.g., mobile phone, tablet, laptop, etc.) for subsequent use in, for example, computer implemented applications, such as skin diagnosis, including the examples briefly described above, or for facial recognition or cosmetic simulation, selection and/or recommendation, etc. Examples of the systems and methods improve image accuracy and quality by addressing issues relating to unpredictable and inconsistent lighting conditions, among others. In an example, the system includes a mobile computing device and an object with known lighting and/or color attributes (e.g., a reference). Such an object acts as a calibration device for images to be captured by the mobile computing device.
In some examples, the calibration device can provide light or color meter data, color card data, or other calibration data to the mobile computing device. By accessing or receiving calibration data from the calibration device, the mobile computing device can generate calibrated images to compensate for non-uniform lighting conditions, for example. In some embodiments, the calibration data can be used prior to image capture for camera setting(s) adjustment. In other embodiments, the calibration data can be alternatively used after image capture for calibrating the images when the captured images are processed for storage.
Accordingly, examples of the systems and methods provide an extremely powerful tool that can be deployed on a simple consumer product, such as a smart phone, tablet, etc., with optional cloud or server storage systems for assisting dermatologists in identifying potential problems, such as cancer. And since the systems and methods can be deployed in consumer products owned or accessible to most users, these systems and methods can to utilized to assist the user in tracking the changes over time (e.g., reduction) of individual lesions (blemishes, acne lesions, dark spots, etc.) to demonstrate the effectiveness of their cosmetic interventions and to provide encouragement to continue such treatment by demonstrating the actual changes over time. If such treatment is shown by the systems and methods of the present disclosure to be ineffective, the user is able to change treatment protocols sooner than without such tools.
In some examples, the methodologies and technologies are carried out by a computing system that includes, for example, a handheld smart device (e.g., a smart phone, tablet, laptop, game console, etc.) with a camera and memory. An optional cloud data store can be accessed by the system for storage of images of the user at different time points with appropriate metadata (e.g., date, user ID, user annotations etc.). The computing system also includes one or more image processing algorithms or engines that are either local to the handheld smart device or remote to the handheld smart device (e.g., server/cloud system) for analyzing the captured images.
In some embodiments, an image processing algorithm or engine compares and interprets the gross changes of lesions over time to determine and flag (flag (e.g., identify, highlight, mark, etc.) a subset of lesions that are categorized as "suspicious." In other embodiments, an image processing algorithm or engine compares and interprets the changes of lesions over time for generating a skin condition profile (e.g., acne profile). A user interface can be presented by the handheld smart device to aid the user in image capture, image storage, access to previously stored images, interaction with the analysis engines and to notify and/or display any lesions flagged as suspicious by the system.
In some examples, some methodologies and technologies of the disclosure are provided to a user as a computer application (i.e., an "App") through a mobile computing device, such as a smart phone, a tablet, a wearable computing device, or other computing devices that are mobile and are configured to provide an App to a user. In other examples, the methodologies and technologies of the disclosure may be provided to a user on a computer device by way of a network, through the Internet, or directly through hardware configured to provide the methodologies and technologies to a user.
FIG. 1 is a schematic diagram that illustrates a non-limiting embodiment of a system for detecting changes in the skin condition of a user according to an aspect of the present disclosure. In the system 100, a user 102 interacts with a mobile computing device 104. The mobile computing device 104 may be used to capture one or more images of the user 102, from which at least one skin condition, such as acne, eczema, psoriasis, or suspicious lesion can be diagnosed. As will be described in more detail below, the mobile computing device 104 can be used to capture one or more image(s) of the user's area of interest (e.g., back, face, neck, etc.) at different points in time (e.g., once a week, once a month, once every six months, once a year, etc.)
In some embodiments, the mobile computing device 104 is used to process the collected images in order to determine changes of the area of interest over a selected period of time. The selected period of time can be, for example, one week, one month, one year, etc. In some embodiments, the results of the processed images can then be used for diagnostic purposes by a physician. For example, the results of the processed images may indicate a suspicious lesion. The physician can then use the results to determine whether a biopsy or other further analysis should be made.
In some other embodiments, the mobile computing device 104 analyzes the changes reflected in the processed images for determining skin conditions associated with the area of interest. With this skin condition information, the mobile computing device may also be used for determining a product recommendation, treatment protocol, etc., to be presented to the user 102. The efficacy of the treatment protocol, product usage, etc., may then be tracked with subsequent image capture and analysis by the mobile computing device 104.
As will be described in more detail below, some of the functionality of the mobile computing device 104 can be additionally or alternatively carried out at an optional server computing device 108. For example, the mobile computing device 104 in some embodiments transmits the captured images to the server computing device 108 via a network 110 for image processing and/or storage. In some embodiments, the network 110 may include any suitable wireless communication technology (including but not limited to Wi-Fi, WiMAX, Bluetooth, 2G, 3G, 4G, 5G, and LTE), wired communication technology (including but not limited to Ethernet, USB, and FireWire), or combinations thereof.
FIG. 2 is a block diagram that illustrates a non-limiting example embodiment of a system that includes a mobile computing device 104 according to an aspect of the present disclosure. The mobile computing device 104 is configured to collect information from a user 102 in the form of images of an area of interest. The area of interest can be a specific body part of the user, such as the back, face, arm, neck, etc., or can be region(s) thereof, such as the forehead, chin, or nose of the face, the shoulder, dorsum, or lumbus of the back, etc.
In some embodiments, the mobile computing device 104 may be a smartphone. In some embodiments, the mobile computing device 104 may be any other type of computing device having the illustrated components, including but not limited to a tablet computing device or a laptop computing device. In some embodiments, the mobile computing device 104 may not be mobile, but may instead by a stationary computing device such as a desktop computing device or computer kiosk. In some embodiments, the illustrated components of the mobile computing device 104 may be within a single housing. In some embodiments, the illustrated components of the mobile computing device 104 may be in separate housings that are communicatively coupled through wired or wireless connections (such as a laptop computing device with an external camera connected via a USB cable). The mobile computing device 104 also includes other components that are not illustrated, including but not limited to one or more processors, a non-transitory computer-readable medium, a power source, and one or more communication interfaces.
As shown, the mobile computing device 104 includes a display device 202, a camera 204, an image analysis engine 206, a skin condition engine 208, a user interface engine 210, a recommendation engine 212, and one or more data stores, such as a user data store 214, a product data store 216 and/or skin condition data store 218. Each of these components will be described in turn.
In some embodiments, the display device 202 is an LED display, an OLED display, or another type of display for presenting a user interface. In some embodiments, the display device 202 may be combined with or include a touch-sensitive layer, such that a user 102 may interact with a user interface presented on the display device 202 by touching the display. In some embodiments, a separate user interface device, including but not limited to a mouse, a keyboard, or a stylus, may be used to interact with a user interface presented on the display device 202.
In some embodiments, the user interface engine 210 is configured to present a user interface on the display device 202. In some embodiments, the user interface engine 210 may be configured to use the camera 204 to capture images of the user 102. Of course, a separate image capture engine may also be employed to carry out at least some of the functionality of the user interface 210. The user interface presented on the display device 202 can aid the user in capturing images, storing the captured images, accessing the previously stored images, interacting with the other engines, etc. The user interface presented on the display device 202 can also present one or more lesions that were flagged as suspicious by the system, and/or can present a treatment protocol to the user 102 with or without product recommendations. In some embodiments, the user interface engine 210 may also be configured to create a user profile. Information in the user profile may be stored in a data store, such as the user data store 214. Data generated and/or gathered by the system 100 (e.g., images, analysis data, statistical data, user activity data, or other data) may also be stored in the user data store 214 from each session when the user 102 utilizes the system 100. The user profile information may therefore incorporate information the user provides to the system through an input means, for example, such as a keyboard, a touchscreen, or any other input means. The user profile may further incorporate information generated or gathered by the system 100, such as statistical results, recommendations, and may include information gathered from social network sites, such as Facebook™, Instagram, etc. The user may input information such as the user's name, the user's email address, social network information pertaining to the user, the user's age, user's area of interest, and any medications, topical creams or ointments, cosmetic products, treatment protocol, etc., currently used by the user, previously recommended treatments and/or products, etc.
In some embodiments, the camera 204 is any suitable type of digital camera that is used by the mobile computing device 104. In some embodiments, the mobile computing device 104 may include more than one camera 204, such as a front-facing camera and a rear- facing camera. Generally herein, any reference to images being utilized by embodiments of the present disclosure should be understood to reference video, images (one or more images), or video and images (one or more images), as the present disclosure is operable to utilize video, images (one or more images), or video and images (one or more images) in its methods and systems described herein.
In some embodiments, the mobile computing device 104 may use an image capture engine (not shown) to capture images of the user. In some embodiments, the image capture engine is part of the user interface engine 210. In an embodiment, the image capture engine is configured to capture one or more images of an area of interest. The area of interest can be for example the back, the face, the neck, the chest, or sections thereof, of the user 102. The images can be captured by the user 102 as a "selfie," or the mobile computing device 104 can be used by a third party for capturing images of a user 102. In some embodiments, the image capture engine timestamps the captured image(s) and stores the images according to the user profile with other data, such as flash/camera settings. The image capture engine may also send the images with the associated information to the server computer device 108 for storage, optional processing, and subsequent retrieval, as will be described in more detail below.
In some embodiments, the image analysis engine 206 is configured to compare two or more images. The image analysis engine 206 checks the timestamps of the images and runs a similar/difference algorithm or image processing routine. In some embodiments, the similar/difference algorithm determines or detects changes in size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc. In some embodiments, image analysis engine 206 compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious." The lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold. This subset of lesions can be highlighted on the image, represented in a skin condition map or profile, etc. In some embodiments, the image analysis engine 206 can identify the changes in the images as acne blemishes, which can also be highlighted on the image, represented in a skin condition map or profile, etc.
In some embodiments, the skin condition engine 208 is configured to analyze, for example, the skin condition map or profile, and can determine, for example, the stages of acne for each region of the image. In doing so, the skin condition engine 208 can access data from the skin condition data store 218. In some embodiments, the skin condition engine 208 identifies a progression of a skin condition, such as acne (e.g., determined from an analyses of the images). If the changes to certain areas (e.g., pixel groups) of the images match, for example, the progression of a known skin condition (e.g., an acne blemish) accessed from the skin condition data store 218, the skin condition engine 208 can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.). Of course, some of the functionality of the skin condition engine 208 can be shared or carried out by the image processing engine 206, and vice versa.
With the results of the analysis, the recommendation engine 212 in some embodiments is configured to recommend a treatment protocol and/or product (e.g., topical formula, such as an ointment, cream, lotion, etc.) for each region based at least on the determined skin condition (e. g., stage of acne, etc.). In doing so, the recommendation engine 212 can access data from the product data store 216 and/or the user data store 214. Any recommendation generated by the recommendation engine 212 can be presented to the user in any fashion via the user interface engine 210 on display 202.
Further details about the actions performed by each of these components are provided below.
"Engine" refers to refers to logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASP, Microsoft .NET™, Go, and/or the like. An engine may be compiled into executable programs or written in interpreted programming languages. Software engines may be callable from other engines or from themselves. Generally, the engines described herein refer to logical modules that can be merged with other engines or can be divided into sub-engines. The engines can be stored in any type of computer-readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
"Data store" refers to any suitable device configured to store data for access by a computing device. One example of a data store is a highly reliable, high-speed relational database management system (DBMS) executing on one or more computing devices and accessible over a high-speed network. Another example of a data store is a key-value store. However, any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries may be used, and the computing device may be accessible locally instead of over a network or may be provided as a cloud-based service. A data store may also include data stored in an organized manner on a computer-readable storage medium, such as a hard disk drive, a flash memory, RAM, ROM, or any other type of computer-readable storage medium. One of ordinary skill in the art will recognize that separate data stores described herein may be combined into a single data store, and/or a single data store described herein may be separated into multiple data stores, without departing from the scope of the present disclosure.
FIG. 3 is a block diagram that illustrates various components of a non-limiting example of an optional server computing device 108 according to an aspect of the present disclosure. In some embodiments, the server computing device 108 includes one or more computing devices that each include one or more processors, non-transitory computer- readable media, and network communication interfaces that are collectively configured to provide the components illustrated below. In some embodiments, the one or more computing devices that make up the server computing device 108 may be rack- mount computing devices, desktop computing devices, or computing devices of a cloud computing service.
In some embodiments, image processing and/or storage of the captured images can be additionally or alternatively carried out at an optional server computing device 108. In that regard, the server computing device 108 can receive captured and/or processed images from the mobile computing device 104 over the network 110 for processing and/or storage. As shown, the server computing device 108 optionally includes an image analysis engine 306, a skin condition engine 308, a recommendation engine 312, and one or more data stores, such as a user data store 314, a product data store 316 and/or skin condition data store 318. It will be appreciated that the image analysis engine 306, a skin condition engine 308, a recommendation engine 312, and one or more data stores, such as a user data store 314, a product data store 316 and/or skin condition data store 318 are substantially identical in structure and functionality as the image analysis engine 206, a skin condition engine 208, a recommendation engine 212, and one or more data stores, such as a user data store 214, a product data store 216 and/or skin condition data store 218 of the mobile computing device 104 illustrated in FIG. 2.
FIG. 4 is a flowchart that illustrates a non-limiting example embodiment of a method 400 for determining changes in skin conditions of a user according to various aspects of the present disclosure. In some embodiments, the method 400 also analyzes the changes in skin conditions and optionally recommends a treatment protocol and/or product to treat the user 102. It will be appreciated that the following method steps can be carried out in any order or at the same time, unless an order is set forth in an express manner or understood in view of the context of the various operation(s). Additional process steps can also be carried out. Of course, some of the method steps can be combined or omitted in example embodiments.
From a start block, the method 400 proceeds to block 402, where a mobile computing device 104 captures image(s) of the user 102 at a time (Tn). In some embodiments, the mobile computing device 104 uses the camera 204 to capture at least one image. In some embodiments, more than one image with different lighting conditions may be captured in order to allow an accurate color determination to be generated. In some embodiments, the captured image is of an area of interest to the user 102. For example, the area of interest can be one of face, the neck, the back, etc., for tracking lesions, such as moles, sun spots, acne, eczema, etc., skin condition analysis, etc.
The one or more images can be stored in the user data store 214 at the mobile computing device 104 and/or server computer 108. When stored, additional data collected at the time of image capture can be associated with the images. For example, each image is time stamped, and may include other information, such as camera settings, flash settings, etc., area of interest captured, etc.
For new users, the user interface engine 210 can be used to create a user profile, as described above. At the time of image capture, the user interface engine 210 may query the user to enter the intended location (e.g., back, face, arm, neck, etc.) so that the captured image can be associated with the user's area of interest. The area of interest can be a specific body part of the user, such as the back, face, arm, neck, etc., or can be regions thereof, such as the forehead, chin, or nose of the face, the shoulder, dorsum, or lumbus of the back, etc. If the user has more than one area of interest, the user interface engine 210 can be repeatedly used until all images are captured. The captured images are stored in the user data store 214. If stored at the server computer 108 in user data store 314, the mobile computing device 104 can transmit the images over the network 110.
Images of the same area of interest are then captured sequentially over a period of time (Ti, T2, T3, Tn) at block 402. For example, the images can be captured daily, weekly, bi-weekly, monthly, bi-monthly, semi-annually, annually, etc.
Of course, the period of image capture can change during observation of the area of interest. For example, if an area of interest is flagged by the system, the user is notified by the system or if the user notices changes when reviewing one or more of the captured images, the frequency of image capture can be adjusted accordingly.
Next, at block 404, the images captured over a period of time are processed by the image analysis engine 206 of the mobile computing device 104 or the image analysis engine 306 of the server computing device 108. In that regard, the images collected over time are processed, for example, to detect differences or changes in the images by comparing each image to the other images. In some embodiments, the image analysis engine is initiated by user input (e.g., via user interface 210). In other embodiments, the image analysis engine may automatically analyze the images once the images are stored in user data store 214 and/or 314. If differences are determined, the image analysis engine is configured to notify the user. For example, if the determined differences are greater than a preset threshold value, the user is notified. Notification can be carried out via email, text message, banner notification via the user interface, etc., the preference of which can be set up in the user profile.
If the user does not enter the area of interest to be associated with the captured image, the image analysis engine can employ one or more image processing techniques to determine the area of interest of the user. In some embodiments, the image analysis engine may access information from a data store to assist in this determination. For example, the captured images may be compared to images with known static body (e.g., facial) features, such as the eyes, nose, and ears in order to determine the area of interest. In some embodiments, registration between captured images is performed to improve the analysis. This can be accomplished in some embodiments by referencing static body (e.g., facial) features present in each of the images to be analyzed. In some embodiments, one or more of these processes can be trained.
The example of the method 400 proceeds to block 406, where an image map is generated depicting changes to the area of interest over time. In some embodiments, the image analysis engine determines or detects changes in one or more of size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc. In some embodiments, the image analysis engine compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious." The lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold (e.g., 1-3%, 2- 4%, 3-5%, etc.). This subset of lesions can be represented in an image map in the form of a skin condition map or profile, etc. In some embodiments, the image analysis engine can identify the changes in the images as acne blemishes, or other skin conditions, which can also be represented in a skin condition map or profile, etc. Next, at block 408, a skin condition of the area of interest is determined based on the skin condition map or profile. In some embodiments, the skin condition engine 208 of the mobile computing device 104 or the skin condition engine 306 of the server computing device 108 analyzes the skin condition map or profile and determines, for example, the stages of acne for each region of the area of interest. In doing so, the skin condition engine can access data from the skin condition data store 218, 318. In some embodiments, the skin condition engine identifies a progression of a skin condition, such as acne (determined from an analyses of the images). In other embodiments, this step can be carried out, at least in part, by the image analysis engine. If the changes to certain areas (e.g., pixel groups) of the images match, for example, the progression of a known skin condition (e.g., an acne blemish) accessed from the skin condition data store, the skin condition engine (or optionally, the image analysis engine) can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.).
The example of the method 400 then proceeds to block 410, where a treatment protocol and/or product are recommended for each region of the area of interest based on the determined skin condition (e. g., stage of acne, etc.). In doing so, data can be accessed from the product data store 216, 316, user data store 214, 314, etc. Different products and/or treatment protocols can be recommended for regions with difference skin condition levels. Any recommendation generated by the recommendation engine can be presented to the user in any fashion via the user interface engine 210 on display 202. The recommendation can be saved in the user’s profile in user data store 214, 314. In some embodiments, previous recommendations and/or treatments administered by the user can be used in the product and/or treatment protocol recommendation. In some embodiments, the efficacy of the recommendation can be tracked, which can be used to train the recommendation engine and/or data stored in the product data store for improved recommendations in subsequent uses.
The method 400 then proceeds to an end block and terminates.
FIG. 5 is a schematic diagram that illustrates a non-limiting example of a system 500 for generating calibrated images of a user according to various aspects of the present disclosure. In some embodiments, the system 500 or aspects thereof may use the calibrated images for diagnosing a skin condition of a user, for example. In a certain embodiment, one or more aspects of the system 500 can be employed by the system 100 described above. For example, a calibration routine from system 500 can be carried out by the system 100 prior to area of interest image analysis. Once calibrated, the images captured by the camera can be employed by the system 100 in order to carry out the skin condition determination method 400 described above with reference to FIG. 4. Alternatively, system 500 can be configured to carry out the skin condition methodologies and technologies of system 100, as shown schematically in FIGURES 6 and 7.
In other embodiments, the system 500 can use the calibrated images for other applications, such as facial recognition applications, for example. In yet other embodiments, the calibrated images may be utilized for generating a recommendation for cosmetic products. For example, such cosmetic products may be for skin, anti-aging, face, nails, and hair, or any other beauty or health product. As a further example, products may include creams, cosmetics, nail polish, shampoo, conditioner, other hair products, vitamins, any health-related products of any nature, or any other product that offer results visible in a person's appearance, such as a person's skin, hair, nails or other aspects of a person's appearance. Examples of treatments may include diet treatments, physical fitness treatments, acupuncture treatments, acne treatments, appearance modification treatments, or any other treatment that offers results visible in a person's appearance.
For more information on suitable uses for the calibrated data, all of which are within the scope of and are embodiments of the disclosure, please see U.S. Patent No. 9,760,925, the disclosure of which is incorporated by reference in its entirety.
In the system 500, a user 502 interacts with a mobile computing device 504 and a calibration device 506. In one example, the mobile computing device 504 is used to capture one or more images of the user 502 in the presence of the calibration device 506. The calibration device 506 is associated with or generates calibration data, such as light meter data, color meter data, color card (e.g., color reference) data, etc. The calibration data is used by the system 500 to generate calibrated images via the mobile computing device 504, for example. In an example, the calibration device 506 can be used to calibrate the mobile computing device 504 (e.g., a camera of the mobile computing device) prior to image capture in order to generate calibrated images. In other embodiments, the calibration data can be used by the mobile computing device 504 after image capture for generating calibrated images. Because of the calibration data provided by the calibration device, the images can be either captured or processed in a way to obtain, for example, true colors of the user, regardless of the lighting conditions, etc., in which the image was taken.
In the embodiment shown, the calibration device 506 is a cosmetic, such as lipstick. In this embodiment, the cosmetic packaging includes one or more colors that can be used as a color calibration reference. In some embodiments, the color(s) is chosen from a list of colors from the Macbeth chart. Generally, the Macbeth chart is comprised of a number of colors with known color values. Other color reference systems can be also used. In some embodiments, the color(s) can be on the exterior of the cosmetic packaging or on a part thereof (e.g., cap, lid, etc.) that can be visible to the mobile computing device 504.
In some embodiments, the calibration data of the calibration device 506 can be associated with other material obtained at the point of sale, for example, the box or other container/packaging, the product literature, etc. In some embodiments, the associated material includes a color card, or parts thereof, for example. The color card can include colors, for example, of the Macbeth chart. In other embodiments, the associated material includes one or more colors and/or associated indicia. The associated indicia (e.g., QR code, bar code, symbol, etc.) can be used by the system to obtain, for example, the color value(s) of the one or more colors included in the associated material or of the cosmetic packaging for calibration purposes. In one example, the associated indicia can be linked to color value(s) in a calibration data store.
In some embodiments, the calibration device 506 includes one or more sensors configured to generate color meter data, light meter data, etc. For example, the calibration device 506 in one embodiment includes one or more photosensors (e.g., photodiodes) configured to sense light conditions and generate light calibration data. Additionally or alternatively, the calibration device 506 in other embodiments includes one or more photosensors (e.g., filtered photodiodes) configured to sense color temperature and generate color calibration data. In other embodiments, the mobile device may include such sensors, and may be used to capture such calibration affecting data.
Of course, the calibration device 506 can take many forms or functions. For example, the calibration device 506 can be a cosmetic, such as a lipstick, eyeshadow, foundation, etc., a hair brush, a toothbrush, etc., or an appliance, such as a Clarisonic branded skin care appliance. In other embodiments, the only function of the calibration device 506 is to provide calibration data.
In some embodiments, the calibration device 506 is configured to transmit the calibration data to the mobile computing device 504. In some embodiments, the calibration device 506 can be coupled (e.g., wired or wirelessly) in data communication with the mobile computing device 504 according to any known or future developed protocol, such as universal serial bus, Bluetooth, WiFi, Infrared, ZigBee, etc. In an embodiment, the calibration device 506 includes a transmitter for transmitting the calibration data.
In some embodiments, once the calibration device 506 is turned on and in range of the mobile computing device 504, it automatically pairs and sends the calibration data to the mobile computing device 504. In other embodiments, the mobile computing device 504 pulls the calibration data from the calibration device 506 via a request or otherwise. In yet other embodiments, the mobile computing device 504 obtains the calibration data from a local data store or a remote data store, such as the calibration data store, based on the associated indicia of the calibration device 506.
As will be described in more detail below, the mobile computing device 504 in some embodiments can carry out a device calibration routine to adjust camera settings, such as white balance, brightness, contrast, exposure, aperture, flash, etc., based on the provided calibration data prior to image capture. As will be also described in more detail below, the calibration data can be also used after image capture in some embodiments. For example, an image captured along with calibration data can be adjusted via imaging processing. In one embodiment in which color data is obtained via a color reference, the image can be compared to a reference image that also contains the color reference with the same color value(s). From the comparison(s), various attributes of the image(s) can be adjusted to calibrate the image. In other embodiments, the calibration device includes associated indicia that can be used to retrieve color values of the calibration device. From the retrieved color value(s), various attributes of the image(s) can be adjusted to calibrate the captured image. In yet other embodiments, the calibration device can transmit light and/or color meter data to the mobile computing device. With the light and/or color meter data, calibrated images are generated by the mobile computing device from the captured images.
As a result, the images captured and/or processed by the mobile computing device 504 would look the same whether the user has taken the photo in a dark room, a bright room, or a room with non-uniform and highly angled lighting. Thus, calibrated images are generated by the mobile computing device. This standardization process can lead to a reduction or elimination in the variability in the quality of images used for applications ranging from diagnosing skin conditions and/or cosmetic recommendations to facial recognition, for example.
As will be described in more detail below, some of the functionality of the mobile computing device 504 can be additionally or alternatively carried out at an optional server computing device 508. For example, the mobile computing device 504 in some embodiments transmits the captured images to the server computing device 508 via a network 510 for image processing (e.g., calibration, skin condition diagnosis, product recommendation, facial recognition, etc.) and/or storage. In some embodiments, the network 510 may include any suitable wireless communication technology (including but not limited to Wi-Fi, WiMAX, Bluetooth, 2G, 2G, 4G, 5G, and LTE), wired communication technology (including but not limited to Ethernet, USB, and FireWire), or combinations thereof.
For example, with the captured images received from the mobile computing device 504, the server computing device 508 may process the captured images for calibration purposes and/or store the calibrated images for subsequent retrieval. In other embodiments, calibrated images are transmitted to the server computing device 508 for storage and/or further processing, such as skin condition diagnosis, etc. In some embodiments, the server computing device 508 can serve calibration data to the mobile computing device 504 for local processing.
FIG. 6 is a block diagram that illustrates a non-limiting example of a mobile computing device 504 according to an aspect of the present disclosure. In some embodiments, the mobile computing device 504 may be a smartphone. In some embodiments, the mobile computing device 504 may be any other type of computing device having the illustrated components, including but not limited to a tablet computing device or a laptop computing device. In some embodiments, the mobile computing device 504 may not be mobile, but may instead by a stationary computing device such as a desktop computing device or computer kiosk. In some embodiments, the illustrated components of the mobile computing device 504 may be within a single housing. In some embodiments, the illustrated components of the mobile computing device 504 may be in separate housings that are communicatively coupled through wired or wireless connections (such as a laptop computing device with an external camera connected via a USB cable). The mobile computing device 504 also includes other components that are not illustrated, including but not limited to one or more processors, a non-transitory computer-readable medium, a power source, and one or more communication interfaces.
As shown, the mobile computing device 504 includes a display device 602, a camera 604, an optional image analysis engine 606, a skin condition engine 608 (optional), a user interface engine 610, a recommendation engine 612 (optional), and one or more optional data stores, such as a user data store 614, a product data store 616 and/or skin condition data store 618, and a calibration data store 620. In an embodiment, the mobile computing device 504 further includes a calibration engine 622. Each of these components will be described in turn.
In some embodiments, the display device 602 is an LED display, an OLED display, or another type of display for presenting a user interface. In some embodiments, the display device 602 may be combined with or include a touch-sensitive layer, such that a user 502 may interact with a user interface presented on the display device 602 by touching the display. In some embodiments, a separate user interface device, including but not limited to a mouse, a keyboard, or a stylus, may be used to interact with a user interface presented on the display device 602.
In some embodiments, the user interface engine 610 is configured to present a user interface on the display device 602. In some embodiments, the user interface engine 610 may be configured to use the camera 604 to capture images of the user 502. For example, the user 502 may take a "selfie" with the mobile computing device 504 via camera 604. Of course, a separate image capture engine may also be employed to carry out at least some of the functionality of the user interface 610. The user interface presented on the display device 602 can aid the user in capturing images, storing the captured images, accessing the previously stored images, interacting with the other engines, etc. In some embodiments, the camera 604 is any suitable type of digital camera that is used by the mobile computing device 504. In some embodiments, the mobile computing device 504 may include more than one camera 612, such as a front-facing camera and a rear- facing camera. In some embodiments, the camera 604 includes adjustable settings, such as white balance, brightness, contrast, exposure, aperture, and/or flash, etc. Generally herein, any reference to images being utilized by the present disclosure, should be understood to reference both video, images (one or more images), or video and images (one or more images), as the present disclosure is operable to utilize video, images (one or more images), or video and images (one or more images) in its methods and systems described herein.
In some embodiments, the calibration engine 606 is configured to calibrate the camera 604 of the mobile computing device 504 based on calibration data obtained from at least one of the calibration device 506 or the calibration data store 620. In some embodiments, the calibration engine 606 is configured to adjust the settings of the camera 604 prior to image capture. In other embodiments, instead of calibrating the camera 604 prior to image capture, the calibration engine 606 is configured to calibrate the images after image capture. For example, calibration data from the calibration device 506 can be used when processing the captured images prior to or during storage.
In some embodiments, the calibration engine 606 detects a color reference (such as a color card) within the captured image and uses the color reference in order to correct the colors of the captured image for calibration purposes. For example, the calibration engine 606 in some embodiments compares the image captured by the camera 604 to a reference image stored in the calibration data store 620. The reference image contains some of, all of, etc., the color calibration data of the captured image. For example, the calibration device 506 (e.g., cosmetic packaging, product literature, appliance handle, etc.) in the captured image may include a color card, a color chip, or other color reference, etc., to be compared to the reference image stored in calibration data store 620.
In other embodiments, the color of the calibration device 506 has a known color value. In yet other embodiments, the calibration device 506 includes one or more colors with a known color value that can be retrieved from the calibration data store 620 via indicia visibly associated with the calibration device 506. In some embodiments, the color reference detected by the calibration engine 606 within the captured image is indicia that can be used to retrieve the color value(s) from the calibration data store 620 in order to correct the colors of the captured image for calibration purposes.
After calibration, the calibrated images are saved in a data store, such as user data store 614, and can be subsequently used for product selection (e.g., hair color, lipstick color, eye shadow color, etc.,), diagnosis, such as skin condition diagnosis, or for other purposes, such as facial recognition applications.
The mobile computing device 504 may be provided with other engines for increased functionality. For example, in the embodiment shown, the mobile computing device 504 includes a skin condition engine 608. The skin condition engine 608 is configured to analyze the calibrated images to determine one or more skin conditions (e.g., acne, eczema, psoriasis, etc.) of the user 502. The skin condition engine 608 may retrieve data from the skin condition data store 618 during the analysis. In some of these embodiments, a recommendation engine 612 may also be provided, which recommends a treatment protocol, products for treatment, etc., based on the results of the analysis carried out by the skin condition engine 608. In doing so, the recommendation engine 612 can access data from the product data store 616.
In some embodiments, the mobile computing device 504 includes an image analysis engine 622 as well as the skin condition engine 608. Image analysis engine 622 includes, among other things, the functionality of the image analysis engine 206 described above with reference to FIG. 2. In that regard, the image analysis engine 622 in some embodiments is configured to compare two or more images. The image analysis engine 622 checks the timestamps of the images and runs a similar/difference algorithm or image processing routine. In some embodiments, the similar/difference algorithm determines or detects changes in size, shape, color, uniformity, etc., of existing lesions (e.g., moles, acne, dark sports, etc.), detects new lesions, detects the absence of previously detected lesions, detects a progression of a lesion, etc. In some embodiments, image analysis engine 622 compares and interprets the gross changes of the lesions over time so as to decide and flag a subset of lesions as "suspicious." The lesions that are flagged as suspicious have changed in size, shape, color, uniformity etc., an amount greater than a predetermined threshold. This subset of lesions can be highlighted on the image, represented in a skin condition map or profile, etc. In some embodiments, the image analysis engine 622 can identify the changes in the images as acne blemishes, which can also be highlighted on the image, represented in a skin condition map or profile, etc.
In some embodiments, the skin condition engine 608 is configured to analyze the skin condition map or profile generated by the image analysis engine 622, and can determine, for example, the stages of acne for each region of the image. In doing so, the skin condition engine 608 can access data from the skin condition data store 618. In some embodiments, the skin condition engine 608 identifies a progression of a skin condition, such as acne (determined from an analyses of the images). If the changes to certain areas (e.g., pixel groups) of the images match, for example, the progression of a known skin condition (e.g., an acne blemish) accessed from the skin condition data store 618, the skin condition engine 608 can identify these groups of pixels as a blemish and can assigned the blemish a skin condition level (e.g., acne stage, etc.).
With the results of the analysis, the recommendation engine 612 is configured to recommend a treatment protocol and/or product for each region based on the determined skin condition (e. g., stage of acne, etc.) in some embodiments. In doing so, the recommendation engine 612 can access data from the product data store 616. Any recommendation generated by the recommendation engine 612 can be presented to the user in any fashion via the user interface engine 610 on display 602.
In other embodiments, a facial recognition engine (not shown) is provided, which is configured to identify the identity of, or other attribute, of the user. In yet other embodiments, a cosmetic recommendation engine (not shown) is provided, which can simulate product color, such as hair color, lipstick, etc., on the user for aid in product selection, product recommendation, etc. In some embodiments, the cosmetic recommendation engine is part of the recommendation engine 612 and can access data from the product data store 616. Any recommendation generated by the recommendation engine 612 can be presented to the user in any fashion via the user interface engine 610 on display 602.
Further details about the actions performed by each of these components are provided below.
FIG. 7 is a block diagram that illustrates various components of a non-limiting example of an optional server computing system 508 according to an aspect of the present disclosure. In some embodiments, the server computing system 508 includes one or more computing devices that each include one or more processors, non-transitory computer- readable media, and network communication interfaces that are collectively configured to provide the components illustrated below. In some embodiments, the one or more computing devices that make up the server computing system 508 may be rack-mount computing devices, desktop computing devices, or computing devices of a cloud computing service.
In some embodiments, image processing and/or storage of the captured images can be additionally or alternatively carried out at an optional server computing device 508. In that regard, the server computing device 508 can receive captured and/or processed images from the mobile computing device 504 over the network 510 for processing and/or storage. As shown, the server computing device 508 optionally includes a calibration engine 706, a skin condition engine 708, a recommendation engine 712, and one or more data stores, such as a user data store 714, a product data store 716, a skin condition data store 718, and/or a calibration data store 720. It will be appreciated that the calibration engine 706, the skin condition engine 708, the recommendation engine 712, and the one or more data stores, such as the user data store 714, the product data store 716, the skin condition data store 718, and/or the calibration data store 720 are substantially identical in structure and functionality as the calibration engine 606, the skin condition engine 608, the recommendation engine 612, and one or more data stores, such as the user data store 614, the product data store 616, the skin condition data store 618, and/or the calibration data store 620 of the mobile computing device 504 illustrated in FIG. 6.
FIG. 8 is a flowchart that illustrates a non-limiting example embodiment of a method 500 for calibrating images of a user according to an aspect of the present disclosure. It will be appreciated that the following method steps can be carried out in any order or at the same time, unless an order is set forth in an express manner or understood in view of the context of the various operation(s). Additional process steps can also be carried out. Of course, some of the method steps can be combined or omitted in example embodiments.
From a start block, the method 800 proceeds to block 802, where calibrated images are generated by the mobile computing device 504 and/or the server computing system 508 with the aid of calibration data from the calibration device 506. For example, the user 502 can operate the calibration device 506 in some embodiments to generate data indictive of, for example, ambient lighting conditions. The calibration device 506 may additionally or alternatively generate color temperature data of the user 502. For example, in one embodiment in which the calibration device 506 includes one or more photosensors, the user 502 can scan an area of interest (e.g., face) with a sweeping movement. This can occur, for example, during a face cleansing or make-up application/removal routine just prior to, contemporaneously with, or just after image capture by the mobile computing device 504. During the scan, the calibration device 506 records light meter data generated by the photosensor(s). If equipped, the calibration device 506 alternatively or additionally records color meter data of the user via an appropriate sensor. The light meter data and/or color meter data can then be transferred (wired or wirelessly) to the mobile computing device 504 and/or server computing system 508.
With the generated light meter data and/or color meter data, the calibration engine can calibrate either the camera 604 of the mobile computing device 504 or the images captured by the camera. For example, the mobile computing device 504 can receive the calibration data (e.g., light meter data, color meter data, etc.) from the calibration device 506 via any wired or wireless protocol and adjust the appropriate camera settings to calibrate the camera 604 prior to image capture. With the calibrated camera, the mobile computing device can generate calibrated image(s) of an area of interest of the user 502. Alternatively, the calibration engine can use the light meter data and/or color meter data obtained from the calibration device 506 to calibrate the images captured by the camera 604.
In some embodiments, the images captured are of an area of interest to the user 502. For example, the area of interest can be one of face, the neck, the arm, etc., for tracking skin conditions, such as moles, sun spots, acne, eczema, etc.
In another embodiment, an attribute of the calibration device 506 can be used by the calibration engine to generate calibrated images. In this embodiment, the mobile computing device 504 captures at least one image of the user 502 in the presence of the calibration device 506. In some embodiments, the at least one image to be captured is of an area of interest to the user 502. For example, the area of interest can be one of face, the neck, the arm, etc., for tracking skin conditions, such as lesions, moles, sun spots, acne, eczema, etc.
For example, the user 502 can capture an image of themselves (a "selfie") holding the calibration device 506. In this embodiment, the calibration device 506 may include a color card, a color chip or other feature that can provide a reference for calibration purposes. From the captured image, the calibration engine 606 can extract calibration data and can then generate a calibrated image. In some embodiments, the calibrated image is generated by adjusting the appropriate camera settings to calibrate the camera 604. With the calibrated camera settings, the mobile computing device generates calibrated images. For example, the user interface engine captures an image to be used for calibration purposes. Once calibrated, the camera can be used to capture calibrated images for skin condition applications, facial recognition applications, etc. In some other embodiments, a calibrated image is generated via image processing techniques by adjusting one or more image attributes (e.g., white balance, brightness, color values, etc.) of the image after image capture.
In some embodiments, the calibration engine automatically detects a color reference (such as a color card) within the captured image and uses the color reference in order to correct the colors of the captured image for calibration purposes. For example, the calibration engine in some embodiments compares the image captured by the camera 604 to a reference image stored in the calibration data store 620, 720. The reference image contains some of, all of, etc., the color calibration data of the captured image. For example, the calibration device 506 (e.g., cosmetic packaging, product literature, appliance handle, etc.) in the captured image may include a color card, a color chip, or other color reference, etc., to be compared to the reference image stored in calibration data store.
In other embodiments, the color of the calibration device 506 has a known color value. In yet other embodiments, the calibration device 506 includes one or more colors with a known color value that can be retrieved from the calibration data store via indicia visibly associated with the calibration device 506. In some embodiments, the color reference detected by the calibration engine 606 within the captured image is indicia that can be used to retrieve the color value(s) from the calibration data store 620 in order to correct the colors of the captured image for calibration purposes. The calibrated images generated by the calibration engine are then stored in the user data store 614 of the mobile computing device 504 for subsequent retrieval. During storage of the captured images of the user, additional image processing (e.g., filtering, transforming, compressing, etc.) can be undertaken, if desired. Additionally or alternatively, the captured images can be transferred to the server computing device 508 over the network 510 for storage at the user data store 714.
Next, at block 504, the calibrated images can be analyzed for any suitable application, including any of those set forth above. For example, the calibrated images can be analyzed to determine a skin condition of the area of interest. In some embodiments, the skin condition engine 608 of the mobile computing device 504 or the skin condition engine 706 of the server computing device 508 analyzes the calibrated images and determines, for example, acne, age spots, dry patches, etc., for each region of the area of interest. In doing so, the skin condition engine can access data from the skin condition data store 618, 718.
The example of the method 500 then proceeds to block 506, where an optional treatment protocol and/or product is for each region of the area of interest is recommended based on the determined skin condition (e. g., acne, dry skin, age spots, etc.). In some embodiments, the recommendation engine 612 of the mobile computing device 504 or the recommendation engine 712 of the server computing device 508 recommends a treatment protocol and/or product for each region of the area of interest based on the determined skin condition(s). In doing so, data can be accessed from the product data store 616, 716. Different products and/or treatment protocols can be recommended for regions with difference skin conditions. Any recommendation generated by the recommendation engine can be presented to the user in any fashion via the user interface engine on display 602. In some embodiments, the efficacy of the recommendation can be tracked, which can be used to train the recommendation engine and/or data stored in the product data store for improved recommendations in subsequent uses.
Of course, any processing accomplished at the mobile computing device 504 can be additionally or alternatively carried out at the server computing device 508.
The method 800 then proceeds to an end block and terminates· Other embodiments are contemplated. For example, the calibration device and/or mobile computing device could also include positional sensors and inertial measurement sensors for generating additional data to be used to calibrate the images.
FIG. 9 is a block diagram that illustrates aspects of an exemplary computing device 900 appropriate for use as a computing device of the present disclosure. While multiple different types of computing devices were discussed above, the exemplary computing device 900 describes various elements that are common to many different types of computing devices. While FIG. 9 is described with reference to a computing device that is implemented as a device on a network, the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other devices that may be used to implement portions of embodiments of the present disclosure. Moreover, those of ordinary skill in the art and others will recognize that the computing device 900 may be any one of any number of currently available or yet to be developed devices.
In its most basic configuration, the computing device 900 includes at least one processor 902 and a system memory 904 connected by a communication bus 906. Depending on the exact configuration and type of device, the system memory 904 may be volatile or nonvolatile memory, such as read only memory ("ROM"), random access memory ("RAM"), EEPROM, flash memory, or similar memory technology. Those of ordinary skill in the art and others will recognize that system memory 904 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 902. In this regard, the processor 902 may serve as a computational center of the computing device 900 by supporting the execution of instructions.
As further illustrated in FIG. 9, the computing device 900 may include a network interface 910 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure may access basic services that utilize the network interface 910 to perform communications using common network protocols. The network interface 910 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WIFI, 2G, 3G, LTE, WiMAX, Bluetooth, Bluetooth low energy, and/or the like. As will be appreciated by one of ordinary skill in the art, the network interface 910 illustrated in FIG. 9 may represent one or more wireless interfaces or physical communication interfaces described and illustrated above with respect to particular components of the computing device 900.
In the exemplary embodiment depicted in FIG. 9, the computing device 900 also includes a storage medium 908. However, services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 908 depicted in FIG. 9 is represented with a dashed line to indicate that the storage medium 908 is optional. In any event, the storage medium 908 may be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, and/or the like.
As used herein, the term "computer-readable medium" includes volatile and non-volatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer readable instructions, data structures, program modules, or other data. In this regard, the system memory 904 and storage medium 908 depicted in FIG. 9 are merely examples of computer-readable media.
Suitable implementations of computing devices that include a processor 902, system memory 904, communication bus 906, storage medium 908, and network interface 910 are known and commercially available. For ease of illustration and because it is not important for an understanding of the claimed subject matter, FIG. 9 does not show some of the typical components of many computing devices. In this regard, the computing device 900 may include input devices, such as a keyboard, keypad, mouse, microphone, touch input device, touch screen, tablet, and/or the like. Such input devices may be coupled to the computing device 900 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, Bluetooth low energy, USB, or other suitable connections protocols using wireless or physical connections. Similarly, the computing device 900 may also include output devices such as a display, speakers, printer, etc. Since these devices are well known in the art, they are not illustrated or described further herein.
The present application may reference quantities and numbers. Unless specifically stated, such quantities and numbers are not to be considered restrictive, but exemplary of the possible quantities or numbers associated with the present application. Further in this regard, the present application may use the term "plurality" to reference a quantity or number. In this regard, the term "plurality" is meant to be any number that is more than one, for example, two, three, four, five, etc. The terms "about," "approximately," "near," etc., mean plus or minus 5% of the stated value. For the purposes of the present disclosure, the phrase "at least one of A, B, and C," for example, means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B, and C), including all further possible permutations when greater than three elements are listed.
The above description of illustrated examples of the present disclosure, including what is described in the Abstract, are not intended to be exhaustive or to be a limitation to the precise forms disclosed. While specific embodiments of, and examples for, the present disclosure are described herein for illustrative purposes, various equivalent modifications are possible without departing from the broader spirit and scope of the present disclosure, as claimed. Indeed, it is appreciated that the specific example voltages, currents, frequencies, power range values, times, etc., are provided for explanation purposes and that other values may also be employed in other embodiments and examples in accordance with the teachings of the present disclosure.
These modifications can be made to examples of the disclosed subject matter in light of the above detailed description. The terms used in the following claims should not be construed to limit the claimed subject matter to the specific embodiments disclosed in the specification and the claims. Rather, the scope is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation. The present specification and figures are accordingly to be regarded as illustrative rather than restrictive.

Claims

CLAIMS The embodiments of the disclosed subject matter in which an exclusive property or privilege is claimed are defined as follows:
1. 1. A computer implemented method for determining changes in a skin condition of a subject, comprising: obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over time, wherein each image taken is separated in time by a time period; determining one or more differences between the plurality of images.
2. The computer implemented method of Claim 1, further comprising: generating an image map of the area of interest, the image map indicative of the differences between the plurality of images.
3. The computer implemented method of Claims 2, wherein the image map indicates changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest.
4. The computer implemented method of Claims 2 or 3, further comprising: determining a skin condition based on the image map.
5. The computer implemented method of Claim 4, further comprising: recommending one of a treatment or a product based on the determined skin condition.
6. The computer implemented method of Claims 4, wherein the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
7. The computer implemented method of Claims 1-6, wherein the time period is selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, six months, and one year.
8. The computer implemented method of Claims 1-7, further comprising: if the difference detected is greater than a preselected threshold value, notifying the user that a change has been detected.
9. The computer implemented method of Claim 8, wherein the user is notified via an email, a text message, or a banner notification on a user device.
10. The computer implemented method of Claims 1-9, further comprising: determining the area of interest based at least one the captured images.
11. The computer implemented method of Claims 1-10, wherein the area of interest includes a back, a face, an arm, a neck, a shoulder, or regions thereof.
12. The computer implemented method of Claims 1-11, further comprising: determining a skin condition based on the determined differences; and recommending one of a treatment or a product based on the determined skin condition.
13. The computer implemented method of Claims 1-12, wherein the plurality of images are captured by a camera, the camera including one or more adjustable settings, the method further comprising: automatically calibrating the camera by adjusting one or more of the camera settings based on an associated calibration device.
14. The computer implemented method of Claims 1-13, further comprising: calibrating the captured images based on calibration data presented in the one or more of the captured images.
15. The computer implemented method of Claims 14, wherein the calibration data is provided by a calibration device captured in the images.
16. The computer implemented method of Claim 15, wherein the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
17. The computer implemented method of Claim 16, wherein the attribute is a color.
18. The computer implemented method of Claim 16, wherein the attribute is indicia indicative of a color, the method further comprising obtaining calibration data based on the indicia by accessing a data store linked to the indicia.
19. The computer implemented method of Claim 18, wherein the indicia includes a QR codes or a bar code.
20. The computer implemented method of Claims 1-12, further comprising obtaining calibration data from a calibration device captured by the image.
21. The computer implemented method of Claim 20, wherein the calibration device is a cosmetics apparatus or packaging associated therewith.
22. A system for determining changes in a skin condition of a subject, comprising: a camera configured to capture one or more images; one or more processing engines including circuitry configured to: cause the camera to capture one or more images of an area of interest associated with the subject, the one or more images taken sequentially over time so as to obtain a plurality of images separated in time by a time period selected from the group consisting of 24 hours, one week, one month, two months, three months, four months, five months, and six months, and one year; determine one or more differences between the captured images, the differences indicative of changes in one or more of a size, a shape, a color, and uniformity of an object contained in the area of interest; and determine a skin condition based on the determined differences or identifying the object for subsequent analysis if said differences are greater than a preselected threshold.
23. The system of Claim 22, wherein the one or more processing engines include circuitry configured to: determine the skin condition based on the determined differences; and recommend a treatment protocol or a product based on the determined skin condition.
24. The system of Claims 22 or 23, wherein the one or more processing engines includes circuitry configured to determine changes in one or more of: size, shape, color, uniformity of an existing lesion, detect new lesions, detect the absence of previously detected lesion(s), or detect a progression of a lesion.
25. The system of Claims 22-24, wherein the one or more processing engines includes circuitry configured to: detect a progression of a lesion from the detected differences in the plurality of images; and determine one or more stages of the lesion based on the detected progression of the lesion.
26. The system of Claims 22-25, wherein the one or more processing engines includes: a user interface engine including circuitry configured to cause the camera to capture the plurality of images; an image analysis engine including circuitry for comparing two or more images using a similar/difference algorithm to determine one or more differences between said images; and a skin condition engine including circuity configured for analyzing an image map of the determined one or more differences to locate a lesion, and for determining the stage of the lesion located in the image map.
27. The system of Claims 22-26, wherein the one or more processing engines further includes: a recommendation engine including circuity configured to recommend a treatment protocol and/or product for each region based at least on the determined skin condition.
28. The system of Claims 22-27, wherein the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
29. The system of Claims 22-28, further comprising a calibration engine including circuitry configured to calibrate the camera prior to image capture for generating calibrated images or to calibrate the images captured by the camera for generating calibrated images, said calibration engine obtaining calibration data from a calibration device.
30. The system of Claims 22-29, wherein the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference.
31. The system of Claims 22-29, wherein the calibration device includes an attribute usable to obtain the calibration data for calibrating the images.
32. The system of Claim 31, wherein the attribute is a reference color.
33. The system of Claim 31, wherein the attribute is indicia indicative of a color or color data, wherein the calibration engine is configured to retrieve the calibration data based on the indicia by accessing a data store linked to the indicia.
34. The system of Claim 33, wherein the indicia includes a QR codes or a bar code.
35. A computer-implemented method for determining changes in a skin condition of a subject, the method comprising: obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; determining a skin condition based on at least the plurality of images; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
36. The computer- implemented method of Claim 35, wherein said obtaining, by a first computing device, a plurality of images of an area of interest associated with the subject includes capturing, by a camera of a first computing device, the plurality of images.
37. The computer-implemented method of Claims 35 or 36, wherein said determining a skin condition based on least the plurality of images or said determining at least one product recommendation based on at least the determined skin condition is carried out by a second computing device remote from the first computing device.
38. The method of Claims 35-37, wherein the skin condition is selected from a group consisting of dermatitis, eczema, acne, and psoriasis.
39. A computer-implemented method for determining changes in a skin condition of a subject, the method comprising: obtaining a plurality of images of an area of interest associated with the subject, the plurality of images taken sequentially over a time with each taken image separated in time by a time period; calibrating the plurality of images based on calibration data obtained via the images; determining a skin condition based on least the plurality of images calibrated; determining at least one product recommendation based on at least the determined skin condition; and providing the at least one product recommendation to the subject.
40. A computer implemented method for accurate skin diagnosis, comprising: calibrating, by a computing device, one or more images of an area of interest associated with a subject; and determining a skin condition based on the one or more calibrated images.
41. The computer implemented method of Claim 40, wherein the one or more images includes a plurality of images taken sequentially over a period of time, and wherein said determining a skin condition is based on the plurality of images .
42. The computer implemented method of Claims 40-41, further comprising generating a treatment protocol and/or product recommendation for an area of interest of the subject based on the determined skin condition.
43. The computer implemented method of Claims 40-42, wherein said calibrating, by a computing device, one or more images of an area of interest associated with a subject includes obtaining calibration data from a calibration device; calibrating a camera based on the calibration data; and capturing the one or more images of a user with the calibrated camera.
44. The computer implemented method of Claim 43, further comprising generating, via the calibration device, light meter data or color temperature data of the subject; receiving the calibration data from the calibration device; adjusting one or more camera settings for calibrating the camera prior to image capture.
45. The computer implemented method of Claims 40-42, wherein said calibrating, by a computing device, one or more images of an area of interest associated with a subject includes capturing the one or more images via a camera associated with the computing device; obtaining calibration data from a calibration device associated with the one or more images captured by the camera; and calibrating the one or more images captured by the camera based on the calibration data.
46. The computer implemented method of Claim 45, further comprising generating, via the calibration device, light meter data or color temperature data of the subject; receiving the light meter data and/or color meter data from the calibration device, and using said light meter data and/or color meter data obtained from the calibration device to calibrate the captured images.
47. The computer implemented method of Claims 45 or 46, wherein the calibration device includes one selected from the group consisting of a color card, a color chip and a calibration reference, the method further comprising capturing at least one image of the subject in the presence of the calibration device.
48. The computer implemented method of Claim 47, wherein the calibration device is a cosmetics apparatus.
49. A method, comprising: obtaining calibration data from a calibration device; generating, by a computing device, calibrated images by one of: calibrating a camera of a mobile computing device based on the calibration data and capturing one or more images of a user with the calibrated camera; or calibrating one or more images captured with the camera based on the calibration data.
50. The method of Claim 49, further comprising determining a skin condition based on the one or more calibrated images.
51. The method of Claims 49 or 50, further comprising recommending one or more of: a skin treatment protocol; and a product configured to treat the skin condition.
52. The method of Claims 49-51, further comprising operating the calibration device to generate data indicative of light meter data or color temperature data of the subject.
53. The method of Claim 49-51, wherein the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data as calibration data.
54. The method of Claim 49-53, wherein the images captured include an area of interest associated with the user, the images of the area of interest usable for tracking skin conditions of the subject.
55. The method of Claims 49-51, further comprising capturing the one or more image of the subject in the presence of the calibration device.
56. The method of Claims 49-51, wherein the calibration device includes calibration data represented by one selected from the group consisting of: a color card, a color chip and a calibration reference.
57. The method of Claims 49-51, further comprising generating the calibrated image by adjusting one or more camera settings based on the calibration data.
58. The method of Claims 49-51, wherein the calibration engine automatically detects a color reference associated with the captured image and uses the color reference in order to correct the colors of the captured image to generate calibrated images.
59. The method of Claims 49-51, further comprising comparing the image captured to a reference image stored in a calibration data store.
60. A computer system, comprising a user interface engine including circuitry configured to cause an image capture device to capture images of the user; a calibration engine including circuitry configured to calibrate the image capture device prior to image capture for generating calibrated images or to calibrate the images captured by the image capture device for generating calibrated images, said calibration engine obtaining calibration data from a calibration device; a skin condition engine configured to determine a skin condition of the user based on the generated calibrated images image.
61. The computer system of Claim 60, further comprising a recommendation engine including circuity configured to recommend a treatment protocol or a product based at least on the determined skin condition.
62. The computer system of Claims 60 or 61 , wherein the calibration device includes one or more sensors configured to generate data indicative of calibration data, and wherein the calibration engine is configured to receive the calibration data and adjust one or more suitable camera settings for calibrating the camera prior to image capture.
63. The computer system of Claims 60-62, wherein the calibration device includes an attribute suitable for use by the calibration engine to generate the calibrated images.
64. The computer system of Claims 63, wherein the attribute is a color or indicia indicative of a color, the calibration engine configured to obtain calibration data based on the indicia.
65. The computer system of Claim 60-64, wherein the calibration engine includes circuitry configured to obtain the calibration data from the image captured by the image capture device, the image captured including an image of the calibration device.
66. The computer system of Claim 60-65, wherein the calibration device is a cosmetics apparatus or packaging associated therewith.
67. The computer system of Claim 60-66, wherein the calibration engine is configured to: automatically detect a color reference associated with the captured image; and use the color reference in order to correct the colors of the captured image to generate calibrated images.
68. The computer system of Claims 60-67, wherein the calibration device includes one selected from the group consisting of: a color card, a color chip and a calibration reference.
69. The computer system of Claims 60-68, wherein the calibration engine includes circuitry configured to extract calibration data from the captured image and generate the calibrated images.
70. The computer system of Claim 60-69, wherein the calibration engine includes circuitry configured to adjust one or more image attributes of the image after image capture for calibrating the image.
71. The computer system of Claim 70, wherein the image attributes includes white balance, brightness, and/or color values.
72. The computer system of Claim 60-71, wherein the user interface engine includes circuitry configured for creating a subject profile and storing said subject profile in a subject data store.
73. The computer system of Claim 60-72, wherein the subject profile includes previously provided recommendations, biographically data of the subject, area(s) of interest of the subject, current medicaments used, cosmetic products used, and/or treatment protocols used.
74. The computer system of Claims 60-61, wherein the calibration device includes one or more sensors configured to generate light meter data and/or color temperature data.
75. The computer system of Claim 74, wherein the generated light meter data and/or color meter data is used by the calibration engine to calibrate either the image capture device or the images captured by the image capture device.
76. The computer system of Claims 60 or 61, wherein the calibration engine is configured to receive the calibration data from the calibration device and adjust one or more camera settings for calibrating the camera prior to image capture.
77. The computer system of Claims 60 or 61, wherein the calibration engine is configured to use the light meter data and/or color meter data obtained from the calibration device to calibrate the images captured by the image capture device.
78. The computer system of Claims 60 or 61, wherein the captured images are calibrated by adjustment to one or more settings of the image capture device based on the calibration data
79. The computer system of Claims 60 or 61, wherein the calibration engine includes circuitry configured to compare the images captured to one or more reference images stored in the calibration data store to determine color calibration data for the captured images.
80. The computer system of Claims 60 or 61, wherein the calibration device includes one or more colors with a known color value, and wherein the calibration engine includes circuitry configured to retrieve from a calibration data store color data associate with the known color value.
81. A computing device configured to: analyze a first image taken at time T1 and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine at least one product recommendation based on the at least one difference between the first image and the second image; and provide the at least one product recommendation to the subject.
82. A computing device configured to: analyze a first image taken at time T1 and second image taken at time T2 of an area of interest of a subject to determine at least one difference between the first image and the second image, wherein time T2 is subsequent to the time Tl; determine if the difference between the first image and the second image is greater than a preselected value; identify the area of the images in which the differences are greater than the preselected value; and recommend review of the area identified.
PCT/US2020/067548 2019-12-30 2020-12-30 Image process systems for skin condition detection Ceased WO2021138477A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962955159P 2019-12-30 2019-12-30
US201962955128P 2019-12-30 2019-12-30
US62/955,159 2019-12-30
US62/955,128 2019-12-30

Publications (1)

Publication Number Publication Date
WO2021138477A1 true WO2021138477A1 (en) 2021-07-08

Family

ID=74285580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/067548 Ceased WO2021138477A1 (en) 2019-12-30 2020-12-30 Image process systems for skin condition detection

Country Status (1)

Country Link
WO (1) WO2021138477A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230105912A1 (en) * 2021-10-06 2023-04-06 Gpower Inc. Customized skin care method and system
WO2024133203A1 (en) 2022-12-21 2024-06-27 F. Hoffmann-La Roche Ag Analytical method of determining at least one property of a sample of a bodily fluid

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120321759A1 (en) * 2007-01-05 2012-12-20 Myskin, Inc. Characterization of food materials by optomagnetic fingerprinting
US9760925B2 (en) 2013-01-02 2017-09-12 Quintiles Ims Incorporated Rating and ranking controlled substance distribution stakeholders
US9996923B2 (en) * 2015-04-24 2018-06-12 Canfield Scientific, Incorporated Methods and apparatuses for dermatological feature tracking over multiple images
WO2018146450A1 (en) * 2017-02-07 2018-08-16 Anthropics Technology Limited A method of matching colours
US20190183233A1 (en) * 2011-12-23 2019-06-20 L'oreal Method for delivering cosmetic advice
US20190214127A1 (en) * 2018-01-10 2019-07-11 International Business Machines Corporation Sub-optimal health detection and alert generation using a time series of images
US20190290187A1 (en) * 2015-01-27 2019-09-26 Healthy.Io Ltd. Measuring and Monitoring Skin Feature Colors, Form and Size

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120321759A1 (en) * 2007-01-05 2012-12-20 Myskin, Inc. Characterization of food materials by optomagnetic fingerprinting
US20190183233A1 (en) * 2011-12-23 2019-06-20 L'oreal Method for delivering cosmetic advice
US9760925B2 (en) 2013-01-02 2017-09-12 Quintiles Ims Incorporated Rating and ranking controlled substance distribution stakeholders
US20190290187A1 (en) * 2015-01-27 2019-09-26 Healthy.Io Ltd. Measuring and Monitoring Skin Feature Colors, Form and Size
US9996923B2 (en) * 2015-04-24 2018-06-12 Canfield Scientific, Incorporated Methods and apparatuses for dermatological feature tracking over multiple images
WO2018146450A1 (en) * 2017-02-07 2018-08-16 Anthropics Technology Limited A method of matching colours
US20190214127A1 (en) * 2018-01-10 2019-07-11 International Business Machines Corporation Sub-optimal health detection and alert generation using a time series of images

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230105912A1 (en) * 2021-10-06 2023-04-06 Gpower Inc. Customized skin care method and system
WO2024133203A1 (en) 2022-12-21 2024-06-27 F. Hoffmann-La Roche Ag Analytical method of determining at least one property of a sample of a bodily fluid

Similar Documents

Publication Publication Date Title
JP6518711B2 (en) Skin condition measurement analysis information management system and skin condition measurement analysis information management method
US9946887B2 (en) Method and apparatus for determining privacy policy based on data and associated values
CN104105953B (en) Apparatus and method for color calibration
AU2015201759B2 (en) Electronic apparatus for providing health status information, method of controlling the same, and computer readable storage medium
EP2972155B1 (en) Systems and methods for specifying and formulating customized topical agents
US8891841B2 (en) Mobile dermatology collection and analysis system
CN105592792B (en) Surface state measures analytical information management system and surface state measures analysis approaches to IM
CN107106018B (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20140316235A1 (en) Skin imaging and applications
US20210196186A1 (en) Acne detection using image analysis
Ruminski et al. Interactions with recognized patients using smart glasses
Hartmann et al. Basic principles of artificial intelligence in dermatology explained using melanoma
EP4196908A1 (en) Systems and methods for acne counting, localization and visualization
JP2023538014A (en) Monitoring system
WO2021138477A1 (en) Image process systems for skin condition detection
US12190623B2 (en) High-resolution and hyperspectral imaging of skin
Vrânceanu et al. Gaze direction estimation by component separation for recognition of eye accessing cues
CN112330528B (en) Virtual makeup trial method, device, electronic device and readable storage medium
Hsu A customer-oriented skin detection and care system in telemedicine applications
KR102123598B1 (en) Apparatus and system for skin diagnosis and method thereof
US20210201492A1 (en) Image-based skin diagnostics
Van Molle et al. Dermatologist versus artificial intelligence confidence in dermoscopy diagnosis: Complementary information that may affect decision‐making
CN112912925B (en) Medium, information processing device, quantification method, and information processing system
CN117242528A (en) Systems and methods for processing images for skin analysis and visual skin analysis
JP7727088B2 (en) IMPLEMENTATION OF SKIN ANALYSIS SYSTEMS AND METHODS

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20845852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20845852

Country of ref document: EP

Kind code of ref document: A1