[go: up one dir, main page]

CN106203466B - Food identification method and device - Google Patents

Food identification method and device Download PDF

Info

Publication number
CN106203466B
CN106203466B CN201610474639.2A CN201610474639A CN106203466B CN 106203466 B CN106203466 B CN 106203466B CN 201610474639 A CN201610474639 A CN 201610474639A CN 106203466 B CN106203466 B CN 106203466B
Authority
CN
China
Prior art keywords
food
information
user
image
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610474639.2A
Other languages
Chinese (zh)
Other versions
CN106203466A (en
Inventor
高欢欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Xingji Meizu Technology Co ltd
Original Assignee
Meizu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meizu Technology Co Ltd filed Critical Meizu Technology Co Ltd
Priority to CN201610474639.2A priority Critical patent/CN106203466B/en
Publication of CN106203466A publication Critical patent/CN106203466A/en
Application granted granted Critical
Publication of CN106203466B publication Critical patent/CN106203466B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The invention relates to a method for identifying food, which comprises the following steps: identifying food through a camera; acquiring health influence information on a user corresponding to the identified food; the health influence information on the user is displayed on a camera preview interface, so that food can be identified more simply and quickly, and a corresponding food identification device is further provided.

Description

Food identification method and device
Technical Field
The invention relates to the technical field of computer application, in particular to a method and a device for food identification.
Background
The nature gives an extremely rich variety of food materials to human beings, and people can hardly know the food materials comprehensively even if the people face various food materials, so that potential unpredictable damage can be caused to the bodies when the people eat the food materials blindly.
With the continuous development of economy and society, more and more people pay attention to healthy diet and healthy life. Conventionally, in order to know food information, people generally search for food through a network, and determine whether the food is suitable for themselves according to the food information. The method has the problems of time consumption, inaccurate judgment and the like, and is not suitable for some foods with unknown names.
Disclosure of Invention
In view of the above, there is a need to provide a food identification method and apparatus capable of quickly identifying food.
A method of food identification, the method comprising:
identifying food through a camera;
acquiring health influence information on a user corresponding to the identified food;
and displaying the health influence information on the user on a camera preview interface.
In one embodiment, before identifying the food by the camera, the method further comprises:
receiving physiological data input by a user;
obtaining health impact information on a user corresponding to the identified food, comprising:
matching the health influence information of the user corresponding to the identified food with the physiological data of the user to acquire matching degree information;
displaying the health influence information on the user on a camera preview interface, wherein the health influence information comprises:
and displaying the matching degree information on a camera shooting preview interface.
In one embodiment, the physiological information input by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the requirement information is the requirement of the user for the purpose of perfecting the body of the user.
In one embodiment, the method further comprises:
acquiring a trigger operation acting on the camera shooting preview interface;
and acquiring matched detail information corresponding to the matching degree information according to the triggering operation.
In one embodiment, the identifying the food by the camera includes:
acquiring an image of food shot by a camera, and matching the shot image of the food with food image information prestored in a server to obtain a matching value;
displaying the pre-stored food image with the matching value not less than the set value;
and detecting whether a confirmation instruction of a user is acquired within a preset time, wherein the confirmation instruction is used for selecting the displayed food image, if so, the selected food image is confirmed as an identification result, and otherwise, the food image with the maximum matching value is confirmed as the identification result.
An apparatus for food identification, the apparatus comprising:
the food identification module is used for identifying food through the camera;
the health influence information acquisition module is used for acquiring health influence information on the user corresponding to the identified food;
and the information display module is used for displaying the health influence information on the user on a camera shooting preview interface.
In one embodiment, before identifying the food by the camera, the method further comprises:
the physiological data acquisition module is used for receiving physiological data input by a user;
the health influence information acquisition module is also used for matching the health influence information of the user corresponding to the identified food with the physiological data of the user to acquire matching degree information;
the information display module is also used for displaying the matching degree information on a camera shooting preview interface.
In one embodiment, the physiological information input by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the requirement information is the requirement of the user for the purpose of perfecting the body of the user.
In one embodiment, the apparatus further comprises:
the food detail information display module is used for acquiring trigger operation acting on the camera shooting preview interface; and acquiring the detail information of the identified food and the matched detail information corresponding to the matching degree information according to the triggering operation.
In one embodiment, the food identification module comprises:
the image matching module is used for matching the shot food image with food image information prestored in the server to obtain a matching value;
the matching image display module is used for displaying the pre-stored food image of which the image matching value is not less than a set value;
and the food image determining module is used for detecting whether a confirmation instruction of the user is acquired within preset time, wherein the confirmation instruction is used for selecting the displayed food image, if so, the selected food image is confirmed as an identification result, and otherwise, the food image with the maximum matching value is confirmed as the identification result.
According to the food identification method and device, the food is identified through the camera; acquiring health influence information on a user corresponding to the identified food; the health influence information on the user is displayed on the camera preview interface, so that the food can be quickly identified through the camera, the corresponding health influence information of the user is taken out according to the identified food and displayed to the user, the user can fully know the food according to the health influence information (the attribute of the food, the suitable crowd, the calorie information, the sugar content and the like) and make accurate eating judgment, and the food identification method is simple and quick and can give clear eating guidance to the user.
Drawings
FIG. 1 is a diagram of an application environment of a food identification method in one embodiment;
FIG. 2 is a flow diagram of a food identification method in one embodiment;
FIG. 3 is a flow chart of a food identification method in another embodiment;
FIG. 4 is a flow diagram of presenting detailed information in one embodiment;
FIG. 5 is a block diagram of a food recognition device in one embodiment;
FIG. 6 is a block diagram showing the structure of a food recognition apparatus in another embodiment;
FIG. 7 is a block diagram showing the structure of a food recognition apparatus in still another embodiment;
fig. 8 is a block diagram of the structure of a food identification module in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In one embodiment, as shown in FIG. 1, a diagram of an application environment in which a method of food identification operates is provided. The application environment includes a terminal 110 and a server 120, wherein the terminal 110 and the server 120 communicate through a network, and the terminal 110 may be at least one of a smart phone, a tablet computer, a notebook computer, and a desktop computer, but is not limited thereto. The server 120 may be an independent physical server or a server cluster including a plurality of physical servers. The server 120 stores food information such as image information and health influence information of various foods. The terminal 110 is internally provided with a camera, the camera can acquire the image information of the food, and the terminal can acquire the image information of the food acquired by the camera and acquire and display the health influence information matched with the image information returned by the server.
In one embodiment, as shown in fig. 2, there is provided a method of food identification, the method comprising:
step S202: food is identified by the camera.
Specifically, a camera is arranged in the terminal, and food to be identified is photographed through the camera. If the physical sign information of the food to be identified is more or the volume of the food to be identified is larger, the food to be identified is shot and recorded through the camera, and images of the food to be identified at multiple angles are obtained.
The terminal acquires the shooting information or the shooting and recording information of the camera to the food to be identified, and extracts the image information of the food to be identified in the shooting information or the shooting and recording information.
Step S204: health impact information on the user corresponding to the identified food is obtained.
The server is prestored with various food information, including image information of food and basic information of food (food name, identification characteristic, shape, color, preservation method and the like) related to the image information and health influence information of users, wherein the health influence information of users refers to information of related food having a guiding effect on healthy diet of people, such as the attribute (hot property, cold property, warm property, flatness and the like) of food, nutritional value (components, sugar content, calorie and the like), applicable people, eating taboos and the like having a guiding effect on healthy diet of people.
The terminal sends the acquired image information of the food to be identified to the server, and the server searches the image information of the food matched with the image of the food to be identified in the database.
In one embodiment, the server and/or the terminal extracts image characteristic attributes corresponding to images of food to be recognized, matches the extracted image characteristic attributes of the food to be recognized with image characteristic attributes corresponding to images of the food pre-stored in the server, calculates matching values of the extracted images of the food to be recognized and all pre-stored images, and obtains the images of the food with the matching values not less than a preset value, wherein the image characteristic attributes comprise color characteristics and literary characteristics.
In one embodiment, the number of the pre-stored food images matching the pattern of the food to be recognized may be plural, and when the number of the recognized images is plural, the pre-stored food image having a matching value not less than a set value is displayed; and detecting whether a confirmation instruction of a user is acquired within a preset time, wherein the confirmation instruction is used for selecting the displayed food image, if so, the selected food image is confirmed as an identification result, and otherwise, the food image with the maximum matching value is confirmed as the identification result.
After the target image prestored in the server matched with the food to be identified is determined, the server can search health influence information corresponding to the determined target image on the user according to the determined target image.
The searched target image can be more accurately determined through the selection of the user, and the identification result is more accurate.
Step S206: and displaying the health influence information on the user on a camera preview interface.
Specifically, the terminal displays the health influence information of the corresponding user found by the server on a camera preview interface. In one embodiment, an image display window and a text display window are arranged on the camera preview interface, wherein the image display window is used for displaying the shot images to be recognized and the recognized food images, and the text display window is used for displaying health influence information of the user.
In the embodiment, the food is quickly identified through the camera, the corresponding health influence information of the user is taken out according to the identified food and displayed to the user, the user can fully know the food according to the health influence information (the attribute of the food, the suitable crowd, the calorie information, the sugar content and the like) and make accurate eating judgment, and the food identification method is simple and quick and can give clear eating guidance to the user.
In one embodiment, as shown in fig. 3, the food identification method further includes:
step S302: physiological data input by a user is received.
The physiological information input by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the basic physiological attribute information comprises sex, age, height, weight and the like; the health condition information comprises information such as medical history and current physical state; the requirement information is the requirements of the user aiming at improving the body shape of the user, such as weight-losing requirements, whitening requirements and the like.
Step S302 may be provided before or after the step of recognizing the food by the camera. In this embodiment, step S302 is set before the step of identifying the food by the camera, and the following steps are set:
step S304: food is identified by the camera.
Step S306: and acquiring health influence information on the user corresponding to the identified food, matching the health influence information of the user corresponding to the identified food with the physiological data of the user, and acquiring matching degree information.
Specifically, the matching degree information reflects whether the user is suitable for eating the food to be identified.
For example, if the health condition information input by the user is diabetes, and if the sugar content of the food displayed in the health influence information is high, the matching degree information may be: the diabetic patient is not suitable for eating the food, or needs to control the amount of eating the food not to exceed a certain value, and the like.
In one embodiment, whether the user has overweight or overweight can be judged according to the physiological attribute information input by the user, the weight state of the user is matched with the health influence information corresponding to food to obtain whether the user is suitable for eating the food, and if the user can eat the food, an eating suggestion is given according to the weight state.
Step S308: and displaying the matching degree information on a camera shooting preview interface.
The matching degree information is directly displayed on a camera shooting preview interface, a user can acquire the matching degree information after finishing shooting the food to be recognized, and diet decision can be made according to the matching degree information.
In the embodiment, the user does not need to judge whether the food is suitable for eating or not according to the characteristics of the identified food, the matching degree information of the food and the user is directly given by combining the physiological information of the user, and the user obtains more intuitive eating guidance according to the matching degree information.
In one embodiment, as shown in fig. 4, after step S308, the method further includes:
step S402: and acquiring a trigger operation acted on the camera shooting preview interface.
Step S404: and acquiring matched detail information corresponding to the matching degree information according to the triggering operation.
The detailed information includes detailed information of the food, and information such as analysis process information for obtaining the matching degree. For example: basic information of food (food name, identification characteristics, form, shape, color, preservation method, etc.), nutritional ingredients (list), medicinal value, eating method, preservation method, etc. The matched analysis process information is matched analysis data between the food information and the physiological data input by the user.
In this embodiment, the detailed information corresponding to the matching degree information can be further acquired through the triggering operation, and the user can know the food and the matching condition of the food and the self state more specifically and completely through the detailed information.
In one embodiment, as shown in fig. 5, there is provided an apparatus for food identification, the apparatus comprising:
and a food identification module 510 for identifying food through a camera.
A health impact information obtaining module 520, configured to obtain health impact information on the user corresponding to the identified food.
And an information display module 530, configured to display the health impact information on the user on a camera preview interface.
In one embodiment, as shown in fig. 6, the apparatus further comprises:
the physiological data acquisition module 610 is configured to receive physiological data input by a user.
The health-related influence information obtaining module 520 is further configured to match the health-related influence information of the user corresponding to the identified food with the physiological data of the user, and obtain matching degree information.
The information displaying module 530 is further configured to display the matching degree information on a camera preview interface.
In one embodiment, the physiological information input by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the requirement information is the requirement of the user for the purpose of perfecting the body of the user.
In one embodiment, as shown in fig. 7, the apparatus further comprises:
the food detail information display module 540 is used for acquiring the trigger operation acting on the camera shooting preview interface; and acquiring the detail information of the identified food and the matched detail information corresponding to the matching degree information according to the triggering operation.
In one embodiment, as shown in fig. 8, the food identification module 510 comprises:
and the image matching module 710 is used for matching the shot food image with the food image information prestored in the server to obtain a matching value.
And a matching image display module 720 for displaying the pre-stored food image with an image matching value not less than a set value.
The food image determining module 730 detects whether a confirmation instruction of the user is obtained within a preset time, where the confirmation instruction is used to select the displayed food image, if so, the selected food image is confirmed as an identification result, and otherwise, the food image with the largest matching value is confirmed as the identification result.
It will be understood by those skilled in the art that all or part of the processes in the methods of the embodiments described above may be implemented by hardware related to instructions of a computer program, and the program may be stored in a computer readable storage medium, for example, in the storage medium of a computer system, and executed by at least one processor in the computer system, so as to implement the processes of the embodiments including the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A method of food identification, the method comprising:
identifying food through a camera;
acquiring health influence information on a user corresponding to the identified food;
displaying the health influence information on the user on a camera preview interface;
wherein the obtaining health impact information on the user corresponding to the identified food comprises: the terminal sends the acquired image information of the food to be identified to the server, and the server searches the image information of the food matched with the image of the food to be identified in the database and determines the food image matched with the food to be identified;
the image display window is used for displaying the shot images to be identified and the identified food images, and the text display window is used for displaying the health influence information of the user;
the discernment food through the camera includes:
acquiring an image of food shot by a camera, and matching the shot image of the food with food image information prestored in a server to obtain a matching value;
displaying the pre-stored food image with the matching value not less than the set value;
and detecting whether a confirmation instruction of a user is acquired within a preset time, wherein the confirmation instruction is used for selecting the displayed food image, and if so, confirming the selected food image as a recognition result.
2. The method of claim 1, further comprising:
receiving physiological data input by a user;
obtaining health impact information on a user corresponding to the identified food, comprising:
matching the health influence information of the user corresponding to the identified food with the physiological data of the user to acquire matching degree information;
displaying the health influence information on the user on a camera preview interface, wherein the health influence information comprises:
and displaying the matching degree information on a camera shooting preview interface.
3. The method of claim 2, wherein the physiological information input by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the requirement information is the requirement of the user for the purpose of improving the body form of the user.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
acquiring a trigger operation acting on the camera shooting preview interface;
and acquiring matched detail information corresponding to the matching degree information according to the triggering operation.
5. The method of claim 1, wherein the identifying the food item by the camera further comprises:
and if the confirmation instruction of the user is not obtained within the preset time, confirming the food image with the maximum matching value as the recognition result.
6. An apparatus for food identification, the apparatus comprising:
the food identification module is used for identifying food through the camera;
the health influence information acquisition module is used for acquiring health influence information on the user corresponding to the identified food;
the information display module is used for displaying the health influence information on the user on a camera shooting preview interface;
wherein the obtaining health impact information on the user corresponding to the identified food comprises: the terminal sends the acquired image information of the food to be identified to the server, and the server searches the image information of the food matched with the image of the food to be identified in the database and determines the food image matched with the food to be identified;
the image display window is used for displaying the shot images to be identified and the identified food images, and the text display window is used for displaying the health influence information of the user;
the food identification module includes:
the image matching module is used for matching the shot food image with food image information prestored in the server to obtain a matching value;
the matching image display module is used for displaying the pre-stored food image of which the image matching value is not less than a set value;
the food image determining module is used for detecting whether a confirmation instruction of a user is acquired within preset time, the confirmation instruction is used for selecting the displayed food image, and if the confirmation instruction is acquired, the selected food image is confirmed as a recognition result.
7. The apparatus of claim 6, further comprising:
the physiological data acquisition module is used for receiving physiological data input by a user;
the health influence information acquisition module is also used for matching the health influence information of the user corresponding to the identified food with the physiological data of the user to acquire matching degree information;
the information display module is also used for displaying the matching degree information on a camera shooting preview interface.
8. The apparatus of claim 7, wherein the physiological information inputted by the user comprises basic physiological attribute information, health condition information and requirement information, wherein the requirement information is the requirement of the user for the purpose of improving the self-form.
9. The apparatus of claim 7 or 8, further comprising:
the food detail information display module is used for acquiring trigger operation acting on the camera shooting preview interface; and acquiring the detail information of the identified food and the matched detail information corresponding to the matching degree information according to the triggering operation.
10. The apparatus according to claim 6, wherein the food image determining module is further configured to determine the food image with the largest matching value as the recognition result if no confirmation instruction of the user is obtained within a preset time.
11. A storage medium on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of food identification according to any one of claims 1 to 5.
12. A terminal device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, the processor implementing the method of food identification according to any one of claims 1 to 5 when executing the program.
CN201610474639.2A 2016-06-23 2016-06-23 Food identification method and device Active CN106203466B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610474639.2A CN106203466B (en) 2016-06-23 2016-06-23 Food identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610474639.2A CN106203466B (en) 2016-06-23 2016-06-23 Food identification method and device

Publications (2)

Publication Number Publication Date
CN106203466A CN106203466A (en) 2016-12-07
CN106203466B true CN106203466B (en) 2020-02-11

Family

ID=57461891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610474639.2A Active CN106203466B (en) 2016-06-23 2016-06-23 Food identification method and device

Country Status (1)

Country Link
CN (1) CN106203466B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791392B (en) * 2016-12-20 2020-12-15 美的集团股份有限公司 Food information acquisition method and device and terminal
CN108205664B (en) * 2018-01-09 2021-08-17 美的集团股份有限公司 Food identification method and device, storage medium and computer equipment
CN110069652A (en) * 2018-08-30 2019-07-30 Oppo广东移动通信有限公司 Prompting method and device, storage medium and wearable device
CN109887574B (en) * 2019-01-31 2023-07-07 广州市格利网络技术有限公司 Food material identification-based diet therapy guiding method and device
CN110412124A (en) * 2019-07-30 2019-11-05 Oppo(重庆)智能科技有限公司 Method, apparatus, mobile terminal and the storage medium of substance detection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021207A (en) * 2014-06-18 2014-09-03 厦门美图之家科技有限公司 Food information providing method based on image
CN104537662A (en) * 2014-12-24 2015-04-22 百度在线网络技术(北京)有限公司 Method and device for providing photographed image
CN104809472A (en) * 2015-05-04 2015-07-29 哈尔滨理工大学 SVM-based food classifying and recognizing method
CN104899814A (en) * 2015-05-08 2015-09-09 努比亚技术有限公司 Method for intelligently reminding healthy diet and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8363913B2 (en) * 2008-09-05 2013-01-29 Purdue Research Foundation Dietary assessment system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021207A (en) * 2014-06-18 2014-09-03 厦门美图之家科技有限公司 Food information providing method based on image
CN104537662A (en) * 2014-12-24 2015-04-22 百度在线网络技术(北京)有限公司 Method and device for providing photographed image
CN104809472A (en) * 2015-05-04 2015-07-29 哈尔滨理工大学 SVM-based food classifying and recognizing method
CN104899814A (en) * 2015-05-08 2015-09-09 努比亚技术有限公司 Method for intelligently reminding healthy diet and terminal

Also Published As

Publication number Publication date
CN106203466A (en) 2016-12-07

Similar Documents

Publication Publication Date Title
CN106203466B (en) Food identification method and device
US10803315B2 (en) Electronic device and method for processing information associated with food
US9008390B2 (en) Similar case searching apparatus, relevance database generating apparatus, similar case searching method, and relevance database generating method
CN110580278B (en) Personalized search method, system, equipment and storage medium according to user portraits
CN114372215B (en) Search result display and search request processing method and device
US20220093234A1 (en) Systems, methods and devices for monitoring, evaluating and presenting health related information, including recommendations
AU2018370069A1 (en) Meal service management system and operating method therefor
US20170311053A1 (en) Identifying entities based on sensor data
JP2008242963A (en) Health analysis display method and health analysis display device
CN113537248B (en) Image recognition method and device, electronic equipment and storage medium
KR20200066278A (en) Electronic device and method for processing information associated with food
US20170061821A1 (en) Systems and methods for performing a food tracking service for tracking consumption of food items
CN109752496A (en) Food material detection method and device and storage device
CN107993695A (en) Methods of exhibiting, device, equipment and the storage medium of diagnosis and treatment data
CN110085319A (en) For providing the system and method for the instruction to individual health
US20190221134A1 (en) Meal Management System
KR20210075955A (en) Methods for management of nutrition and disease using food images
CN114556444A (en) Training method of combined model and object information processing method, device and system
CN111863194A (en) A display method, device, device and storage medium for dietary information
CN112419432A (en) Method and device for controlling food in refrigerator, electronic equipment and storage medium
JP2013029877A (en) Data management system, data management method and program
JP2015028767A (en) Recipe recommendation device
KR20190048922A (en) Smart table and controlling method thereof
JP2018049584A (en) Meal size estimation program, meal size estimation method, and meal size estimation apparatus
EP4325512A1 (en) Artificial intelligence-based meal monitoring method and apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240923

Address after: 430000, 14th floor, No. 181 Chunxiao Road, Wuhan Economic and Technological Development Zone, Hubei Province

Patentee after: Wuhan Xingji Meizu Technology Co.,Ltd.

Country or region after: China

Address before: 519085 Guangdong Zhuhai science and technology innovation coastal Meizu Technology Building

Patentee before: MEIZU TECHNOLOGY Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right