Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The embodiment of the application is applied to a scene that a user uses the article storage device to manage articles, wherein the article storage device can be, for example, an intelligent refrigerator, an intelligent freezer and the like, and the user can store the articles into the article storage device and take the articles out of the article storage device. In the embodiment of the application, when a user manages the article, the article storage device acquires images when the user performs the storing or taking-out operation by using the camera arranged on the article storage device, and selects the article management based on the image recognition mode by using the method of the embodiment of the application, and/or starts the bar code scanner on the article storage device to manage the article in the bar code recognition mode. The method specifically comprises the steps of pushing identified article information to a user when the user stores articles, updating a current existing article set after the user confirms the identified article information, judging whether the result of the taking-out operation is successful or not when the user takes out the articles, pushing corresponding prompt information, and updating the current existing article set according to response information of the user.
Fig. 1a is an exemplary front view structural diagram of an article storage device according to an embodiment of the present application, and fig. 1b is an exemplary side view structural diagram of an article storage device according to an embodiment of the present application, as shown in fig. 1a and fig. 1b, taking an article storage device as an example of a smart refrigerator, a camera is disposed on the top of the smart refrigerator, and a field of view of the camera is downward. The bar code scanner is positioned on the box body between the upper refrigerating chamber, the lower refrigerating chamber and the temperature changing chamber. Besides, the bar code scanner can be positioned at the outer side of the intelligent refrigerator, between the door bodies or between the door bodies, and the like. The application does not limit the installation position of the bar code scanner, and only needs to enable the bar code scanner to be in the visual field of the camera. The camera can collect images when a user manages the article, and the bar code scanner can scan the bar code on the article to acquire the information of the article.
For convenience of description, the following embodiments will be described by taking the article storage device as an example of a smart refrigerator. In addition, in the following embodiments, the "article storage device" will be simply referred to as "device".
Fig. 2 is a schematic flow chart of an article management method according to an embodiment of the present application, where an execution subject of the method is an article storage device. As shown in fig. 2, the method includes:
s201, acquiring an image sequence when a user manages the articles in the article storage device, wherein the images of the image sequence comprise hand image areas of the user and article image areas.
The camera arranged at the top of the intelligent refrigerator can collect multiple frames of images in real time when a user operates after the user opens the door body, and the multiple frames of images form an image sequence. Each frame of image in the image sequence at least comprises a hand of a user and an article held by the user, so that the image at least comprises a hand image area and an article image area.
S202, target tracking is carried out on the hand image area, and management actions of the user on the article and whether the user performs code scanning operation on the article are determined, wherein the management actions comprise storing the article into the article storage device or taking the article out of the article storage device.
Optionally, the image sequence acquired by the camera includes movement information of the hand of the user when the user deposits or takes out the article, so that the user can know whether the user performs the operations of depositing or taking out and whether the user performs the code scanning operation or not based on the target tracking of the hand image area of each image in the image sequence.
The specific implementation procedure of object tracking on the hand image area will be described in detail in the following embodiments.
S203, determining the actual type of the article according to whether the user performs the code scanning operation on the article.
By tracking the object of the hand image area, the article storage device can automatically recognize whether the user performs the code scanning operation or not, and based on the information, the actual kind of the article is automatically recognized in a corresponding manner. For example, if the user performs the code scanning operation, the actual type of the article is identified by the barcode identification method, and if the user does not perform the code scanning operation, the actual type of the article is identified by the image identification method.
The actual kind of the article may include, for example, branding, specifications, categories, etc. For example, the actual type of an item is 200ml of milk of brand A.
It should be noted that, the step S202 may be completed before the step S203 is performed, or may be completed after the step S203 is performed. For example, first, it is recognized that the user performs the code scanning operation through step S202, at this time, the execution of step S203 may be started, and at the same time, step S202 may be continuously performed to recognize the management behavior of the user.
S204, pushing prompt information and updating a current stored article set of the article storage device, wherein the prompt information is used for indicating the actual type of the article and the management action of the user.
The article storage device can display the articles in the current stored article set for the user according to the needs of the user, so that the user can know whether to supplement certain articles at any time.
In this step, after the management behavior of the user and the type of the article that the user manages this time are identified, on one hand, the management behavior of this time and the type of the article may be pushed to the user, so that the user knows that the article storage device has identified this time of operation, and on the other hand, the article storage device correspondingly updates the current stored article set according to this time of operation, so that the current stored article set is consistent with the article actually stored in the device, that is, the accuracy of the current stored article set is ensured. For example, when a user stores an article a in the intelligent refrigerator, the intelligent refrigerator displays a prompt message of "you store the article a" on a display screen thereof, adds the information of the article a to a current stored article set, and when the user takes out an article B from the intelligent refrigerator, the intelligent refrigerator displays a prompt message of "you take out the article B" on a display screen thereof, and deletes the information of the article B from the current stored article set.
In this embodiment, the article storage device may automatically identify the management behavior of the user and whether the user performs the code scanning operation by collecting the image when the user manages the article and performing the target tracking, and automatically identify the actual type of the article managed by the user according to whether the user performs the code scanning operation by using the corresponding method, and then push the information of the present article management to the user and update the current stored article set. In the process, the user can finish article management and view the article management result without executing any additional operation, so that the operation process when the user manages the articles is greatly simplified, and the user experience is greatly improved.
When the user manages the articles, the code scanning operation can be executed or not executed. If the user performs the code scanning operation, the article storage device can obtain the actual type of the article by using a bar code identification mode. If the user does not perform the code scanning operation, the actual type of the article can be obtained by utilizing an image recognition mode. In the first scenario, the user performs the code scanning operation when the user accesses a certain article, and does not perform the code scanning operation when the user accesses the article. In a second scenario, the user does not perform a code scanning operation when accessing an item, and performs a code scanning operation when retrieving the item. In the above two scenarios, when the user stores and takes out the articles, the article storage device uses different identification modes, the identification results of the different identification modes on the same food material may be different, and when the user takes out the articles, the problem that the article storage device cannot match the identified article type in the current stored article set when updating the current stored article set may occur. In view of this problem, as an alternative embodiment, when the item storage apparatus performs the above step S204, before updating the currently stored item set, it may first determine that the operation result of the management action of the item by the user is operation success or operation failure. The successful operation of the user means that the information of the article can be successfully added to the current stored article set when the article is stored. For the user's taking out operation, the successful operation means that the actual category of the identified item can be found from the current stored item set, and the failed operation means that the actual category of the identified item can be found from the current stored item set. For the deposit operation, the article type is identified, and the article type can be added into the current stored article set, so that the operation result in the deposit process can be directly considered as the success of the operation. For the taking-out operation, the operation result can be judged according to the actual type of the article and the current stored article set through the following article association reasoning flow. Specifically, if an item matching the identified item is included in the currently stored item set, then the operation is determined to be successful.
Fig. 3 is a schematic flow chart of an article management method according to an embodiment of the present application, and as shown in fig. 3, an associative inference flow includes:
S301, inquiring whether an article with the same kind as the identified actual kind exists in the current stored article set, if so, executing step 303, and if not, executing step S302.
For example, the same kind of items may refer to brands, specifications, and categories that are all the same. For example, if the actual type of identification is 200ml of milk with a specification of brand a, and an article of the type of 200ml of milk with a specification of brand a exists in the existing articles, the operation is judged to be successful, that is, the taking out is successful.
S302, inquiring whether an article with the difference between the type and the identified actual type is smaller than a preset threshold exists in the current stored article set, if so, executing the step S303, and if not, executing the step S304.
By way of example, the preset threshold may refer to one of a brand, a specification, a category.
The difference between the category in the stored item set and the identified actual category is smaller than a preset threshold, for example, may be:
(1) The brands are the same, the specifications are different, and the categories are the same.
For example, if cola with the specification of 300ml of brand A exists in the stored article set, the identified actual type is cola with the specification of 250ml of brand A, the difference between the two is smaller than the preset threshold, and the operation is successful.
(2) Different brands, same specifications and same category
For example, if cola with the specification of 300ml of brand A exists in the stored article set, the identified actual type is cola with the specification of 250ml of brand A, the difference between the two is smaller than the preset threshold, and the operation is successful.
In another example, although the article a and the article B, which are greatly different in category division, are greatly different in category, they are similar in appearance, so that they can be considered to be less than a preset threshold in category, and the operation is successful.
S303, determining that the operation is successful.
S304, determining that the operation fails.
After the operation result of the management action of the user on the article is judged to be successful or failed by utilizing the process, the user can interact with the user based on the operation result and update the current stored article set, so that article management is completed.
FIG. 4 is a flow chart of interacting with a user and updating a currently stored item set according to the operation result, as shown in FIG. 4, for the result of successful operation in the process of storing and taking out, that is, successful storing and taking out, the user can be prompted to confirm the prompt information, and the set can be updated after the user confirms. For the result of the operation failure in the taking-out, namely the taking-out failure, the user can be prompted to correct the article type, and the collection is updated after the user corrects.
Optionally, if the operation result is that the operation is successful, updating the current stored article set according to the confirmation information of the user on the prompt information. If the operation result is that the operation fails, the actual type of the article is updated according to the correction information of the prompt information by the user, and the current stored article is updated according to the actual type of the article.
The correction information of the prompt information by the user can be information for correcting the actual type of identification, or information in the current stored article set.
In one example, when a user stores an article a, the user performs a code scanning operation, the device identifies the article a as an article a by a bar code identification mode and adds the article a to a stored article set, when the user takes out the article a, the device does not perform the code scanning operation, and identifies the article a as an article B by an image identification mode, at this time, the user can correct the identified article B as the article a, and the device deletes the article a from the current stored article set according to the article a corrected by the user.
In another example, when a user stores an article a, the user does not perform a code scanning operation, the device identifies the article a as an article B by an image identification manner and adds the article B to the stored article set, when the user takes out the article a, the device identifies the article a as the article a by a bar code identification manner, at this time, the user can delete the article a from the current stored article set after correcting the article B recorded in the current stored article set as the article a.
When the article storage device interacts with a user, the interaction can be performed through an interface of the article storage device, an application program interface interaction on a terminal device bound with the article storage device, or voice interaction. The following description will take as an example an interface interaction through an item storage device.
Fig. 5 is an exemplary diagram of an interface when the deposit is successful, as shown in fig. 5, the article storage device recognizes that the user has deposited the article a, and because of the deposit, the article storage device directly determines that the operation result is successful, that is, the deposit is successful, so that a prompt message of "you have deposited the article a, please confirm" is displayed on a screen of the article storage device, the user clicks a confirm button under the prompt message to confirm, the article storage device receives confirmation information of the user, and then the article a is added to the currently stored article set.
Fig. 6 is an exemplary view of an interface when the removal fails, and as shown in fig. 6, an article is article a, and information stored when the article is stored is "article a". When the user takes out the article, the article storage device recognizes that the user takes out the article B, and when no article B exists in the current stored article set, the article storage device displays a prompt message of "do you take out the article B? an input box is displayed on the screen of the article storage device, the user inputs correction information of the actual category of the taken article, namely article A, in the input box, and clicks a confirmation button below, and the article storage device deletes the article A from the current stored article set according to the correction information input by the user.
The following describes a process in which the apparatus determines the actual kind of the article according to whether the user performs the code scanning operation on the article in step S203 described above.
The manner of determining the actual type of article may be any of the following two ways.
In the first mode, the device determines a target recognition mode according to whether a user performs a code scanning operation, wherein the target recognition mode is a bar code recognition mode or an image recognition mode. Further, the actual type of the article is identified using the target identification method.
According to the method, whether the user executes the code scanning operation or not is selected to perform the identification processing according to the corresponding identification mode, and therefore calculation cost can be saved.
Specifically, if the target recognition mode is an image recognition mode, the device selects one frame of image in the image sequence, and performs image recognition on an article image area of the one frame of image to obtain the actual type of the article. If the target recognition mode is a bar code recognition mode, a bar code scanner is used for bar code recognition, and the result of the bar code recognition is used as the actual type of the article.
FIG. 7 is a schematic flow chart of identifying the actual kind of the article by using the first mode when the user deposits the article, as shown in FIG. 7, the flow chart includes:
s701, identifying the code scanning operation of the user.
S702, judging whether the bar code scanner is successfully started, if yes, executing step S703, otherwise, executing step S704.
S703, setting the code scanning mark position to be 1.
S704, setting the code scanning mark position to be 0.
S705, recognizing that the management behavior of the user is a deposit.
S706, judging whether the code scanning flag bit is 1, if yes, executing step S707, otherwise executing step S709.
S707, identifying the actual type of the stored article by using a bar code identification mode.
S708, setting the code scanning mark position to be 0.
After the actual type of the article is obtained, the code scanning mark position is 0, so that the next identification process can be performed.
S709, identifying the actual type of the stored article by using an image identification mode.
S710, obtaining the actual type of the stored article.
Fig. 8 is a schematic flow chart of identifying an actual kind of an article by using a first method when a user takes out the article, and as shown in fig. 8, the flow chart includes:
s801, recognizing that the management behavior of the user is taken out.
S802, identifying the code scanning operation of the user.
S803, judging whether the bar code scanner is successfully started, if yes, executing step S804, otherwise, executing step S805.
S804, the code scanning mark position is 1.
S805, setting the code scanning mark position to be 0.
S806, judging whether the code scanning flag bit is 1, if yes, executing step S807, otherwise executing step S809.
S807, identifying the actual type of the taken-out article by using a bar code identification mode.
S808, setting the code scanning mark position to be 0.
After the actual type of the article is obtained, the code scanning mark position is 0, so that the next identification process can be performed.
S809, the actual type of the taken-out article is identified by using an image identification mode.
S810, obtaining the actual type of the taken-out article.
In the second mode, the device first performs image recognition on an article image area of one frame of image in the image sequence to obtain an optional type of the article, and then determines the actual type of the article according to whether the user performs the code scanning operation and the optional type of the article.
Specifically, if the user performs the code scanning operation, the selectable category is updated according to the result of the bar code identification, and the updated selectable category is taken as the actual category of the article. If the user does not perform the code scanning operation, the selectable category of the item is taken as the actual category of the item.
In the mode, the type is firstly identified by using the image identification mode when the identification process starts, so that the identification speed can be improved, and the user experience is further improved.
FIG. 9 is a schematic flow chart of identifying the actual kind of the article by using the second mode when the user deposits the article, as shown in FIG. 9, the flow chart includes:
s901, identifying a code scanning operation of a user.
S902, identifying optional types of the stored articles by using an image identification mode.
S903, judging whether the bar code scanner is successfully started, if yes, executing step S904, otherwise, executing step S905.
S904, setting the code scanning mark position as 1.
S905, setting the code scanning mark position to be 0.
S906, identifying the management behavior of the user as a deposit.
S907, judging whether the code scanning flag bit is 1, if yes, executing step S908, otherwise executing step S911.
S908, identifying by using a bar code identification mode.
S909, setting the code scanning mark position to be 0.
After the actual type of the article is obtained, the code scanning mark position is 0, so that the next identification process can be performed.
S910, updating the optional category by using the bar code recognition result to obtain the actual category.
And S911, taking the optional category as an actual category.
S912, obtaining the actual type of the stored article.
FIG. 10 is a schematic flow chart for identifying the actual kind of the article by using the second mode when the user takes out the article, as shown in FIG. 10, the flow chart includes:
s1001, recognizing that the management behavior of the user is taken out.
S1002, identifying optional types of the taken-out articles by using an image identification mode.
S1003, identifying the code scanning operation of the user.
S1004, judging whether the bar code scanner is successfully started, if yes, executing step S1005, otherwise, executing step S1006.
S1005, setting the code scanning mark position to be 1.
S1006, setting the code scanning mark position to be 0.
S1007, judging whether the code scanning flag bit is 1, if yes, executing step S1008, otherwise executing step S1011.
S1008, identifying by using a bar code identification mode.
S1009, setting the code scanning mark position to 0.
After the actual type of the article is obtained, the code scanning mark position is 0, so that the next identification process can be performed.
S1010, updating the optional types by using the bar code recognition result to obtain the actual types.
S1011, taking the optional species as an actual species.
S1012, obtaining the actual type of the taken-out article.
In the above two ways, when the type is identified by using the barcode identification method, the type of the article is scanned by the barcode scanner based on the barcode scanner illustrated in fig. 1a and 1 b. The barcode scanner may be a photocoupler (Charge Coupled Device, abbreviated as CCD), a laser scanner, a light pen scanner, or the like. The CCD scanner mainly adopts a fixed light beam (usually a floodlight source of a light emitting diode) to illuminate the whole bar code, reflects the bar code symbol onto a photosensitive element array, and recognizes the bar code symbol through photoelectric conversion. Some CCD scanners can recognize not only one-dimensional bar codes and row-wise two-dimensional bar codes, but also matrix-type two-dimensional bar codes. The laser scanner is a scanner using laser light as a light source. The laser scanner can scan in a long distance and has high scanning precision. The light pen scanner is a handheld contact type bar code reader, when in use, a user needs to contact the light pen with the surface of the bar code, and when a light spot emitted by the light pen passes through the bar code from left to right, part of light rays in the 'empty' part are reflected, and part of light rays in the 'bar' part are absorbed. After photoelectric conversion, the electric signal is amplified and shaped and then used for a decoder.
When the image recognition mode is used for recognizing the types, in one mode, a feature extractor and a classifier can be used for realizing feature extraction of the images, learning training of the classifier and test recognition. The feature extractor may include a directional gradient histogram (Histogram of Oriented Gradient, HOG for short), a local binary pattern (Local Binary Pattern, LBP for short), a deformation component model (Deformable Parts Model, DPM for short), and the like. The classifier may use a support vector machine (Support Vector Machines, abbreviated SVM), adaboost, decision tree, bayesian network, neural network, etc. In another approach, deep learning methods can be used for training and recognition, such as using Fast-CNN, fast-CNN, YOLO, YOLO 9000, and the like.
The following describes the process of performing object tracking on the hand image area in the above step S202 to obtain the management behavior of the user on the article, and whether the user performs the code scanning operation on the article.
Specifically, the device performs target tracking on the hand image area to obtain a hand movement track, and obtains management behaviors of the user and whether the user performs code scanning operation according to the hand movement track.
Fig. 11 is a schematic diagram of a process of obtaining management actions and whether a user performs a code scanning operation by using object tracking, and as shown in fig. 11, first, the device first identifies a hand image area in an image. And secondly, carrying out target tracking on the area, and identifying three actions of going, returning and staying according to the track of the hand tracking. Furthermore, the management behavior of the user can be identified according to the go-return sub-action, and whether the user executes the code scanning operation can be identified according to the stop sub-action.
When the hand image area in the image is identified, the image area including the hand and the object can be detected. The method for detecting the hand region can be the method for detecting the feature extractor and the classifier, or the method based on deep learning.
When the target tracking is carried out, the target tracking of the hand image area can be carried out by using methods such as Kalman filtering, particle filtering and the like, and three actions of going, returning and staying are identified according to the tracking track. Fig. 12a is an example diagram of recognition of a go-sub action, fig. 12b is an example diagram of recognition of a return sub action, and fig. 12c is an example diagram of recognition of a stay sub action. As shown in fig. 12a, when the tracking track is from the direction of the refrigerator door to the direction of the refrigerator door, the sub-action is going to the boundary between the refrigerator door and the refrigerator door, and the management action is stored. As shown in fig. 12b, when the tracking track is from the refrigerator box direction to the refrigerator door direction and passes through the box and door dividing line, the sub-action is returned, and the management action is taken out. As shown in fig. 12c, when the duration of stay of the tracking track in the code scanning stay area exceeds the preset duration, determining the sub-action as stay, and further judging that the user performs the code scanning operation.
Fig. 13 is a block diagram of an article management device according to an embodiment of the present application, as shown in fig. 13, where the device includes:
The processing module 1301 is configured to collect an image sequence when a user manages an item in an item storage device, where an image of the image sequence includes a hand image area and an item image area of the user, and perform target tracking on the hand image area to obtain a management action of the user on the item and whether the user performs a code scanning operation on the item, where the management action includes storing the item in the item storage device or taking the item out of the item storage device, and determine an actual type of the item according to whether the user performs the code scanning operation on the item.
A pushing module 1302, configured to push a prompt message and update a currently stored item set of the item storage device, where the prompt message is used to indicate an actual kind of the item and a management behavior of the user.
As an alternative embodiment, the processing module 1301 is specifically configured to:
and determining a target identification mode according to whether the user executes the code scanning operation or not, wherein the target identification mode comprises a bar code identification mode or an image identification mode, and identifying the actual type of the object by using the target identification mode.
As an alternative embodiment, the processing module 1301 is specifically configured to:
and if the target recognition mode is an image recognition mode, carrying out image recognition on the object image area of one frame of image of the image sequence to obtain the actual type of the object.
As an alternative embodiment, the processing module 1301 is specifically configured to:
And if the target recognition mode is a bar code recognition mode, taking a bar code recognition result as the actual type of the article.
As an alternative embodiment, the processing module 1301 is specifically configured to:
Carrying out image recognition on the article image area of one frame of image of the image sequence to obtain selectable types of the articles; and determining the actual type of the article according to whether the user performs the code scanning operation and the optional type of the article.
As an alternative embodiment, the processing module 1301 is specifically configured to:
If the user executes the code scanning operation, updating the optional category according to the bar code identification result, and taking the updated optional category as the actual category of the article.
As an alternative embodiment, the processing module 1301 is specifically configured to:
and if the user does not execute the code scanning operation, taking the optional type of the article as the actual type of the article.
As an alternative embodiment, the pushing module 1302 is specifically configured to:
If the operation result of the management action of the user on the article is judged to be successful, updating the current stored article set according to the confirmation information of the user on the prompt information.
As an alternative embodiment, the pushing module 1302 is specifically configured to:
If the operation result of the management action of the user on the article is judged to be operation failure, updating the actual type of the article according to the correction information of the user on the prompt information, and updating the current stored article according to the actual type of the article.
As an alternative embodiment, the processing module 1301 is further configured to:
and judging an operation result of the management action of the user on the article according to the actual type of the article and the current stored article set.
As an alternative embodiment, the processing module 1301 is specifically configured to:
And if the management action is to take out the article from the article storage device and the currently stored article set comprises an article matched with the article, determining that the operation result of the management action of the user on the article is successful.
Wherein the articles matched with the articles comprise articles with the same kind as the actual kind, or comprise articles with the kind different from the actual kind less than a preset threshold.
As an alternative embodiment, the processing module 1301 is specifically configured to:
And obtaining the management behavior of the user and whether the user performs code scanning operation according to the hand movement track.
The article management device provided by the embodiment of the present application may perform the method steps in the above method embodiment, and its implementation principle and technical effects are similar, and are not described herein again.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. The modules can be realized in the form of software which is called by the processing element, in the form of hardware, in the form of software which is called by the processing element, and in the form of hardware. For example, the determining module may be a processing element that is set up separately, may be implemented in a chip of the above apparatus, or may be stored in a memory of the above apparatus in the form of program code, and may be called by a processing element of the above apparatus and execute the functions of the determining module. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as one or more Application SPECIFIC INTEGRATED Circuits (ASICs), or one or more microprocessors (DIGITAL SIGNAL processors, DSPs), or one or more field programmable gate arrays (field programmable GATE ARRAY, FPGAs), or the like. For another example, when a module above is implemented in the form of processing element scheduler code, the processing element may be a general purpose processor, such as a central processing unit (central processing unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
The embodiment of the application also provides an intelligent refrigerator, which comprises:
The intelligent refrigerator comprises a camera, a bar code scanner, a processor and a processor, wherein the camera is positioned at the top of the intelligent refrigerator and is used for collecting images of a user when managing articles in the intelligent refrigerator, the bar code scanner is positioned in the visual field of the camera, the processor is used for identifying the article management behaviors of the user and whether the user executes a code scanning operation according to the images collected by the camera and selecting the actual types of the articles based on an image identification mode according to whether the user executes the code scanning operation, and/or starting the bar code scanner, identifying the actual types of the articles in the bar code identification mode, pushing prompt information and updating a current stored article set, and the prompt information is used for indicating the actual types of the articles and the management behaviors of the user.
Fig. 14 is a schematic structural diagram of an article storage device 1400 according to an embodiment of the present application. As shown in fig. 14, the article storage device may include a processor 141, a memory 142, a communication interface 143, a system bus 144, a camera 145 and a barcode scanner 146, where the memory 142, the communication interface 143, the camera 145 and the barcode scanner 146 are connected to the processor 141 through the system bus 144 and complete communication with each other, the memory 142 is used for storing computer-executed instructions, the communication interface 143 is used for communicating with other devices, and the camera 145 is used for capturing images when a user manages articles and sending the images to the processor 141. The barcode scanner 146 is used to scan a barcode on an item to obtain the actual type of item and send it to the processor 141. The processor 141 implements the embodiments of the embodiments shown in fig. 2 to 12 described above when executing the computer program.
The system bus referred to in fig. 14 may be a peripheral component interconnect standard (PERIPHERAL COMPONENT INTERCONNECT, PCI) bus, or an extended industry standard architecture (extended industry standard architecture, EISA) bus, or the like. The system bus may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus. The communication interface is used to enable communication between the database access apparatus and other devices (e.g., clients, read-write libraries, and read-only libraries). The memory may include random access memory (random access memory, RAM) and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor including a Central Processing Unit (CPU), a network processor (network processor, NP), etc., or may be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component.
Optionally, an embodiment of the present application further provides a storage medium, where instructions are stored, when the storage medium runs on a computer, to cause the computer to perform the method of the embodiment shown in fig. 2 to 12.
Optionally, an embodiment of the present application further provides a chip for executing instructions, where the chip is configured to perform the method of the embodiments shown in fig. 2 to fig. 12.
Embodiments of the present application also provide a program product comprising a computer program stored in a storage medium, from which at least one processor can read, the at least one processor executing the computer program implementing the method of the embodiments shown in fig. 2 to 12.
In embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" describes an association of associated objects, meaning that there may be three relationships, e.g., A and/or B, and that there may be A alone, while A and B are present, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the front and rear associated objects are in a "or" relationship, and in the formula, the character "/" indicates that the front and rear associated objects are in a "division" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (a, b, or c) of a, b, c, a-b, a-c, b-c, or a-b-c may be represented, wherein a, b, c may be single or plural.
It will be appreciated that the various numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present application, the sequence number of each process does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
It should be noted that the above embodiments are merely for illustrating the technical solution of the present application and not for limiting the same, and although the present application has been described in detail with reference to the above embodiments, it should be understood by those skilled in the art that the technical solution described in the above embodiments may be modified or some or all of the technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiments of the present application.