Disclosure of Invention
The technical problem to be solved by the invention is how to design a technical scheme for NAS equipment, which can be used for processing a very large-scale model or data set without being limited by local hardware configuration, and can realize the functions of data security and privacy protection in a multi-user access environment with low cost.
In a first aspect, an embodiment of the present invention provides an AI processing method for a NAS device, where the method is used in a NAS control unit in the NAS device, where the NAS control unit implements docking with an AI model in a manner of communicatively connecting with an AI processing module, and the method includes S1, invoking a user operation layer of a first form, generating a first AI requirement, and sending the first AI requirement to the AI model, S2, receiving the first AI requirement by the AI model, converting the first AI requirement into an AI instruction, and feeding back the AI instruction to the user operation layer of the first form, S3, converting the user operation layer of the first form into a user operation layer of a second form based on the AI instruction, and S4, invoking the user operation layer of the second form by the NAS control unit as an AI processing result.
In the above scheme, the user operation layer in the first aspect is converted into the user operation layer in the second aspect based on the AI instruction, and can automatically adjust the file position or automatically adjust the picture information, and the user operation layer can issue a series of instructions, such as AI generation content, AI dialogue, AI knowledge base, AI document management, AI video entertainment, and the like. Further, the AI processing module has AI capability, which is defined as the capability of automatically operating and generating the content by means of module installation or software local deployment or request from outside through a network interface, and further receiving a certain degree of intelligent reply.
The NAS control unit is further internally provided with a first module group comprising an NAS manufacturer intelligent module A, a third party intelligent module B and a self-grinding intelligent module C, and is further internally provided with a second module group comprising a container virtual intelligent module D, a manufacturer integrated intelligent module E and a non-manufacturer intelligent module F, wherein S1, the NAS control unit calls a user operation layer of a first form to generate a first AI requirement and sends the first AI requirement to an AI model, and specifically comprises that the NAS control unit calls the user operation layer of the first form to generate the first AI requirement corresponding to the user operation layer, packages the first AI requirement into a standardized data packet through a preset interface and protocol, and sends requirement parameters, application scene identifiers and user authority information in the data packet to the AI model.
The NAS control unit generates a module starting response according to the demand parameters, the application scene identification and the user authority information, and starts the first module group and/or the second module group.
The method further comprises the steps that the NAS control unit generates a module starting response, a first module group is started, if the module starting response is related to an NAS manufacturer intelligent module A, the NAS manufacturer intelligent module A directly calls a user operation layer through an API interface integrated in a NAS system and sends requirements to an AI model, if the module starting response is related to a third party intelligent module B, the third party intelligent module B encrypts the first AI requirements through a third party interface compatible with the NAS system and sends the first AI requirements to the AI model, and if the module starting response is related to a self-grinding intelligent module C, the self-grinding intelligent module C sends the first AI requirements to the AI model in a specific format through a custom script and an interface.
The method further comprises the steps that the NAS control unit generates a module starting response, a second module group is started, if the module starting response is related to a container virtual intelligent module D, the container virtual intelligent module D operates in a container environment and sends a first AI requirement to an AI model through a network interface in the container, if the module starting response is related to a manufacturer integrated intelligent module E, the manufacturer integrated intelligent module E sends the first AI requirement to the AI model in a script calling mode through a tool script developed by a manufacturer, and if the module starting response is related to a non-manufacturer intelligent module F, the non-manufacturer intelligent module F sends the first AI requirement to the AI model through a tool script developed by a third party through a preset protocol.
The method comprises the following steps that S2, an AI model receives a first AI demand, converts the first AI demand into an AI instruction, and feeds the AI instruction back to a user operation layer of a first form, wherein the AI model is used for abutting against the first AI demand sent by an NAS control unit, analyzing the first AI demand through a preset analysis module, extracting demand parameters, application scene identification and user authority information in the first AI demand, combining a preset AI algorithm library according to the extracted demand parameters, matching a corresponding AI algorithm and generating a corresponding AI instruction, and feeding the generated AI instruction back to the user operation layer of the first form through a preset feedback channel in a standardized instruction format, wherein the feedback channel comprises a network communication interface and a local communication interface.
The method comprises the following steps that S3, a user operation layer in a first form is converted into a user operation layer in a second form based on an AI instruction, the user operation layer in the first form analyzes the AI instruction through a preset instruction analysis module after receiving the AI instruction fed back by an AI model, and extracts operation parameters and a target state in the instruction, the user operation layer in the first form calls a preset scene conversion module according to the extracted operation parameters, gradually adjusts configuration and state of the user operation layer according to a preset conversion rule, and in the conversion process, the user operation layer in the first form monitors the conversion state in real time, and confirms the user operation layer converted into the second form when the conversion is completed and the preset second form condition is met, wherein the second form condition comprises preset configuration parameters and a state identifier.
The further technical scheme includes that S4, the NAS control unit calls a user operation layer of a second form and serves as an AI processing result, specifically, the NAS control unit detects that the user operation layer of the second form is successfully converted and is in a stable state through a preset scene detection module, the NAS control unit calls the user operation layer of the second form, current state information and processing result data of the user operation layer are obtained through a preset interface, the NAS control unit packages the obtained processing result data, an AI processing result is generated, and the AI processing result is sent to a preset user terminal or a storage system through a preset output interface, and the output interface comprises a network interface and a local storage interface.
In a second aspect, the present invention proposes an AI processing method for a NAS device and a NAS system incorporating an AI processing module external or internal to the NAS by using the NAS device as a data storage core. The AI processing module has AI capability, which is defined as the capability of automatically operating and generating the content by means of module installation or software local deployment or request from outside through a network interface, and further receiving a certain degree of intelligent reply. The framework comprises an NAS storage management module, a data preprocessing module and an AI processing module through a modularized design, and seamless cooperation is realized among the modules through standardized interfaces. The system supports efficient model training and reasoning, has the characteristics of expansibility, low delay and intellectualization, and is suitable for various application scenes.
The function of each module of the invention comprises a NAS storage management module which is responsible for storing the original data and the intermediate result. Multi-user data access rights management, version control, and backup functions are provided. The system comprises an AI processing module, a data preprocessing module, an AI processing module and a cloud end, wherein the AI processing module is connected with the AI processing module through a protocol and used for preprocessing stored data such as format conversion, denoising and data enhancement, and is deployed on a local server or the cloud end and responsible for training and reasoning of a model. The system and the method ensure the efficient transmission of data by communicating with NAS equipment in a wireless or wired mode through a high-speed network, support a distributed computing and edge computing architecture for each module function, and adapt to task demands of different scales.
The NAS system comprises NAS equipment and computer equipment which are connected with each other, and is characterized in that:
first, efficient storage and computation are separated, namely independent expansion of data and computing resources is realized by decoupling the storage and computation modules.
And secondly, modular design, namely interconnecting system components through standardized interfaces, and flexibly replacing or expanding functional modules.
Third, low latency communication, which uses network communication protocols, ensures data transfer efficiency between storage and computation.
Fourthly, intelligent task scheduling, namely dynamically distributing computing resources and file positions according to file types and resource utilization rates through a A I scheduling system.
Fifth, cross-platform compatibility, supporting multiple operating systems and development frameworks, users can deploy in multiple environments.
The invention mainly aims to integrate an NAS system and AI calculation efficiently through an NAS of an online or external AI architecture, the architecture not only can fully utilize the storage capacity of the NAS, but also can provide users with certain data analysis and intelligent decision support, simultaneously avoids the risk of inconsistent output caused by the same A I of multi-user operation, and in addition, the traditional NAS system (without AI) has certain AI analysis output capacity.
In summary, the scheme of the invention provides a flexible, extensible and efficient intelligent service scheme by combining the efficient storage capability of the NAS and the intelligent computing capability of the AI module, provides personalized and diversified AI side services for users, and the NAS data side services (such as automatically adjusting file positions or automatically adjusting picture information) have AI dialogue capability, can answer the problems of users conveniently and can help users to manage the NAS better, so that the NAS has wide market prospect and application value.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to one or any and all possible combinations of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In the present specification and the appended claims, there may be various expressions for the same technical features or technical terms, for example, different expressions such as upper level generalization, lower level definition or synonymous substitution, etc., and those skilled in the art can clearly understand the substantially same technical meaning pointed by the different expressions based on their technical knowledge and in combination with the whole content of the specification and the accompanying drawings, and the differences between the different expressions are only represented in the diversity of text levels, and do not constitute substantial modification or limitation of the technical scheme, and do not affect the certainty of the scope of protection of the patent claims and the full disclosure of the technical content of the specification.
Example 1
Referring to fig. 1 to 4, fig. 1 is a flowchart of an AI processing method for a NAS device according to an embodiment of the present invention, where the method is provided for an NAS control unit in the NAS device, the NAS control unit is configured to implement docking with an AI model in a manner of communicatively connecting with an AI processing module, the NAS device belongs to an original NAS device, and has no built-in AI module and no AI capability, and the method includes S1, the NAS control unit invokes a user operation layer in a first form to generate a first AI requirement and sends the first AI requirement to the AI model, S2, the AI model receives the first AI requirement, converts the first AI requirement into an AI instruction and feeds back the AI instruction to the user operation layer in the first form, S3, the user operation layer in the first form is converted into a user operation layer in a second form based on the AI instruction, and S4, the NAS control unit invokes the user operation layer in the second form as an AI processing result. In the above scheme, the user operation layer in the first aspect is converted into the user operation layer in the second aspect based on the AI instruction, and can automatically adjust the file position or automatically adjust the picture information, and the user operation layer can issue a series of instructions, such as AI generation content, AI dialogue, AI knowledge base, AI document management, AI video entertainment, and the like. Further, the AI processing module has AI capability, which is defined as the capability of automatically operating and generating the content by means of module installation or software local deployment or request from outside through a network interface, and further receiving a certain degree of intelligent reply.
In an embodiment, a first module group is arranged in the NAS control unit and comprises an NAS manufacturer intelligent module A, a third party intelligent module B and a self-grinding intelligent module C, a second module group is also arranged in the NAS control unit and comprises a container virtual intelligent module D, a manufacturer integrated intelligent module E and a non-manufacturer intelligent module F, the S1 is used for calling a user operation layer of a first form to generate a first AI requirement and sending the first AI requirement to an AI model, and concretely comprises the NAS control unit is used for calling the user operation layer of the first form to generate the first AI requirement corresponding to the user operation layer, packaging the first AI requirement into a standardized data packet through a preset interface and protocol, and sending the requirement parameter, the application scene identifier and the user authority information in the data packet to the AI model.
In an embodiment, the sending the requirement parameter, the application scene identifier and the user permission information to the AI model specifically includes that the NAS control unit generates a module starting response according to the requirement parameter, the application scene identifier and the user permission information, and starts the first module group and/or the second module group.
In an embodiment, the method further includes the steps that the NAS control unit generates a module starting response, a first module group is started, if the module starting response is related to an NAS manufacturer intelligent module A, the NAS manufacturer intelligent module A directly calls a user operation layer through an AP I interface integrated in a NAS system and sends requirements to an AI model, if the module starting response is related to a third party intelligent module B, the third party intelligent module B encrypts the first AI requirements through a third party interface compatible with the NAS system and sends the first AI requirements to the AI model, and if the module starting response is related to a self-grinding intelligent module C, the self-grinding intelligent module C sends the first A I requirements to the A I model in a specific format through a custom script and an interface.
In an embodiment, the method further comprises the steps that the NAS control unit generates a module starting response, starts a second module group, if the module starting response is related to a container virtual intelligent module D, the container virtual intelligent module D runs in a container environment and sends a first AI requirement to an AI model through a network interface inside the container, if the module starting response is related to a manufacturer integrated intelligent module E, the manufacturer integrated intelligent module E sends the first AI requirement to the AI model in a script call mode through a tool type script developed by a manufacturer, and if the module starting response is related to a non-manufacturer intelligent module F, the non-manufacturer intelligent module F sends the first AI requirement to the A I model through a tool type script developed by a third party through a preset protocol.
In an embodiment, the step S2 is that the AI model receives a first AI requirement, converts the first AI requirement into an AI instruction, and feeds the AI instruction back to a user operation layer of a first form, and specifically comprises that the AI model analyzes the first AI requirement through a preset analysis module, extracts a requirement parameter, an application scene identifier and user authority information in the first AI requirement, combines a preset AI algorithm library according to the extracted requirement parameter, matches a corresponding AI algorithm, and generates a corresponding AI instruction, and feeds the generated AI instruction back to the user operation layer of the first form through a preset feedback channel in a standardized instruction format, wherein the feedback channel comprises a network communication interface and a local communication interface.
In an embodiment, the step S3, the user operation layer in the first form is converted into the user operation layer in the second form based on the AI instruction, specifically comprises the steps that after the user operation layer in the first form receives the AI instruction fed back by the AI model, the AI instruction is analyzed through a preset instruction analysis module, the operation parameters and the target state in the instruction are extracted, the user operation layer in the first form calls a preset scene conversion module according to the extracted operation parameters, the configuration and the state of the user operation layer are gradually adjusted according to a preset conversion rule, in the conversion process, the user operation layer in the first form monitors the conversion state in real time, and when the conversion is completed and the preset second state condition is met, the user operation layer converted into the second form is confirmed, and the second state condition comprises the preset configuration parameters and the state identifier.
In an embodiment, the step S4 is that the NAS control unit calls a user operation layer in a second form as an AI processing result, and specifically comprises that the NAS control unit detects that the user operation layer in the second form is successfully converted and is in a stable state through a preset scene detection module, the NAS control unit calls the user operation layer in the second form, acquires current state information and processing result data of the user operation layer through a preset interface, encapsulates the acquired processing result data, generates an AI processing result, and sends the AI processing result to a preset user terminal or a storage system through a preset output interface, wherein the output interface comprises a network interface and a local storage interface.
In the above scheme, the NAS control unit is provided with a first module group including a NAS manufacturer intelligent module a, a third party intelligent module B, and a self-polishing intelligent module C, and is also provided with a second module group including a container virtual intelligent module D, a manufacturer integrated intelligent module E, and a non-manufacturer intelligent module F.
The class A application program corresponding to the NAS manufacturer intelligent module A can be a file manager, a video center, an album, a virtual machine, synchronous backup, music, an application mall and the like, and for example, the NAS manufacturer provides a file manager application or a program or a script with an AI artificial intelligence function.
The class B application program corresponding to the third party intelligent module B can be an application program or script which is developed by non-NAS authorities and can be operated in NAS and has AI artificial intelligence function, such as thunder, knowledgeable, weChat, tremble, and the like.
The class C application program corresponding to the self-research intelligent module C can be an application program or script which is installed in the NAS by a NAS holder (including a NAS purchaser) or is used for autonomous development and use or call of an AI artificial intelligent function.
The class D application program corresponding to the container virtual intelligent module D may be an application program or script having an AI artificial intelligent function, which is developed or used in a container or virtual machine such as a Docker, a virtual machine, a VMWare, etc. of the NAS, for example, container items such as alist, emby, fi refox, chrome, etc. have an AI artificial intelligent function added thereto.
The class E application program corresponding to the manufacturer integrated intelligent module E can be a script which is researched and developed or purchased by a NAS manufacturer and integrated in the NAS and has an AI artificial intelligent function, for example, the NAS manufacturer uses AI to carry out automatic switching equipment, password resetting, alarm clock, fault archiving, fault processing, problem investigation, file archiving/classifying and other scripts with AI capability in the NAS.
The F-type application program corresponding to the non-manufacturer intelligent module F can be a script which is developed or purchased by a non-NAS manufacturer and integrated in the NAS and has AI artificial intelligent functions, such as a script with AI capability, for example, for carrying out local or remote data synchronization, information archiving/searching, generated question and answer and the like in the NAS.
In one embodiment, the present invention provides an AI processing method for a NAS device and a NAS system that incorporates an AI processing module that is external or internal to the NAS by using the NAS device as a data storage core. The AI processing module has AI capability, which is defined as the capability of automatically operating and generating the content by means of module installation or software local deployment or request from outside through a network interface, and further receiving a certain degree of intelligent reply. The framework comprises an NAS storage management module, a data preprocessing module and an AI processing module through a modularized design, and seamless cooperation is realized among the modules through standardized interfaces. The system supports efficient model training and reasoning, has the characteristics of expansibility, low delay and intellectualization, and is suitable for various application scenes.
The function of each module of the invention comprises a NAS storage management module which is responsible for storing the original data and the intermediate result. Multi-user data access rights management, version control, and backup functions are provided. The system comprises an AI processing module, a data preprocessing module, an AI processing module and a cloud end, wherein the AI processing module is connected with the AI processing module through a protocol and used for preprocessing stored data such as format conversion, denoising and data enhancement, and is deployed on a local server or the cloud end and responsible for training and reasoning of a model. The system and the method ensure the efficient transmission of data by communicating with NAS equipment in a wireless or wired mode through a high-speed network, support a distributed computing and edge computing architecture for each module function, and adapt to task demands of different scales.
In summary, the scheme of the invention provides a flexible, extensible and efficient intelligent service scheme by combining the efficient storage capability of the NAS and the intelligent computing capability of the AI module, provides personalized and diversified AI side services for users, and the NAS data side services (such as automatically adjusting file positions or automatically adjusting picture information) have AI dialogue capability, can answer the problems of users conveniently and can help users to manage the NAS better, so that the NAS has wide market prospect and application value.
Example 2
Referring to fig. 5, fig. 5 is a block diagram of an electronic device according to the present invention. The electronic device may be a terminal or a server, where the terminal may be an electronic device having a communication function, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, and a wearable device. The system comprises a processor 111, a communication interface 112, a memory 113 and a communication bus 114, wherein the processor 111, the communication interface 112 and the memory 113 are communicated with each other through the communication bus 114.
A memory 113 for storing a computer program.
In one embodiment of the present invention, the processor 111 is configured to implement the method provided in any of the foregoing method embodiments when executing the program stored on the memory 113.
It should be appreciated that in embodiments of the present application, the Processor 111 may be a central processing unit (Centra lProcess I ng Unit, CPU), and the Processor 502 may also be other general purpose processors, digital signal processors (DIGITA L SIGNA L processors, DSPs), application specific integrated circuits (AS ics) SPECIFIC I NTEGRATED CI rcuit, off-the-shelf programmable gate arrays (Fie l d-Programmab L E GATE ARRAY, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those skilled in the art will appreciate that all or part of the flow in a method embodying the above described embodiments may be accomplished by computer programs instructing the relevant hardware. The computer program may be stored in a storage medium that is a computer readable storage medium. The computer program is executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, a unit or component may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be combined, divided and deleted according to actual needs. In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The integrated unit may be stored in a storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a terminal, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and for those portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.