[go: up one dir, main page]

CN114912778B - Training program testing method, device, electronic equipment and storage medium - Google Patents

Training program testing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114912778B
CN114912778B CN202210463793.5A CN202210463793A CN114912778B CN 114912778 B CN114912778 B CN 114912778B CN 202210463793 A CN202210463793 A CN 202210463793A CN 114912778 B CN114912778 B CN 114912778B
Authority
CN
China
Prior art keywords
information
pet
interaction
model
virtual pet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210463793.5A
Other languages
Chinese (zh)
Other versions
CN114912778A (en
Inventor
贾彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Ruipeng Pet Healthcare Group Co Ltd
Original Assignee
New Ruipeng Pet Healthcare Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Ruipeng Pet Healthcare Group Co Ltd filed Critical New Ruipeng Pet Healthcare Group Co Ltd
Priority to CN202210463793.5A priority Critical patent/CN114912778B/en
Publication of CN114912778A publication Critical patent/CN114912778A/en
Application granted granted Critical
Publication of CN114912778B publication Critical patent/CN114912778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/535Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

本申请涉及人工智能技术领域,具体公开了一种培养方案检验方法、装置、电子设备及存储介质,其中,方法包括:获取饲育者的个人信息、饲育者所饲养的宠物的宠物信息和饲养宠物的饲育空间的环境信息;根据个人信息和环境信息,构建饲育模型,其中,饲育模型用于模拟饲育空间的生活环境;根据宠物信息构建初始虚拟宠物,以便模拟宠物的生长;将待检验培养方案和初始虚拟宠物输入饲育模型,对初始虚拟宠物进行培养模拟,得到目标虚拟宠物;获取目标虚拟宠物与初始虚拟宠物之间的差异,得到差异信息;根据差异信息对待检验培养方案进行检验处理,以确定待检验培养方案的可信度。

The present application relates to the field of artificial intelligence technology, and specifically discloses a training program inspection method, device, electronic device and storage medium, wherein the method comprises: obtaining personal information of a breeder, pet information of a pet raised by the breeder and environmental information of a training space for raising the pet; constructing a training model according to the personal information and environmental information, wherein the training model is used to simulate the living environment of the training space; constructing an initial virtual pet according to the pet information so as to simulate the growth of the pet; inputting the training program to be inspected and the initial virtual pet into the training model, performing training simulation on the initial virtual pet and obtaining a target virtual pet; obtaining the difference between the target virtual pet and the initial virtual pet and obtaining difference information; and inspecting the training program to be inspected according to the difference information to determine the credibility of the training program to be inspected.

Description

Cultivation scheme checking method, device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a culture scheme checking method, a device, electronic equipment and a storage medium.
Background
With the abundance of life, people have a certain demand for the shapes of pets raised by themselves, and it is expected that pets with shapes which meet the preference of the people can be raised. In addition to the colors determined by the genes congenital, the image of the pet has a certain influence on the image of the pet by the acquired culture, such as diet, life work, daily activities and the like.
However, the ordinary breeder does not have professional feeding knowledge, and these breeders can only cultivate the own pets by searching for feeding cases related to the own pets and by imitating the cultivation scheme in the cases. This approach does not allow for advanced verification of the incubation protocol, and does not guarantee that the incubation protocol in the case is correlated with the outcome of the case. That is, the credibility of the culture scheme in the case cannot be confirmed in advance. The actual conditions of all pets are different, and the culture schemes determined by the mode can have different effects on different pets, so that the final rearing result cannot reach the original expectations of the rearers.
Disclosure of Invention
In order to solve the above problems in the prior art, embodiments of the present application provide a cultivation scheme checking method, apparatus, electronic device, and storage medium, which can check the reliability of a cultivation scheme before formal raising, and ensure that a raising person selects a cultivation scheme that meets his own expectations.
In a first aspect, embodiments of the present application provide a culture protocol verification method comprising:
acquiring personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet;
Constructing a rearing model according to the personal information and the environment information, wherein the rearing model is used for simulating the living environment of the rearing space;
Constructing an initial virtual pet according to the pet information so as to simulate the growth of the pet;
inputting a culture scheme to be tested and an initial virtual pet into a rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet;
obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information;
and carrying out inspection treatment on the cultivation scheme to be inspected according to the difference information so as to determine the credibility of the cultivation scheme to be inspected.
In a second aspect, embodiments of the present application provide a culture protocol verification apparatus comprising:
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet;
The system comprises a building module, a breeding module and a control module, wherein the building module is used for building a breeding model according to personal information and environment information, the breeding model is used for simulating the living environment of a breeding space, and an initial virtual pet is built according to pet information so as to simulate the growth of the pet;
The simulation module is used for inputting a culture scheme to be tested and an initial virtual pet into the rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet;
The verification module is used for obtaining the difference between the target virtual pet and the initial virtual pet, obtaining difference information, and carrying out verification processing on the cultivation scheme to be verified according to the difference information so as to determine the credibility of the cultivation scheme to be verified.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor coupled to a memory, the memory for storing a computer program, the processor for executing the computer program stored in the memory to cause the electronic device to perform a method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, the computer program causing a computer to perform the method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program, the computer being operable to cause a computer to perform a method as in the first aspect.
The implementation of the embodiment of the application has the following beneficial effects:
In the embodiment of the application, a rearing model simulating living environment, environmental climate change and interaction events of the rearing space is constructed by acquiring personal information of a rearer and environment information of the rearing space for raising pets. Meanwhile, pet information of the pets raised by the raising person is obtained, and a virtual pet identical to the pets raised by the raising person is constructed so as to simulate the growth of the pets. Then, inputting the cultivation scheme to be tested and the initial virtual pet into a rearing model, and combining the cultivation scheme, rearing habit of a rearer and rearing environment, and carrying out cultivation simulation on the initial virtual pet to obtain the target virtual pet. And finally, obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information, and then carrying out inspection processing on the cultivation scheme to be inspected according to the difference information to determine the credibility of the cultivation scheme to be inspected. Therefore, the credibility of the cultivation scheme is checked before the formal raising, meanwhile, the raising effect can be predicted in advance, so that a raising person can intuitively observe the raising effect of the own pets when selecting the raising scheme, and the cultivation scheme meeting the self expectations is selected.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a hardware structure of a culture scheme inspection device according to an embodiment of the present application;
FIG. 2 is a system frame diagram of a method for inspecting a cultivation plan of a pet in a scenario for inspecting a cultivation plan of the pet according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a method for testing a culture scheme according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a method for adjusting a point cloud model to obtain a raising model according to climate change information, interaction time and interaction rules according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interaction timeline according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a climate timeline provided by an embodiment of the present application;
FIG. 7 is a schematic illustration of another climate timeline provided by an embodiment of the present application;
FIG. 8 is a functional block diagram of a culture solution inspection device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the present application. All other embodiments, based on the embodiments of the application, which are apparent to those of ordinary skill in the art without inventive faculty, are intended to be within the scope of the application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and drawings are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, result, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly understand that the embodiments described herein may be combined with other embodiments.
First, referring to fig. 1, fig. 1 is a schematic hardware structure of a culture solution testing device according to an embodiment of the present application. The incubation protocol verification apparatus 100 comprises at least one processor 101, a communication line 102, a memory 103, and at least one communication interface 104.
In this embodiment, the processor 101 may be a general-purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program according to the present application.
Communication line 102 may include a pathway to transfer information between the above-described components.
The communication interface 104, which may be any transceiver-like device (e.g., antenna, etc.), is used to communicate with other devices or communication networks, such as ethernet, RAN, wireless local area network (wireless local area networks, WLAN), etc.
The memory 103 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-only memory, EEPROM), a compact disc (compact disc read-only memory) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In this embodiment, the memory 103 may be independently provided and connected to the processor 101 via the communication line 102. Memory 103 may also be integrated with processor 101. The memory 103 provided by embodiments of the present application may generally have non-volatility. The memory 103 is used for storing computer-executable instructions for executing the scheme of the present application, and is controlled by the processor 101 to execute the instructions. The processor 101 is configured to execute computer-executable instructions stored in the memory 103 to implement the methods provided in the embodiments of the present application described below.
In alternative embodiments, computer-executable instructions may also be referred to as application code, as the application is not particularly limited.
In alternative embodiments, processor 101 may include one or more CPUs, such as CPU0 and CPU1 in fig. 1.
In alternative embodiments, the culture protocol verification device 100 may include multiple processors, such as the processor 101 and the processor 107 of FIG. 1. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In an alternative embodiment, if the cultivation plan checking apparatus 100 is a server, for example, it may be a stand-alone server, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery network (ContentDelivery Network, CDN), and basic cloud computing services such as big data and artificial intelligence platform. The incubation protocol verification apparatus 100 may further comprise an output device 105 and an input device 106. The output device 105 communicates with the processor 101 and may display information in a variety of ways. For example, the output device 105 may be a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 106 is in communication with the processor 101 and may receive user input in a variety of ways. For example, the input device 106 may be a mouse, a keyboard, a touch screen device, a sensing device, or the like.
The culture protocol verification apparatus 100 may be a general-purpose device or a special-purpose device. Embodiments of the present application are not limited to the type of culture protocol testing device 100.
Secondly, it should be noted that the method for inspecting a cultivation scheme provided by the application can be applied to the situations of inspecting various cultivation schemes such as a cultivation scheme of pets, a cultivation scheme of plants, a cultivation scheme of poultry and the like. In this embodiment, a scenario of inspecting a cultivation scheme of a pet will be taken as an example, and a cultivation scheme inspection method provided by the present application will be described, and a method of inspecting a cultivation scheme in other scenarios is similar to a cultivation scheme inspection method in a scenario of inspecting a cultivation scheme of a pet, and will not be described herein.
Finally, fig. 2 is a system frame diagram of a method for inspecting a cultivation plan of a pet in a scenario for inspecting a cultivation plan of a pet according to an embodiment of the present application. Specifically, the system may include a data acquisition device 201, a simulated culture device 202, and a model database 203. The data obtaining device 201 may be a smart Phone (such as an Android Mobile Phone, an iOS Mobile Phone, a Windows Phone Mobile Phone, etc.), a tablet computer, a palm computer, a notebook computer, a Mobile internet device MID (Mobile INTERNET DEVICES, abbreviated as MID), etc., and is configured to receive personal information input by a raising person, pet information of a pet raised by the raising person, environmental information of a raising space of the raising person, and a cultivation scheme to be checked, and display a final simulation result and a final checking result. The simulation culturing apparatus 202 may be a server for receiving the personal information of the raising person, the pet information of the pet raised by the data acquiring apparatus 201, the environment information of the raising space for raising the pet, and the culturing plan to be inspected, acquiring model data from the model database 203 for model construction and simulation culturing, and transmitting the simulation result and the inspection result to the data acquiring apparatus 201. At the same time, the simulation culturing apparatus 202 also performs update maintenance on the model database 203.
In this embodiment, the rearer can log in the simulation system through the data acquisition device 201 to fill in and upload related materials, such as room pictures, house plane drawings and positioning information under different viewing angles as environment information of the rearing space, whole body images and medical record information of the pets under different viewing angles as pet information of the pets, own historical rearing information as personal information, and texts or videos of the cultivation scheme to be checked. After the data acquisition device 201 acquires the relevant information, the information is sent to the simulated cultivation device 202 so that the simulated cultivation device 202 can analyze the information, and meanwhile, model data in the model database 203 is called according to the analysis result to construct a rearing model and simulate pets to be cultivated. And then the effect of the culture scheme to be inspected is inspected according to the simulation result, and the inspection result and the simulation result are returned to the data acquisition device 201. Specifically, the simulation training apparatus 202 constructs a point cloud model of the room from the room pictures and the house plane drawings at a plurality of different viewing angles. And meanwhile, determining the climate characteristics of the position of the room according to the positioning information, and generating a climate time axis simulating the climate change of the position of the room in the cultivation period. According to the historical rearing information of the rearer, the interaction law and the interaction time of the rearer are determined, and then an interaction time axis simulating the interaction event between the rearer and the pet in the cultivation period is generated. And then, the climate time axis and the interaction time axis are overlapped into the point cloud model, so that the climate change and the occurrence of events in the point cloud model are controlled through the two time axes, and a rearing model is formed. Meanwhile, the body type of the pet is determined based on the whole body images of the pet at different viewing angles, and then the variety and age of the pet in the pet information are combined, and corresponding standard body models are matched in the model database 203 for adjustment, so that an initial body model is generated. And then the initial body model is secondarily adjusted through medical record information to enable the initial body model to be more suitable for the real state of the pet, the hair color characteristics of the pet are determined through the whole body images of the pet at different visual angles, and then corresponding maps are matched in the model database 203 to be covered on the adjusted body model, so that the model pet is generated.
After the rearing model and the virtual pet are constructed, the simulation culturing device 202 inputs the virtual pet and the culturing scheme to be checked into the rearing model, and then adjusts the interaction event in the interaction time axis through the culturing scheme to be checked, thereby performing the simulation culturing on the virtual pet. After obtaining the simulation result, the simulation culturing device 202 may compare the simulation result with the inputted virtual pet, record the difference between the simulation result and the inputted virtual pet, and then compare the difference with the culturing effect in the culturing scheme to be tested, so as to determine the credibility of the culturing scheme to be tested. And transmits the obtained simulation result and credibility to the data acquisition device 201 so that after the breeder checks, it is determined whether to culture the own pet by the culture scheme to be checked.
In the embodiment, the reliability of the cultivation scheme is checked before the formal raising, and meanwhile, the raising effect can be predicted in advance, so that a raising person can intuitively observe the raising effect of the self-family pets when selecting the raising scheme, and the cultivation scheme meeting the self-expectations is selected.
The cultivation scheme inspection method disclosed in the present application will be described below by taking a scenario of inspecting a cultivation scheme of a pet as an example:
referring to fig. 3, fig. 3 is a schematic flow chart of a method for testing a culture scheme according to an embodiment of the application. The culture scheme inspection method comprises the following steps:
301, personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet are acquired.
In this embodiment, the personal information of the feeders may include information on the rest, feeding habits, and historical interaction information with the pets. The feeding habit information is used for recording the time and frequency of feeding the pet by the breeder, for example, the daily amount of the pet is directly put in the morning or fed in different times, and the historical interaction information is used for recording the daily interaction time and interaction content of the breeder and the pet. The pet information may include whole body images, medical history information, and historical behavioral information. The whole body image can be a plurality of whole body images or partial images under different visual angles, can be uploaded after being photographed by a feeder, the medical record information can be obtained through a pet hospital system related to the pet visit, the history behavior information can be history monitoring data of a monitoring camera installed in the feeder, the feeder can export the history monitoring data and upload the history monitoring data by the feeder, or the history information can be related text description and can be manually filled in by the feeder when the feeder inputs the information. The environment information may include house information, house position information, and indoor furnishing information. The house information can be a plurality of house pictures and plane drawings of houses at different visual angles, and can be uploaded after being photographed by a feeder, the house position information can be GPS positioning information, and the indoor decoration information can be obtained by carrying out feature recognition on the house pictures at the different visual angles.
Specifically, in this embodiment, since the data acquisition device 201 may be a personal device such as a smart phone, a tablet computer, a palm computer, or a notebook computer, the raising person may install an application program associated with the incubation plan verification in the data acquisition device 201, and then submit personal information, pet information, and environmental information by registering and logging in, or may submit personal information, pet information, and environmental information by accessing a WeChat applet associated with the incubation plan verification and then by authenticated WeChat login, or may submit personal information, pet information, and environmental information by accessing a website of the incubation plan verification and then by registering and logging in.
Specifically, taking an information filling interface of an application program as an example, wherein information of work and rest, feeding habit information, historical interaction information of pets, historical behavior information in text form and the like can be input and filled in through text boxes, whole-body images, house information, indoor decoration information and historical behavior information in video form can be uploaded by clicking a file selection button to select corresponding files, and medical record information and house position information can be selected by clicking a secondary menu button behind the text boxes. After the breeder has filled out the information, the filled-in information may be submitted to the simulated growing device 202 by clicking a ok button. After receiving the information, the simulation culturing apparatus 202 may begin to analyze the information to construct a corresponding model for subsequent simulation culturing.
302, Constructing a rearing model according to personal information and environment information.
In this embodiment, the feeding model is used to simulate the living environment of the feeding space. In particular, the model may simulate the climate of the space and the interactive events occurring therein, in addition to simulating the space of the living environment of the pet.
In the present embodiment, first, a point cloud model of a raising space is constructed from house information and indoor decoration information. Specifically, by extracting features of each of a plurality of house pictures, special positions such as windows, doors, corners, wall decorations, and the like are identified in each house picture as feature points of each house picture. And combining the characteristic points at the same position in each house picture to construct a rough point cloud space serving as an initial point cloud model. Then, when the initial point cloud model is adjusted based on the size information of the house, window, door, etc. described in the planar drawing of the house, the size of the door, window, wall, etc. in the initial point cloud model is the same as the size described in the planar drawing. Then, the plurality of house pictures are analyzed, and furniture and the arrangement positions contained in the house pictures are determined as indoor arrangement information. And determining the size information of furniture in each house picture according to the size information of houses, windows, doors and the like recorded in the plane drawing, generating corresponding furniture point cloud models, and placing the furniture point cloud models at corresponding positions in the adjusted initial point cloud models according to indoor furnishing information to obtain the point cloud models of the raising space.
Climate change information for the feeding space may then be determined from the house position information. Specifically, the location information of the house may be GPS location information, through which the region where the house is located may be quickly determined, and then the historical weather information and weather prediction information of the region are acquired as the weather change information.
Then, the process is carried out, can be according to the information of the work and rest feeding habit information and historical interaction information determine interaction time and interaction rules of the breeder and the pet. In this embodiment, according to the above description, the information of the work and rest may be the time of getting up, sleeping, going out, etc. of the breeder each day, the feeding habit information is used to record the time and frequency of feeding the pet by the breeder, for example, the amount of the pet put on the morning each day is directly put on, or the pet is fed in different times, and the history interaction information is used to record the daily interaction time and interaction content of the breeder and the pet. Thus, through analysis of the information of the work and rest, the information of the feeding habit and the historical interaction information, the time of the feeder appearing in the feeding space every day, the interaction event of the pet, the feeding habit and the like can be determined as the interaction time and the interaction rule of the feeder and the pet. For example, a breeder can prepare daily dry food pet ration and drink at 7 am and 30 am on a working day, interact with the pet at 8 am to 9 am and feed a small amount of canned food ration at 7 am and 7 am at night, prepare different types of ration at home on a daily basis and accompany the pet for playing on time on a rest day and all day.
And finally, adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain a rearing model. In the embodiment, the influence of the breeder and weather on the growth of the pet is added to the point cloud model by establishing a time axis and adding corresponding events on the time axis to form an event time axis. Specifically, as shown in fig. 4, the adjustment method includes:
And 401, establishing a time axis, determining at least one interaction interval in the time axis according to the interaction time, and generating at least one interaction event according to the interaction rule.
In this embodiment, the at least one interaction event corresponds to at least one interaction interval one by one. Specifically, first, the time axis may be divided into different sections by days, and whether each of the divided sections is a workday or a rest day may be determined based on the current time and occupation information of the feeder, and the determined type may be taken as the section type of each section. Then, based on the interaction time and the interaction rule determined in step 302, and combining the interval type of each interval, the interaction interval is divided for each interval. For example, for the interval 1, the interval type is working days, according to the interaction time and interaction rule illustrated in the step 302, the interval 1 can be divided into four interaction intervals, namely, the interaction interval 1:7:30-19:00, the raising person goes out to work, the pet is alone at home without interaction, the interaction interval 2:19:00-20:00, the raising person goes home to eat, washes one's face and washes one's face, weak interaction, the interaction interval 3:20:00-21:00, the raising person plays a game and feeds the pet, strong interaction, the interaction interval 4:21:00- (the second day) 7:30, the raising person falls asleep, and weak interaction.
In this embodiment, after the interaction interval is determined, according to the time period corresponding to the interaction interval and the analyzed interaction rule of the breeder and the pet in the time period, the interaction event corresponding to the interaction rule may be randomly generated.
And 402, filling each interactive content in the at least one interactive content into a corresponding interactive interval to generate an interactive time axis.
In this embodiment, the interaction interval may be placed at a corresponding position in the time axis according to a time period corresponding to the interaction interval, and a specific interaction event corresponding to the interaction interval is filled into the interaction interval, so as to obtain an interaction time axis as shown in fig. 5.
403, Determining the climate characteristics in the first time period according to the climate change information, the current time and the preset simulated cultivation duration.
In this embodiment, the starting time of the first period is the current time, and the duration of the first period is the simulated incubation duration. In simple terms, the simulated incubation period may be determined by the incubation regimen to be tested, or may be set by the breeder himself. After determining the simulated incubation period, the first time period may be determined in combination with the current time. For example, the simulated incubation period is 3 months, the current time 2022 is 3 months and 31 days, and the first time period is 2022 is 4 months 1 day to 2022 is 6 months and 30 days.
Meanwhile, in the present embodiment, the climate change information may be historical weather information and weather prediction information. Specifically, the historical weather information is the weather information of the region in a historical period, such as the past 5 years, and the weather forecast information is the weather information issued by the weather station for a future period. The weather prediction information is accurate, but the duration of the prediction is generally shorter, and based on the weather prediction information, when the first time period is shorter, that is, the first time period is less than or equal to the duration of the accurate prediction, the weather prediction information can be directly used as the weather feature in the first time period. And when the first time period is longer, that is, the first time period is longer than the precisely predicted time period, the first time period can be split into a time period equal to the precisely predicted time period and the other end exceeding the precisely predicted time period. One section of time length equal to the accurate prediction adopts weather prediction information as a weather feature, and the other section exceeding the time length of the accurate prediction can find historical weather information through the date of the time to determine the historical weather information with the same date as the weather feature corresponding to the time. For example, the weather forecast information is the weather condition of the next week, and the first time period is 2 weeks, and the first time period can be split into 1 st week and 2 nd weeks. Wherein, weather forecast information is adopted as climate characteristics in the first week. For week 2, the date is first determined, for example, 2022, 3, 31, 4, 6, 2022, and then the historical weather information in the same time period as 3, 31, 4, 6, 4, can be searched as the weather feature of week 2.
At least one climate event is randomly generated based on the climate characteristics and randomly populated into a timeline to generate a climate timeline.
In this embodiment, when the climate characteristic is weather prediction information, the climate events in the corresponding order may be sequentially generated according to the weather prediction information, and the generated climate events may be filled into the time axis according to the generation order. For example, weather forecast information is weather forecast of the following week, specifically [3 months 31 days, sunny ], [4 months 1 day, sunny ], [4 months 2 days, yin ], [4 months 3 days, light rain ], [4 months 4 days, heavy rain ], [4 months 5 days, yin ], and [4 months 6 days, sunny ], and the generated weather time axis is shown in fig. 6.
Similarly, in this embodiment, when the weather feature is historical weather information, the weather feature and the change feature in the first period may be determined according to the historical weather information, and then the corresponding weather event may be randomly generated. For example, if the weather characteristics of a certain period are overcast and rainy according to the historical weather information, the change characteristics are rapid change, the period is mainly in overcast and rainy days, and weather switching is frequent, weather sequences of [3 months 31 days, light rain ], [4 months 1 days, yin ], [4 months 2 days, light rain ], [4 months 3 days, heavy rain ], [4 months 4 days, heavy rain ], [4 months 5 days, yin ], and [4 months 6 days, heavy rain ] can be generated according to the characteristics, and meanwhile, a weather time axis shown in fig. 7 is generated.
And 405, superposing the climate time axis and the interaction time axis into the point cloud model to obtain a rearing model.
In the present embodiment, the interaction event corresponding to the interaction time axis may be generated in the feeding model through the interaction time axis, and the climate event corresponding to the climate time axis may be generated in the feeding model through the climate time axis.
And 303, constructing an initial virtual pet according to the pet information.
In this embodiment, the virtual pet may simulate the growth process of the pet in cooperation with the rearing model. Specifically, the pet information may include whole body images, medical history information, and historical behavior information. The whole-body image may be a plurality of whole-body images at multiple viewing angles. Thus, feature extraction is performed on the whole body image to obtain the bodily form feature and at least one appearance feature of the pet. The medical record information can be obtained through the authorization of the feeders and the pet hospitals connected with the pets for medical treatment. The medical record information records age information, variety information and health information of the pet. The historical behavior information may be a surveillance video of the pet raised by the raising person or a common behavior of the pet filled by the raising person, whereby the character characteristics of the pet may be determined by analysis of the historical behavior information.
Based on the above, in the process of constructing the initial virtual pet, the corresponding body database can be firstly obtained according to the variety information recorded by the medical record information, wherein the body database is used for storing standard body models of the pets of the corresponding varieties in different age stages and different health states. And then matching a corresponding initial body model in a body database according to the age information and the health information, and adjusting the initial body model according to the body type characteristics extracted by the real photo to obtain the body model.
And then, matching in a map library according to the at least one appearance characteristic to obtain at least one appearance map corresponding to the at least one appearance characteristic one by one. Specifically, prior to extracting the appearance features, the pet may be subjected to a division of its body parts into head, torso (back and abdomen), limbs (left forelimb, right forelimb, left hindlimb and right hindlimb), and tails (if present). And extracting the appearance characteristics of the divided areas separately to obtain the appearance characteristics corresponding to each area, and adding the names of the positions corresponding to each appearance characteristic into the appearance characteristics as position labels.
Based on this, in this embodiment, a corresponding portion map library may be obtained according to a portion tag of each of the at least one appearance feature, where the portion tag is used to identify a portion of the pet body corresponding to each appearance feature, and the portion map library is used to store map materials of the corresponding portion. And then, matching is carried out in the corresponding position mapping library according to each appearance characteristic, and a corresponding mapping material group is obtained. And finally, carrying out mapping generation processing on the mapping material group corresponding to each appearance characteristic according to each appearance characteristic to obtain at least one appearance mapping.
And finally, constructing a first virtual pet according to at least one appearance map and the body model, and adjusting the first virtual pet according to the health information and the character characteristics to obtain an initial virtual pet. Specifically, according to the position label of the appearance characteristic corresponding to each appearance map, the appearance map is covered to the position corresponding to the position label on the body model, and then the first virtual pet is constructed. Meanwhile, bones, muscles and viscera of the first virtual pet are adjusted according to the health information and the variety information of the pet, for example, the health information shows that a certain pet is obese, is accompanied by visceral fat, and has certain hyperosteogeny in the left forelimb. The bone portion of the left forelimb of the first virtual pet can be adjusted for hyperosteogeny and the internal organ portion and the muscle portion can be filled with a certain fat. And finally, generating a behavior strategy of the pet according to the character characteristics, and adding the behavior strategy into the first virtual pet so as to control the first virtual pet to make corresponding actions when a climate event or an interaction event occurs. Therefore, the comprehensive and high-precision simulation of the appearance, the health state and the character of the pet in reality is realized, so that the simulation can be more in line with the actual development, and the reality of the simulation result is improved.
And 304, inputting the culture scheme to be tested and the initial virtual pet into a rearing model, and carrying out culture simulation on the initial virtual pet to obtain the target virtual pet.
Specifically, in this embodiment, parameter adjustment may be performed on the interaction events in the interaction time axis in the feeding model according to the cultivation scheme to be checked. For example, the feeding food in the interactive event is replaced with the feeding food in the cultivation scheme to be checked, and the play event in the interactive event is replaced with the cultivation event in the cultivation scheme to be checked.
Based on the method, after the virtual pet inputs, the adjusted interaction time axis and climate time axis can be started, so that the rearing model generates corresponding interaction events and climate events along with the time, the virtual pet in the rearing model interacts, and the rearing process in reality is simulated. Specifically, settlement may be performed every 24 hours, and parameters of the virtual pet may be adjusted based on interactive events and climate events occurring in the feeding model within 24 hours. For example, 3 feeds were made within 24 hours, at 8 am, 12 am and 18 pm respectively, the feed in the morning was a dry ration, the feed in the noon was a nutritional meal self-processed from chicken breast, tofu, carrot, lettuce, cabbage, etc., and the feed in the evening was a wet ration. Meanwhile, due to clear weather, the pet is taken away from home during 14-16 pm. Based on this, after the 24 hour period has elapsed, the impact of each meal, and the afternoon walk, on the pet's body is calculated, and then parameters of the virtual pet, such as muscle, hair shine, fat, etc., are adjusted based on the impact, thereby visualizing the impact of the interaction events and climate events occurring within the 24 hour period on the pet. In addition, the growth rule of the pet is combined, and the growth condition of the pet is calculated, so that the parameters of the virtual pet are synchronously adjusted, and the cultivation simulation of the real pet is realized.
And 305, obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information.
In the embodiment, the difference information may include a body type difference, a hair glossiness difference, a health degree difference, and the like, and the corresponding difference information may be obtained by extracting and comparing features related to the initial virtual pet and the target virtual pet.
And 306, performing inspection treatment on the cultivation scheme to be inspected according to the difference information so as to determine the credibility of the cultivation scheme to be inspected.
In this embodiment, keyword extraction may be performed on the culture scheme to be tested, so as to obtain at least one piece of effect information corresponding to the culture scheme to be tested. And then splitting the difference information according to the at least one piece of effect information to obtain at least one piece of sub-difference information corresponding to the at least one piece of effect information one by one. And then determining the standard reaching degree of the effect information corresponding to each piece of sub-difference information according to each piece of sub-difference information in at least one piece of sub-difference information, and obtaining at least one standard reaching rate. And splitting at least one piece of effect information into at least one effect reaching the standard and at least one effect not reaching the standard according to a preset threshold value and at least one standard reaching rate. And finally, generating the credibility of the culture scheme to be checked according to at least one effect reaching the standard and at least one effect not reaching the standard.
Specifically, as described in step 305, the difference information may include a difference in body type, a difference in hair glossiness, a difference in health degree, and the like, but the culture scheme to be examined may have an influence only on a part of the differences. Therefore, the corresponding sub-difference information can be extracted from the difference information by the culture effect described in the culture scheme to be checked to make a correspondence. For example, if the effect of enhancing the constitution of the pet and improving the hair luster of the pet is mentioned in the culture scheme to be tested, the health difference in the difference information can be extracted as sub-difference information corresponding to the effect of enhancing the constitution of the pet, and the hair luster difference in the difference information can be extracted as sub-difference information corresponding to the effect of improving the hair luster of the pet.
Further, in this embodiment, according to the description of the culture effect in the culture scheme to be tested, the features corresponding to the recorded effect can be extracted and compared, so as to obtain the corresponding sub-difference information, thereby improving the comparison efficiency and reducing the complexity of the system.
Then, the difference degree of the difference information corresponding to each effect and the improvement degree recorded by each effect are compared, and the standard reaching rate of each effect is determined, specifically, the standard reaching rate can be represented by a formula ①:
Wherein p represents the achievement rate of each effect, a represents the difference degree of the difference information corresponding to each effect, and b represents the improvement degree recorded by each effect.
After the standard reaching rate of each effect is determined, the effects can be divided into standard reaching effects and non-standard reaching effects according to a preset threshold value. For example, if the actual effect is set to 60% of the recorded effect in advance, it is considered that the effect can be achieved, and if the effect is not a false advertisement, the corresponding threshold is 0.6. Based on the above, the effect with the standard reaching rate of more than or equal to 0.6 is divided into the effect reaching the standard, and the effect with the standard reaching rate of less than 0.6 is divided into the effect not reaching the standard.
In this embodiment, the ratio of the number of up-to-standard effects to the total effect number may be directly used as the reliability of the cultivation scheme to be tested and fed back to the breeder.
In an alternative embodiment, the effects of the cultivation scheme to be tested may also be presented to the breeder in the form of a list, the importance of each effect being determined by the breeder, for example, three options of high, medium and low may be provided after each effect, and then the importance of each effect may be determined according to the breeder's choice. Then, the weight of each effect is determined based on the importance level of each effect, for example, the weight corresponding to the importance level higher is 3, the weight corresponding to the importance level is 2, and the weight corresponding to the importance level lower is 1. And taking the ratio of the sum of the weights of the effects reaching the standard to the sum of the weights of all the effects as the credibility of the culture scheme to be checked. Specifically, the credibility of the culture scheme to be tested can be expressed by the formula ②:
Wherein c represents the credibility of the culture scheme to be tested, d i represents the weight corresponding to the ith standard effect in at least one standard effect of the culture scheme to be tested, n represents the number of at least one standard effect, e j represents the weight corresponding to the jth effect in all effects of the culture scheme to be tested, m represents the number of all effects, i, j is an integer greater than or equal to 1.
Thus, the attention of the breeder is focused on being combined into the credibility, so that the obtained credibility can more clearly show the standard reaching degree of the culture scheme to be tested on the attention of the breeder.
In summary, in the cultivation scheme inspection method provided by the invention, the rearing model simulating the living environment, the environmental climate change and the interaction event of the rearing space is constructed by acquiring the personal information of the rearing person and the environmental information of the rearing space for rearing the pet. Meanwhile, pet information of the pets raised by the raising person is obtained, and a virtual pet identical to the pets raised by the raising person is constructed so as to simulate the growth of the pets. Then, inputting the cultivation scheme to be tested and the initial virtual pet into a rearing model, and combining the cultivation scheme, rearing habit of a rearer and rearing environment, and carrying out cultivation simulation on the initial virtual pet to obtain the target virtual pet. And finally, obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information, and then carrying out inspection processing on the cultivation scheme to be inspected according to the difference information to determine the credibility of the cultivation scheme to be inspected. Therefore, the credibility of the cultivation scheme is checked before the formal raising, meanwhile, the raising effect can be predicted in advance, so that a raising person can intuitively observe the raising effect of the own pets when selecting the raising scheme, and the cultivation scheme meeting the self expectations is selected.
Referring to fig. 8, fig. 8 is a functional block diagram of a culture solution inspection apparatus according to an embodiment of the present application. As shown in fig. 8, the culture scheme inspection device 800 includes:
An acquisition module 801 for acquiring personal information of a raising person, pet information of a pet raised by the raising person, and environmental information of a raising space for raising the pet;
A construction module 802 for constructing a rearing model according to the personal information and the environmental information, wherein the rearing model is used for simulating the living environment of the rearing space, and constructing an initial virtual pet according to the pet information so as to simulate the growth of the pet;
The simulation module 803 is configured to input a cultivation scheme to be tested and an initial virtual pet into the rearing model, and perform cultivation simulation on the initial virtual pet to obtain a target virtual pet;
The inspection module 804 is configured to obtain a difference between the target virtual pet and the initial virtual pet, obtain difference information, and perform inspection processing on the cultivation scheme to be inspected according to the difference information, so as to determine the credibility of the cultivation scheme to be inspected.
In the embodiment of the invention, the personal information can comprise information of work and rest, feeding habit information and historical interaction information with pets, wherein the feeding habit information is used for recording time and frequency of feeding the pets by a breeder, and the environment information can comprise house information, house position information and indoor furnishing information. Based on this, in constructing a feeding model from the personal information and the environmental information, a construction module 802 is specifically configured to:
Constructing a point cloud model of a raising space according to house information and indoor furnishing information;
determining climate change information of the raising space according to the house position information;
According to the information of the work and rest feeding habit information and history interaction information determine interaction time and interaction rules of a feeder and a pet;
and adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain a rearing model.
In the embodiment of the present invention, in terms of adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain the feeding model, the construction module 802 is specifically configured to:
Establishing a time axis, determining at least one interaction interval in the time axis according to the interaction time, and generating at least one interaction event according to the interaction rule, wherein the at least one interaction event corresponds to the at least one interaction interval one by one;
filling each interactive content in at least one interactive content into a corresponding interactive interval to generate an interactive time axis;
Determining climate characteristics in a first time period according to climate change information, current time and preset simulated culture duration, wherein the starting time of the first time period is the current time, and the duration of the first time period is the simulated culture duration;
Randomly generating at least one climate event according to the climate characteristics, randomly filling the at least one climate event into a time axis, and generating a climate time axis;
And superposing the climate time axis and the interaction time axis into the point cloud model to obtain a rearing model, so as to generate interaction events corresponding to the interaction time axis in the rearing model through the interaction time axis, and generate climate events corresponding to the climate time axis in the rearing model through the climate time axis.
In an embodiment of the present invention, the pet information may include whole body images, medical history information, and historical behavior information. Based on this, in constructing an initial virtual pet from the pet information, a construction module 802 is specifically configured to:
Extracting features of the whole body image to obtain body type features and at least one appearance feature of the pet;
determining character characteristics of the pet according to the historical behavior information;
Constructing a body model of the pet according to the body type characteristics and the medical record information;
Matching in a map library according to at least one appearance characteristic to obtain at least one appearance map, wherein the at least one appearance map corresponds to the at least one appearance characteristic one by one;
And constructing a first virtual pet according to the at least one appearance map and the body model, and adjusting the first virtual pet according to the health information and the character characteristics to obtain an initial virtual pet.
In an embodiment of the present invention, the construction module 802 is specifically configured to construct a body model of the pet based on the body type characteristics and the medical record information:
determining age information, variety information and health information of the pets according to the medical record information;
Acquiring a corresponding body database according to the variety information, wherein the body database is used for storing standard body models of pets of the corresponding variety in different age stages and different health states;
according to the age information and the health information, matching a corresponding initial body model in a body database;
and adjusting the initial body model according to the body type characteristics to obtain the body model.
In an embodiment of the present invention, the construction module 802 is specifically configured to, in terms of matching in a map library according to at least one appearance feature to obtain at least one appearance map:
Obtaining a corresponding position mapping library according to the position label of each appearance feature in at least one appearance feature, wherein the position label is used for marking the position of the pet body corresponding to each appearance feature, and the position mapping library is used for storing mapping materials of the corresponding position;
Matching in the corresponding position mapping library according to each appearance characteristic to obtain a corresponding mapping material group;
and carrying out mapping generation processing on the mapping material group corresponding to each appearance characteristic according to each appearance characteristic to obtain at least one appearance mapping.
In the embodiment of the present invention, in the process of performing the examination of the culture scheme to be examined based on the difference information, to determine the credibility aspect of the culture protocol to be tested, a testing module 804 is specifically configured to:
Extracting keywords from the culture scheme to be tested to obtain at least one piece of effect information corresponding to the culture scheme to be tested;
Splitting the difference information according to at least one piece of effect information to obtain at least one piece of sub-difference information, wherein the at least one piece of sub-difference information corresponds to the at least one piece of effect information one by one;
determining the standard reaching degree of the effect information corresponding to each piece of sub-difference information according to each piece of sub-difference information in at least one piece of sub-difference information to obtain at least one standard reaching rate;
Splitting at least one piece of effect information into at least one effect reaching the standard and at least one effect not reaching the standard according to a preset threshold value and at least one standard reaching rate;
And generating the credibility of the culture scheme to be checked according to at least one effect reaching the standard and at least one effect not reaching the standard.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 9, the electronic device 900 includes a transceiver 901, a processor 902, and a memory 903. Which are connected by a bus 904. The memory 903 is used to store computer programs and data, and the data stored in the memory 903 may be transferred to the processor 902.
The processor 902 is configured to read a computer program in the memory 903 to perform the following operations:
acquiring personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet;
Constructing a rearing model according to the personal information and the environment information, wherein the rearing model is used for simulating the living environment of the rearing space;
Constructing an initial virtual pet according to the pet information so as to simulate the growth of the pet;
inputting a culture scheme to be tested and an initial virtual pet into a rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet;
obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information;
and carrying out inspection treatment on the cultivation scheme to be inspected according to the difference information so as to determine the credibility of the cultivation scheme to be inspected.
In the embodiment of the invention, the personal information can comprise information of work and rest, feeding habit information and historical interaction information with pets, wherein the feeding habit information is used for recording time and frequency of feeding the pets by a breeder, and the environment information can comprise house information, house position information and indoor furnishing information. Based on this, the processor 902 is specifically configured to perform the following operations in constructing a feeding model from the personal information and the environmental information:
Constructing a point cloud model of a raising space according to house information and indoor furnishing information;
determining climate change information of the raising space according to the house position information;
According to the information of the work and rest feeding habit information and history interaction information determine interaction time and interaction rules of a feeder and a pet;
and adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain a rearing model.
In the embodiment of the present invention, the processor 902 is specifically configured to perform the following operations in terms of adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain a feeding model:
Establishing a time axis, determining at least one interaction interval in the time axis according to the interaction time, and generating at least one interaction event according to the interaction rule, wherein the at least one interaction event corresponds to the at least one interaction interval one by one;
filling each interactive content in at least one interactive content into a corresponding interactive interval to generate an interactive time axis;
Determining climate characteristics in a first time period according to climate change information, current time and preset simulated culture duration, wherein the starting time of the first time period is the current time, and the duration of the first time period is the simulated culture duration;
Randomly generating at least one climate event according to the climate characteristics, randomly filling the at least one climate event into a time axis, and generating a climate time axis;
And superposing the climate time axis and the interaction time axis into the point cloud model to obtain a rearing model, so as to generate interaction events corresponding to the interaction time axis in the rearing model through the interaction time axis, and generate climate events corresponding to the climate time axis in the rearing model through the climate time axis.
In an embodiment of the present invention, the pet information may include whole body images, medical history information, and historical behavior information. Based on this, the processor 902 is specifically configured to perform the following operations in constructing an initial virtual pet from the pet information:
Extracting features of the whole body image to obtain body type features and at least one appearance feature of the pet;
determining character characteristics of the pet according to the historical behavior information;
Constructing a body model of the pet according to the body type characteristics and the medical record information;
Matching in a map library according to at least one appearance characteristic to obtain at least one appearance map, wherein the at least one appearance map corresponds to the at least one appearance characteristic one by one;
And constructing a first virtual pet according to the at least one appearance map and the body model, and adjusting the first virtual pet according to the health information and the character characteristics to obtain an initial virtual pet.
In an embodiment of the present invention, the processor 902 is specifically configured to perform the following operations in constructing a body model of a pet based on body shape characteristics and medical record information:
determining age information, variety information and health information of the pets according to the medical record information;
Acquiring a corresponding body database according to the variety information, wherein the body database is used for storing standard body models of pets of the corresponding variety in different age stages and different health states;
according to the age information and the health information, matching a corresponding initial body model in a body database;
and adjusting the initial body model according to the body type characteristics to obtain the body model.
In an embodiment of the present invention, the processor 902 is specifically configured to perform the following operations in terms of matching in a map library according to at least one appearance feature to obtain at least one appearance map:
Obtaining a corresponding position mapping library according to the position label of each appearance feature in at least one appearance feature, wherein the position label is used for marking the position of the pet body corresponding to each appearance feature, and the position mapping library is used for storing mapping materials of the corresponding position;
Matching in the corresponding position mapping library according to each appearance characteristic to obtain a corresponding mapping material group;
and carrying out mapping generation processing on the mapping material group corresponding to each appearance characteristic according to each appearance characteristic to obtain at least one appearance mapping.
In an embodiment of the present invention, the processor 902 is specifically configured to perform the following operations in terms of performing an inspection process on the cultivation scheme to be inspected according to the difference information to determine the credibility of the cultivation scheme to be inspected:
Extracting keywords from the culture scheme to be tested to obtain at least one piece of effect information corresponding to the culture scheme to be tested;
Splitting the difference information according to at least one piece of effect information to obtain at least one piece of sub-difference information, wherein the at least one piece of sub-difference information corresponds to the at least one piece of effect information one by one;
determining the standard reaching degree of the effect information corresponding to each piece of sub-difference information according to each piece of sub-difference information in at least one piece of sub-difference information to obtain at least one standard reaching rate;
Splitting at least one piece of effect information into at least one effect reaching the standard and at least one effect not reaching the standard according to a preset threshold value and at least one standard reaching rate;
And generating the credibility of the culture scheme to be checked according to at least one effect reaching the standard and at least one effect not reaching the standard.
It should be understood that the cultivation scheme inspection device in the present application may include a smart Phone (such as an Android Phone, an iOS Phone, a Windows Phone, etc.), a tablet computer, a palm computer, a notebook computer, a Mobile internet device MID (Mobile INTERNET DEVICES, abbreviated as MID), a robot, a wearable device, etc. The above described culture protocol testing device is merely exemplary and not exhaustive and includes, but is not limited to, the above described culture protocol testing device. In practical application, the culture scheme checking device can also comprise an intelligent vehicle-mounted terminal, computer equipment and the like.
From the above description of embodiments, it will be apparent to those skilled in the art that the present invention may be implemented in software in combination with a hardware platform. With such understanding, all or part of the technical solution of the present invention contributing to the background art may be embodied in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the methods described in the various embodiments or parts of the embodiments of the present invention.
Accordingly, an embodiment of the present application also provides a computer-readable storage medium storing a computer program that is executed by a processor to implement some or all of the steps of any one of the incubation protocol verification methods described in the above method embodiments. For example, the storage medium may include a hard disk, a floppy disk, an optical disk, a magnetic tape, a magnetic disk, a flash memory, etc.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the incubation protocol verification methods described in the method embodiments above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are alternative embodiments, and that the acts and modules involved are not necessarily required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and for those portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional divisions when actually implemented, such as multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software program modules.
The integrated units, if implemented in the form of software program modules, may be stored in a computer-readable memory for sale or use as a stand-alone product. Based on this understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product or all or part of the technical solution, which is stored in a memory, and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or part of the steps of the method according to the embodiments of the present application. The Memory includes a U disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, etc. which can store the program codes.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable Memory, and the Memory may include a flash disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, etc.
While the foregoing has been provided to illustrate the principles and embodiments of the present application, specific examples have been provided herein to assist in understanding the principles and embodiments of the present application, and are intended to be in no way limiting, for the purpose of illustrating the application, as long as the principles and embodiments of the present application are modified in accordance with the principles and embodiments of the present application.

Claims (10)

1. A method of testing a culture protocol, the method comprising:
Acquiring personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet;
constructing a rearing model according to the personal information and the environment information, wherein the rearing model is used for simulating living environment, climate and interaction events of the rearing space;
Constructing an initial virtual pet according to the pet information so as to simulate the growth of the pet, wherein the pet information comprises a whole body image, medical record information and historical behavior information;
Inputting a culture scheme to be tested and the initial virtual pet into the rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet;
obtaining the difference between the target virtual pet and the initial virtual pet to obtain difference information;
performing inspection treatment on the cultivation scheme to be inspected according to the difference information so as to determine the credibility of the cultivation scheme to be inspected;
Inputting the culture scheme to be tested and the initial virtual pet into the rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet, wherein the method comprises the following steps:
according to the culture scheme to be checked, parameter adjustment is carried out on the interaction events in the interaction time axis in the rearing model, so that the rearing model generates corresponding interaction events and climate events along with the time, and interacts with the initial virtual pet in the rearing model to simulate the rearing process in reality;
And adjusting parameters of the initial virtual pet according to the interaction event and the climate event and the interaction result of the initial virtual pet to obtain a target virtual pet.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The personal information comprises information of work and rest, feeding habit information and historical interaction information with the pet, wherein the feeding habit information is used for recording time and frequency of feeding the pet by the rearer;
the environment information comprises house information, house position information and indoor decoration information;
the building of a rearing model according to the personal information and the environment information comprises the following steps:
Constructing a point cloud model of a raising space according to the house information and the indoor furnishing information;
determining climate change information of the raising space according to the house position information;
According to the information of the work and rest the feeding habit information and the history interaction information determine the interaction time and interaction rule of the feeders and the pets;
and adjusting the point cloud model according to the climate change information, the interaction time and the interaction rule to obtain the rearing model.
3. The method of claim 2, wherein the adjusting the point cloud model according to the climate change information, the interaction time and the interaction law to obtain the feeding model comprises:
Establishing a time axis, determining at least one interaction interval in the time axis according to the interaction time, and generating at least one interaction event according to the interaction rule, wherein the at least one interaction event corresponds to the at least one interaction interval one by one;
filling each interactive content in at least one interactive content into a corresponding interactive interval, and generating the interactive time axis;
determining climate characteristics in a first time period according to the climate change information, the current time and a preset simulated culture duration, wherein the starting moment of the first time period is the current time, and the duration of the first time period is the simulated culture duration;
Randomly generating at least one climate event according to the climate characteristics, randomly filling the at least one climate event into the time axis, and generating a climate time axis;
And superposing the climate time axis and the interaction time axis into the point cloud model to obtain the feeding model, so as to generate interaction events corresponding to the interaction time axis in the feeding model through the interaction time axis, and generate climate events corresponding to the climate time axis in the feeding model through the climate time axis.
4. The method of claim 1, wherein said constructing an initial virtual pet from said pet information comprises:
extracting features of the whole body image to obtain body type features and at least one appearance feature of the pet;
determining character features of the pets according to the historical behavior information;
Constructing a body model of the pet according to the body type characteristics and the medical record information;
Matching in a map library according to the at least one appearance feature to obtain at least one appearance map, wherein the at least one appearance map corresponds to the at least one appearance feature one by one;
And constructing a first virtual pet according to the at least one appearance map and the body model, and adjusting the first virtual pet according to the health information and the character characteristics to obtain the initial virtual pet.
5. The method of claim 4, wherein the constructing the body model of the pet from the body type characteristics and the medical record information comprises:
determining age information, variety information and health information of the pets according to the medical record information;
acquiring a corresponding body database according to the variety information, wherein the body database is used for storing standard body models of pets of corresponding varieties in different age stages and different health states;
according to the age information and the health information, matching a corresponding initial body model in the body database;
and adjusting the initial body model according to the body type characteristics to obtain the body model.
6. The method of claim 4, wherein said matching in the gallery based on the at least one appearance characteristic to obtain at least one appearance map comprises:
Obtaining a corresponding part map library according to the part label of each appearance feature in the at least one appearance feature, wherein the part label is used for marking the part of the pet body corresponding to each appearance feature, and the part map library is used for storing map materials of the corresponding part;
matching in the corresponding part map library according to each appearance characteristic to obtain a corresponding map material group;
and carrying out mapping generation processing on the mapping material group corresponding to each appearance characteristic according to each appearance characteristic to obtain at least one appearance mapping.
7. The method according to any one of claims 1 to 6, wherein the performing the inspection process on the cultivation scheme to be inspected based on the difference information to determine the credibility of the cultivation scheme to be inspected comprises:
extracting keywords from the culture scheme to be inspected to obtain at least one piece of effect information corresponding to the culture scheme to be inspected;
Splitting the difference information according to the at least one piece of effect information to obtain at least one piece of sub-difference information, wherein the at least one piece of sub-difference information corresponds to the at least one piece of effect information one by one;
Determining the standard reaching degree of the effect information corresponding to each piece of sub-difference information according to each piece of sub-difference information in the at least one piece of sub-difference information, and obtaining at least one standard reaching rate;
Splitting the at least one piece of effect information into at least one effect reaching the standard and at least one effect not reaching the standard according to a preset threshold value and the at least one standard reaching rate;
and generating the credibility of the culture scheme to be checked according to the at least one effect reaching the standard and the at least one effect not reaching the standard.
8. A culture protocol verification device, the device comprising:
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring personal information of a raising person, pet information of a pet raised by the raising person and environment information of a raising space for raising the pet;
The building module is used for building a rearing model according to the personal information and the environment information, wherein the rearing model is used for simulating living environment, climate and interaction events of the rearing space and building an initial virtual pet according to the pet information so as to simulate the growth of the pet, and the pet information comprises whole body images, medical record information and historical behavior information;
The simulation module is used for inputting a culture scheme to be tested and the initial virtual pet into the rearing model, and carrying out culture simulation on the initial virtual pet to obtain a target virtual pet;
the verification module is used for obtaining the difference between the target virtual pet and the initial virtual pet, obtaining difference information, and carrying out verification processing on the cultivation scheme to be verified according to the difference information so as to determine the credibility of the cultivation scheme to be verified;
Wherein, in the aspect of inputting the cultivation scheme to be checked and the initial virtual pet into the rearing model, the initial virtual pet is cultivated and simulated to obtain the target virtual pet, the simulation module is specifically configured to:
according to the culture scheme to be checked, parameter adjustment is carried out on the interaction events in the interaction time axis in the rearing model, so that the rearing model generates corresponding interaction events and climate events along with the time, interacts with the initial virtual pet in the book searching rearing model, and simulates the rearing process in reality;
And adjusting parameters of the initial virtual pet according to the interaction event and the climate event and the interaction result of the initial virtual pet to obtain a target virtual pet.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured for execution by the processor, the one or more programs comprising instructions for performing the steps of the method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of any of claims 1-7.
CN202210463793.5A 2022-04-28 2022-04-28 Training program testing method, device, electronic equipment and storage medium Active CN114912778B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210463793.5A CN114912778B (en) 2022-04-28 2022-04-28 Training program testing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210463793.5A CN114912778B (en) 2022-04-28 2022-04-28 Training program testing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114912778A CN114912778A (en) 2022-08-16
CN114912778B true CN114912778B (en) 2025-07-01

Family

ID=82764836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210463793.5A Active CN114912778B (en) 2022-04-28 2022-04-28 Training program testing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114912778B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023442A (en) * 2004-07-29 2007-08-22 Can科技公司 System and method for optimizing animal production based on environmental nutrient inputs
CN105957140A (en) * 2016-05-31 2016-09-21 成都九十度工业产品设计有限公司 Pet dog interaction system based on technology of augmented reality, and analysis method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050110260A (en) * 2004-05-18 2005-11-23 (주)유니원커뮤니케이션즈 Cyber pet system using genetic algorithm
CN107670274A (en) * 2017-10-16 2018-02-09 杭州潇楠科技有限公司 Chelonian raising pets simulation system and application method
CN108744506A (en) * 2018-05-17 2018-11-06 上海爱优威软件开发有限公司 Pseudo-entity exchange method based on terminal and system
CN111109105A (en) * 2019-12-17 2020-05-08 广东顺德雷舜信息科技有限公司 Pet feeding method and device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023442A (en) * 2004-07-29 2007-08-22 Can科技公司 System and method for optimizing animal production based on environmental nutrient inputs
CN105957140A (en) * 2016-05-31 2016-09-21 成都九十度工业产品设计有限公司 Pet dog interaction system based on technology of augmented reality, and analysis method

Also Published As

Publication number Publication date
CN114912778A (en) 2022-08-16

Similar Documents

Publication Publication Date Title
Smith et al. Agent-based models of malaria transmission: a systematic review
Grasseni Skilled vision. An apprenticeship in breeding aesthetics
Topping et al. Opening the black box—Development, testing and documentation of a mechanistically rich agent-based model
Boyd et al. Bio-logging science: sensing beyond the boundaries
Kerr et al. Modeling the implications of stock mixing and life history uncertainty of Atlantic bluefin tuna
US11663400B2 (en) Configuration and deployment of extensible templates
Kelley et al. Male great bowerbirds create forced perspective illusions with consistently different individual quality
EP3255551A1 (en) Information processing apparatus, information processing method, and information processing system
CN110547210A (en) feed supply method and system, computer system, and storage medium
CN115586834B (en) An intelligent cardiopulmonary resuscitation training system
Farrie et al. Rangeland Rummy–A board game to support adaptive management of rangeland-based livestock systems
Heinen et al. Experimental manipulation of food distribution alters social networks and information transmission across environments in a food-caching bird
CN106202901A (en) Stone age remote online interpreting system
Sivaram et al. Expert System in Determining the Quality of Superior Gourami Seed Using Forward Chaining-Based Websites
Collet et al. High-throughput phenotyping to characterise range use behaviour in broiler chickens
CN116313164A (en) Anti-interference sleep monitoring method, device, equipment and storage medium
CN114912778B (en) Training program testing method, device, electronic equipment and storage medium
Paterson et al. Male throat colour polymorphism is related to differences in space use and in habitat selection in tree lizards
CN109410012A (en) Adopt method, terminal and computer readable storage medium in internet
Van Dyne et al. Forage allocation on arid and semiarid public grazing lands: summary and recommendations
KR20220032784A (en) Method and system for providing recommendation information of health functional food
CN115019931B (en) Treatment plan recommendation method, device, electronic device and storage medium
KR20210051860A (en) Method for providing management guidline based on feeding standard
CN114927229B (en) Surgery simulation method, device, electronic device and storage medium
CN114882973A (en) Daily nutrient intake analysis method and system based on standard food recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant