[go: up one dir, main page]

CN114036568A - Computing task processing method and device based on cloud communication - Google Patents

Computing task processing method and device based on cloud communication Download PDF

Info

Publication number
CN114036568A
CN114036568A CN202111401119.6A CN202111401119A CN114036568A CN 114036568 A CN114036568 A CN 114036568A CN 202111401119 A CN202111401119 A CN 202111401119A CN 114036568 A CN114036568 A CN 114036568A
Authority
CN
China
Prior art keywords
target
computing task
task
user
approval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111401119.6A
Other languages
Chinese (zh)
Inventor
张翅飞
卢彬彬
蔡冠男
张乃元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202111401119.6A priority Critical patent/CN114036568A/en
Publication of CN114036568A publication Critical patent/CN114036568A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6227Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database where protection concerns the structure of data, e.g. records, types, queries
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/083Network architectures or network communication protocols for network security for authentication of entities using passwords

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Bioethics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Databases & Information Systems (AREA)
  • Signal Processing (AREA)
  • General Business, Economics & Management (AREA)
  • Technology Law (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

本说明书实施例提供了基于云通信的计算任务处理方法及装置,该方法可以应用于中间服务平台,包括:接收客户提交的计算任务,计算任务包括授权码和至少一个用户标识,授权码指示出客户已被授权使用的目标标签,目标标签关联客户的目标供应商;向目标供应商的服务器发送评分查询请求,评分查询请求包括该至少一个用户标识和目标标签;接收服务器返回的查询结果,查询结果包括该至少一个用户标识中的用户标识,以及该用户标识指示的用户在目标标签下的分数,该分数由服务器中关联于目标标签的评分模型预测得出;根据查询结果,生成计算任务对应的计算结果。

Figure 202111401119

The embodiments of this specification provide a computing task processing method and device based on cloud communication. The method can be applied to an intermediate service platform, including: receiving a computing task submitted by a client, where the computing task includes an authorization code and at least one user identifier, and the authorization code indicates that The target label that the customer has been authorized to use, and the target label is associated with the target supplier of the customer; send a score query request to the server of the target supplier, and the score query request includes the at least one user ID and the target label; receive the query result returned by the server, query The result includes the user ID in the at least one user ID, and the score of the user indicated by the user ID under the target label, and the score is predicted by the scoring model associated with the target label in the server; calculation result.

Figure 202111401119

Description

Computing task processing method and device based on cloud communication
Technical Field
The embodiment of the specification relates to the technical field of cloud communication, in particular to a computing task processing method and device based on cloud communication.
Background
At present, enterprises generally need data to improve the quality and efficiency of services, but the data of the enterprises are often insufficient or incomplete, and the suppliers of the enterprises generally have a large amount of data, but for the reasons of data security, user privacy and the like, the online, efficient and compliant output cannot be realized.
Therefore, a reasonable and reliable scheme is urgently needed, which can help the supplier output own data on the premise of legality and compliance and provide services for enterprises.
Disclosure of Invention
The embodiment of the specification provides a computing task processing method and device based on cloud communication, which can help a supplier output own data on the premise of legality and compliance and provide services for enterprises.
In a first aspect, an embodiment of the present specification provides a computing task processing method based on cloud communication, which is applied to an intermediate service platform in a cloud communication platform, and includes: receiving a computing task submitted by a customer, the computing task including an authorization code and at least one user identification, the authorization code indicating a target tag that the customer is authorized to use, the target tag being associated with a target provider of the customer; sending a scoring query request to a server of the target provider, the scoring query request including the at least one user identifier and the target tag; receiving a query result returned by the server, wherein the query result comprises a user identifier in the at least one user identifier and a score of a user indicated by the user identifier under the target tag, and the score is predicted by a scoring model associated with the target tag in the server; and generating a calculation result corresponding to the calculation task according to the query result.
In a second aspect, an embodiment of the present specification provides a computing task processing method, which is applied to an intermediate service platform, and includes: receiving a computing task submitted by a customer, the computing task including an authorization code and at least one user identification, the authorization code indicating a target tag that the customer is authorized to use, the target tag being associated with a target provider of the customer; sending a scoring query request to a server of the target provider, the scoring query request including the at least one user identifier and the target tag; receiving a query result returned by the server, wherein the query result comprises a user identifier in the at least one user identifier and a score of a user indicated by the user identifier under the target tag, and the score is predicted by a scoring model associated with the target tag in the server; and generating a calculation result corresponding to the calculation task according to the query result.
In some embodiments, before the receiving the computing task submitted by the client, further comprising: receiving a use application submitted by the customer for the target label; sending a first approval request to a first approval end according to the use application; receiving an approval result returned by the first approval end; and in response to the approval result being that the client is allowed to use the target label, generating the authorization code for the client, and returning the approval result to the client.
In some embodiments, the computing task is a real-time computing task or an offline computing task.
In some embodiments, the computing task is the real-time computing task; and after the generating of the calculation result corresponding to the calculation task, further comprising: and returning the calculation result to the client.
In some embodiments, the computing task is the offline computing task; and after the generating of the calculation result corresponding to the calculation task, further comprising: receiving the request of the client for obtaining the calculation result, and returning the calculation result to the client; and/or, in response to the offline computing task belonging to a cyclic task, when a next computing period of a current computing period of the offline computing task comes, continuing to execute the sending of the scoring query request to the server of the target provider.
In some embodiments, the computing task is the offline computing task, the offline computing task further including a model identification of the scoring model, the scoring model being stored in the intermediate service platform; and before the sending of the scoring query request to the server of the target provider, further comprising: according to the offline computing task, sending a second approval request to a second approval end, so that the second approval end deploys the grading model to the server through a model processing end after the offline computing task passes through the task feasibility approval; and receiving the model deployment completion notification information returned by the second approval end.
In some embodiments, the scoring model is generated according to modeling requirements of the customer; and prior to receiving the computing task submitted by the client, further comprising: receiving a model creation request of the customer, wherein the model creation request comprises the modeling requirement; according to the model creating request, sending a third approval request to the second approval end, so that the second approval end obtains the grading model at least according to the modeling requirement through the model processing end after the modeling requirement passes feasibility approval; receiving the scoring model returned by the second approval end; generating the model identification for the scoring model.
In some embodiments, the computing task further comprises a task type; and generating a calculation result corresponding to the calculation task according to the query result, wherein the calculation result comprises: and generating the calculation result according to the task type and the query result.
In some embodiments, the task type is any one of: user sorting, user filtering, normalization calculation, sub-label score prediction.
In some embodiments, the target tag belongs to a scene of a business; and/or, the individual user identification comprises any one of: the mobile phone number, the international mobile equipment identity IMEI, the advertisement identifier IDFA and the anonymous equipment identifier OAID; and/or the scoring model is provided to the target provider by the intermediate service platform.
In a third aspect, an embodiment of the present specification provides a method for applying for using a tag, including: receiving a use application submitted by a client aiming at a target label; sending a first approval request to a first approval end according to the use application; receiving an approval result returned by the first approval end; and in response to the approval result being that the client is allowed to use the target label, generating an authorization code associated with the target label for the client, and returning the approval result to the client.
In a fourth aspect, an embodiment of the present specification provides a computing task processing device based on cloud communication, which is applied to an intermediate service platform in a cloud communication platform, and includes: a first receiving unit configured to receive a computing task submitted by a customer, the computing task including an authorization code and at least one user identification, the authorization code indicating a target tag that the customer is authorized to use, the target tag being associated with a target provider of the customer; a first sending unit configured to send a scoring query request to a server of the target provider, the scoring query request including the at least one user identifier and the target tag; a second receiving unit, configured to receive a query result returned by the server, where the query result includes a user identifier in the at least one user identifier, and a score of a user indicated by the user identifier under the target tag, where the score is predicted by a scoring model associated with the target tag in the server; and the generating unit is configured to generate a calculation result corresponding to the calculation task according to the query result.
In a fifth aspect, an embodiment of the present specification provides a computing task processing apparatus, which is applied to an intermediate service platform, and includes: a first receiving unit configured to receive a computing task submitted by a customer, the computing task including an authorization code and at least one user identification, the authorization code indicating a target tag that the customer is authorized to use, the target tag being associated with a target provider of the customer; a first sending unit configured to send a scoring query request to a server of the target provider, the scoring query request including the at least one user identifier and the target tag; a second receiving unit, configured to receive a query result returned by the server, where the query result includes a user identifier in the at least one user identifier, and a score of a user indicated by the user identifier under the target tag, where the score is predicted by a scoring model associated with the target tag in the server; and the generating unit is configured to generate a calculation result corresponding to the calculation task according to the query result.
In a sixth aspect, an embodiment of the present specification provides an apparatus for applying for use of a tag, including: the first receiving unit is configured to receive a use application submitted by a client aiming at a target label; the sending unit is configured to send a first approval request to a first approval end according to the use application; the second receiving unit is configured to receive the approval result returned by the first approval end; a processing unit configured to generate an authorization code associated with the target tag for the customer in response to the approval result being that the customer is allowed to use the target tag, and return the approval result to the customer.
In a seventh aspect, the present specification provides a computer-readable storage medium, on which a computer program is stored, wherein when the computer program is executed in a computer, the computer is caused to execute the method described in any implementation manner of the first aspect to the third aspect.
In an eighth aspect, the present specification provides a computing device, including a memory and a processor, where the memory stores executable code, and the processor executes the executable code to implement the method described in any implementation manner of the first to third aspects.
In a ninth aspect, the present specification provides a computer program, wherein when the computer program is executed in a computer, the computer is caused to execute the method described in any implementation manner of the first to third aspects.
The cloud communication-based computing task processing method and device provided by the above embodiments of the present specification can enable the intermediate service platform to receive a computing task submitted by a customer, obtain score data output by a target provider of the customer in an online, efficient and compliant manner according to the computing task, and generate a computing result corresponding to the computing task based on the score data. Subsequently, the calculation result may be returned to the client. Therefore, the intermediate service platform can play a role in connecting customer requirements and supplier data compliance use, and can help the supplier to output own data on the premise of legality and compliance so as to provide services for enterprises.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments disclosed in the present specification, the drawings needed to be used in the description of the embodiments will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments disclosed in the present specification, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is an exemplary system architecture diagram to which some embodiments of the present description may be applied;
FIG. 2 is a schematic diagram of one embodiment of a computing task processing method;
FIG. 3 is a schematic diagram of a process for applying for the use of a target tag;
FIG. 4 is a schematic diagram of a tag usage application submission process;
FIG. 5 is a schematic view of a label square interface;
FIG. 6 is another schematic view of a label square interface;
FIG. 7 is a schematic diagram of a model creation process and a model deployment process;
fig. 8 is a schematic diagram of a configuration of a calculation task processing device.
Detailed Description
The present specification will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. The described embodiments are only a subset of the embodiments described herein and not all embodiments described herein. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step are within the scope of the present application.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present description may be combined with each other without conflict.
As mentioned above, enterprises generally need data to improve their service quality and efficiency, but their own data is often insufficient or incomplete, and the suppliers of enterprises generally have a lot of data, but there is no way to output data online, efficiently and in compliance for reasons of data security and user privacy.
Based on this, some embodiments of the present specification provide a method for processing a computing task, by which a provider can be helped to output its own data on the premise of legitimacy and compliance to provide services for an enterprise. In particular, FIG. 1 illustrates an exemplary system architecture diagram suitable for use with these embodiments.
As shown in fig. 1, the system architecture may include a client's terminal device, an intermediate service platform, and at least one provider's respective server for the client. Where a single customer may be an organization, which may include, but is not limited to, an enterprise. The intermediate service platform may be, for example, a sub-platform of the cloud communication platform, and may communicate with the terminal device and the server, respectively.
For any of the at least one supplier, at least one scoring model may be deployed in a server of the supplier, and the at least one scoring model may be associated with different tags. It should be noted that the provider typically has its own user data, such as user attribute information and/or historical behavior data of multiple users, and may have the ability to score users based on the user data, such as the ability to credit the user with an intention score, credit card application intention score, repayment intention score, automobile purchase intention score, and/or financial product purchase intention score, among others. Based on this, the labels respectively associated with the at least one scoring model may be obtained by packaging the capabilities possessed by the supplier.
It should be noted that the tags obtained by packaging the loan intention degree scoring capability, the credit card application intention degree scoring capability, the repayment intention degree scoring capability, the automobile purchase intention degree scoring capability and the financial product purchase intention degree scoring capability respectively may be referred to as a loan intention degree tag, a credit card application intention degree tag, a repayment intention degree tag, an automobile purchase intention degree tag and a financial product purchase intention degree tag in sequence.
In practice, the labels respectively associated with the at least one scoring model can be provided for the customer through an intermediate service platform. For the label required by the client, the client can firstly apply for the use permission of the label from the intermediate service platform, and the intermediate service platform can generate an authorization code associated with the label for the client after determining that the client is allowed to use the label, so that the client can be granted the use permission of the label. It is noted that this process may be referred to as a tag application process.
After the client obtains the usage right of the tag, if there is a business requirement related to the tag, such as a risk assessment requirement or an information pushing requirement, the client may submit a computing task related to the tag to the intermediary service platform through the terminal device of the client, and enable the intermediary service platform to obtain score data related to the tag from a server of a target provider of the client according to the computing task, and generate a computing result corresponding to the computing task according to the score data. The server deploys a scoring model related to the label, and the scoring data is predicted by the scoring model. It should be noted that the processing procedure of the computing task may be referred to as a tag using procedure.
Taking the customer as a bank a, the at least one supplier including suppliers B1, …, BN (N may be a natural number greater than 1), and suppliers B1, BN respectively deploying scoring models associated with the repayment intention label, further describe the label using process of the bank a after obtaining the using authority of the repayment intention label.
Assuming that user C applies for a loan to bank A, bank A typically needs to assess the repayment capabilities of user C before approving the loan application by user C to avoid the risk of capital loss and the like. In practice, the repayment ability of user C may be measured based on the willingness to repay of user C. Based on this, the bank a may submit a calculation task related to the offer intention label to the intermediary service platform through its terminal device, which may include the user identification of the user C, such as the mobile phone number shown in fig. 1, and an authorization code associated with the offer intention label.
Then, the intermediate service platform may determine the repayment intention degree label according to the authorization code in the calculation task, deploy the suppliers B1 and BN with the scoring model associated with the repayment intention degree label, and further send scoring query requests to the servers of the suppliers B1 and BN respectively according to the calculation task. The scoring query request may include the mobile phone number of the user C and the repayment intention label.
The provider B1, the server of the BN, may then return query results in response to the scored query requests, respectively. The query results may include, for example, user C's cell phone number, and user C's score under the repayment intent label. Wherein the score can be predicted by the scoring model. The score may be predicted in advance by using the scoring model, or may be predicted on the spot by using the scoring model based on the user data of the user C, and is not particularly limited herein.
Then, the intermediate service platform can generate a calculation result corresponding to the calculation task according to the query result. The calculation result may include the mobile phone number and the target score of the user C. In one example, the target score may be a score in a query result returned by a server of either of the vendors B1, BN. In another example, the target score may be the highest score among the query results returned by the providers B1, BN, respectively. In yet another example, the target score may be an average of the scores in the query results returned by vendors B1, BN, respectively.
Subsequently, the intermediate service platform may actively return the calculation result to the bank a, or return the calculation result to the bank a in response to a request for obtaining the calculation result from the bank a. For example, when the calculation task is a real-time calculation task, after a calculation result corresponding to the real-time calculation task is generated, the calculation result may be immediately returned to the bank a. When the calculation task is an offline calculation task, after a calculation result corresponding to the offline calculation task is generated, the calculation result may be returned to the bank a in response to an acquisition request of the bank a for the calculation result.
By adopting the label using process described above, namely the calculation task processing process, the intermediate service platform can play a role in connecting customer requirements and supplier data compliance use, and can help the supplier to output own data on the premise of legality and compliance, so as to provide services for enterprises.
The following describes specific implementation steps of the above method with reference to specific examples.
Referring to FIG. 2, a diagram of one embodiment of a computing task processing method is shown. The method comprises the following steps:
202, receiving a computing task submitted by a customer by an intermediate service platform, wherein the computing task comprises an authorization code and at least one user identifier, the authorization code indicates a target label authorized to be used by the customer, and the target label is associated with a target supplier of the customer;
step 204, the intermediate service platform sends a grading query request to a server of a target provider, wherein the grading query request comprises at least one user identifier and a target label;
step 206, the intermediate service platform receives a query result returned by the server, wherein the query result comprises a user identifier in the at least one user identifier and a score of a user indicated by the user identifier under the target label, and the score is predicted by a scoring model associated with the target label in the server;
and 208, generating a calculation result corresponding to the calculation task by the intermediate service platform according to the query result.
The above steps are further explained below.
In step 202, the intermediary service platform may receive a computing task submitted by a customer, the computing task may include an authorization code and at least one user identification. The at least one user identification respectively indicates a user, typically a user of the customer.
Where a single customer may be an organization, which may include, but is not limited to, an enterprise. The individual user identification may be, for example, an identification of a number category. Further, the individual subscriber Identity may include, For example, a Mobile phone number, an International Mobile Equipment Identity (IMEI), an advertisement Identifier (IDFA) or an Anonymous Device Identifier (OAID), etc.
The authorization code may indicate that the customer has been authorized to use a target tag, which may be associated with a target provider of the customer. In practice, the target provider has the ability to score the user for the target tag, which can be obtained by packaging the ability.
In some embodiments, the authorization code may be generated by an approval in response to a request for use of the target tag by the customer. Specifically, the application process for the use of the target tag may be as shown in fig. 3. Wherein, the application process comprises the following steps:
step 302, the intermediate service platform receives a use application submitted by a client aiming at a target label;
step 304, the intermediate service platform sends a first approval request to the first approval terminal according to the application;
step 306, the intermediate service platform receives the approval result returned by the first approval end;
308, the intermediate service platform responds to the approval result that the client is allowed to use the target label, and generates an authorization code associated with the target label for the client;
and step 310, the intermediate service platform returns the approval result to the client.
In practice, the intermediary service platform may provide the customer with an interface presented with a label, which may be referred to as a label square interface. In addition, the label square interface can present a use application entry corresponding to the label while presenting the label. The client can access the label square interface through its terminal device, for example through a browser in the terminal device. When the client browses the required target label, the application for the target label can be created and submitted through the application entrance corresponding to the target label.
Referring to fig. 4, a schematic diagram of a tag application submission process is shown. The label square interface indicated by reference numeral 401 in fig. 4 presents a loan intention label and a credit card application intention label, and the two labels respectively correspond to the application entry, i.e., the application opening button. Assuming that the loan intention label is a target label required by the customer, the customer may click an application opening button corresponding to the loan intention label and enter an application opening window indicated by the reference numeral 402. Thereafter, the customer may enter specified information in the application opening window, such as the application title, the application reason, the application material, and check a check box to approve the opening and use the loan intention label service. The customer may then click on the submit application button to submit a use application for the loan intention label to the intermediate service platform.
In some embodiments, after packaging the scoring capabilities as described above into a label, the label may be further packaged into a label in one or more industries, and the label may be specific to a scenario of the industry to which it belongs. For example, loan intent tags may be packaged as tags in the finance, real estate, etc. industries. In the financial industry and the real estate industry, for example, the scenarios to which the loan intention label belongs may both be loan scenarios. As another example, a credit card application intent tag may be packaged as a tag in the financial industry, and the tag may be attributed to a credit card scenario in the financial industry.
Based on this, the label square interface can also present the industry and scene to which the label belongs, and the label square interface pointed to by reference numeral 401 in fig. 4 can be further optimized into the interface shown in fig. 5. It should be noted that the same labels of multiple different industries/scenarios may correspond to different scoring models or different data sources, and are not specifically limited herein.
It should be understood that the label square interface and the application opening window shown in fig. 4, and the label square interface shown in fig. 5 are only exemplary contents, and the label square interface and the application opening window may be designed according to actual requirements, and are not specifically limited herein.
Based on the above-described tag application submission process, in step 302, the intermediate service platform may receive a use application created and submitted by a customer through a use application entry corresponding to the target tag.
Next, in step 304, the intermediate service platform may send a first approval request to the first approval terminal according to the application. The first approval request may be a request to approve whether the customer is allowed to use the target tag. The first approval request may include, for example, a customer identification of the customer, an object tag, and at least a portion of the content input by the customer in the application, such as the reason for the application, the application material, and the like.
It should be noted that, the first approval end may be, for example, a client or a server used by an operation department of the intermediate service platform for using application approval, and is not limited specifically herein. In one example, the first approval end may automatically approve whether the customer is allowed to use the target label according to the first approval request and the deployed application approval algorithm. In another example, on the first approval side, a manual intervention may be employed to approve whether the customer is allowed to use the target label. It should be understood that various approval methods may be adopted upon approval of the application for use, and are not specifically limited herein.
After the approval is completed according to the first approval request, the first approval end may return an approval result to the intermediate service platform. The approval result may be that the customer is allowed to use the target label or that the customer is not allowed to use the target label. If the approval result is that the client is allowed to use the target tag, the intermediate service platform may generate an authorization code associated with the target tag for the client by performing step 308. The intermediate service platform may then return the approval to the customer by performing step 310. In some embodiments, the intermediate service platform may also return an authorization code to the customer in step 310.
After obtaining the approval result indicating that the target label is allowed to be used, the client can know that the client has obtained the use authority of the target label, and can subsequently use the target label, for example, submit a computing task related to the target label to an intermediate service platform.
With reference to fig. 2, after receiving the computing task submitted by the client, the intermediate service platform may obtain, according to the authorization code in the computing task, the target tag associated with the authorization code and the target provider associated with the target tag. For example, the intermediate service platform may establish a first data table for characterizing an association relationship between the authorization code and the tag, and a second data table for characterizing an association relationship between the tag and the provider, and the intermediate service platform may first find a target tag associated with the authorization code in the first data table according to the authorization code in the calculation task, and then find a target provider associated with the target tag in the second data table.
The intermediate service platform may then send a scoring query request to the target provider's server by performing step 204. The scoring query request may include the at least one user identification and the target tag. Further, when the target tag belongs to a certain scene of a certain industry, the scoring query request may further include the industry and the scene to which the target tag belongs.
In practice, the server of the target provider deploys a scoring model associated with the target tags. The scoring model may be a model owned by the target provider or provided by the target provider through the intermediate service platform, and the scoring model may be a scoring rule or a machine learning model for score prediction, which is not limited herein.
For a user identifier in the at least one user identifier, if the server predicts the score of the user indicated by the user identifier under the target tag by using the scoring model in advance, the server may obtain a ready score for the user; otherwise, the server can obtain the user data of the user, and predict the score of the user under the target label on site according to the user data by using the scoring model.
In one example, the target label may be a training label used in the training process of the scoring model, and based on this, the scoring model may predict a score corresponding to the target label for a user corresponding to the user data according to the input user data. In this case, the score of the user under the target tag is usually one score.
In another example, the target tag may have multiple sub-tags. For example, when the target tag is a user representation, the plurality of sub-tags of the user representation may include, but are not limited to, male, female, preferred shopping, preferred financing, and the like, for example. The plurality of sub-labels may be training labels used in the training process of the scoring model, and based on this, the scoring model may predict scores corresponding to the plurality of sub-labels, respectively, for the user corresponding to the user data according to the input user data. In such a case, the score of the user under the target tag is typically a plurality of scores.
And the server of the target provider can return the query result to the intermediate service platform after obtaining the scores respectively corresponding to the at least one user identifier according to the scoring query request. The query result may include a user identifier of the at least one user identifier, and a score of the user indicated by the user identifier under the target tag. Further, when the target tag has a plurality of sub-tags and the scoring model associated with the target tag is used to predict scores corresponding to the plurality of sub-tags, the query result may further include the sub-tags corresponding to the scores.
Thereafter, the intermediate service platform may receive the query result returned by the server of the target provider by performing step 206.
Then, the intermediate service platform may generate a calculation result corresponding to the calculation task according to the query result by executing step 208.
Specifically, in one example, if the target vendor is a single vendor, the intermediate service platform may directly take the query result as the calculation result in step 208. In another example, if the target provider is a plurality of providers, in step 208, the intermediate service platform may employ various processing means to generate a calculation result corresponding to the calculation task according to the query result returned by the plurality of providers.
For example, the intermediate service platform may take the results of the query of any of the plurality of vendors as the computed results. Alternatively, the intermediate service platform may perform the following generating steps: step a, combining the query results of the plurality of suppliers into a first query result; step b, performing deduplication on the first query result, and for tags corresponding to a plurality of different scores (e.g. target tags, or sub-tags of the target tags), performing any one of the following processing operations: retaining a maximum score of the plurality of different scores in the first query result and deleting other scores of the plurality of different scores; averaging the plurality of different scores and replacing the plurality of different scores with a single average; and c, taking the processed first query result as a calculation result.
In some embodiments, the computing task may further include a task type, and in step 208, the intermediate service platform may generate a computing result corresponding to the computing task according to the task type and the query result.
In general, a single task type may be, for example, user ranking, user filtering, normalization calculations, or sub-label score predictions, among others. In practice, when the target tag does not have a sub-tag, i.e., when the scoring model associated with the target tag is used to predict the score corresponding to the target tag, the task type of the computing task may be, for example, user ranking, user filtering, or normalization computing. When the target tag has a sub-tag, i.e. when the scoring model associated with the target tag is used to predict the score corresponding to the sub-tag, the task type of the computation task may be, for example, a sub-tag score prediction.
When the task type of the computing task is normalization computing or sub-tag score prediction, the intermediate service platform may perform the specific implementation described above with respect to step 208 to generate the computing result.
When the at least one user identifier is a plurality of user identifiers and the task type of the calculation task is user sequencing, the calculation task may further include a number upper limit, where the number upper limit may be data in a percentage form or a natural number, and is not specifically limited herein. In such a case, in step 208, the intermediate service platform may first determine, according to the query result, a score that the user indicated by the user identifier of the plurality of user identifiers is finally under the target tag. Then, the intermediate service platform may sort the plurality of user identifiers in order of scores from top to bottom to obtain a user identifier sequence. Then, the intermediate service platform may select at least part of the user identifiers from the positions of the user identifiers corresponding to the highest scores in the user identifier sequence according to the upper limit of the number, and use the at least part of the user identifiers as the calculation result.
Specifically, when the number upper limit is data in percentage form, the intermediate service platform may determine the product of the number upper limit and the number of the plurality of user identifiers as a target number, and select the user identifiers of the previous target number from the position in the user identifier sequence. When the number upper limit is a natural number, the intermediate service platform may select a user identifier with the number upper limit from the position. In some embodiments, when the number of the plurality of user identifiers is less than or equal to the target number or the upper number limit, the intermediate service platform may directly use the user identifier sequence as the calculation result.
When the task type of the calculation task is the user filtering, in step 208, the intermediate service platform may first determine, according to the query result, a score of the user indicated by the user identifier in the at least one user identifier, which is finally under the target tag, and then may determine whether the score exceeds a score threshold, and then generate a calculation result according to the determination result. The calculation result may include a user identifier of the at least one user identifier, and a flag indicating whether a score of the user indicated by the user identifier, which is finally under the target tag, exceeds a score threshold. By way of example, the flag may be, for example, "YES", "Y", or "YES", etc., when the score exceeds a score threshold; when the score does not exceed the score threshold, the flag may be, for example, "NO", "N", or "NO", and is not particularly limited herein.
It should be noted that the score threshold may be preset by the intermediate service platform, or may be specified by the client (for example, the score threshold is included in the calculation task), and is not specifically limited herein.
In some embodiments, after step 208, the intermediate service platform may then perform step 212 as shown in FIG. 2, returning the results of the computation to the customer.
In practice, the computing task may be a real-time computing task, or an offline computing task. The intermediate service platform may provide the customer with a first created entry for the real-time computing task and a second created entry for the offline computing task, and the computing task in step 202 may be created and submitted by the customer through either the first created entry or the second created entry.
By way of example, the first created portal may be presented on a label square interface as previously described, and the second created portal may be presented on a particular offline task management interface, which is not specifically limited herein.
Further, while each label on the label square interface corresponds to the application entry, each label may also correspond to the first creation entry, for example, the immediate use button shown in fig. 6. Wherein, fig. 6 is another schematic diagram of the label square interface. It should be noted that, for any one of the labels presented on the label square interface, when the client has obtained the usage right of the label, the immediate use button corresponding to the label is generally in a clickable state; when the customer does not obtain the usage right of the label, the immediate use button corresponding to the label is generally in a non-clickable state.
When the computing task is a real-time computing task, the intermediate service platform may actively return the computing result, that is, after the intermediate service platform performs step 208, the intermediate service platform may immediately perform step 212 to return the computing result to the client. When the computing task is an offline computing task, the intermediate service platform can passively return a computing result. For example, after performing step 208, the intermediary service platform may first receive a request for obtaining the calculation result from the client by performing step 210 as shown in fig. 2, and then return the calculation result to the client by performing step 212.
It should be noted that, when the calculation task is a real-time calculation task, the aforementioned scoring model associated with the target tag may be specifically a scoring model associated with the target tag in a real-time calculation service. When the calculation task is an offline calculation task, the scoring model associated with the target tag may specifically be a scoring model associated with the target tag in an offline calculation service. It should be noted that the scoring models respectively associated with the target tag in the real-time computing service and the offline computing service may be the same model or different models, and are not specifically limited herein.
In addition, the server of the target provider may provide the real-time query interface and the offline query interface to the intermediate service platform. When the computing task is a real-time computing task, in step 204, the intermediate service platform may send a scoring query request to the server through the real-time query interface. When the computing task is an offline computing task, in step 204, the intermediate service platform may send a scoring query request to the server through the offline query interface.
In some embodiments, the offline computing task may belong to a single task or a looping task. As the name implies, a single task is a task that needs to be executed only once, and a round-robin task is a task that needs to be executed periodically. When the offline computing task belongs to the looping task, after the step 208 is executed for the offline computing task, the intermediate service platform may continue to execute the step 204 for the offline computing task when a next computing period of a current computing period of the offline computing task comes.
The computing task processing method provided by the embodiment corresponding to fig. 2 can enable the intermediate service platform to receive the computing task submitted by the customer, obtain the score data output by the target provider of the customer in an online, efficient and compliant manner according to the computing task, and generate the computing result corresponding to the computing task based on the score data. Subsequently, the calculation result may be returned to the client. Therefore, the intermediate service platform can play a role in connecting customer requirements and supplier data compliance use, and can help the supplier to output own data on the premise of legality and compliance so as to provide services for enterprises.
In addition, the method can help the customer reduce the difficulty of using the data of the suppliers, shield the access complexity of the data sources of different suppliers and shield the difference of the structured data of different suppliers. And moreover, the data of the suppliers can be efficiently used by the customers, when the data of different suppliers needs to be accessed, the whole platform does not need to be greatly modified, and after the suppliers complete the access, the customers can use the new data labels by themselves.
In practice, in an offline computing service, the intermediate service platform may support a client to specify a model identifier when creating an offline computing task, so that score data predicted by a specified score model (the score model indicated by the model identifier) may be obtained subsequently when executing the offline computing task. It should be noted that, if the scoring model indicated by the model identifier is not pre-deployed in the server of the target provider, the intermediate service platform may first deploy the scoring model to the server of the target provider after receiving the offline computing task submitted by the client.
Based on this, in some embodiments, the computing task in step 202 may be an offline computing task, which may also include a model identification of the scoring model. After step 202, and before step 204, when the scoring model is stored in the intermediate service platform and not pre-deployed to the server of the target provider, a model deployment procedure as shown in fig. 7 may also be performed. The model deployment process comprises the following steps:
step 2032, the intermediate service platform sends a second approval request to the second approval end according to the offline calculation task, so that the second approval end deploys the scoring model to the server of the target provider through the model processing end after the offline calculation task passes the task feasibility approval;
step 2034, the intermediate service platform receives the notification information of the completion of model deployment returned by the second approval end.
Specifically, in step 2032, the second approval request may be a request for approval of the feasibility of the task. The second approval request may include, for example, the offline computing task itself or at least part of the offline computing task, and is not particularly limited herein. The second approval end may be, for example, a client or a server used by the aforementioned operation department for task approval, and the second approval end and the first approval end may be the same approval end or different approval ends, which is not limited herein.
In one example, the second approval terminal may automatically approve the task feasibility of the offline computing task according to the second approval request and the deployed task approval algorithm. In another example, on the second approval side, a task feasibility approval may be performed on the offline computing task in a manual intervention manner. It should be understood that when performing a task feasibility approval on an offline computing task, various approval methods may be employed, and are not specifically limited herein.
After the approval is completed according to the second approval request, if the offline calculation task does not pass the task feasibility approval, the second approval end may return a newly added task failure result to the intermediate service platform, and the intermediate service platform may return the newly added task failure result to the client. And if the offline computing task passes the task feasibility approval, the second approval end can send a model deployment request to the model processing end.
The model deployment request may include, but is not limited to, a model identification, among others. Further, the offline computing task may also include a vendor identification of the target vendor, and thus the model deployment request may also include the vendor identification. It should be noted that the model processing end may be a server used by a research and development department of the intermediate service platform and used for performing model generation and model deployment, or may be a server used by a third-party research and development team for performing model generation and model deployment, and is not specifically limited herein.
After receiving the model deployment request, the model processing terminal may deploy the scoring model to a server of a target provider by using various deployment methods. In one example, the model processing terminal may automatically obtain the scoring model indicated by the model identifier according to the model deployment request and an existing model deployment algorithm, and deploy the scoring model to the server of the target provider. In another example, a manual intervention mode may be adopted to further communicate the model deployment requirement with the customer, and then perform model deployment based on the final model deployment requirement.
It should be noted that modeling is a highly human-required task, and therefore, the modeled service can be provided to the client by a third-party research and development team at an intermediate service platform. Therefore, the intermediate service platform can rapidly realize large-scale service for the client by means of the force of a third-party research and development team.
After the scoring model is successfully deployed, the model processing end can return model deployment completion notification information to the second approval end, and the second approval end can return the model deployment completion notification information to the intermediate server. Based on this, in step 2034, the intermediate service platform may receive the model deployment completion notification information returned by the second approval end. Optionally, after step 2034, the intermediate service platform may return a model deployment completion notification to the customer.
In practice, under the offline computing service, the intermediate service platform can also support a customer-defined scoring model. Specifically, the customer may provide modeling requirements to the intermediate service platform, which may create a scoring model based on the modeling requirements.
Based on this, in some embodiments, the scoring model indicated by the model identifier may be a model owned by the intermediate service platform or may also be a model generated according to the modeling requirements of the customer. When the scoring model is generated according to the modeling requirements of the customer, a model creation flow as shown in fig. 7 may also be performed before step 202. The model creation process comprises the following steps:
step 2012, the intermediate service platform receives a model creation request of the client, wherein the model creation request comprises modeling requirements;
step 2014, the intermediate service platform sends a third approval request to the second approval end according to the model creation request, so that the second approval end obtains a grading model at least according to the modeling requirement through the model processing end after the modeling requirement passes the feasibility approval;
step 2016, the intermediate service platform receives the scoring model returned by the second approval end;
and 2018, generating a model identifier for the grading model by the intermediate service platform.
Specifically, in step 2012, the model creation request may be submitted by the customer through a model creation portal provided by the intermediate service platform, for example. The model creation request includes at least modeling requirements. Modeling requirements may include, for example, the use of the model, applicable industries and scenarios, etc., and are not specifically limited herein. Optionally, the model creation request may also include a set of positive examples, a set of negative examples, and/or feature filtering rules, or identification information for at least one of the three. Wherein, when the model creation request includes the identification information, the client has uploaded the at least one to the intermediate service platform in advance. It is noted that the positive samples in the positive sample set and the negative samples in the negative sample set may be both user identifications, such as mobile phone numbers, IMEIs, IDFAs, OAIDs, and the like.
Next, in step 2014, the intermediate service platform may send a third approval request to the second approval end according to the model creation request. Wherein the third approval request may be a request to approve feasibility of the modeling requirement. The third approval request may include modeling requirements. Optionally, the third approval request may further include a positive sample set, a negative sample set, and/or feature screening rules as described above.
In one example, the second approval end may automatically approve the feasibility of the modeling requirement according to the third approval request and the deployed feasibility approval algorithm of the modeling requirement. In another example, on the second approval side, manual intervention may be employed, approval being the feasibility of modeling requirements. It should be understood that in feasibility approval of modeling requirements, various approval methods may be employed, and are not specifically limited herein.
After the approval is completed according to the third approval request, if the modeling requirement does not pass the feasibility approval, the second approval end may return a new model failure result to the intermediate service platform, and the intermediate service platform may return the new model failure result to the client. If the modeling requirement passes the feasibility approval, the second approval end may send the modeling request to the model processing end as described above.
Wherein the modeling request may include modeling requirements. Optionally, the modeling request may also include a set of positive examples, a set of negative examples, and/or feature screening rules as previously described. After receiving the modeling request, the model processing end can adopt various modeling methods to create a scoring model. In one example, the model processing side can automatically create a scoring model based on the modeling request and the deployed modeling algorithm. In another example, a scoring model may be created from a modeling request using human intervention.
After the scoring model is successfully created, the model processing terminal may return the scoring model to the second approval terminal, and the second approval terminal returns the scoring model to the intermediate service platform, for example. Based on this, the intermediate service platform may receive the score model returned by the second approval end by executing step 2016. The intermediate service platform may then generate a model identification for the scoring model by performing step 2018. Subsequently, the intermediate service platform can also return the model creation completion notification information to the client. The notification information may include, but is not limited to, the model identification.
With further reference to FIG. 8, the present specification provides one embodiment of a computing task processing device that may be applied to an intermediary service platform as shown in FIG. 1.
As shown in fig. 8, the calculation task processing device 800 of the present embodiment may include: a first receiving unit 801, a first transmitting unit 802, a second receiving unit 803, and a generating unit 804. The first receiving unit 801 is configured to receive a computing task submitted by a customer, where the computing task includes an authorization code and at least one user identifier, the authorization code indicates that the customer has been authorized to use a target tag, and the target tag is associated with a target provider of the customer; the first sending unit 802 is configured to send a scoring query request to a server of a target provider, where the scoring query request includes the at least one user identifier and a target tag; the second receiving unit 803 is configured to receive a query result returned by the server, where the query result includes a user identifier in the at least one user identifier, and a score of the user indicated by the user identifier under the target tag, where the score is predicted by a scoring model associated with the target tag in the server; the generating unit 804 is configured to generate a calculation result corresponding to the calculation task according to the query result.
In some embodiments, the apparatus 800 may further include a second sending unit for sending information to the client, a third sending unit for sending information to the first approval side, a third receiving unit for receiving information returned by the first approval side, a fourth sending unit for sending information to the second approval side, a fourth receiving unit for receiving information returned by the second approval side, and so on.
In some embodiments, the first receiving unit 801 may be further configured to: receiving a use application submitted by a client aiming at a target label before receiving a computing task submitted by the client; the third transmitting unit may be configured to: sending a first approval request to a first approval end according to the application; the third receiving unit may be configured to: receiving an approval result returned by the first approval end; the generating unit 804 may be further configured to: in response to the approval result being that the client is allowed to use the target label, generating the authorization code for the client; the second transmitting unit may be configured to: and returning the approval result to the client.
In some embodiments, the computing task is a real-time computing task or an offline computing task.
In some embodiments, the computing task is a real-time computing task; and the second transmitting unit may be configured to: after the generating unit 804 generates the calculation result corresponding to the calculation task, the calculation result is returned to the client.
In some embodiments, the computing task is an offline computing task; and the first receiving unit 801 may be further configured to: after the generating unit 804 generates the calculation result corresponding to the calculation task, receiving an acquisition request of a client for the calculation result; the second transmitting unit may be configured to: and returning the calculation result to the client.
In some embodiments, the offline computing task belongs to a looping task; and the first sending unit 802 may be further configured to: and when the next calculation period of the current calculation period of the offline calculation task comes, continuously sending a grading inquiry request to the server of the target supplier.
In some embodiments, the computing task is an offline computing task, the offline computing task further including a model identification of a scoring model, the scoring model being stored in the intermediate service platform; and the fourth transmitting unit may be configured to: before the first sending unit 802 sends a scoring query request to a server of a target provider, a second approval request is sent to a second approval end according to an offline calculation task, so that the second approval end deploys a scoring model to the server through a model processing end after the offline calculation task passes through task feasibility approval; the fourth receiving unit may be configured to: and receiving model deployment completion notification information returned by the second approval end.
In some embodiments, the scoring query request is generated based on modeling requirements of the customer; and the first receiving unit 801 may be further configured to: receiving a model creation request of a customer, including the modeling requirement, prior to receiving a computing task submitted by the customer; and the fourth transmitting unit may be configured to: according to the model establishing request, sending a third approval request to the second approval end, so that the second approval end obtains a grading model at least according to the modeling requirement through the model processing end after the modeling requirement passes feasibility approval; the fourth receiving unit can be configured to receive the scoring model returned by the second approval end; the generating unit 804 may be further configured to: generating a model identification for the scoring model.
In some embodiments, the computing task further includes a task type; and the generating unit 804 may be further configured to: and generating a calculation result according to the task type and the query result.
In some embodiments, a single task type may be user-ordered, user-filtered, normalized or sub-label score predicted, or the like.
In some embodiments, a target tag may be attributed to a certain scene of a certain industry; and/or, the individual user identification may include a mobile phone number, an international mobile equipment identity, an advertisement identifier, or an anonymous device identifier, among others; and/or the scoring model may be provided to the target provider by the intermediary service platform.
In the embodiment of the apparatus corresponding to fig. 8, the detailed processing of each unit and the technical effect thereof can refer to the related description of the method embodiment in the foregoing, and are not repeated herein.
An embodiment of the present specification further provides a computing task processing method based on cloud communication, which is applied to an intermediate service platform in a cloud communication platform, and includes: receiving a computing task submitted by a customer, wherein the computing task comprises an authorization code and at least one user identifier, the authorization code indicates a target label authorized to be used by the customer, and the target label is associated with a target supplier of the customer; sending a scoring query request to a server of a target provider, wherein the scoring query request comprises the at least one user identifier and a target tag; receiving a query result returned by the server, wherein the query result comprises a user identifier in the at least one user identifier and a score of a user indicated by the user identifier under a target label, and the score is predicted by a scoring model associated with the target label in the server; and generating a calculation result corresponding to the calculation task according to the query result.
An embodiment of the present specification further provides a computing task processing device based on cloud communication, which is applied to an intermediate service platform in a cloud communication platform, and includes: a first receiving unit configured to receive a computing task submitted by a customer, the computing task including an authorization code and at least one user identifier, the authorization code indicating a target tag that the customer has been authorized to use, the target tag being associated with a target provider of the customer; a first sending unit configured to send a scoring query request to a server of a target provider, the scoring query request including the at least one user identifier and a target tag; the second receiving unit is configured to receive a query result returned by the server, wherein the query result comprises a user identifier in the at least one user identifier and a score of a user indicated by the user identifier under the target label, and the score is predicted by a scoring model associated with the target label in the server; and the generating unit is configured to generate a calculation result corresponding to the calculation task according to the query result.
The embodiment of the specification further provides a use application method for the label, which comprises the following steps: receiving a use application submitted by a client aiming at a target label; sending a first approval request to a first approval end according to the application; receiving an approval result returned by the first approval end; and in response to the approval result being that the client is allowed to use the target label, generating an authorization code associated with the target label for the client, and returning the approval result to the client.
An embodiment of the present specification further provides a device for applying for use of a tag, including: the first receiving unit is configured to receive a use application submitted by a client aiming at a target label; the sending unit is configured to send a first approval request to the first approval terminal according to the use application; the second receiving unit is configured to receive the approval result returned by the first approval end; and the processing unit is configured to respond to the approval result that the client is allowed to use the target label, generate an authorization code associated with the target label for the client, and return the approval result to the client.
The present specification also provides a computer-readable storage medium, on which a computer program is stored, wherein when the computer program is executed in a computer, the computer program causes the computer to execute the computing task processing method and the application method for using a tag described in the above method embodiments respectively.
The embodiment of the present specification further provides a computing device, which includes a memory and a processor, where the memory stores executable code, and when the processor executes the executable code, the computing task processing method and the application method for use of a tag described in the above method embodiments are implemented.
Embodiments of the present specification further provide a computer program, where the computer program, when executed in a computer, causes the computer to execute the computing task processing method and the application method for using a tag, which are respectively described in the above method embodiments.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in the embodiments disclosed herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The above-mentioned embodiments, objects, technical solutions and advantages of the embodiments disclosed in the present specification are further described in detail, it should be understood that the above-mentioned embodiments are only specific embodiments of the embodiments disclosed in the present specification, and are not intended to limit the scope of the embodiments disclosed in the present specification, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the embodiments disclosed in the present specification should be included in the scope of the embodiments disclosed in the present specification.

Claims (14)

1.一种基于云通信的计算任务处理方法,应用于云通信平台中的中间服务平台,包括:1. A computing task processing method based on cloud communication, applied to an intermediate service platform in a cloud communication platform, comprising: 接收客户提交的计算任务,所述计算任务包括授权码和至少一个用户标识,所述授权码指示出所述客户已被授权使用的目标标签,所述目标标签关联所述客户的目标供应商;Receive a computing task submitted by a customer, the computing task includes an authorization code and at least one user identifier, the authorization code indicates a target label that the customer has been authorized to use, and the target label is associated with the target supplier of the customer; 向所述目标供应商的服务器发送评分查询请求,所述评分查询请求包括所述至少一个用户标识和所述目标标签;sending a scoring query request to the server of the target provider, where the scoring query request includes the at least one user identifier and the target tag; 接收所述服务器返回的查询结果,所述查询结果包括所述至少一个用户标识中的用户标识,以及该用户标识指示的用户在所述目标标签下的分数,所述分数由所述服务器中关联于所述目标标签的评分模型预测得出;Receive a query result returned by the server, where the query result includes the user ID in the at least one user ID, and the score of the user indicated by the user ID under the target tag, and the score is associated by the server Predicted by the scoring model of the target label; 根据所述查询结果,生成所述计算任务对应的计算结果。According to the query result, a calculation result corresponding to the calculation task is generated. 2.一种计算任务处理方法,应用于中间服务平台,包括:2. A computing task processing method, applied to an intermediate service platform, comprising: 接收客户提交的计算任务,所述计算任务包括授权码和至少一个用户标识,所述授权码指示出所述客户已被授权使用的目标标签,所述目标标签关联所述客户的目标供应商;Receive a computing task submitted by a customer, the computing task includes an authorization code and at least one user identifier, the authorization code indicates a target label that the customer has been authorized to use, and the target label is associated with the target supplier of the customer; 向所述目标供应商的服务器发送评分查询请求,所述评分查询请求包括所述至少一个用户标识和所述目标标签;sending a scoring query request to the server of the target provider, where the scoring query request includes the at least one user identifier and the target tag; 接收所述服务器返回的查询结果,所述查询结果包括所述至少一个用户标识中的用户标识,以及该用户标识指示的用户在所述目标标签下的分数,所述分数由所述服务器中关联于所述目标标签的评分模型预测得出;Receive a query result returned by the server, where the query result includes the user ID in the at least one user ID, and the score of the user indicated by the user ID under the target tag, and the score is associated by the server Predicted by the scoring model of the target label; 根据所述查询结果,生成所述计算任务对应的计算结果。According to the query result, a calculation result corresponding to the calculation task is generated. 3.根据权利要求2所述的方法,其中,在所述接收客户提交的计算任务之前,还包括:3. The method according to claim 2, wherein, before the receiving the computing task submitted by the client, further comprising: 接收所述客户针对所述目标标签提交的使用申请;receiving a use application submitted by the customer for the target tag; 根据所述使用申请,向第一审批端发送第一审批请求;sending a first approval request to the first approval terminal according to the use application; 接收所述第一审批端返回的审批结果;receiving the approval result returned by the first approval terminal; 响应于所述审批结果为允许所述客户使用所述目标标签,针对所述客户生成所述授权码,并将所述审批结果返回给所述客户。In response to the approval result being that the customer is allowed to use the target tag, the authorization code is generated for the customer, and the approval result is returned to the customer. 4.根据权利要求2所述的方法,其中,所述计算任务为实时计算任务或离线计算任务。4. The method of claim 2, wherein the computing task is a real-time computing task or an offline computing task. 5.根据权利要求4所述的方法,其中,所述计算任务为所述实时计算任务;以及5. The method of claim 4, wherein the computing task is the real-time computing task; and 在所述生成所述计算任务对应的计算结果之后,还包括:After generating the calculation result corresponding to the calculation task, the method further includes: 将所述计算结果返回给所述客户。The calculation result is returned to the client. 6.根据权利要求4所述的方法,其中,所述计算任务为所述离线计算任务;以及6. The method of claim 4, wherein the computing task is the offline computing task; and 在所述生成所述计算任务对应的计算结果之后,还包括:After generating the calculation result corresponding to the calculation task, the method further includes: 接收所述客户对所述计算结果的获取请求,将所述计算结果返回给所述客户;和/或,receiving a request from the client for obtaining the calculation result, and returning the calculation result to the client; and/or, 响应于所述离线计算任务属于循环任务,在所述离线计算任务的当前计算周期的下一计算周期到来时,继续执行所述向所述目标供应商的服务器发送评分查询请求。In response to the offline computing task being a recurring task, when the next computing cycle of the current computing cycle of the offline computing task arrives, the sending of the scoring query request to the server of the target supplier is continued. 7.根据权利要求4所述的方法,其中,所述计算任务为所述离线计算任务,所述离线计算任务还包括所述评分模型的模型标识,所述评分模型存储在所述中间服务平台中;以及7. The method according to claim 4, wherein the computing task is the offline computing task, and the offline computing task further comprises a model identifier of the scoring model, and the scoring model is stored in the intermediate service platform in; and 在所述向所述目标供应商的服务器发送评分查询请求之前,还包括:Before the sending of the scoring query request to the server of the target provider, the method further includes: 根据所述离线计算任务,向第二审批端发送第二审批请求,以使得所述第二审批端在所述离线计算任务通过任务可行性审批后,经由模型处理端将所述评分模型部署到所述服务器;According to the offline computing task, a second approval request is sent to the second approval terminal, so that the second approval terminal can deploy the scoring model to the second approval terminal via the model processing terminal after the offline computing task passes the task feasibility approval. the server; 接收所述第二审批端返回的模型部署完成通知信息。Receive the model deployment completion notification information returned by the second approval terminal. 8.根据权利要求7所述的方法,其中,所述评分模型根据所述客户的建模需求而生成;以及8. The method of claim 7, wherein the scoring model is generated based on modeling needs of the customer; and 在所述接收客户提交的计算任务之前,还包括:Before receiving the computing task submitted by the client, the method further includes: 接收所述客户的模型创建请求,其中包括所述建模需求;receiving a model creation request from the client, including the modeling requirement; 根据所述模型创建请求,向所述第二审批端发送第三审批请求,以使得所述第二审批端在所述建模需求通过可行性审批后,经由所述模型处理端至少根据所述建模需求获取所述评分模型;According to the model creation request, send a third approval request to the second approval terminal, so that after the modeling requirement passes the feasibility approval, the second approval terminal can use the model processing terminal at least according to the Modeling requirements to obtain the scoring model; 接收所述第二审批端返回的所述评分模型;receiving the scoring model returned by the second approval terminal; 为所述评分模型生成所述模型标识。The model identification is generated for the scoring model. 9.根据权利要求2所述的方法,其中,所述计算任务还包括任务类型;以及9. The method of claim 2, wherein the computing task further comprises a task type; and 所述根据所述查询结果,生成所述计算任务对应的计算结果,包括:The generating the calculation result corresponding to the calculation task according to the query result includes: 根据所述任务类型和所述查询结果,生成所述计算结果。The calculation result is generated according to the task type and the query result. 10.根据权利要求9所述的方法,其中,所述任务类型为以下任一项:用户排序、用户过滤、归一计算、子标签分数预测。10. The method according to claim 9, wherein the task type is any one of the following: user ranking, user filtering, normalization calculation, and sub-tag score prediction. 11.根据权利要求2所述的方法,其中,11. The method of claim 2, wherein, 所述目标标签归属于某一行业的某一场景;和/或,The target tag belongs to a certain scene of a certain industry; and/or, 单个用户标识包括以下任一项:手机号、国际移动设备识别码IMEI、广告标识符IDFA、匿名设备标识符OAID;和/或,A single user identification includes any of the following: mobile phone number, International Mobile Equipment Identity Number IMEI, Advertising Identifier IDFA, Anonymous Equipment Identifier OAID; and/or, 所述评分模型是所述中间服务平台提供给所述目标供应商的。The scoring model is provided by the intermediate service platform to the target supplier. 12.一种计算任务处理装置,应用于中间服务平台,包括:12. A computing task processing device, applied to an intermediate service platform, comprising: 第一接收单元,被配置成接收客户提交的计算任务,所述计算任务包括授权码和至少一个用户标识,所述授权码指示出所述客户已被授权使用的目标标签,所述目标标签关联所述客户的目标供应商;The first receiving unit is configured to receive a computing task submitted by a client, the computing task includes an authorization code and at least one user identifier, the authorization code indicates a target tag that the client has been authorized to use, and the target tag is associated with the target suppliers of said customer; 第一发送单元,被配置成向所述目标供应商的服务器发送评分查询请求,所述评分查询请求包括所述至少一个用户标识和所述目标标签;a first sending unit, configured to send a score query request to the server of the target supplier, where the score query request includes the at least one user identifier and the target tag; 第二接收单元,被配置成接收所述服务器返回的查询结果,所述查询结果包括所述至少一个用户标识中的用户标识,以及该用户标识指示的用户在所述目标标签下的分数,所述分数由所述服务器中关联于所述目标标签的评分模型预测得出;The second receiving unit is configured to receive the query result returned by the server, where the query result includes the user ID in the at least one user ID, and the score of the user indicated by the user ID under the target tag, so The score is predicted by the scoring model associated with the target tag in the server; 生成单元,被配置成根据所述查询结果,生成所述计算任务对应的计算结果。The generating unit is configured to generate a calculation result corresponding to the calculation task according to the query result. 13.一种计算机可读存储介质,其上存储有计算机程序,其中,当所述计算机程序在计算机中执行时,令计算机执行权利要求1-11中任一项所述的方法。13. A computer-readable storage medium having a computer program stored thereon, wherein, when the computer program is executed in a computer, the computer is caused to perform the method of any one of claims 1-11. 14.一种计算设备,包括存储器和处理器,其中,所述存储器中存储有可执行代码,所述处理器执行所述可执行代码时,实现权利要求1-11中任一项所述的方法。14. A computing device, comprising a memory and a processor, wherein executable code is stored in the memory, and when the processor executes the executable code, the processor of any one of claims 1-11 is implemented method.
CN202111401119.6A 2021-11-19 2021-11-19 Computing task processing method and device based on cloud communication Pending CN114036568A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111401119.6A CN114036568A (en) 2021-11-19 2021-11-19 Computing task processing method and device based on cloud communication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111401119.6A CN114036568A (en) 2021-11-19 2021-11-19 Computing task processing method and device based on cloud communication

Publications (1)

Publication Number Publication Date
CN114036568A true CN114036568A (en) 2022-02-11

Family

ID=80145299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111401119.6A Pending CN114036568A (en) 2021-11-19 2021-11-19 Computing task processing method and device based on cloud communication

Country Status (1)

Country Link
CN (1) CN114036568A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196012A (en) * 2010-03-17 2011-09-21 华为技术有限公司 Service opening method, system and service opening server
CN103916395A (en) * 2014-04-09 2014-07-09 北京京东尚科信息技术有限公司 Method, device and system for service calling
CN110543498A (en) * 2019-08-20 2019-12-06 武汉稀云科技有限公司 An event-triggered multi-party data association query method and device
CN110825539A (en) * 2019-11-07 2020-02-21 中国联合网络通信集团有限公司 Business processing method and device
CN111027086A (en) * 2019-12-16 2020-04-17 支付宝(杭州)信息技术有限公司 Private data protection method and system
CN111046425A (en) * 2019-12-12 2020-04-21 支付宝(杭州)信息技术有限公司 Method and device for risk identification by combining multiple parties
CN111523812A (en) * 2020-04-24 2020-08-11 同盾控股有限公司 Model life cycle management method and system, equipment and storage medium
CN111815439A (en) * 2020-07-23 2020-10-23 睿智合创(北京)科技有限公司 Credit scoring system based on cloud platform

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196012A (en) * 2010-03-17 2011-09-21 华为技术有限公司 Service opening method, system and service opening server
CN103916395A (en) * 2014-04-09 2014-07-09 北京京东尚科信息技术有限公司 Method, device and system for service calling
CN110543498A (en) * 2019-08-20 2019-12-06 武汉稀云科技有限公司 An event-triggered multi-party data association query method and device
CN110825539A (en) * 2019-11-07 2020-02-21 中国联合网络通信集团有限公司 Business processing method and device
CN111046425A (en) * 2019-12-12 2020-04-21 支付宝(杭州)信息技术有限公司 Method and device for risk identification by combining multiple parties
CN111027086A (en) * 2019-12-16 2020-04-17 支付宝(杭州)信息技术有限公司 Private data protection method and system
CN111523812A (en) * 2020-04-24 2020-08-11 同盾控股有限公司 Model life cycle management method and system, equipment and storage medium
CN111815439A (en) * 2020-07-23 2020-10-23 睿智合创(北京)科技有限公司 Credit scoring system based on cloud platform

Similar Documents

Publication Publication Date Title
CN111428881B (en) Recognition model training method, device, equipment and readable storage medium
US11496452B2 (en) Non-repeatable challenge-response authentication
CN111340558B (en) Online information processing method, device, equipment and medium based on federated learning
CN109684364A (en) The problem of being drawn a portrait based on user processing method, device, equipment and storage medium
CN114004206B (en) Form generation method, device, computer equipment and storage medium
CN117252554B (en) Business process mutual exclusion control method and system based on decision engine
CN110019916A (en) Event-handling method, device, equipment and storage medium based on user's portrait
CN110796269A (en) Method and device for generating model, and method and device for processing information
CN105096034B (en) The implementation method and electronic government affairs system of E-Government
CN114202303B (en) Business data processing method and system
CN114357289A (en) Information push method, information push system, electronic device and storage medium
CN111427923A (en) Vehicle information query method and device based on block chain and storage medium
KR102579032B1 (en) System and method for providing on/offline secretary service
CN114677138B (en) Data processing method, device and computer readable storage medium
US11398073B2 (en) Method to facilitate mass conversion of 2D drawings to 3D models
CN109285068A (en) Online loan inquiry method, apparatus, equipment and storage medium
US20180174142A1 (en) Managing product returns associated with a user device
CN117575721B (en) A big data trading method
CN114036568A (en) Computing task processing method and device based on cloud communication
CN114862451B (en) Method, device, electronic device and storage medium for reviewing media resources
CN118710372A (en) Multi-source data fusion transaction method, device, equipment, medium and product
US20090177510A1 (en) System and method of generating a business plan
Mustafa et al. Decentralized oracle networks (DONs) provision for DAML smart contracts
US12165435B2 (en) Identity verification in a document management system
CN112561455B (en) Business application control system and business application control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40067042

Country of ref document: HK